I was hired at this position to launch a Computer Science program in the Middle School. When initially hired, they had been teaching Computer Science in their high school for several years. There was high demand for the program and over the course of my stay, there was consistently high demand. On average, one-third of the students in the Middle School were enrolled in one of my classes.
The first year I was there, I launched our Computer Science classes. This involved extensive research in different curriculum platforms for Computer Science classes. After I selected CodeHS as our platform, I wrote a proposal to purchase the Pro version of the platform. We also began our curriculum review for Computer Science in the spring of that year.
In developing the curriculum, I focused on two key concepts: Top-Down Design and Debugging
Top-Down Design is the process of taking a large program and breaking it into smaller and easier to solve sub-problems, then connecting the solutions of those sub-problems together to create a solution to the larger problem. In short, the concept of “divide and conquer” applied to computer programming.
Breaking problems into smaller sub-problems is an essential skill if you wish to create programs on your own. If you cannot do that, it becomes incredibly difficult to figure out where you should start working on the problem. My curriculum at RIS spent considerable time exercising the skill of identifying the large problem, its sub-problems, and their sub-problems. I also taught the students to think about how best to define the boundaries of a given problem. These exercises equipped my students with the skills needed to design a computer program.
Debugging is the process of removing defects from software. It is an incredibly challenging task, and according to a variety of studies conducted of professional programmers over the last forty years, programmers spend 30 to 60% of their time debugging. It is also an incredibly frustrating process for beginners for a wide variety of reasons, some of which are explored in my master’s thesis.
I taught debugging through three major streams of delivery. First, we did regular “speed debugging” activities to start the class in which the students were exposed to the most common typos found in student code and asked to spot them. For these, I used Quizizz to deliver the activity and I tracked the class wide average as a challenge to improve their class’s results over time. Second, I engaged in group debugging of the teacher’s code. This was done to normalize the idea that making mistakes is part of computer programming, and so it is OK if your code has mistakes in it. This helps kids develop a growth mindset that builds confidence in students as they struggle with setbacks and failures across all subjects. These debugging sessions focused on bugs where the code appears to work but is incorrect. Most commonly, these kinds of bugs center around failures to define the boundaries of a given problem or sub-problem. Finally, students were asked to debug code they have not written. This is a common activity for professional programmers, but it also is important because it teaches students how to read code and understand what it does before they start changing it.
The second year, I added Robotics courses, which adopted the NGSS Engineering standards for Middle School. This class was structured around the creation and maintenance of an engineering notebook, which documented the development and design of their robots over the course of the semester.
I selected the engineering notebook as the primary means of assessment because constructing a robot is not the primary skill taught in our robotics classes. Rather, the skills we focus on in our class are the incremental design of a solution to a challenge, task, or problem. These classes are structured around a series of tasks which the students construct robots to complete. Along the way, they document the intermediate versions of their robot and code on their way to a working solution.
Part of this documentation is identifying the good and bad aspects of how the robot behaved on the challenge table. They then construct a plan to modify their robot and its code in such a way that they preserve the good aspects and minimize or remove the bad aspects. We encapsulate this process as “Build, test, document, reflect”.
This approach to assessing robotics is a key part of all major robotics competition programs, including Vex, FRC, and MakeX.
In my third year, I developed a Digital Skills course intended to build the technology skills needed to succeed in a Google Based school. This course has three areas of focus. First, it focuses on teaching the skills needed to maintain all the lines of communication they use on a regular basis to collaborate with peers and communicate with teachers. This includes managing Google Classroom, keeping their Gmail inbox under control, and organizing their Google Drive. The second area of focus is general competence in the Google Suite of software used on a daily basis in the Middle School, including Google Docs and Sheets. The final area of focus is basic troubleshooting skills for maintaining their computers (software and hardware). This ranges from how and when to install software updates to an overview of the parts of a computer. As part of troubleshooting, we also covered password management and the basics of cybersecurity.
For our curriculum review, I developed detailed curriculum from scratch for 6 different courses. Then, I researched other CS programs to reflect on what other schools were doing. In the end, I produced a detailed report which was delivered to our administrative team and included a proposal to expand our robotics program and to add Digital Skills to our curriculum.
RIS implemented daily 45 minute blocks for students to engage in their own projects. Student locations during this time were fluid as the students frequently move to specialist locations to use the 3D printer, access subject-matter experts, or conference with partners working on similar projects. Additionally, there is a content support program that runs during the same time. Students join this content support block if they feel they need additional support with a specific concept in one of their classes or if they need to work on their learning plan before a reassessment opportunity. This fluidity does not work well with PowerSchool, where we track attendance.
In order to provide consistent tracking while we pursued a permanent system, I created a temporary solution for the school over the course of the weekend before the program launched. I built a webpage using Python 3 and Django, hosted for free on PythonAnywhere, which provided a way for teachers to check attendance for their students. It also gave them a way to move students to a new room and have the teacher in that room see the student had been moved. Finally, an attendance reporting dashboard was built with limited access which allowed the principals to see a list of all students who were marked absent in real time as well as to see the total number of tardies and absences over the course of a semester.
The solution was well received by students and teachers alike, who found it easy to navigate and use. Additionally, the team running our Focus Block was able to quickly and easily manage the location of students and teachers and keep track of student numbers in rooms to ensure that they weren’t overloaded.
The solution I built was never intended as a permanent solution. Instead, it was designed as an intermediate tool to track student location while I worked with our PowerSchool Administrator and PowerSchool to learn how to create a plugin which could interface directly with PowerSchool. But as an immediate stopgap, the website gave us a much more systematic way of tracking students during the year required to develop a more permanent solution.