Mastery of Knowledge and Skills: Claim 1

McAuliffe’s economically diverse scholars outperform their Framingham and state peers on ELA and math state assessments.

“CMCS uses quantitative and qualitative data to improve student outcomes. Stakeholders provided multiple examples of how the school has used data to modify the academic program.” - DESE site visitors, fall 2016

Introduction

We’ve made strong progress in our journey to help all scholars succeed academically at McAuliffe and have done so by focusing on implementation of EL Education core practices and reaching our annual work plan goals / performance benchmarks. During the last four years we have prioritized raising expectations, increasing the rigor of our learning targets, and ensuring that assignments and assessments align to these targets. We have also focused on developing curriculum embedded assessments and using assessment data to target instruction and provide tier 2 interventions through lab intervention classes. Our state-level data, as shown in the accompanying graphs and in our data profile, demonstrate how our scholars have consistently outperformed their peers in the aggregate, in the low income / economically disadvantaged subgroup, and in the students with disabilities subgroup. Our state-level data also represents a story of changing standards, tests, test platforms (computer-based and paper-based), and an EL Education school adjusting to these variables year by year.  We begin with the story about testing our scholars from 2013 through 2017. We’ll then share with you the evidence of our diverse scholars outperforming their Framingham and state peers. Finally, we’ll articulate how we monitor our scholars’ academic success to ensure that all of our scholar are making their way to the top of the mountain and how McAuliffe is making progress toward meeting our performance benchmarks.

Building Background Knowledge About McAuliffe’s State Assessment Results

Let’s start with a metaphor: When our 6th  grade scholars learn about the predator-prey study of moose and wolves on Isle Royale -- the longest continuous predator-prey study in the world -- they learn that each data point has its own story. One data point for the wolves is collected each year and one data point is collected for the moose. In times of increase or decline, the changes are not just because of the predator-prey population dynamics but instead have to do with situational factors such as a really cold winter, a really hot summer with a surge in the winter tick population, or even a hiker whose dog with canine parvovirus shares the virus with wolves on the island.

Such is the case for McAuliffe’s ELA English language arts (ELA) and mathematics data points for MCAS and PARCC assessments years 2013 through 2016. Each data point has its own story that was influenced by the testing rollercoaster ride we have been on in the state of Massachusetts during this period of time. We wish we had a set of four years of state data to share with you that came from the same test assessing the same standards. Alas, we do not. Instead our data should be interpreted in the context of the changing testing climate in Massachusetts as well as a connectivity challenge that our scholars faced when taking the computer-based PARCC assessment in 2014.

When grappling with how to present our data that spans two years of MCAS and two years of PARCC (one on computer and one on paper)  we spoke with Mark Conrad, EL’s Chief Schools Officer, to receive guidance. Because the MCAS has four achievement levels (Warning, Needs Improvement, Proficient, and Advanced) and the PARCC has five achievement levels (Levels 1 through 5) we resolved that McAuliffe would use a measure that allows us to compare achievement (proficiency, not growth) across years and with other districts and the state. The achievement measure we have used is called the Composite Performance Index (CPI). CPI is a performance measure, not a growth measure. Below is a definition of how a CPI score is calculated. The important information to know is that the state of Massachusetts was able to provide a CPI for schools that took MCAS and a “transitional CPI” for schools that took PARCC in 2015 and 2016. With this information, we are able to share with you McAuliffe’s achievement on an annual basis and in relation to our comparison district (Framingham) and the state of Massachusetts. Relatedly, the state of Massachusetts did not release any statewide data for spring 2016 tests. We made special request to receive information for the sake of our portfolio, but were met with a standard reply that no statewide test results were released for this year. This is why on both the graphs and the data profile there is no state data for 2016.

Massachusetts uses the 100-point Composite Performance Index to measure progress toward the goal of narrowing proficiency gaps. The CPI assigns 100, 75, 50, 25, or 0 points to each student participating in PARCC, MCAS, and MCAS Alternate Assessment (MCAS-Alt) tests based on how close they came to scoring Proficient or Advanced. (For example, all students scoring Proficient or Advanced are assigned 100 CPI points; students with very low assessment scores are assigned 0 CPI points.) The CPI for a student group is calculated by dividing the total number of points by the number of students in the group. The result is a number between 0 and 100.

To make things even more interesting, Massachusetts rolled out an entirely new test in spring 2017 called next generation MCAS (next gen MCAS). The state will be using data from the 2017 next gen MCAS to set a baseline for future testing and are not providing a comparable CPI as they have historically. The state also makes it clear that the next gen MCAS data should not be compared with previous MCAS or PARCC data because the test is such a different one with different expectations, question types, and ratings.

McAuliffe is proud to have outperformed the state and Framingham on 2017 testing for the aggregate as well as the subgroups, students with disabilities and economically disadvantaged students. Because we only recently received the full results, we have not integrated full analysis into this and the next claim. However, we will be happy to answer questions about the results at our portfolio presentation. Click here for the letter sent to the McAuliffe community about the 2017 results. We have included a tab in our credentialing data with 2017 MCAS results to showcase our scholars’ performance compared with the state and Framingham.

McAuliffe Scholars’ Academic Achievement

During the last four years, McAuliffe scholars consistently outperformed our comparison district (Framingham) and the state in the aggregate, low income / economically disadvantaged, and students with disabilities subgroups for both English language arts and math. Our theory of action is that the stronger our implementation of EL Education core practices, the stronger our scholars’ academic performance compared with their peers in Framingham and across the state. At the end of our 2015-16 school year, we were proud to reach a score of 100 on the EL Implementation review. We were also proud of our strong achievement results at the end of 2015-16, following difficult ELA results from 2014-15. The following charts start by exploring schoolwide English language arts data in the aggregate and low income performance in the aggregate, by grade, by cohort, and in comparison with Framingham and the state. After that we’ll guide you through our math performance through the same lenses. Mastery of Knowledge and Skills Claim 2 will showcase the story of our students with disabilities and their successes.

For additional explanation of results, we invite you to read a segment of our application for renewal submitted to the MA DESE in July 2016 explaining 2012 - 2015 testing results. We also invite you to read the letter sent to McAuliffe families in September 2016 explaining 2016 PARCC results. Click here for both texts. The combination of these texts tells the story of McAuliffe’s achievement during the last four plus years in the context of our evolving standardized testing climate.

How are our scholars growing as readers and writers compared with district scholars?

The charts below showcase McAuliffe scholars’ ELA performance compared with Framingham and the state of Massachusetts. Please note that the state did not release data in 2015-16 for comparison.

Chart 2.1.A focuses on the entire student body. In this chart, notice that McAuliffe scholars outperformed Framingham each of the four years in English language arts. McAuliffe scholars outperformed the state in 2012-13 and 2013-14 and had comparable performance in 2014-15.

Chart 2.1.B illustrates grade level performance in the aggregate and compared with Framingham. In this chart, notice that McAuliffe scholars are outperforming Framingham at each grade in each year, except for an outlier data point for 6th grade in 2014-15. 6th grade scholars were those most impacted by the connectivity challenge we faced during PARCC testing that year. The steep drop and then steep incline following 2014-15 indicates that this is, indeed, an outlier in the data set. 8th graders also had a subtle dip; they were testing at the same time but were less impacted than 6th graders. 7th graders tested on a day by themselves and so they were not interrupted by technology infrastructure issues.

Chart 2.1.A

Chart 2.1.B

 

The slideshow to the left showcases our low income/economically disadvantaged scholars' performance in ELA. Note: The state shifted its usage from “low income” to “economically disadvantaged” in 2014-15; they are essentially the same subgroup.

In these charts, notice that McAuliffe scholars outperformed Framingham each of the four years in English language arts. McAuliffe scholars outperformed the state in 2012-13 and 2013-14 and performed just below the state 2014-15.

With the exception of one data point all grade level data indicate McAuliffe low income scholars are outperforming Framingham low income scholars. The one data point that is an outlier is 6th grade ELA in 2014-15, the year with the technology infrastructure challenges impacting testing conditions. Also noteworthy is the decline in 8th grade low income performance from 2012-13 through 2015-16. Framingham also saw a decline in this sub-group’s performance in 7th grade from 2013-14. Notice a similar trend in 8th grade from 2013-14 through 2015-16. This observation guided us to to look at two cohorts’ of low income scholars and their performance in ELA over time. Chart 2.1.C allows us to look at two of our low income cohorts from sixth, to seventh, to eighth grades. The data shows consistent performance over three years at  the school with slight variability from year to year (i.e. Class of 2015 performance from 6th to 7th dips and then rises back up in 8th).

Finally, Chart 2.1.D illustrates McAuliffe and Framingham low income subgroups compared with McAuliffe and Framingham Non-Low Income subgroups. Our credentialing data profile shows the specific numbers included in each district’s gap between low income and non-low income. In addition to each McAuliffe sub-group outperforming the district, McAuliffe had a smaller gap (or difference) between low income and non low income subgroups than Framingham. In particular, McAuliffe’s gap decreased from 2014-15 and 2015-16.

Chart 2.1.C

Chart 2.1.D

Analysis of each chart shows a decline between 2012-13 and 2014-15 and then a steeper incline in 2015-16. Following the dip in 2012-13, we made some adjustments to our reading and writing curriculum and instructional coaching. In particular, we made some curricular adjustments to ensure texts our scholars were reading were at the appropriate lexile level. Some less rigorous texts were replaced with those that were more rigorous, such as a shift from book groups that read “Parvana’s Journey” and “Mud City” by Deborah Ellis and “When My Name Was Keoko” by Linda Sue Park, to excerpts from “Where Am I Wearing” by Kelsey Timmerman. Our humanities teachers also worked with their instructional coach to align targets more clearly with the updated state curriculum frameworks and PARCC assessment. When in 2013-14 we saw another decline in performance, we were surprised because we had only take actions to improve our program the previous year. When making sense of the data, we considered the impact of the computer-based testing which we conducted in 2014-15 and the connectivity challenges that interfered with testing conditions. We had the most connectivity issues during ELA testing (compared with math) and it was our ELA testing that showed the decline in performance. Our theory is that our scholar’s ELA performance in 2014-15 was negatively influenced by testing conditions; this theory proved true when our scholars took the paper-based assessment in 2015-16, had stable testing conditions, and saw substantial performance improvement in 2015-16.

We acknowledge gaps in performance between scholars who are economically disadvantaged and who are not economically disadvantaged. Though the gap is not as wide for ELA as it is for mathematics, we have set goals to continue to provide engaging and meaningful curriculum, instruction, and interventions so that all of our scholars are demonstrating strong reading, writing, and communication skills.

How are our scholars growing as mathematicians and problem solvers?

The charts below  showcase McAuliffe scholars’ math performance compared with Framingham and the state of Massachusetts. Please note that the state did not release data in 2015-16 for comparison.

Chart 2.1.E focuses on the entire student body. In this chart, notice that McAuliffe scholars outperformed Framingham in each of the four years in mathematics. McAuliffe scholars outperformed the state in each of the three years in which state data was released. Chart 2.1.F illustrates grade level performance in the aggregate and compared with Framingham. In this chart, notice that McAuliffe scholars outperformed Framingham at each grade in each year. McAuliffe grade level performance was also relatively stable during the course of the four years.

Chart 2.1.E

Chart 2.1.F

The slideshow below focuses on our low income or economically disadvantaged scholars performance in mathematics. Note: Low income and economically disadvantaged are essentially the same subgroup in the state. In these charts, notice that McAuliffe scholars outperformed Framingham each of the four years in mathematics. McAuliffe scholars also outperformed the state in two of the three years of data provided. We performed 0.1 below the state’s CPI in 2014-15, the first year our scholars all participated in computer-based testing. Our grade level cohorts of low income scholars are, for the most part, outperforming Framingham low income students. However, in 2015-16 both seventh and eighth grade McAuliffe scholars performed under their Framingham peers in mathematics. Meanwhile, we saw a substantial increase in sixth grade low income performance from 2014-15 to 2015-16.

The final chart in the slide show illustrates McAuliffe and Framingham low income subgroups compared with McAuliffe and Framingham non-low income subgroups. Our credentialing data profile show the specific numbers include each district’s gap between low income and non-low income. In addition to each McAuliffe sub-group outperforming the district, McAuliffe had a smaller gap (or difference) between low income and non low income subgroups than Framingham. The last two years, we have had a similar size gap to Framingham.

McAuliffe scholars’ mathematics performance in the aggregate as well as in both subgroups has been steady from 2012-13 to 2015-16. However, McAuliffe’s mathematics performance has historically been below our ELA performance and so continues to be the content area we’ve identified as most in need of improvement. In addition, we recognize achievement gaps that exist between our scholars who are economically disadvantaged compared with scholars who are not economically disadvantaged. Gap analysis and comparison with Framingham indicates that our gaps are smaller than the district gaps. Nevertheless, we have named it a top priority to improve our scholars’ mastery of knowledge and skills in mathematics both in the aggregate and specifically for students who are economically disadvantaged.

Our theory of action is that continued implementation of EL core practices such as using assessment data to inform instruction and intervention as well as our instruction and tracking of habits of work and learning will help us close the gap. Meanwhile, our culture of mathematics work will continue to lift up our entire mathematics performance. Specifically, we aim to target the performance of our low income subgroup in 7th and 8th grades. The next section will explain how we use student learning data to inform interventions.

Using Data to Inform Instruction and Intervention

We are committed to guiding every scholar in the school up the mountain toward the summit for each lesson, unit, and course. We also recognize that learning is not “neat” much of the time and so have systems in place to monitor and track student progress in their classes, to identify struggling scholars, and to provide intervention. Over the course of the last four years we have integrated built a a lab intervention and extension program that provides tier 2 instruction for struggling scholars and extension opportunities for those ot struggling. We have also developed curriculum embedded assessments which serve as common assessments used to assess progress towards mastery of learning targets. These practices align with EL core practices pertaining to use of assessment, diverse scholars, and leadership. Below we’ll share a brief story of each practice with a complementary piece of evidence that helps tell the story.

Lab Intervention and Extension
Over the course of the the 2014-2015, our school-wide focus on data crystallized our understanding that for many of our scholars, lack of preparation on basic skills in math, reading, writing, and social skills were a roadblock to academic success. After administrators and teachers participated in a site visit to Two Rivers Public Charter School (an EL school in Washington D.C.) and saw their intervention block, we launched our “Lab” intervention/extension program in the 2015-16.  In Lab, scholars who have gaps in reading, writing, math, or social skills (what we call “social thinking”), have three 45-50 minute blocks per week improving their skills in a small group setting. Scholars who are at or above grade level participate in an extension class on topics such as Marine Biology, Japanese Art, or Robotics.  Scholars are placed in labs based on standardized assessment data, their progress toward learning targets in their core courses, and teacher recommendations.

The story of our math Lab is a good example of the progress we have seen through this intervention block. We found that many scholars enter McAuliffe not knowing how to add, subtract, multiply, and divide whole numbers, let alone work with fractions, decimals, and integers.  We use a series of computation assessments that allow teachers and scholars to identify the weak areas in a scholar’s math knowledge.  Those weak areas are targeted with individualized practice and instruction and then retested to show growth.  Scholars are able to choose which skills they want to tackle first and are highly motivated to “test out” of Math Lab.  This sample lab assessment illustrates the foundational skills students work on in this intervention block and an assessment tool that we used in 2016-17.

For the 2017-18 school year, our math and reading labs have started to use iReady, an online program that allows us to easily progress monitor scholars and determine next instructional steps. We look forward to expanding our use of ongoing assessment to track student progress and inform any adjustments to intervention.

Curriculum Embedded Assessments
To make sure that scholars grow as learners, in the 2015-2016 school year we shifted from out-of-house standardized interim assessments (Achievement Network) to in-house developed curriculum-embedded assessments. We made this shift for two reasons: 1) To create high-quality, common assessments across collaborative teaching teams that were genuinely aligned to both the standards and content we teach; and 2) To ensure an efficient data cycle where teachers could promptly use results to design lessons to better meet the needs of individuals, small groups, and whole classes.

In ELA, we knew we wanted to measure students’ ability to read and write independently, and  we also wanted those texts and writing assignments to be intimately connected to the work they were doing in class so that they could be genuine assessments of the learning they had done at McAuliffe. After studying the EL modules, we created sets of mid-unit on-demand writing assessments that measure both ongoing standards (ie, Common Core Standard R. 1), and standards that are specific to the unit scholars are working on. Instead of using generic prompts to measure scholars’ progress writing open response-format essays, we embed this measurable assessment within the context of the curriculum scholars are already studying. By assessing scholars more authentically, we are able to evaluate progress on discrete writing goals and on specific academic learning targets. For example, this 7th grade interim assessment was part of a unit based on EL’s “Water is Life” module; it made use of the standards and content of the unit, but required students to read and write about a text they had not read before.

In math, our math/science instructional coach guided teachers to write assessments of mathematical knowledge and skills that get to the depth and rigor of the standards, while mimicking the variety of ways that scholars are assessed on the PARCC and MCAS.  See below for samples of assessments that have evolved over time.

In the examples above, our math teachers studied the standards and refined our targets to ensure that they reflected all components of those standards. In order to facilitate student-engaged assessment and data cycles, we placed all targets at the top of each assessment and coded the supported target within each problem. In the 2015-16 assessment linked above we assessed scholars' ability to use the Pythagorean  theorem to solve problems. Upon reviewing the standard and assessment, we realized that the assessment was lacking a check for conceptual understanding. In the 2016-17 assessment you can see that question seven directly assesses scholars' understanding of the Pythagorean  theorem: "Explain how this figure demonstrates how the Pythagorean  theorem works."

Now that we have stronger assessments, we are focusing more intensively on using assessment data to inform adjustments to practice. Instructional coaches are developing data cycle calendars that allow for in-depth analysis of the results of assessments and intentional action planning to re-teach skills scholars still need to master.