How Should “Gap-Closing” Schools Measure Success?

May 02, 2016

user_avatar

Kyle Smucker

Analytics and Communication Coordinator, Accelerate Institute

composite.jpgData-driven schools and education organizations in under-served communities should ask themselves two questions when measuring success: 1. How much are our students expected to grow and 2. Have our students exceeded these expectations, and are therefore closing the achievement gap? 

Picture this: you are finishing your first year working at a turnaround school. The school has a high socio-economically disadvantaged population, and has received an F grade on the state assessment for several years prior to turnaround. While you are confident you and your staff have had a great impact on the school this year, students are still below the district average. So how can you measure your impact and answer those two key questions?

Scenarios like these are why growth measures (or “value-add”) are so important in our work. Closing the achievement gap doesn’t mean going “0 to 60” in seconds, but it does mean that students are accelerating on a trajectory towards success and college readiness, beyond the status-quo for disadvantaged students. While measuring benchmarks like proficiency is important, growth measures are perhaps a better indicator of success and more useful for schools closing the achievement gap.

While measuring benchmarks like proficiency are important, growth measures are perhaps 
a better indicator of success and more useful for schools closing the achievement gap.

How do you measure the growth in these scenarios accurately? Measuring growth answers our two key questions: 1. identify the growth projection for your students and 2. measure how far beyond that growth projection your students are growing. Growth projections are made by comparing your students’ initial scores to a large sample of students who began the year with similar scores. 

Some assessments make this easy: NWEA-MAP, for example, conducts national norming studies to determine the typical trajectories of students with different fall scores. When students test again in the winter or spring, you can see the growth students are making relative to their expectations. Last year, our Ryan Fellow led schools achieved 144% of their projections from fall to spring, or “1.44 years growth”.  Because we know students exceeded their initial projections, we can say confidently that Ryan Fellow schools are closing the achievement gap.

Most state tests and assessments have growth projections you can use; some are easier to find and use than others.

However, not all growth projections are the same. Take the case of New Jersey vs. New York state growth measures; both called Student Growth Percentiles (SGP). New York takes into account disability, English language learner, and economic status when projecting growth, while New Jersey only considers students’ initial scores. It’s important to have an understanding of how your growth projections are made and what it can or can’t tell you.

What if your school’s assessments don’t make growth projections? If you look in the right places, they probably do! Often, if you search “growth norms” or “expected growth” with the name of your assessment, you can find a way to make projections for your students. For example, ACT EPAS and Scholastic Reading Inventory (Lexile) both have normed projections (linked here) that many teachers and leaders are not aware of. Finally, do a thorough check of your state assessments and how they measure growth. At least 20 states have adopted growth measures similar to those described here. 

Accelerate Institute is proud to say that we have the resources and training to determine whether or not we are making an impact, but for each analysis, it always starts with those same two questions. Ask yourself, does your school have the resources to look at student growth accurately? I’m happy to help ask and answer any questions about data, and am always looking to learn more. Leave a comment below or contact me at ksmucker@accelerateinstitute.org



Kyle handles data collection and analysis for the recruitment and assessment of Accelerate Institute program members, as well managing our social media platforms. Prior to this, Kyle worked as an intern at LIFT in Washington, D.C., a social service non-profit with locations around the U.S. Kyle’s lifelong interest in income inequality and data-driven strategies to combat poverty lead to his move from Ohio to Chicago to work with Accelerate Institute.


Please add a comment

Leave a Reply



(Your email will not be publicly displayed.)