How We Moved Our District Forward Through Data-Based Instruction


How We Moved Our District Forward Through Data-Based Instruction

By Tracey Severns, EdD

Dr. Tracey Severns is a nationally recognized and award-winning educator with over 20 years of experience. During the 2011-2012 school year, she served as the principal of Mt. Olive Middle School and adjunct professor in the Educational Leadership program at Centenary College in Hackettstown, NJ. She has been awarded three Principal Fellowship Grants from the Dodge Foundation, the Outstanding Service and Leadership Award by the New Jersey Coalition of Educational Leaders and selected as one of the eight principals chosen as part of the 2009 Brazil Award-Winning Principals’ Program. In August 2012 the New Jersey Department of Education hired Dr. Severns as the Deputy Chief Academic Officer.

The Partnership for Assessment of Readiness for College and Careers (PARCC) is a coalition of states committing to test students exclusively on computers according to Common Core State Standards by the 2014-2015 school year. PARCC has been a hot-button issue for educators and daunting challenge for many schools as they search for ways to prepare for change. This account of Mt. Olive Middle School’s early foray into PARCC preparation offers successful practices to help administrators and teachers to not only implement data gathering technology as PARCC requires, but how to use that data to usher in a new era of data-driven education.

Arguments continue to swirl around PARCC as educators, some fearfully, some disdainfully, and some expectantly, express their passionate views on the pending change. I’m sure within many schools, there are a range of diverse opinions and ours was no exception. That’s precisely why we proposed an experiment that we hoped would shed light on what we could expect from PARCC and how we could make sure we would succeed during this period of rapid change and beyond. The experiment showed that PARCC compliance would be a challenge, but a manageable one with the right strategies. Moreover, our results hinted at the possibility of a new era of data-driven teaching that could change the face of education.

The Experiment

As a New Jersey school, Mt. Olive Middle is required to adopt PARCC-mandated digital testing and assessments by the 2014-2015 school year. With such a significant shift on the horizon, Mt. Olive saw the critical importance of helping staff become comfortable with the idea as quickly as possible. Without ample preparation time to smooth adoption and integration, we knew it would be virtually impossible to navigate the change successfully. As such, an early trial was undertaken so that students, teachers and administrators would know what to expect from PARCC and how we could work to ensure the success of all our stakeholders in improved student performance.


The Technology

PARCC states that schools must implement Common Core State Standards- based online or computer-based assessments for students. PARCC states have committed to building a K-12 assessment system that:

  • Builds a pathway to college and career readiness for all students
  • Creates high-quality assessments that measure the full range of the Common Core State Standards
  • Supports educators in the classroom 
  • Makes better use of technology in assessments
  • Advances accountability at all levels

The selection of the right technology provider was integral to the success of the experiment for Mt. Olive and no easy feat. The district administrators embarked on an extensive search, seeking a comprehensive technology platform that could accomplish the following objectives.

1.  Support the mass transfer of legacy data or old school records onto a manageable data management platform.

2.  Become a true time-saving teaching tool for teachers, not only tracking student performance but also delivering the right customized lessons to help improve learning.

3.  Provide robust support for assessment and analysis from the classroom level all the way to the district level.

4.  Provide the school with comprehensive customer service that would help us get through the inevitable hiccups with a transition of this scale.

Our search led us to a New York-based company called LinkIt!, which met the requirements and provided us with an unprecedented ability to group students based on specific skills and standards deficits. LinkIt! also gave our teachers access to an extensive lesson library featuring curriculum and content from many of the leading publishers. While online lesson resources are increasingly abundant, LinkIt! offered a unique mechanism of providing access to these resources directly via student performance reports through its recommendation engine, saving time and guesswork for many of our teachers.

Choosing the right vendor in an experiment like this proved to be critical. This experiment could not have been successful if the company was not willing to provide in-depth training and attentive service to help school administration and staff feel comfortable amid a challenging change.

Testing Strategy

We implemented the technology and held an initial round of testing early in the school year with what we called “pre-tests” administered in late October and early November. The driving logic was that initial testing towards the beginning of the school year would act as a comparison point for test results during and at the end of the school year.

Our vendor helped us design all the tests to mirror the content and wording of the New Jersey Assessment of Skills and Knowledge (NJ ASK), the statewide end- of-year tests. Our intention was to give students multiple opportunities throughout the year to demonstrate mastery of standards on these assessments, a process which helped us prepare for the PARCC requirement to track student achievement in relation to state standards at regular intervals.


Training Strategy

Teacher training was by far one of the most difficult aspects of implementing this trial. Much like the varying abilities of our students, teachers had differing levels of technological proficiency, which made it necessary to create a system for individualized education. Fittingly, this helped underscore the importance of using technology to help offer this individualized learning for our students.

Knowing the power of first impressions, we were determined to introduce the program to our teachers in a way that made them feel comfortable and excited. We organized a turnkey training system whereby we identified key teachers that would receive in-depth training from LinkIt!. These educators were chosen based on two prerequisites: credibility with their colleagues and proficiency with technology. Once they had a working knowledge of the LinkIt! platform, they led training sessions for fellow teachers and educational staff.

The school formed a committee to help with the training process, making the decision to focus on how to generate, read and apply the five most popular reports that could be generated through the platform. Teachers and LinkIt! technical support worked together to create step-by-step instructions for each kind of report. Focusing on the most frequently used reporting tools helped streamline the training process and also helped teachers to feel that all of their time was being used effectively to build only the skills they would need.

Professional Learning Groups

Following the initial training program, professional learning groups continued the teachers’ learning process. These groups gave teachers opportunities to teach and learn from each other as well as to analyze and use the data created by the technology to support the successful learning.

It’s important to note the success of the small groups hinged on providing teachers an outlet to meet with one another within the workday. Mt. Olive took the initiative to refocus free periods around data assessment. Teachers could take time to consider the data and how to incorporate that information for the benefit of their classrooms. We felt it would have been an unrealistic expectation for already busy teachers to learn an entirely new model of data-based teaching and the specifics of using the platform without providing them the extra time required. We encourage all schools to maintain this sensitivity to teachers’ schedules to avoid causing burnout.

We created two types of professional learning groups to support ongoing learning on how to implement the technology and data into teaching methods. One type of group brought together teachers of differing subjects who taught the same students so they could look at the report data from the student-centric perspective. The second type of group brought together teachers within a given concentration so they could share reports across classrooms and disseminate any useful observations and tips to help improve learning within that topic area.

Within professional learning groups, teachers were asked to identify what we called S.M.A.R.T. goals, goals that are specific, measurable, attainable, supported with resources and time-specific, to help students improve on their pre-test scores. With these goals in place, teachers used professional learning groups as an open forum to share triumphs, setbacks and lessons learned. Successes were continuously reinforced while failure to meet goals would be used as opportunities to reflect on how to approach the situation differently next time. Most importantly, it became clear that data should become the basis for celebration, not reprimand. When shared openly, data helped encourage teachers to reflect and work together to find the most effective methods to help students improve; essentially, data became a tool for teachers to hone their craft.

The professional learning groups were hugely successful in achieving teacher buy-in. Teachers trust the judgment of their fellow teachers, and the opportunity to grow with one another helped the teachers feel supported, as well as part of a larger collective effort to pioneer a new model for education.

The Results

In late April, Mt. Olive held year-end tests using the technology platform. The data was interpreted using a three-pronged ‘quintile analysis’ made easy by the technology platform. Stated simply, this analysis divides the school population by five, ranked by their scores on the pre-test. For each group, we calculated the percentage of improvement within the group, as well as mobility of students between groups indicating advancement to groups of more advanced students.

When measuring growth within groups, we determined that 97 percent showed positive growth, and no quintile experienced negative growth. From the highest to lowest performing students, across the board they experienced improvement. The lowest achieving quintiles tended to experience the highest percentage of growth, particularly in English language acquisition.  While all groups did show positive growth, not all students showed a similar level of growth with some grades and subjects. In ELA grades 6-8, 83 percent of students showed growth (with 17 percent of students showing zero to negative growth). In Math grades 6- 8, 67 percent of students showed growth (with 33 percent showing zero or negative growth).  It is clear that ascertaining where growth is being made (which students, achievements, teachers) is the most integral part of figuring out where improvement is needed.

We found that scores on the first assessment were highly predictive of where students scored on the second tests. In regard to mobility across quintiles, the majority of Quintile 1 students remained within Quintile 1 (59 percent in ELA, 68 percent in Math), and any exceptions dropped into Quintile 2.

Although Quintile 5 students showed widespread improvement of their test scores (70 percent in ELA, 71 percent in Math), they were not likely to move from Quintile 5, so it was clear that we needed to think more deeply about how we were addressing the needs of our lowest performers. Across the middle quintiles there was more fluidity, students were likely to remain in their quintile in math but fluctuated in ELA.

5 Steps Moving Forward

The experiment helped us achieve many things, some of the most important takeaways being:

1.  Technology implementation for online or computer-based testing should be thought of as a long-term endeavor, with vendor selection being an integral part of the process. It should be started as soon as possible for compliance with the 2014-15 PARCC deadline.

2.  These technological changes are beginning to introduce a model of data-driven education that encourages teachers to formulate lessons based on evidence of student performance, not by instinct or the class average.

3.  Teachers must be supported throughout this process with resources, time and training. Only when teachers succeed can our students succeed. Having the right technology is a crucial part of the puzzle, but process that you build around using data is equally important to your success.

4.  Teachers are more willing, open and eager to engage in data-driven teaching when the data is shared in collaborative forums that encourage knowledge sharing of effective teaching methods, not as bases for reprimand.

5.  Teachers felt affirmed in both their own teaching skills and the technology when results showed that students achieved improvement across the board. Despite the fact that there is still plenty of work to do to improve student outcomes, the gains we experienced helped to reinforce belief in and commitment to our process.

Mt. Olive’s experiment demonstrates that implementing the right data collection and assessment technology can drive meaningful change. When coupled with the right level of support for teachers, including opportunities for professional collaboration, Mt. Olive achieved a high level of teacher buy-in and did not blink when the inevitable obstacles arose from time to time.

We hope that what we’ve learned will be helpful for your school’s journey and build trust in that sometimes, change is just what we need.

Image attribution flickr users flickeringbrad and woodleywonderworks