About this blog Subscribe to this blog

SLO_pulse
Lessons Learned: Launching a SLO Initiative

By Jo-ne Bourassa

In Georgia, as in many states, approximately 75 percent of teachers teach subjects that are not assessed by state tests—for at least part of the instructional day. To meet the student growth and academic achievement component of Georgia’s Teacher Keys Effectiveness System, teachers of these non-tested subjects must implement student learning objectives (SLOs) to gauge student growth.

As one of the original 26 Race to the Top districts in Georgia, Bibb County School District jumped in early on to launch a SLO initiative. In the first year of the pilot, during 2012-13, the Georgia Department of Education required that 52 SLOs be given. To help districts prepare for this, the DOE provided training in assessment alignment and SLO creation.

In Bibb County, we decided to use a pre-test/post-test format to determine student growth over a semester or school year. Several challenges became apparent during year one.

Year One Challenges

To start, we faced a steep learning curve and a massive amount of work. As part of Georgia’s system, student growth and academic achievement are measured by student growth percentiles in tested subjects, or SLOs in non-tested subjects. Only 25 percent of our courses provide growth percentiles through Georgia’s Criterion-Referenced Competency Testsin grades 4-8, or end-of-course tests in high school. That meant we were now responsible for creating and administering SLO assessments—and arriving at SLO scores—for 75 percent of our courses.

To save time, we initially used public-domain SLO assessments created by other districts. This, unfortunately, meant our teachers felt no ownership of the materials. In addition, the administration and grading of the 52 pre- and post-tests caused almost all other activities to come to a halt. The tests took two to four days to administer. All were done via paper and pencil, which consumed our paper and copying budgets. The student scores (about 52,000 scores) were collected by hand on an Excel spreadsheet and sent to the central office for summarizing.

Mid-Year Changes

By the end of the first semester in year one, it became clear that teachers, students, and parents did not take the SLO assessments seriously. They even joked and complained about students taking “SLOW” tests.

So, in December 2012, we decided to change our SLOs to GLOs — growth learning objectives. We also switched out the labor-intensive assessments for instruments we already had for PreK-3, including AIMSweb for reading and math in grades 1-3; district writing assessments for English language arts in grades 1-3; Gkids portfolio pieces for kindergarten ELA, math, and reading; and Bright from the Startportfolio pieces for PreK literacy and numeracy. We had to live with the assessments in the other grade levels until we could write our own.

Year Two Revisions

For 2013-14, our district was required to have at least one growth measure for every certified teacher from pre-kindergarten through grade 12. This included P.E., fine arts, and career, technical and agriculture education (CTAE) teachers, as well as any class for which a teacher did not have an existing GLO or student growth percentile. This necessitated the creation of 100-plus additional GLO assessments.

We knew this would be a nearly impossible task without technology. So, after issuing an RFP and evaluating several systems, in January 2013 we selected the SLO Module from Performance Matters, along with the company’s assessment and data management system. Then, from February to July, we revised and developed 100-plus GLO assessments in ELA, math, science, social studies, P.E., fine arts, and CTAE.

In June and July 2013, we conducted training on the Performance Matters platform for our administrators and testing coordinators. Then, in August, we administered the GLO pre-tests in all 41 schools via plain paper scanning and online testing. Instead of having to collect the pre-test data on a spreadsheet, the results were automatically available in the company’s system, which also made it easier to send to the DOE. In addition, our teachers could now see the baseline assessment results and growth targets for each student. This allowed them to more easily monitor students’ progress toward the growth target during the school year, and identify what their students needed and which standards they needed to focus on to reach that target.

We went through the same process to administer the post-tests. Even though we were now administering twice as many assessments, we saved a significant amount of time and energy over the previous year, when we had to gather and analyze the data by hand. Our teachers and administrators could now access the target score and the results data from the pre- and post-assessments for each student under each SLO. With the SLO Module’s automatic calculations, teachers could see whether or not each student met the SLO, as well as the overall percentage of students achieving the SLO by class or by course. In addition, school leaders could use the data to group teachers into the appropriate ratings category—exemplary, proficient, needs development, or ineffective—on the SLO portion of their annual evaluations.

Lessons Learned

Here are some key takeaways from our successful launch of an SLO initiative.

  • Develop clear and concise test administration instructions to guide school testing coordinators and teachers through the pre- and post-test process.
    We created a spreadsheet with “administration notes” for each course. These notes instruct testing coordinators and teachers how to administer and score each test, which eliminates confusion and ensures consistency in each course.
  • Form two teams of teachers to develop SLO assessments—a team of writers and a team of reviewers.
    Having a team of writers and a separate team of reviewers not only improves the quality of the assessments, but it also encourages teacher buy-in since they are actively involved in the test-creation process.
  • Train teachers on test development and assessment alignment.
    Within each course, teachers should be able to determine which standards are most important for students to master the course and prepare for the next grade level. They should be able to plan what percentage of course time should be spent on each of these “power standards,” taking into consideration the percentage of the test that will cover the standard. They should be able to dissect each standard and identify exactly what students must be able to know, understand, and do to demonstrate mastery. They should also be able to articulate the depth of knowledge of each standard, so the assessment item can match that level of cognitive complexity.
  • Allow at least six months—and a large team of teachers — for the development, review, and input of the SLO assessments.
    It is also important to gather input from teachers of English-language learners and students with disabilities who can provide insight into what, if any, modifications might be required to meet specific needs or individualized education plans.
  • Create an item template to ensure consistency.
    Instead of purchasing an item bank, we collected items from a variety of sources, including state-released tests, or we developed our own. Each item was created using a template, which included the following fill-in-the-blank components: question number, DOK level (1-4), curriculum code, question stem, answer choices A-D, and correct answer. Having this template made it much easier to review and vet the items, before we put them into the assessment and data management system.

When we launched our GLO initiative more than two years ago, we had no idea how much work it would be to create and administer the assessments, and then crunch the data for each GLO to determine if students achieved the academic goals set at the beginning of each course. The use of technology has allowed us to automate and streamline the GLO process and ensure more accurate calculations for effectiveness ratings for our teachers. It also gives our teachers easier access to the data they need to inform their instruction, so they can meet the primary purpose of the SLOs — to improve student learning in the classroom.

Dr. Jo-ne Bourassa is the Race to the Top coordinator for the Bibb County School District in Macon, Georgia.

Image: credit: Zero Creatives/Media Bakery

Comments
Post Comment

If you have a TypeKey or TypePad account, please Sign In

Disclaimer: The opinions expressed in edu Pulse are strictly those of the author and do not reflect the opinions or endorsement of Scholastic, Inc.