Don’t Study?


The summative assessments required by ESSA do not track the growth of individual students from one year to the next, and that is a problem. Adequate Yearly Progress determinations were based on whether students in one grade at a particular school performed better than the cohort of students who were in that grade the previous year (Zimmerman & Dibenedetto, 2008). The severe penalties associated with failure to make AYP by this inappropriate standard resulted in desperate searches for quick fix programs, increased referrals to special education despite widespread discussion and implementation of the principles of Response to Instruction, and significant class time spent on repeated practice of test skills. All of this, of course, results in reduced instructional time.

There may be more systemic ways of improving overall school performance. Reeves (2000) completed an analysis of over 180 schools that faced distinct challenges with respect to AYP. Among those schools, there were a number that shared the following characteristics: Over 90% of the students enrolled were eligible for free and reduced lunch, 90% or more were members of an ethnic minority group, and 90% or more had met state or district-wide standards for proficiency in reading or another core area. Systematic comparison of these schools revealed some unique common features that distinguished them from lower-performing schools.

Among those distinguishing characteristics was what Reeves (2000) termed a “laser-like” focus on academic achievement. The ‘90/90/90’ schools made specific learning objectives explicit to students, displayed learning charts, benchmarks, standards and exemplars throughout the school, and charted student progress at least weekly on those specific learning goals with formative assessment. Mastery learning was a key factor in proficiency assessment (Reeves, 2000; Zimmerman & Dibenedetto, 2008). Teachers did not “cover” the curriculum. Rather, core skills were emphasized, given ample time, analyzed through Curriculum-Based Measurement, and re-taught to mastery if necessary.

Frequent, formative assessment (CBM) provides multiple opportunities for improvement without penalty. In a traditional model, the test is administered at the end of the unit, and all students, regardless of grade, move on to the next unit. There is no motivation for improvement, because the opportunity to display competence is lost (Reeves, 2000).

It is widely known that the limitations of summative assessment can be overcome through systematic CBM (Zimmerman & Dibenedetto, 2008; Pemberton, 2003; Stecker & Fuchs, 2000; Safer & Fleischman, 2005; Ysseldyke & Bolt, 2007; Davis & Fuchs, 1995; Wesson, 2001). Curriculum-Based Measurement, or Progress Monitoring entails the timely and systematic sampling of specific academic or behavioral competencies. The samples or “probes” are administered frequently, within the context of the lesson, and are taken directly from the material to be learned. The learning targets are made explicit to students through graphic depiction of student trendline data compared against a goal line (Davis & Fuchs, 1995; Pemberton, 2003; Stecker & Fuchs, 2000; Zimmerman & Dibenedetto, 2008; Stecker, Fuchs, & Fuchs, 2005; Safer & Fleischman, 2005; Stecker & Fuchs, 2005; Ysseldyke & Bolt, 2007). Standardized administration, benchmark target data, and systematic decision rules prompt teachers to make programmatic and instructional changes when rate of progress is not sufficient to achieve proficiency goals within the projected timeframe (Stecker & Fuchs, 2000). When teachers employ systematic data based decision making models using individual student outcomes, they do make more timely, frequent, and specific instructional changes (Stecker, Fuchs, & Fuchs, 2005; Safer & Fleischman, 2005). When teachers use progress monitoring (PM) students learn more, teacher decision making improves, and students become more aware of their own performance (Safer & Fleischman, 2005; Pemberton, 2003; Stecker, & Fuchs, 2000; Ysseldyke & Bolt, 2007; McCurdy & Shapiro, 1992; Wesson, 2001; Reeves, 2000). Stecker & Fuchs (2000) found that teachers who make instructional changes in response to students’ specific performance data produced better outcomes than controls. Fuchs & Fuchs (2002) conducted a meta-analysis of experimental controlled studies on PM and concluded : “When teachers use systematic progress monitoring to track their students’ progress in reading, math and spelling, they are better able to identify students in need of additional or different forms of instruction, they design stronger instructional programs, and students achieve better.”

For example, Ysseldyke & Bolt (2007) investigated the use of Accelerated Math to monitor progress. They found that students whose teachers used PM with specific decision rules significantly outperformed students whose teachers used the district math curriculum alone. In fact, students in the Accelerated Math condition made an average gain of 5.75 NCE units on the NWEA over the summer, which was six times the rate of growth shown over the previous school year. Because the computer-managed PM system included built-in decision rules and goal-raising mechanisms, it was easier to manage than teacher-made systems.

In addition to the specific, real-time diagnostic information PM systems provide to teachers, ongoing mastery based formative assessment also changes the instructional landscape for students. Certainly, research in brain-based education indicates students need to understand the goals of instruction and their specific skill targets. Progress Monitoring makes this explicit. The use of frequent performance related assessment without penalty and with continued opportunity for improvement also provides for a balanced level of arousal and anxiety without fear (Reeves, 2000; Zimmerman & Dibenedetto, 2008). In addition, PM provides frequent and highly specific feedback to the learner—students are able to view graphed performance immediately (Wesson, 2001; Ysseldyke & Bolt, 2007; Davis & Fuchs, 1995; Stecker & Fuchs, 2000). Davis & Fuchs (1995) noted that frequent, relevant graphed performance data may assist students in regulating their own learning and in motivation to continue because “they attribute success of failure to their effort, or lack of it, rather than forces over which they have little or no control”. Reeves (2000) noted that the ‘90/90/90’ schools provided significantly more feedback to students in a timely and accurate manner. Wesson (2001) hypothesized that ongoing measurement procedures may increase the amount of practice on the target skills.

There may also be a number of brain-based factors associated with PM that contribute to its effectiveness. Roediger & Karpicke (2006) used reading passages to study test effects in an educational context. They found immediate testing resulted in better long-term retention than repeated studying of the passage. It is interesting to note that this occurred even though the tests included no feedback. Moreover, Roediger & Karpicke found the effect in the testing condition was not attributable to added practice, because the re-studying condition provided for re-exposure as well. Students in the repeated testing condition recalled 21% more material after a week than the study condition students, despite the fact that the study condition students read the passage 14.2 times as opposed to only 3.4 times for the repeated test condition students. Roediger and Karpicke (2006) conclude that practicing the skills needed for retrieval during the learning phase is important to etention. They concluded testing is neglected in schools, and that frequent testing would require students to space their studying throughout the year rather than “massing” it just before a test. The spacing of studying material and review over time provides for a better cycle of consolidation and integration without decay of material to be remembered (Cepeda et. al, 2006).

.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s