Excellence

"High Achievement always takes place in the framework of high expectation." - Charles Kettering



Saturday, April 2, 2011

Adequate Yearly Progress in Reading May Elude JC McKenna Disabled Students


  • The Intangible Costs of Relentlessly Pursuing NCLB Achievement Goals

It's been an interesting few days since I woke to the Tuesday morning news of the Janesville Public School District trumpeting victory with their WKCE improvement data on page one of the Gazette. I quickly shot off emails to the principals in the district asking for Evansville data. I got one response that the data were "embargoed until April 7 or something like that." Okay...The skeptic/reporter /data hound in me went on high alert. On a whim, I looked for Evansville's 2010 WKCE data on the DPI website in their WINSS system. Eh voila, there they were. To be fair to my source, they expressed astonishment at the DPI asking that the data be embargoed yet publishing it on their website.


If you click on the post, you will be directed to the website where any achievement data you please can be accessed. Enter Evansville in the District and hit enter. The data can be viewed a number of ways. Please be aware that observation of the data, while informative, is not in itself sufficient to draw any conclusions regarding Adequate Yearly Progress (AYP). Every year, I forget the definition of AYP and have to review the method by which it is calculated. I forgot to do this last year and got into trouble. Like many government initiatives, it really is more complicated than it needs to be.


There are four components to achieving AYP. 1) Reading Achievement must meet annual yearly objective; 2) Math achievement must meet the annual yearly objective; 3) 95% or more students must participate in achievement testing. 4) Graduation rate and attendance rate of 85% or more or a 2% increase over the previous year. These criteria must be met not only on the district level and the school level, but also for any student subgroup for which there is a population of over 40 students (such as disabled or economically disadvantaged).


When we all read in the news about WKCE achievement as compared to the No Child Left Behind (NCLB) targets, the data are usually presented as "the percent of students scoring proficient or advanced." That simple sum is not how the "proficiency index" is calculated. Districts receive results from WKCE in four categories: Minimum, Basic, Proficient and Advanced. At the beginning of this NCLB journey, the target values for Wisconsin Public School Districts to achieve in Reading and Math were established as "Annual Measurable Objectives (AMO) for proficiency index." This year those values are 80.5% for Reading and 68.5% for Math. To ascertain whether or not any district or school or subgroup has achieved AYP, the reader must determine the Proficiency Index (PI) by adding up the percentage of students scoring Proficient and the percentage of students scoring Advanced and adding to that half the percentage value of students scoring Basic on the WKCE. To illustrate, if a certain school had 20% Basic, 35% Proficient and 25% Advanced achievement on Math, the PI would be 35+25+0.5(20), or 70%. This calculation indicates that a school with these Math scores meets the minimum target value of 68.5%.


Since I began doing all of these calculations and trying to post this data, I have had technical difficulties. I will spare you the gory details and cut to the chase, to which I hear a collective sigh of relief. I really do love playing with this data, but conclude that it's pretty boring to a good chunk of the population. I am going to qualify my data with the remark that I may or may not completely understand the final step in designation of AYP status. Here's a summary of what I know:



  • Calculate the Proficiency Index values for Math Achievement (MA) and Reading Achievement (RA) for each school and for each subgroup for which 40 or more students are present.

  • Determine if the MA and RA values are at or above the AMOs or fall within statistically measured confidence intervals. If yes, AYP is achieved.

  • If no, calculate a Safe Harbor calculation comparing Basic plus Minimum achievement for the present year with the immediately past year (SH1).

  • If SH1 shows that Basic plus Minimum performance achievement is reduced by 10% or more, AYP has been achieved.

  • If AYP is still elusive, recalculate the Safe Harbor designation by comparing the percentage of Basic plus Minimum achievement for the current year with an average of the Basic plus Minimum achievement for the two prior years (SH2). If SH2 shows a decrease of 10% or more in the Basic plus Minimum achievement categories, AYP has been achieved.

  • If the proficiency index, SH1 and SH2 all fail to meet AYP, AYP is not met.

There are two parts I am unclear on. The first is the SH2 calculation because I only found it once in the literature and can't find it again. The second is what the statistical Confidence intervals are for our district. I will summarize the steps and the AYP status at each step for the district.


Due to statistical limitations, I only ran the data on a school wide basis for TRIS, JC McKenna and EHS. I was concerned with borderline achievement levels last year for the Disabled and Economically Disadvantaged students, so I focused on those areas. The target values this year are much higher than last and it was reasonable to check this data first.


When I calculated the proficiency index for these groups at each school, the achievement in reading and math for economically disadvantaged students all met AYP outright. The straight proficiency calculation using the reading and math achievement of the disabled population failed to meet the strict comparison for both subjects in all three schools for this group of students across the board in both subjects. The Math result at JC McKenna was very close, however, and will meet AYP with the Confidence Interval, I'm sure. Safe Harbor 1 calculations bring EHS into compliance and SH2 calculations bring TRIS into compliance. Even after SH2 calculations JC McKenna reading scores for disabled students fail to meet AYP.


What does this all mean in these trying times? The first observation to be made is that overcoming failure to meet AYP now becomes extremely challenging as each district faces a 6.5% increase in the Reading target and a 10.5% increase in the Math target annually until 2014, when POOF everyone will magically become proficient in Reading and Math. HAH! Borderline performance this year bodes ill for future success. Part of me screams that the NCLB Act is so mind numbingly insane while the scientific part of me acknowledges that data can be powerful. Then I get ticked off at the Wisconsin DPI for making it so damned easy to meet AYP early on and lulling districts into a false sense of security.


There are real issues with the integrity of the WKCE and its ability to measure anything resembling proficiency to begin with. I attended a seminar at the School Board Convention in 2008 at which one of the co-creators of the WKCE stated that the original test that was developed using teacher experts to designate what constitutes the knowledge level of each achievement category had to be scrapped because so few students could achieve proficient or advanced on the test. It is a watered down version of what students are really expected to learn at each grade level.


There are statistical challenges faced by smaller districts like Evansville. Each class runs 130-150 kids and fluctuations in standardized test results are much wider. Each class has its own character that can sometimes be inferred by test results. Comparing the data for this year's fourth grade class with last year's fourth grade class is meaningless in terms of helping teachers become more effective or enabling students to learn more. But that's exactly what the NCLB data crunching calls for. And yet, the fundamental question of, "are students progressing?" is not addressed. Like so many educational initiatives, this enormous program was thrown into action with no adequate baseline and a very nebulous goal: To make every kid in America proficient by 2014. In what, pray tell? Taking standardized tests? By whose definition of proficient? I'm sure my definition of proficient differs from yours, just like Florida's is different from Wisconsin's. Without adding resources for this, the latest in a long line of unfunded federal mandates, it is guaranteed to fail.


Aside from the obvious costs incurred by districts, states and feds to administer the test, evaluate results and implement changes to improve deficiencies, there are countless intangible consequences of the NCLB Act. Time wasted in class preparing for and then taking the tests takes away from valuable learning time. All to take an impotent test like the WKCE that all the players know is ineffective. Which they try to mitigate by administering a useful test like the MAP test, thereby tripling the work since the MAP is given twice a year. They get individual data and aggregate data from this, which shows progress within a group, not in comparison to another group. But it is not the "authorized testing method," so it can't be used instead of WKCE. There is a potential problem of teaching to the test instead of what they should know or are interested in. There isn't any time to explore a subject in more depth if the class wants because they must push on to the next topic. That is NOT education. It is rote memorization that doesn't teach anybody anything.


I noticed one of those hard to pin down consequences of NCLB last year as I reviewed WKCE data. I sensed a data trend that suggested gains in achievement for groups with borderline results were accompanied by losses in achievement in the general student population. To get a handle on how to measure this phenomena, I postulated that a simple sum of the percent proficient plus percent advanced achievement (%P+A) would be an easy way to start. Subtracting this value for 2009 from its counterpart for 2010 would measure the change in achievement level for each of the four student groups: Economically Disadvantaged, Not Economically Disadvantaged, Disabled, Not Disabled. A positive result wold indicate increased achievement and a negative result would indicate loss of achievement. This was done for each of the five academic categories tested (Reading, Math, Language, Science and Social Studies). I used the three subjects tested only at grades 4,8 and 10 as a sort of "canary in the coal mine." If trouble showed up there while trying to strong-arm the reading and math scores into submission, it could be a sign that my concerns are well founded.


EHS showed universal improvements for all students in all 20 measurements using the "Hammann Quick and Dirty Evaluation of Student Achievement Index (HQDESAI)." This is especially notable because the high school only tests one grade (10) and is not statistically in danger of losing AYP due to the low numbers of the at risk populations. Each and every one of the four groups stated above had gains; some of them were of impressive magnitude. Excellent work EHS! Kudos for the turn-around.


JC McKenna posted gains in 12/20 of these measures. Scrutiny of the areas of loss of achievement showed Not Economically Disadvantaged Students losing on 3/5 measures while Economically Disadvantaged gained across all five subjects. A similar story is seen between the Disabled and Not Disabled groups. The Not Disabled students had gains only in Reading proficiency with losses being posted in the four remaining subjects. Conversely, the Disabled student achievement showed sometimes impressive gains in exactly the subjects their Not Disabled counterparts lost ground on, while losing ground in Reading. It is a textbook case of what happens when resources are reassigned instead of added to solve a problem. All hell breaks loose where the vacuum is formed.


TRIS posted gains in 14/20 categories. Losses in proficiency at TRIS (using the HQDESAI) were more equally distributed amongst the Economic Subgroups. The Disabled Student achievement posted losses in science and social studies while the Not Disabled Student achievement showed gains in all five subjects. I don't know the details of this issue, but just on principle it concerns me because the district specifically added resources at TRIS in special education. The consistent issue at TRIS, however, is the across the board significant loss in Social Studies achievement. The district recently realigned its Social Studies curriculum to better meet state standards. Maybe it's growing pains, but I imagine this will be addressed at future teacher in-service days at TRIS. The other schools posted universally large gains in Social Studies achievement, so the realignment seems to be working at the other levels of instruction across the district.


How can we help all students achieve their best and how do we measure that achievement to guarantee that they are achieving? It is a conundrum. I know there is a big "differentiated instruction" bandwagon in the district right now. I don't think it's been effectively implemented across the district. Whatever they're doing at the High School seems to be effective, though. Maybe that model can be extended to the rest of the district to help our struggling schools regain and maintain AYP. Evansville has great teachers and outstanding students. I'm confident that these passionate educators will find a way to optimize the quality of education during these trying times.

No comments: