Friday, April 22, 2011
It's time for a reevaluation of the Human Growth and Development curriculum again. Check out the following link for details regarding this meeting on May 4 from 8am - 3 pm.
Finally, there will be the regular Board Business Meeting on May 9 at 6 p.m. Much is happening in the district this time of year. The general mayhem surrounding the unsettled nature of the state budget scenario has caused a lot of rumor and innuendo to fly around town. Much of it probably has a kernel of truth, but resist the urge to believe everything. If you hear that "they aren't going to replace any retiring/quitting teachers," then more likely the whole story is "they aren't going to interview for such and so position until after the official budget is presented from the state." Which means after July 1st. Because that's when the state promised to have a solid budget proposal. Two of the three years I was on the board, the state didn't come through with final numbers until October. Imagine if you will the havoc that plays on the budget setting needs of each and every district. How can you hire people in August if you don't know whether or not funds are available for it? It's a very difficult situation. So remember to take a deep breath and hope for the best. They have only had to lay off teachers due to decline in enrollment for next year. That may not be the case for 12-13, which is shaping up to be even worse.
Why, you say, are we suddenly suffering the slings and arrows of outrageous fortune? It's not, as some would be happy to report, entirely the fault of Gov. Walker. His "budget repair" bill is a dominant player here, but the chickens are coming home to roost, ala the stimulus funds accepted two years ago. What did they hire the temporary special ed position at TRIS with? Stimulus funds. Where did all the capital funds come from for a new handicapped access van and new PT equipment come from? Stimulus funds. All in all, these funds inflated the budget to the tune of nearly a half-million dollars over two years. Add that to a loss of $550 per kid from the state (about a million) and a loss in enrollment of about 40 kids over the last 3 years (about a quarter of a million) and a perfect storm of budget misery strikes. Had it not been for the diligent efforts of the board to build up the fund 10 balance and toe the line on certain expenditures, things would be a lot worse than they are, which is bad enough. So, keep your wits about you and remain calm. Hard as it may be, trust the board and the administrators to be sensible in these dark days. And if all else fails, come to a board meeting!
Sunday, April 17, 2011
Thursday, April 7, 2011
Monday, April 4, 2011
I qualified the areas where I may be a bit shaky, which is in regard to the second level Safe Harbor AYP definition and the width of the typical Confidence Intervals for our district. Other than that, I completely understand how to arrive at AYP conclusions. One can even access computational websites online entitled "Adequate Yearly Progress Calculator" if you don't trust your own ability to add, subtract, multiply and divide. I got sick of punching numbers and wrote a mini-program in excel to do the SH2 calculations.
The district has a week or so to verify and qualify any data they are legally able to challenge before the DPI sends out the AYP notifications, at which point the DPI designation becomes official. I don't believe that disqualification of a few students at the Middle School will make any difference in their Reading AYP status. (I think I recall that only 2% of disabled students district wide may be disqualified from consideration in the AYP status calculations. I could be wrong on the exact value, but I know there is a very strict limitation here).
Some may criticize me for publicizing this before the DPI calls it official. I say, why wait when you know how to do the calculations? The city of Janesville was proud and trumpeted their news. The democrats were very anxious to show just how impotent the Milwaukee Voucher program results were and posted that data.
I have always been a take-charge kinda person. I propose that the DISTRICT should begin the process of determining the problems and proposing a solution. Scrutinize the data and listen carefully to what it tells you. Don't wait for the suits at the DPI to tell you that there is a problem. Find it, own it and solve it in a proactive way. I reiterate that my purpose in writing the AYP post was to illustrate the complete idiocy of the NCLB Act and how a committed district is negatively effected in many ways by the numerous intangible consequences of unfunded, scientifically invalid federal mandates.
Sunday, April 3, 2011
- VOTE VOTE VOTE VOTE VOTE VOTE VOTE VOTE VOTE VOTE VOTE VOTE!!!!!!!
Oh, did I mention to VOTE on Tuesday? No matter what your politics, you have a civic responsibility to make your voice heard. Don't like the Supreme Court Justice who can't keep his misogynist views to himself? Get out there and vote for his opponent. Is experience more important to you? Let your voice be heard on Tuesday or you lose all credibility when making noise at a later date. One of the great strengths of our nation is the unalienable right AND responsibility of every citizen to vote. Exercise it. My rant is done.
- VOTE VOTE VOTE VOTE VOTE VOTE VOTE VOTE VOTE VOTE VOTE VOTE!!!!!!!
Saturday, April 2, 2011
- The Intangible Costs of Relentlessly Pursuing NCLB Achievement Goals
It's been an interesting few days since I woke to the Tuesday morning news of the Janesville Public School District trumpeting victory with their WKCE improvement data on page one of the Gazette. I quickly shot off emails to the principals in the district asking for Evansville data. I got one response that the data were "embargoed until April 7 or something like that." Okay...The skeptic/reporter /data hound in me went on high alert. On a whim, I looked for Evansville's 2010 WKCE data on the DPI website in their WINSS system. Eh voila, there they were. To be fair to my source, they expressed astonishment at the DPI asking that the data be embargoed yet publishing it on their website.
If you click on the post, you will be directed to the website where any achievement data you please can be accessed. Enter Evansville in the District and hit enter. The data can be viewed a number of ways. Please be aware that observation of the data, while informative, is not in itself sufficient to draw any conclusions regarding Adequate Yearly Progress (AYP). Every year, I forget the definition of AYP and have to review the method by which it is calculated. I forgot to do this last year and got into trouble. Like many government initiatives, it really is more complicated than it needs to be.
There are four components to achieving AYP. 1) Reading Achievement must meet annual yearly objective; 2) Math achievement must meet the annual yearly objective; 3) 95% or more students must participate in achievement testing. 4) Graduation rate and attendance rate of 85% or more or a 2% increase over the previous year. These criteria must be met not only on the district level and the school level, but also for any student subgroup for which there is a population of over 40 students (such as disabled or economically disadvantaged).
When we all read in the news about WKCE achievement as compared to the No Child Left Behind (NCLB) targets, the data are usually presented as "the percent of students scoring proficient or advanced." That simple sum is not how the "proficiency index" is calculated. Districts receive results from WKCE in four categories: Minimum, Basic, Proficient and Advanced. At the beginning of this NCLB journey, the target values for Wisconsin Public School Districts to achieve in Reading and Math were established as "Annual Measurable Objectives (AMO) for proficiency index." This year those values are 80.5% for Reading and 68.5% for Math. To ascertain whether or not any district or school or subgroup has achieved AYP, the reader must determine the Proficiency Index (PI) by adding up the percentage of students scoring Proficient and the percentage of students scoring Advanced and adding to that half the percentage value of students scoring Basic on the WKCE. To illustrate, if a certain school had 20% Basic, 35% Proficient and 25% Advanced achievement on Math, the PI would be 35+25+0.5(20), or 70%. This calculation indicates that a school with these Math scores meets the minimum target value of 68.5%.
Since I began doing all of these calculations and trying to post this data, I have had technical difficulties. I will spare you the gory details and cut to the chase, to which I hear a collective sigh of relief. I really do love playing with this data, but conclude that it's pretty boring to a good chunk of the population. I am going to qualify my data with the remark that I may or may not completely understand the final step in designation of AYP status. Here's a summary of what I know:
- Calculate the Proficiency Index values for Math Achievement (MA) and Reading Achievement (RA) for each school and for each subgroup for which 40 or more students are present.
- Determine if the MA and RA values are at or above the AMOs or fall within statistically measured confidence intervals. If yes, AYP is achieved.
- If no, calculate a Safe Harbor calculation comparing Basic plus Minimum achievement for the present year with the immediately past year (SH1).
- If SH1 shows that Basic plus Minimum performance achievement is reduced by 10% or more, AYP has been achieved.
- If AYP is still elusive, recalculate the Safe Harbor designation by comparing the percentage of Basic plus Minimum achievement for the current year with an average of the Basic plus Minimum achievement for the two prior years (SH2). If SH2 shows a decrease of 10% or more in the Basic plus Minimum achievement categories, AYP has been achieved.
- If the proficiency index, SH1 and SH2 all fail to meet AYP, AYP is not met.
There are two parts I am unclear on. The first is the SH2 calculation because I only found it once in the literature and can't find it again. The second is what the statistical Confidence intervals are for our district. I will summarize the steps and the AYP status at each step for the district.
Due to statistical limitations, I only ran the data on a school wide basis for TRIS, JC McKenna and EHS. I was concerned with borderline achievement levels last year for the Disabled and Economically Disadvantaged students, so I focused on those areas. The target values this year are much higher than last and it was reasonable to check this data first.
When I calculated the proficiency index for these groups at each school, the achievement in reading and math for economically disadvantaged students all met AYP outright. The straight proficiency calculation using the reading and math achievement of the disabled population failed to meet the strict comparison for both subjects in all three schools for this group of students across the board in both subjects. The Math result at JC McKenna was very close, however, and will meet AYP with the Confidence Interval, I'm sure. Safe Harbor 1 calculations bring EHS into compliance and SH2 calculations bring TRIS into compliance. Even after SH2 calculations JC McKenna reading scores for disabled students fail to meet AYP.
What does this all mean in these trying times? The first observation to be made is that overcoming failure to meet AYP now becomes extremely challenging as each district faces a 6.5% increase in the Reading target and a 10.5% increase in the Math target annually until 2014, when POOF everyone will magically become proficient in Reading and Math. HAH! Borderline performance this year bodes ill for future success. Part of me screams that the NCLB Act is so mind numbingly insane while the scientific part of me acknowledges that data can be powerful. Then I get ticked off at the Wisconsin DPI for making it so damned easy to meet AYP early on and lulling districts into a false sense of security.
There are real issues with the integrity of the WKCE and its ability to measure anything resembling proficiency to begin with. I attended a seminar at the School Board Convention in 2008 at which one of the co-creators of the WKCE stated that the original test that was developed using teacher experts to designate what constitutes the knowledge level of each achievement category had to be scrapped because so few students could achieve proficient or advanced on the test. It is a watered down version of what students are really expected to learn at each grade level.
There are statistical challenges faced by smaller districts like Evansville. Each class runs 130-150 kids and fluctuations in standardized test results are much wider. Each class has its own character that can sometimes be inferred by test results. Comparing the data for this year's fourth grade class with last year's fourth grade class is meaningless in terms of helping teachers become more effective or enabling students to learn more. But that's exactly what the NCLB data crunching calls for. And yet, the fundamental question of, "are students progressing?" is not addressed. Like so many educational initiatives, this enormous program was thrown into action with no adequate baseline and a very nebulous goal: To make every kid in America proficient by 2014. In what, pray tell? Taking standardized tests? By whose definition of proficient? I'm sure my definition of proficient differs from yours, just like Florida's is different from Wisconsin's. Without adding resources for this, the latest in a long line of unfunded federal mandates, it is guaranteed to fail.
Aside from the obvious costs incurred by districts, states and feds to administer the test, evaluate results and implement changes to improve deficiencies, there are countless intangible consequences of the NCLB Act. Time wasted in class preparing for and then taking the tests takes away from valuable learning time. All to take an impotent test like the WKCE that all the players know is ineffective. Which they try to mitigate by administering a useful test like the MAP test, thereby tripling the work since the MAP is given twice a year. They get individual data and aggregate data from this, which shows progress within a group, not in comparison to another group. But it is not the "authorized testing method," so it can't be used instead of WKCE. There is a potential problem of teaching to the test instead of what they should know or are interested in. There isn't any time to explore a subject in more depth if the class wants because they must push on to the next topic. That is NOT education. It is rote memorization that doesn't teach anybody anything.
I noticed one of those hard to pin down consequences of NCLB last year as I reviewed WKCE data. I sensed a data trend that suggested gains in achievement for groups with borderline results were accompanied by losses in achievement in the general student population. To get a handle on how to measure this phenomena, I postulated that a simple sum of the percent proficient plus percent advanced achievement (%P+A) would be an easy way to start. Subtracting this value for 2009 from its counterpart for 2010 would measure the change in achievement level for each of the four student groups: Economically Disadvantaged, Not Economically Disadvantaged, Disabled, Not Disabled. A positive result wold indicate increased achievement and a negative result would indicate loss of achievement. This was done for each of the five academic categories tested (Reading, Math, Language, Science and Social Studies). I used the three subjects tested only at grades 4,8 and 10 as a sort of "canary in the coal mine." If trouble showed up there while trying to strong-arm the reading and math scores into submission, it could be a sign that my concerns are well founded.
EHS showed universal improvements for all students in all 20 measurements using the "Hammann Quick and Dirty Evaluation of Student Achievement Index (HQDESAI)." This is especially notable because the high school only tests one grade (10) and is not statistically in danger of losing AYP due to the low numbers of the at risk populations. Each and every one of the four groups stated above had gains; some of them were of impressive magnitude. Excellent work EHS! Kudos for the turn-around.
JC McKenna posted gains in 12/20 of these measures. Scrutiny of the areas of loss of achievement showed Not Economically Disadvantaged Students losing on 3/5 measures while Economically Disadvantaged gained across all five subjects. A similar story is seen between the Disabled and Not Disabled groups. The Not Disabled students had gains only in Reading proficiency with losses being posted in the four remaining subjects. Conversely, the Disabled student achievement showed sometimes impressive gains in exactly the subjects their Not Disabled counterparts lost ground on, while losing ground in Reading. It is a textbook case of what happens when resources are reassigned instead of added to solve a problem. All hell breaks loose where the vacuum is formed.
TRIS posted gains in 14/20 categories. Losses in proficiency at TRIS (using the HQDESAI) were more equally distributed amongst the Economic Subgroups. The Disabled Student achievement posted losses in science and social studies while the Not Disabled Student achievement showed gains in all five subjects. I don't know the details of this issue, but just on principle it concerns me because the district specifically added resources at TRIS in special education. The consistent issue at TRIS, however, is the across the board significant loss in Social Studies achievement. The district recently realigned its Social Studies curriculum to better meet state standards. Maybe it's growing pains, but I imagine this will be addressed at future teacher in-service days at TRIS. The other schools posted universally large gains in Social Studies achievement, so the realignment seems to be working at the other levels of instruction across the district.
How can we help all students achieve their best and how do we measure that achievement to guarantee that they are achieving? It is a conundrum. I know there is a big "differentiated instruction" bandwagon in the district right now. I don't think it's been effectively implemented across the district. Whatever they're doing at the High School seems to be effective, though. Maybe that model can be extended to the rest of the district to help our struggling schools regain and maintain AYP. Evansville has great teachers and outstanding students. I'm confident that these passionate educators will find a way to optimize the quality of education during these trying times.