Skip to content

MEAP: Comparing Similar Districts

April 6, 2011

Based on a suggestion by Mike Reno, I compared MEAP Scores from Rochester Community Schools to other similar districts.  Mike provided a list of districts, but this should not be considered a list of all comparable districts.  The results are favorable for Rochester Community Schools and portend what scores will be like once the MEAP Cut Scores are adjusted.

The numbers below are the average percent of students that scored in the advanced or proficient ranges for the various tests.  Note that this is an average of the tests and does not mean that the student scored advanced or proficient in all tests for their grade.  The percent of students that met or exceeded expectations is the percent advanced plus the percent proficient.  Because of rounding, the addition is often not exact.

% Advanced % Proficient % Meeting or Exceeding Expectations
State-wide 36% 42% 79%
Rochester 64% 28% 92%
Ann Arbor 59% 30% 89%
Birmingham 62% 31% 92%
Bloomfield Hills 63% 31% 93%
East Lansing 49% 36% 85%
Grosse Pointe 56% 34% 90%
Novi 62% 29% 91%
Troy 63% 29% 91%

.

Of the districts selected, Rochester Community Schools is the top district in the average percentage of advanced scores on the MEAP and second in scores that meet or exceed expectations.  There may be other districts in the state that have scored higher.

(Edit 4/7/11)

I also just grabbed raw data from the state website and ran it through MS Access. I’ve excluded singular schools and the data below is only for districts.

For math, Rochester is the 13th district in Meet/Exceed Expectations average over all grades. It is 3rd behind Novi and Troy in Advanced though.

For reading, Rochester is the 12th district in Meet/Exceed Expectations average over all grades. It is 2nd behind East Grand Rapids Public Schools in Advanced though.

In overall MEAP averages, Rochester is 2nd in Meet/Exceed Expectations behind Spring Lake Public Schools and 1st in Advanced.

Some numbers appear to be different.  This could be based on getting data from a few different sources.

(End Edit)

Rochester has definitely succeeded in preparing their students for the MEAP tests.  The high percentage of advanced scores portend well for when the MEAP Cut Scores are raised.  The projected changes for MEAP Cut Scores have the new proficient levels close to the current advanced levels.  So while Rochester’s MEAP scores will fall significantly, they will actually look better compared to state averages.  Jeremy Nielson details how Rochester Community Schools has responded to the raising MEAP Cut Scores.  Hopefully Rochester and the state of Michigan take this opportunity to meet this new challenge head on.  Our kids’ future depends on it.

P.S. if anyone wants to play with some raw data from the State, save this file and rename it from .odt to .mdb to open.  This is an Access 2003 database.  If you don’t get the “Save File” option clicking on the link, you should be able to right click the link and choose something like “Save Link As”.

8 Comments leave one →
  1. Mike Reno permalink
    April 7, 2011 7:07 AM

    This is interesting, Joshua.

    Can you better described how you came up with an average?

    For example… did you look at the percentage of students that were “advanced” in math, and the percentage of students that were “advanced” in english, and take and average of those two numbers?

    Is this for one grade level, or all grades?

    Just curious…

  2. April 7, 2011 8:59 AM

    Mike, this is just the most base of averages. It is every test, every grade level. It certainly might be more interesting to look at math or reading averages across the grades for each district. I just haven’t had time for that yet. However, the “compared MEAP scores” link above provides the needed information for those who want to take a closer look.

    I have updated “compared MEAP Scores” link to have averages for Math and Reading. Writing, Science, and Social Studies have too few data points to be useful for averages.

  3. April 7, 2011 9:51 AM

    Mike, I also just grabbed raw data from the state website and ran it through MS Access. I’ve excluded singular schools and the data below is only for districts.

    For math, Rochester is the 13th district in Meet/Exceed Expectations average over all grades. It is 3rd behind Novi and Troy in Advanced though.

    For reading, Rochester is the 12th district in Meet/Exceed Expectations average over all grades. It is 2nd behind East Grand Rapids Public Schools in Advanced though.

    In overall MEAP averages, Rochester is 2nd in Meet/Exceed Expectations behind Spring Lake Public Schools and 1st in Advanced.

    I’ve updated the post to have an Access 2003 database people can play with to look at the data.

    I hope the RCS doesn’t look at this as a time to rest on one’s laurels, but a realization that we have a large number of students that could benefit from advanced and gifted education and the rigorous curriculum of an IB school.

  4. Angela Korn permalink
    April 7, 2011 11:55 AM

    Read those labels carefully:

    http://www.educationreport.org/12365

    What MEAP Scores Mean

    By MICHAEL VAN BEEK | March 22, 2010

    The 2009 Michigan Educational Assessment Program results are in, and schools from Benton Harbor to Midland to Ludington to Detroit are celebrating. For the fifth consecutive year, test scores were up on the whole. That appears to be good news; good enough anyway to briefly overshadow the few dozen districts going bankrupt and Detroit’s recent record-setting ineptitude on a national standardized test.

    The only drawback to these MEAP results is that they don’t really tell us how much kids are learning.

    This shortcoming of the MEAP is not the fault of individual schools or districts. The MEAP is mandated by the state. Schools have no choice but to follow the state’s directions, take four days away from regularly scheduled classes, sharpen their No. 2 pencils, and administer the test. The MEAP’s failure as a reliable source for measuring student performance is due to the inability of both federal and state governments to institute meaningful accountability standards.

    On the federal side, the No Child Left Behind program created perverse incentives for the states agreeing to carry out its standards. The strategy was to set high bars for schools to meet, inject money into the lowest-performing schools, and then have states hold failing schools accountable. As public choice theory would have predicted, some states realized that it would be easier to make more schools meet the NCLB standards than it would be to actually deal with school failures.

    Michigan is one of those states. The MEAP test is the state’s primary NCLB yardstick, and since NCLB went into effect, MEAP scores have increased at remarkable rates. When students in Michigan are compared to students in other states via national standardized tests, however, the results are very different.

    A National Center for Education Statistics study (http://nces.ed.gov/nationsreportcard/pdf/studies/2010456.pdf)
    last year compared state standardized test scores to those of the most consistent national standardized test, the National Assessment of Educational Progress. The study calculated what the NAEP cut score would be for each state based on that state’s “proficient” student.

    Michigan’s definition of “proficient” ranked near the bottom in every subject tested. Michigan’s was 44th and 46th out of 48 states in fourth-grade reading and math, and 35th and 37th in eighth-grade reading and math, respectively.

    Other evidence suggesting the MEAP is a poor measure of actual student achievement is that fact while MEAP scores were soaring to new heights, average scores on other standardized tests didn’t budge. The average ACT score in Michigan from 2004 to 2007 (years in which a similar number of students took the ACT) remained constant. Similarly, from 2002 to 2008, the average SAT score slightly decreased in Michigan. Likewise, scores on the NAEP over the last decade haven’t changed much either. Don’t forget that all the while, graduation rates remained at about 75 percent.

    Yet the MEAP says that from 1999 to 2009, the percentage of “satisfactory” or “proficient” fourth-grade reading scores went from 60 percent to 84 percent. The percentage of seventh-graders who were proficient in math went from 63 to 82 percent. In science, 81 percent of fifth-graders were proficient in 2009 compared to 37.5 percent in 1999. Either the MEAP is picking up on an academic miracle (the likes of which would be unprecedented in government-run schooling) that these other tests are missing, or we’re witnessing what happens when cut scores and designations of satisfactory are changed.

    Further complicating matters, the labels used by the Michigan Department of Education to describe satisfactory scores have varied widely over the last decade. From 2007 to 2009, the MEAP called students either “advanced,” “proficient,” “partially proficient” or “not proficient.” But from 2002 to 2006, student scores were categorized as “exceeded,” “met,” “basic” or “apprentice” (except in 2003 when the lowest level in some subjects was labeled “not endorsed” and 2002 when writing was deemed either “proficient” or “not yet proficient”). From 1999 to 2001, student scores were just “satisfactory,” “moderate” or “low.”

    These varying labels make it almost impossible for parents know how much their kids are actually learning.

    The MEAP is such a poor measure of student performance that many schools have opted to use independent tests to gauge the effectiveness of their academic programs. The only possible thing MEAP scores could tell parents is how their school ranks compared to others, but that’s essentially meaningless since most parents have little choice over which school their children attend.

    Federal (and ill-conceived) standards gives states like Michigan an incentive to see to it that its schools appear to be constantly improving, while at the same time the MEAP fails at telling parents how much their kids are actually learning.

  5. April 7, 2011 12:39 PM

    I agree that the standards of the MEAP are way too low and that they give false assurance to parents that children are learning. But by digging deeper, they can give important information about how schools and districts are doing.

    First, they can provide a general comparison of the level of teaching within a district and between districts. Two schools could have 80% “Met or Exceeded Expectations”, but if one is 60% advanced and 20% proficient whereas the other is 20% advanced and 80% proficient, it helps indicate if students who will pass the MEAP are challenged to do better.

    Second, they can be helpful for trend analysis. Say that MEAP proficiency has risen in a district 20% over 5 years. Is this good? If state averages have only risen 10%, it is. If state averages have risen by 30%, it isn’t.

    There are certainly better measures of education than the MEAP. Looking at average ACT scores can be better. Discovering what percentage of students made a “full year’s growth” could be telling. The percentage of students graduating from college also is an important stat. But it is in combining and sifting through the stats that information can really be gleaned. What type of learning is happening in a district with high MEAP scores, but low ACT scores? If a student scores high on standardized tests, but doesn’t make a “full year’s growth”, does that indicate that the student is not being appropriately challenged? Does a cluster of such students at a particular school show that differentiation is not occurring appropriately there?

    I would be curious as to how many school districts bring in a statistician to run the numbers and discover where they have problems. I doubt many do, but they all should. It would help pinpoint effective and ineffective teachers, programs, and curriculum. And even though the MEAP is a weak measure, it can be helpful when analyzed properly.

  6. Mike Reno permalink
    April 7, 2011 1:33 PM

    Rochester had a “Director Level” person who was supposed to analyize assessments… never really saw much analysis.

  7. Angela Korn permalink
    April 7, 2011 3:03 PM

    “I agree that the standards of the MEAP are way too low and that they give false assurance to parents that children are learning. But by digging deeper, they can give important information about how schools and districts are doing.

    That’s the problem with this annual ritual of raising MEAP’s profile via news reports, Joshua.

    Most local parents will read “Advanced” or “Proficient” and tune out, content that THEIR child is enrolled in a place where “92% Meet or Exceed Expectations”.

    And compared to the tragic numbers the media report from our nearby urban centers, they’re right. Right?

    Unfortunately, most parents don’t question what these labels represent, or ask if the measurements are valid and reliable indicators of student performance.

    Most parents don’t “dig deeper” as you correctly suggest they should.

    Parents think the numbers are “good enough”, because the school district and the news media tells them so.

    So are they?

    Again, according to the National Center for Education Statistics (cited in my earlier post), the answer is a resounding “NO”.

    Forgive the redundancy, but for the vast majority of local students who are college-bound, MEAP is a very small element of the big picture and parents need to look beyond the MEAP:

    “The study calculated what the NAEP cut score would be for each state based on that state’s “proficient” student.

    Michigan’s definition of “proficient” ranked near the bottom in every subject tested.

    Michigan’s was 44th and 46th out of 48 states in fourth-grade reading and math, and 35th and 37th in eighth-grade reading and math, respectively.

    Other evidence suggesting the MEAP is a poor measure of actual student achievement is that fact while MEAP scores were soaring to new heights, average scores on other standardized tests didn’t budge.

    The average ACT score in Michigan from 2004 to 2007 (years in which a similar number of students took the ACT) remained constant. Similarly, from 2002 to 2008, the average SAT score slightly decreased in Michigan.”

    So what else is needed?

    Independent, nationally normed assessments provide local school administrators and teachers with critical performance comparisons between local students and schools and their peers throughout the country.

    Local school districts can take further advantage of post-test workshops (such as those offered by TerraNova) which utilize test results for curriculum and instructional planning suited for their own student population.

    Rochester used to administer a nationally normed assessment (CAT and then TN) to students in the 6th grade (they also offered one in 3rd grade, but dropped that practice years ago).

    The results of these nationally normed assessments should be thoroughly examined and cogently explained and disseminated so that parents, teachers and school board members are fully cognizant of the learning needs within our own district.

    This vital information is essential to the process of allocating resources strategically for high impact instructional planning.

  8. Mike Reno permalink
    April 7, 2011 3:25 PM

    I’ll add, Joshua, that the district still administers the Terra Nova, but can’t give any coherent answer describing what they do with it, other than “use it for math placement.”

Join the discussion!

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: