Monday, September 10, 2007

Testing Disconnect

Joanne Jacobs links to a Center on Education Policy study that says most of the states with an exit exam requirement for graduation are not making students
(Of 23 states surveyed), only six say that the purpose of the test is to measure the knowledge and skills needed for college-readiness, while nine indicate work-readiness as a purpose.

In contrast, 18 states say that the tests — which are generally aligned to the 10th-grade level — are intended to determine mastery of the state curriculum (e.g. standards, curriculum framework). And 18 states say that the exams are used to provide data to state policymakers on student progress toward state education goals to inform policy decisions.
. While this may seem shocking to, it is not all that surprising to me.

One of the fundamental precepts of testing is knowing what you are measuring and then designing a test to measure that standard. For most school systems and states, what the exit exams are measuring is not college preparedness or workforce preparedness, but "how well we have done our jobs." That is schools are measuring to students to see how well the schools have done educating studetns.

Such a testing arrangement is very different from testing for college or workforce readiness. The bias itself is not all that surprising, given that it is the Department of Education that is building the exam, largely as a means to demonstrating to the public and more importantly to the legislative appropriators that they are not only earning their keep but need more money to earn their keep.

If exits exams are supposed to measure college/workforce readiness, they would have to be built on the standards colleges and employers want. To be fair to the states, that is not really easy since the two groups may be looking for different, if not divergent skill sets. However, that is the only way in which exit exams will measure readiness for higher education or employment.

No comments: