Alaska's "High School Graduation Qualifying Examination" (HSGQE): Web Resources Page:

Development and Implementation (1998 - 2003)

Claudia Krenz, Ph.D. (datafriend @ gmail-.-com)

This web page covers the period--between 1998 and 2003--when Alaska's "High School Graduation Qualifying Exam" (HSGQE) was developed, field tested, and first used as a diploma sanctioning test: that is, Alaska high school students had to pass achievement tests in reading, writing, and math to receive their high school diplomas. The site map to the right below is divided into two broad categories: local control (pre-NCLB) and post-local control (post-NCLB). In January 2002, the President signed the NCLB into law, with broad, bi-partisan support: NCLB, technically, still allows states local control over their curriculum in the sense that it is they who develop the content and performance standards--articulate what students should know by when--and hire their own test publishers to use those objectives as blueprints for developing achievement tests measuring student performance (Alaska hired CTB/McGraw Hill to develop, administer and score its HSGQE).

Under NCLB, one state's science achievement test could measure knowledge of geology and another's, creationism. NCLB's tolerance for state curricular definitions contrasts with reports of "faith" (and oil) based initiatives--like federal scientists studying climate change being told not to report what their models predict or they observe, the National Park Service being ordered to remove books about the geological history of the Grand Canyon, and the the National Institutes of Health having to remove the word "condom" from its HIV health prevention pages.


Site Map

Click on a link to your right to go to that section. The text in this file is organized sequentially by these links (with the URLs mentioned in text organized alphabetically in the Bibliography). The first two links to the right are before the present paragraph, the rest, in order, after it: this first section looks at "accountability" as defined by the NCLB.

The second section describes the HSGQE's development, official definition, different administrations, student performance, and what little's known about the pre-2002 administrations. Issues common to test development are illustrated with the Texas Assessment of Academic Skills (TAAS), required there since 1990--to overview the testing process and highlight issues (most certainly not to suggest that Texas sets an example to emulate)--and what might have been learned.


Post-NCLB: Post-Local Control


The single nationwide east-coast database
Letter to the gov
Alaska summary
Nikiski Elementary: Ode to a School that Worked
Spring 2002 Scores Retrospectively


Pre-NCLB: Local Control


Alaska's HSGQE
(development, administration, and performance).
What was known about the tests
Where were we?

Earlier: Outside
Constructing the TAAS, living with the TAAS
Lessons from other states
What could have been learned


  Letters to Students
  Author Note
  Footnotes
  Bibliography

While NCLB leaves the states free to establish their own content and performance standards for their own students, the U.S. Department of Education (USED) has various means of influencing said content. In one example, in 2005, one state, Utah, openly defied USED: USED Secretary Spellings called Utah state legislators names and threatened to cut funding for its neediest schools.

"The letter  warned that 'the consequences of enacting and 
implementing this bill would be so detrimental to students in Utah'
 .... In the letter's 25-line third paragraph, Mrs. Spellings warned 
that the   department probably would cut off at least $76 million 
in federal funds this year for low-income school districts and 
teacher training" (Archibald, G., Washington Times, 20 April 2005).
Secretary Spellings is also 	quoted as saying that "Under the state 
test, 74 percent of white eighth-graders and 47 percent of Hispanics were 
reported 'proficient or advanced' in math, while just 34 percent of white 
and 7 percent of Hispanic eighth-graders were 'proficient or advanced' on 
the NAEP math test."

The Secretary of Education's threats were quite real: As a Utah newspaper put it, the day the state legislature rebelled against the NCLB

"also happened to be the anniversary of the opening battles of the Revolutionary War. Two-hundred-thirty years ago, Massachusetts militiamen at Lexington and Concord faced off with British troops, firing the famous shot heard 'round the world .... At Lexington and Concord, and in Salt Lake City, the central players were taking a risk. The colonists were risking the wrath of a global superpower--Britain--and the hangman's noose for treason. Utah's stake is bloodless, but a risk nevertheless" (Provo Daily-Herald, 21 April 2005).

The Secretary's argument was "one heck of a good one:" Comparing percentages of 8th grade Utah students proficient on the federal National Assessment of Educational Progress (NAEP) and Utah's own state achievement test scores (Utah called its the "UPASS"; Alaska, the "HSGQE") is possible since test scores are only numbers. Comparing scores on tests that have never been standardized to each other--and are not necessarily even measuring the same types of achievement (or have the same definitions of "proficiency")--however makes no sense: There are so many possible explanations for the observed differences in percentages reported on the two tests that it's impossible to single out a particular one: what would be a wonder would be if the scores had been the same!

--

USED, despite its seeming tolerance of different state curricular standards, USED insists upon uniform data entry format: student last names must begin in column 1, address in--say, column 21, school lunch status (free, reduced) , column 32, math state achievement test scores columns 50-55. In short, USED wants each state to format its individually identifiable data about its public K-12 children the same way (uniform data entry practices would facilitate the merging of state data into a single nationwide database). Skim the text summaries on a search engine results page--from a simple search on nclb data format handbook pbdmi OR eden over the last year--for a sense of the kinds of data being collected. As put by Pennsylvania, "NCLB (No Child Left Behind) requires that data be collected on individual students. This means a complete redesign of the current VDOE school data collection system. In addition, the U. S. Department of Education (ED) is developing a new automated system for collecting data to meet the requirements of the No Child Left Behind Act and other federal education programs. This new system, called the Performance-Based Data Management Initiative (PBDMI), is based on the principle collect once and share."

USED responded more rapidly to Hurricane Katrina displaced children than FEMA, instructing schools to enroll them as "migrant" students so the schools could be reimbursed (NCLB expressly prohibits creation of a "single nationwide database" except "to better coordinate services" for students switching schools).
Alaska has doubtless already given its K-12 data to USED and the time is
12:00:59

NCLB is reminiscent of the old Greek myth about the Trojan horse: while everyone, including me, was squawking about irrational accountability testing--individually identifiable data about US public K12 students are being silently sent behind secure socket layers. How will USED use these data? Targeted advertising is possible. I doubt these data will be used for educational research. If this relatively well funded data collection part of the NCLB were on the up-and-up, why would USED indulge in secretive and threatening messages (just so, if the NSA says it is not intercepting international email, one wonders; if the NSA says it is intercepting international email, there's little doubt).

 

 

Alaska Summary

In 1998, the Alaska State Legislature passed its "Quality Schools Initiative," compared to most states in the continental U.S., Alaska students were doing well compared to their counterparts in the other states. Click here to a sorted list showing percents of 8th graders proficient on the National Assessment of Educational Progress' math test (the closest the US had to gold standard or way of evaluating student achievement levels across the country. They were certainly doing better than students in California and Texas, home states of the whining oil execs who convinced the Alaska leg that its "work force " wasn't up to becoming good employees. I don't think the leg noticed that its students were doing better before it decided to inflict accountability testing on real students in real time.


Accountability Testing in Alaska under Tony. This page is a compilation of links about educational testing in Alaska since 1998 and implementation of the "Quality Schools Initiative" from an educational testing and measurement perspective.

Believing such exams should be subject to public scrutiny--call it "sunshine"--is not synonymous w/ the position that test scores tell us everything. There is a middle ground between the abyss of assuming no multiple-choice test can ever measure anything relevant or worthwhile--and the positivist glory heights of assuming the tests we use measure what we want them to measure and so there is no need to examine empirical data about test reliability and validity. Consider also the example of the phrase "scientifically valid and reliable" that peppers the pages of nclb.gov and used.gov: the phrase has it exactly wrong. It is easy to imagine reliable tests that are not valid (an easy example being measuring height when you're interested in hearing) but not the opposite: reliability--and there are several ways to estimate it--is a necessary but not sufficient condition for validity. It is, as Donald Campbell wrote, "the role of methodology to chart the course between naive credulity and inert skepticism."

Alaska was one of the last states to implement standards-based reforms: the leg passed a law, as had most other States earlier, mandating that schools a) administer a battery of tests to students [3 partial days/academic year for the testing alone] and b) aggregate those scores over local schools to hold each "accountable." This "accountability testing" was on top of what has always normally occurred in the classroom (the typically enormous lag between testing and scoring made pedagogical use of these scores unlikely). Students took CTB tests of unknown merit (insofar as of unknown relevance to state-articulated performance objectives); their scores were aggregated over individual local schools and made public, typically on the web. Here's a 1) sorted list of how the other states compared to Alaska in degree of accountability testing in 1998.

Two years before the leg enacted the Quality School's Initiative, Alaska's students had scored higher on the National Assessment of Educational Progress (NAEP) than students in the home states of the oil execs who convinced the leg that Alaska's students were insufficiently educated. Specifically, the percentage proficient and above on the NAEP math exceeded CA's and TX's by about a decile:

Thirty % of AK's 8th graders--compared to 21% of TX's and 17% of CA's--scored "proficient" on the NAEP math test (psychometrically the closest the U.S. had to a "gold standard" in measuring student achievement).

Here's a 2) sorted list of how the other states compared to AK, TX, and CA in % proficient on NAEP 8th grade math in 1996. In summary, the first list shows that in 1998 Alaska ranked low in degree of standards implementation (and so is at the top of that list). The second shows that in 1996 Alaska ranked relatively high in math achievement (and so is at the bottom of that list).

A state that didn't recognize the caliber of its own students--used whatever the oil execs said as its litmus test--wouldn't and didn't ask whether implementing such a testing program--often described w/ phrases like "standards-based reforms" and "high-stakes testing" and above all "accountability"--was a wise move, whatever the caliber of its students. Had the AK leg looked, it might have noticed, from correlating scores on two variables--degree of accountability testing grade (as assigned each state by EWW) with % proficient on 8th-grade math (as assigned by the NAEP)--the following Pearson product-moment correlation: 3) -.39.

This statistically significant correlation is in the *wrong* direction insofar as the accountability testing model would predict that imposing its reforms enhanced, increased, or accelerated achievement.

Had the leg concluded, based on a single statistical correlation that testing was so inimical to achievement that it should be abolished, I would have considered them foolish. Sort of like I consider them foolish for never questioning the link between the state's articulated performance objective--also a gold standard--and the tests that purportedly measure them--never insisted on examining empirical data from its test publisher pertaining to test reliability and validity ("idiots" is perhaps too strong; "innumerates" certainly is not--and "spongiform encephalopathy" comes to mind). The current debacle is that the class of 04 must pass a test which AK's education department says is so flawed that the state has changed test publishers (click here to read what DRC, the new testing contractor, has to say about itself: whatever the benefits of changing publishers--real values like more flexible scheduling and rapid scoring--so much dust will be raised that visibility--the chance to watch the as yet unfolding spectacle--is guaranteed decreased).

Reason suggests that resource or time consumption is an aspect of accountability testing that should be considered (student time both being tested and testing prepared--in Alaska, originally at the 3rd, 6th, 8th, and 10th through 12th grades--times teacher time subtracted from what would otherwise have been spent in instructional activity--at the least time for some unique and wonderful Alaska history lessons). But reason isn't operative in a logical positivist state like Alaska. Maybe that's the reason for the "brain drain" (what state demographers call 40% of the state's grads leaving the state never to return).

Accountability Testing in Alaska under Frank. The national NCLB law is based on Texas' high school exit exam, the TAAS: USED Secretary Paige--former Houston principal--doubtless did not know that the Houston education miracle *poof* was fraud (created by the miracle of cooking the high school dropout data, not dissimilar to cooking enron's books--misleading accounting policies to say the least). It also mandates the impossible: that in less than a dozen 100% of all students everywhere score proficient (no accidents, no boycotts, everyone--perfect everywhere).

The 2/21/03 state school board massacre is another example of shock and awe tactics: leave the state rudderless w/o anyone to respond to the mounting federal pressure.

This law's coup de grace though is the dozens of categories in which local schools can fail: attendance not high enough on test day? you're not making AYP (not making AYP? then "in need of improvement"--busy, busy schools). Alaska to its credit did initiate a statistical definition of AYP that reduced the number of Type I errors--schools doing fine erroneously classified as failing based on their aggregated test scores because of random error. And it protests that its Title 1 funding depends on literal compliance--not even axiomatic "accommodations" for reality situations like the closest school being a mountain range away (at least until recently). Even w/ now, w/ the increasing political hoopla, this ESEA reauthorization still appears as well read and understood as patriot before it was so overwhelmingly passed (I wish I could afford to provide more free r&d).

The NCLB acknowledges what has been observed in countless empirical studies: a high ses/white -- lo SES/not white difference on a lot of different standardized tests. There are numerous interpretations of these test score differences: Under one common positivist paradigm, scores on what for-profit companies say a test measures is what they actually measure, whether w/ "iq" or "achievement." The Alaska leg typifies this kind of naive empiricism. Others argue that such scores mean nothing at all until at least quasi validated. Ironically, as schools that can afford to opt-out, those bearing most of the burden may be those most initially in need of help. In the sense that children get left behind everyday--poverty, abusive parents, the list is long--and in the sense that the law at least superficially addresses the problem resulting from a climate of "low expectations" by requiring test scores be reported disaggregated by groups like the vanilla census 2000 ethnicity categories--which are of little relevance to Alaska. No surprise--being based on the "Texas education miracle"--the law does not require the states to provide disaggregated dropout data.

Targeted advertising--based on purchases at the old Carr's/new Safeway's--has already begun on the Kenai Peninsula: how will the data on the little kids be used? Certainly for advertising; certainly for more Texas--Enron, the Houston *poof* education miracle--math; who knows to what totalitarian ends. I am not optimistic. These data will not be used for educational research, which USED regards as being conducted by the "the dullest yeomen". USED presents the PBDMI as the solution. I see it as a Trojan Horse. If this relatively well funded data collection part of the NCLB were on the up-and-up, why would USED indulge in secretive and threatening messages? There is no question that Alaska isn't unique: but to look at approval of Alaska's plan in isolation is to miss the forest for the trees: The NCLB has two components, an under-funded accountability part--17 plans for which were approved in the recent Rose Garden ceremony--and a fully funded data collection part, which is little known. There is no question that Alaska isn't smart: one parent who telephoned said that AK was one of the few states whose public schools he'd allow his kids to attend (he was mad about academic time lost to testing, his kids' time wasted).


A page at the whitehouse.gov site says that public support for the NCLB is "rock solid" although that doesn't seem so in states like Utah (Alaska officials proclaim the state leaders in obeying the NCLB [while simultaneously insisting on noting the numerous discrepancies between the law and reality]). Go to an automatically generated news site (news.google or news.yahoo) and paste NCLB into its search window to find out what local communities external to your own are saying.


--
That the "houston miracle" has become something of an urban legend was suggested by the Texas Project Grad rep who spoke of looking forward to working with /ken-eye/ schools, having already helped "fix the Houston schools" (APRN, 3.04). Not surprisingly, APRN neither corrected the pronunciation nor challenged the assertion.


Accountability Testing in Alaska under Frank (contd.): The part of the NCLB no one talks about, The Performance Based Data Management Initiative. That the accountability part of the NCLB is like the HSGQE and the TAAS on steroids keeps the schools preoccupied (so many ways to fail, busy schools writing self-improvement plans, and after a few sequential years of not making AYP, so much money to be made assisting those "failing").

You may remember the old Greek myth about the Trojan Horse: after besieging Troy for a decade, the Greek fleet sailed off (behind a nearby island), leaving a cool looking wooden horse outside the city gates--hollow like a chocolate easter bunny but filled w/ enough Greek soldiers to open the city gates from within when attention was distracted, which it soon enough was despite the warnings of Cassandra and Laocoon. Tacitus does not mention whether those Greeks smiled silently as they opened the gates of Troy from within.

The other part of the NCLB I came across at a whitehouse.gov site in the process of updating this page: The plan to create a single nationwide database on every U.S. public K-12 kid. No one, of course, speaks of it because there is much money to be made in educational technology these days: so many software applications for entering NCLB-compliant data, custom interfaces, lots of money. Type "hsgqe" into a google window--and see a testing industry advertisement; type in "nclb" and see many, many ads.

Although the NCLB requires the local schools to provide scores disaggregated by groups, USED wants individual test scores, as well as names, addresses, and phone numbers. Several examples of state pages talking about data entry are shown below. One problem--in the grand tradition of unfunded federal mandates-- is funding for the local schools, the "first responders" in the present context. Sometimes local schools have to fork over the bucks for the data entry software themselves:

Clegg says the issue of having to upgrade systems to the new Federal Performance Based Data Management Initiative came as a surprise to many districts throughout Iowa at a tough financial time. "It's going to be quite a change in terms of software acquisition. Our estimated cost is going to be right around 365 to 385 thousand dollars to fully implement this new federal initiative. There are no federal dollars to be provided for this. There are no state dollars that are going to be provided for it. It's going to be borne by the local taxpayers eventually to pay for this new federal requirement." The new system will require the district to add immunization records and new state student identification numbers to records that are electronically tracked. (KCIN Enews Iowa, 29 Jan 04)

Any such costs within my own school borough are apparently covered under a single line item: "computer," which is too large a number to include only hardware and connections (but likely includes the computer costs of the new NCLB carpetbagger "supplemental service providers," exact amount unknown since "insider" information). Indeed, the KPBSD has historically funded the University of Maine to "analyze" student data (cost unknown). There was no open bidding process (the point isn't that I do better work but not supporting the UA system). I wonder whether "data" from the current "advisory" ballot will be sent to UMaine, too, whether the NCLB data on the lst graders--who don't vote, write letters, and have unions--will be more secure than that on the teachers.
I see no other interpretation of NCLB data entry anecdotes--listed below under the red panic button (yes, my hair is on fire)-- than that, although expressly prohibited, USED is creating a single nationwide database: call it soft- and hardware evolution, Moore's law, whatever. It is now very easy for the feds to collect data on every kid in the country--especially when the states do their own data entry ("clerk work" as Rep. Dayton put it; "no way in hell are we going to do all they're asking" said another Utah official). Individually identifiable student data about every Alaska public K-12 kids would fit onto a 1.4 MB floppy. But, in Alaska, officials face this orwellian threat w/ quasi-bovine tranquility: as they do all else, silently. All the more tragic because Alaska children will be more adversely impacted than those in other states (kids get SSNs young up here).

I interpret all of USED's bombast about rigor--the importance of "scientifically valid and reliable data"--to be about wanting to collect accurate data. Reading through the e-gov documents, I see little evidence that those--too often former doubleclick ad execs--speaking w/ the voice of USED understand quantitative longitudinal research (one discussion of randomization I read merits an A+, as farce in a literature class--so target rich, the documents of these advertising industry execs--and there's, of course, the new "Institute of Education Science" and so many methodological gaffes). Many in Alaska are more than comfortable w/ the idea of not being particularly easy to find ("we don't need no stinking rules" says a local radio dj). It's hard to understand why no one is concerned about individually identifiable data about their children. Perhaps it's related to their local officials telling them that the feds are only getting the aggregated data, not the individually identifiable data. Not through the pipeline crudely but silently behind SSLs.

C lick on the "e-gov" link to read an appeal to the Alaska gov about USED's data collection efforts, the other part of the NCLB, whose funding has not been cut since enactment (how convenient to have data on your citizens in the new American century of total information dominance).

July 04. That said about the professional stance from which I view the HSGQE: it is neither technically sound--the little empirical evidence there is suggests that the test does not meet even minimal standards of reliability and validity--nor legally defensible: in the 1970s or 80s there was a Florida case (Debra P. vs. Turlington), in which the courts ruled that the state does *not have the right* to deny a public K-12 student a diploma unless s/he is given the opportunity to learn the material on which the test is based ... Yet that is exactly what happened to over 700 Alaska students this spring (hobbled by its expensive inadequate microsoft-tm software, the state could not initially even say exactly how many had "failed" its flawed test). Unknown is how many failed the test because they had been given no opportunity to learn what was expected and how many failed because the test was flawed. Either way, it is a rank injustice--one that could only pass muster in a state as mired in 19th c. colonial thinking as Alaska (high school exit exams are part of the "standards-based reform" fad that has swept state legislators across the country since the early 1990s ... AK was one of the last states to implement an exit exam: AK could have learned--for example that high school drop-out rates increase when exit exams are implemented--but it chose not to).

As to my personal opinion: First, chickadees know more about Alaska and Alaska values than I: I can testify only as a technical person but, like everyone else, I've got opinions. I've only been here since the leg passed the Quality Schools Initiative in 98--and I've watched in shock and awe as state "elected" officials repeatedly respond to reason and science, empirical fact with silence (pretending away reality by not bespeaking it). What I see here is a fundamental lack of respect for Alaska students (small wonder some 40% leave the state never to return)--Alaska citizens generally --on the part of those in name only "senator," "governor," etc.--they are like ostriches hiding their heads in the sand ... but then it is not their children who bear the cost of being irrationally branded by tests like the HSGQE. This reality will not go away for others' children (it is my fear, although I lack the data to do other than surmise, that this burden is not spread equally around the state, that rural schools are more vulnerable than urban ones).

The HSGQE will not go away--testing's on the horizon for years to come-- because of the federal NCLB (to think otherwise is to waste what could be energy used practically). Some states--CA most recently--stopped using their exit exam as a diploma-sanctioning test (other states have also done so, i.e., gotten on the bandwagon and then gotten off; a few states like Iowa resisted the standards-based reform bandwagon totally--and their students are doing very well academically). As someone wiser than myself said 'dogs may bark, but the caravan rolls on.'

I fear that matters will get worse. I quite agree with you about the potentially devastating "feedback" failing such a test provides (all the more a tragedy when the test yields substantively meaningless scores) ... Under the NCLB small schools are automatically deemed to be making "adequate yearly progress," which results in the state not needing to issue the school a 'report card' to "hold it accountable"--the federal fERPA laws spell this out (a fine UAF grad was I believe the first to observe this [I remember an emailed spreadsheet crosstabulating the FERPA definitions with Alaska's school compositions]). Of course, the FERPA reasoning is sound in that since the state won't issue a small school a 'report card,' it can't possibly violate some students' right to privacy ... However, if the burden of this test is not born evenly across this state, if the students most penalized by the test are at small schools, if their aggregated scores do not contribute to the State's overall "adequate yearly progress"--education used to be under local control, no longer--those students most penalized will be invisible, not appear in the state accountability report, its ledger, i.e., AK could be failing--in the real sense, not the test-score sense--a major portion of its students and still look good to the feds. And then there is also the problem of the NCLB's Performance Based Data Management Initiative (which is collecting data on all public K-12 students). It is ironic that there is a FERPA law when student privacy rights are being systematically violated throughout the country.


June 04. The source of my frustration is that, although a latecomer to implementing "standards-based reforms"--the hallmark of which is a high-stakes test like the HSGQE--the leaders of the State of Alaska have chosen to learn nothing from the mistakes made earlier by other U.S. states. Take the example of Texas, an early innovator, its high school exit exam dating back to 1990: when its exam was implemented, the high school dropout rate increased, as it has in state after other state (if Alaska is tracking this phenomena, it is not doing so publicly). Again, although gratified to have had the opportunity to assist a few, I have watched aghast--an outside observer, external to the system--as Alaska has repeated all the mistakes made years earlier by the other states. More recently, over 700 Alaska young people were denied high school diplomas on the basis of their HSGQE test scores. This is irrational, because the current Commissioner of Education has deemed the current HSGQE so flawed--a view supported by existing empirical research--that he fired CTB, the state's testing contractor since the Quality Schools Initiative's inception. Doubtless our leaders' attention has turned to that future test; what need to delve into any prior test data? Yet, even a cursory examination can be informative: Tables 37 through 39 of CTB's most recent technical report indicate that up to 25% of the students sitting for the 2003 HSGQE did not respond at all to the open-ended items on the 3 subtests (http://www.eed.state.ak.us/tls/assessment/HSGQE/TechnicalReports/Spring03TechReport.pdf). Although the state did not require CTB to break down these data by subgroups, it's hard not to wonder whether these percentages were evenly distributed across Alaska schools.


You can use the internet to focus your search. For example, click on this link, and a new advanced search engine window will open. Look down this page until you see the line for specifying a search domain: Paste in state.ak.us (and use the pull-down menu near the top right to specify 100 results per page). Now type in your search terms: Search, for example, on HSGQE and retake and fee to get a results page containing links to relevant state documents (you would not find this page on such a search, because this page is not official). The search engine results page also contains summaries which can themselves be informative (forgotten the name of the current "Commissioner of Education"? search on his title and you'll likely find his name in the summary on the results page).

Another advantage of accessing official Alaska documents through a search engine is that the engine results page contains links to both the PDF documents provided by the state and automatically generated text versions. The text versions are useful, because you can cite relevant portions by copying and pasting them, obviously more efficient than typing them (which is the only way to cite the PDF documents provided by the state). If discussing, say, the fee schedule for retaking the HSGQE w/ a friend, it's obviously easier to get the regulations correct by pasting--versus typing--them into your email. Using an engine as an interface is the easiest way I know to track state documents--good luck in your search!

I used google as the example above, but here are links to the advanced search interfaces for altavista, hotbot, and yahoo. The actual organization of these forms are a bit different from the example in the paragraph above but easy to use.


Also included on this e-gov page, towards the bottom, are links to AK's performance standards (representing Alaskans' consensus about what high school students are supposed to know--representing what the CTB tests are supposed to measure--what will be covered in the tests): Towards the bottom of this state HSGQE page are links to CTB's test-item maps, which consisted of AK performance standards cross-tabulated by the number and kind of CTB test items (another way to look at these maps is by courses your brother has already taken and not, as a check-list). On the math test, for example geometry comprises about 10 percent of the questions, 5 of them multiple-choice--like a study guide. Another useful online resource--given your goal of finding ways to help your brother prepare for, study for the HSGQE before taking it in the 10th grade--as suggested by the AK governor's 03-04 operating budget (which equates the HSGQE with the GED and includes phone numbers and email addresses).

I did not find a comparable online directory listing HSGQE tutor availability by school. I'll check again in a few weeks, when things are a little less chaotic: AK e-gov has been but recently redesigned--and, while in the process of working out its bugs, doubtless was adversely affected by the most recent worm round (AK servers being exclusively microsoft).


December 04: Sounds like a very interesting course: I'm sure you already know to go to www.nclb.gov for the propaganda, to www.ed.gov/policy/elsec/leg/esea02 for the actual law, and to a site like news.google.com for what the journalists are saying about the NCLB.

You asked about the impact of the NCLB on children with "culture and languages barriers." In the case of someone whose native language is not English, who wants to learn English, having a non-English native tongue could be considered a barrier. Similarly, an individual school that literally rises and falls on aggregated English proficiency test scores--the NCLB's punishments for not meetings the state's "annual measureable objective" (amo) are draconian after several years of 'non-compliance'-- could regard students whose native language is not English as a 'barrier' to be overcome (whether its students want or need to learn English or not). So your question could be recast to one about the impact of the NCLB on children whose native languages and cultures are different from the ideologues who formulated the law.

At one level the effects are already obvious: over 700 Alaska students were denied diplomas based on scores from a test deemed so bad that the state has changed test publishers. Perhaps with the new oil price windfalls, the governor will eliminate cuts he earlier made to the state's GED program. To the extent that not having a diploma closes doors on future opportunities, these individual students are screwed. They are just the beginning: the number of individual students left behind will only cumulate.

In Alaska's case, the NCLB groups one quarter of Ak's public k-12 students into one category, "Native Alaskan" (the broad census 2000 categories). Grouping numerous unique ethno-socio-linguistic into a single group shows that the diversity of these cultures is not even on the radar of the movers and shakers (statistically, what this mono-grouping creates is a situation where between group differences--expected to be large given Alaska's diversity-- wash each other out). At another level, your question could be recast as what the impact of the NCLB on the continued viability of those languages and cultures ... this year Alaska was named the leader, amongst the states, in the president's "faith based initiatives," the leader in promulgating a particular religion: Christianity. I remember reading a press release about our "constitutional" right to prayer in the schools on an nclb.gov page. And on Dec. 8, one newspaper reported that the Bush administration urged the U.S. Supreme Court to permit Ten Commandments displays in courthouses . As to my opinion, it's bad now and will get worse.

 

 

totalitarian?

What happens when you ignore reality? It doesn't go away. What do you call a state that ignores reality? Fool, alaska. What happens when you discuss reality? You make more informed decisions. What do you call a state that faces the issues? Leader, Utah.

Rep. Margaret Dayton, R-Orem, is preparing a bill for the 2004 Legislature, which convenes next month, that would free Utah of NCLB requirements. "I don't want the schools to be accountable to the feds. And the way this is written, our local school boards will be doing clerk work (for them)," Dayton said. "(NCLB) is a very burdensome, highly intrusive bill. I think it's very deflating to the education community." .... "Turning your back on $107 million sounds like a tough option for Utah to face right now," Matheson said. "That's a lot of money, and Utah's education dollars are stretched pretty thin already." Still, State Superintendent of Public Instruction Steve Laing believes the discussion is politically worth having. "I have some serious concerns about No Child Left Behind, and I think a discussion held in a state so overwhelmingly Republican as Utah is would be informative and helpful," he said. "I think it would be very persuasive with the federal congressional delegation and administration" (Deseret Morning News, 12/18/03).

"With Bush facing re-election this year, his No Child Left Behind an administration centerpiece, and Utah being one of the most Republican states in the nation, it was clear the Bush White House was taking no chances with the Utah Legislature. Bush's education officials quickly dispatched top aides, who met Friday with Dayton, other top GOP legislative leaders and state education bosses. There, federal officials said Utah would lose $106 million in NCLB funding if it opted out of the program entirely. Dayton and others then had to weigh whether the money would be worth keeping with so many strings attached. We as the state Legislature are not going to be reduced to clerks" for the federal government, Dayton said .... the federal education officials who visited Utah last week simply laughed .... Utah could wind up the big loser if the Bush administration decides to make an example of the state -- by pulling all its federal education funding, for example -- in order to whip other doubters and recalcitrants into line during this no-holds-barred election season" (Salt Lake Tribune, 2/14/04). Another protesting state is New Hampshire, which recently reduced state funding for testing to $1.00.

Alaska is the leader in obeying the NCLB: the gov and his daughter support it, as does the rest of the congressional delegation. The gov says he has fully funded Alaska education. If you believe that, I'd like to sell you the Brooklyn Bridge. What to call a government that gives its K-12 data to USED and its senior data to Wells Fargo Bank? An enemy of the people? Pretty damn dumb? What do you call a national law based on the *poof* Houston education miracle, one of numerous Texas frauds so obvious even the NYT noticed? The gov and his daughter favor fraud? No, I think them just deluded. After all, Alaska schools have no bechtel toilets that pump sewage when flushed. We're not at that level of incompetence yet. As usual, the real losers are the students. I am happy for Alaska's pot smokers, that they have a right to privacy under the state Constitution, that they have civil rights: too damn bad the rest of us don't.


Alaska media can be counted on to obfuscate the issues: Regardless of merit--not based on suddenly coming up to previously articulated standards--ALL 17 unapproved State accountability plans, including Alaska's, were approved en masse, on 10 June 2003, the day before a congressional deadline, in a special Rose Garden ceremony: To the left below, read what Education Week on the Web recently said of that Rose Garden event; to the right below, read what Anchorage Daily News said of the same event.
"Since the plans themselves, and the basis for approving them, are not yet .... available, it's hard to know what to make of it ... we are still negotiating ... said Kentucky ... Iowa's plan was approved, even though the state has no statewide academic-content standards and intends to use off-the-shelf, norm- referenced tests to measure progress ... The department also allowed Nebraska's plan ... [and] Maine's ... and Pennsylvania's [and] Missouri's .... approved on the condition that they were going to do these things" (Olson, 03).

Iowa's plan was that it was planning to take the first steps towards standards-based reform, articulating content standards, a step Alaska took years ago. It seems more sensible to say that USED granted general amnesty in a special ceremony than to say it approved the specifics of the 17 individual state "accountability plans."

Yet--in a fantasy worthy of the New York Times--the Anchorage Daily-News said of the same ceremony: "Sen. Lisa Murkowski has earned ... applause for her hard work in persuading federal education officials that Alaska presents obstacles .... that make ... impractical stringent implementation of the No Child Left Behind Act. Although you would not know it from most news accounts .... Murkowski fought for a waiver ... demanded ... won her point after tangling with education officials right up until the night before the bill was signed ... in the rose garden" (6/19/03).

Credit is certainly due: It took work to orchestrate Secretary Paige's recent trip to Alaska (it also took work, on the part of Alaska's School Board, to craft a quantitative challenge to the NCLB (based on the insight of a recent UA graduate student) and develop a statistical definition of AYP). Crediting Lisa alone is nonsense.


There is no question that Alaska isn't unique: but to look at approval of Alaska's plan in isolation is to miss the forest for the trees: The NCLB has two components, an under-funded accountability part--17 plans for which were approved in the recent Rose Garden ceremony--and a fully funded data collection part, which is little known. There is no question that Alaska isn't smart: one parent who telephoned said that AK was one of the few states whose public schools he'd allow his kids to attend (he was mad about academic time lost to testing, his kids' time wasted).

It is interesting that the PBDMI was renamed EDEN--and interesting the online forms showing the data the states must supply--they take as long to fill out as they did on paper--showing spaces to insert this and that aggregate data: the data entry tables are so convoluted that words like "gender" appear more than a dozen times in a single form. In a digital world--ignorance of reality is not an excuse for the state whose self image is open source but whose government is a province of redmond--where it is easy for anyone who really wants to intrude into any networked system: in this world, this current reality situation, are you comfortable with forms that aren't transparent, on their face understandable? I could imagine questions whose answers could only be obtained w/ the individually identifiable data--which means the individually identifiable data would be in the state.client computer's cache and could, even w/o an overt request for said data, it could be whisked out of the client alaska.state computer to the host.USED computer.

How Like w/ Patriot, the credit goes to the people of Alaska. And so now comes the responsibility. To not act now is to make a decision. If my local newspaper is to be trusted, the courts--remember, part of the "balance of powers"--have recently ruled that an Alaska citizen's right to privacy under the Alaska Constitution is so great as to trump federal marijuana statutes. If the pot smokers have a right to privacy, what would you say about the little kids who currently have no voice?

Not New: CTB's Spring 2003 tech report is out. Table 39 lists inter-rater reliability coefficients for the 9 constructued-response writing items (subtract the number of multiple choice items in Table 1 from the numer of multiple-choice (MC) items to get the number of constructed response items [35-26=9). These coefficiens range from a low of about 70 to highs of about 85% (remarkably, this table indicates that students did not submit responses--questions were left blank--from 15 to 20% of the time for each of the 9 items). And yet the students of Alaska have no substantive right of appeal or due process.

CTB also published a Spring 2002 HSGQE and Benchmark technical report.

  • The technical report--the only empirical clue as to whether--or not--the ctb test scores--ctb also helps Alaska out with its "terra novas"-- mean squat comes out after the present year's test administration (OASIS data would answer questions about the ASD missing one thousand, like whether related to HSGQE scores). The HSGQE is a diploma-sanctioning test--real consequences for real students--so it must be established that the test is an adequate measure of articulated performance objectives (to not do so is to not show common sense ... to not do so in the present context, when LEAs are to be held accountable by those same scores is folly approaching the sublime).

  • To my surprise, CTB flamed me for summarizing its first two technical reports: Alaska parents, Alaska's test publisher writes me, are not the intended audience of its "contract driven" reports--anyone seen the contract? Although technical issues are involved, Alaska parents, in my experience, understand them when they're explained: They understand that, before drawing inferences from test scores, one must first, rationally, establish what the tests measure; they understand it's unreasonable to expect randomized, double-blind, case-control experimental data from a test publisher (a hypertext page is though possible, with links to specifics like test reliability: CTB presents its coefficients alpha and percent inter-rater agreement, validity estimates, and multi-trait multi-method matrices (Campbell & Fiske, 1959)). Should anyone want to find out more about the tests used--or said to--measure "accountability" in Alaska s/he should be able to do so, because the psychometric characteristics of the tests must be public for accountability to occur. Such a site, a technical statement by the publisher of what's available in e-gov format allowing citizen access--not graphic-hog sucking pages that old MBless machines, which many people still use, cannot handle--is particularly useful in the context of the NCLB, consumptive as it is of existing resources, waxing as Alaska's Constitutional Budget Reserve wanes.


The local papers and radio talk-in shows contain an increasing number of critical if not outright hostile discussions of the underfunded accountability part of the NCLB. Often mentioned are the unintended bad side-effects of the increased testing. There is no discussion of the fully-funded--I think its budget even increased in fy04--Performance Based Data Management Initiative. Do you who distrust the accountability part of the NCLB that you understand so well trust the fully-funded data collection initiative you probably don't understand (people's eyes often glaze over at the mention of computers and data, numbers, matrices)? Especially since a file of individually identifiable Alaska K-12 data would be less than 1.4 MB--in the context of the today's common GE 80 MB HDDs, it would hardly be noticeable. As in every other state in the union, the schools enter the individual student data and send it to a central state repository: the schools do not know what happens to the data after being submitted to the state. Does the State know? Do you trust what the state says because it says it? To each of my email buds who've remarked on AK's stand on Patriot, I reply back that the leg deserves no credit (like any shapeshifter, it easily assumes different postures). The credit belongs to the people of Alaska. Go to google--I use it as an example search engine because it is currently the best--and paste in Performance Based Data Management Initiative--or PBDMI or its newer name, EDEN (Education Data Exchange Network).


The state noted, in a document that is no longer online, that "In order to initiate an appeal to a student's Fall 2003 Alaska Scale Score, a re-score of a student's test book is required. An appeal to a student's Alaska Scale Score, in one or more of the Reading, Writing and Mathematics content areas will incur processing charges. A handling charge of $54.00 per student, per content area, is required for each appeal and must be submitted with a purchase order. If the re-scored student's performance level changes from NON- PROFICIENT to PROFICIENT, CTB will provide the District with a new student report, at no charge. If the re-scored student's performance level does not change, CTB will invoice the District for the cost of the re-score process. The Individual Student Report is the only report that may be appealed. Summary reports will not be re-generated to accommodate changed student scores." Why Alaskans are allowing their incompetent leaders to railroad their own kids is beyond my understanding. And to give away the data on your own kids boggles any sense of "fair play" I ever understood. Ignorance needn't be a bastion of the last frontier: functionally it is so.

I hear from a reader that CTB is outsourcing exam construction--go to elance.com: who knows who will be doing what to generate tests items and test scores and test statistics--for which there is no appeal for most students--reflecting Alaska's "unique" performance objectives--it won't be me since I've pissed off the phrass [and elance doesn't work]. Jeez, how long until we found out that Halliburton is Alaska's new testing contractor? DEEDs has proposed replacing CTB as its testing contractor, putting out an rfp last December. A seven-member state committee will review proposals from five other contractors--also w/ preexisting contracts w/ other "large" (read more lucrative) states. The abysmal turn-around time is mentioned as one reason for change, as is--it's only taken 5 years for ak "leaders" to notice--the importance of tests being aligned w/ state standards. Who's going to decide what's a good proposal when no one in AK DEEDs has expertise--or shows leadership in ignoring (let's thank them)--psychometric issues common to all high-stakes testing? having a new test publisher will not magically resolve existing problems. Why now? To raise dust, make it more difficult for people to see what's going on (anyone who says the NCLB doesn't contain provisions written to benefit the test publishers has not read the law). The losers, of course, are the kids. Since your leaders refuse to take responsibility--lead the pack in ignoring reality--who is left? I wish you luck, but I do not think that will suffice. .


New: "Security was tight when Texas State Board of Education members were given results ... from ... new ... test. Guards stood outside their locked meeting room, and board members were asked to sign a secrecy pledge ... 'The results were grim' ... [but] Federal officials ... are satisfied lower standards" aren't an early unintended NCLB byproduct (Dillon, 03).

The Single Nationwide Database

Introduction

According to NCLB Sec. 9531, there is a "Prohibition on nationwide database." However, under the not well known fully funded "Performance-Based Data Management Initiative"--such a database is being created. Searches on the pbdmi are turning up state.us pages like the following:

Nebraska districts used to "enter data ... within their districts and once per month export that data on a disk to the central repository in Lincoln." Now they use a "client/server operation ... utilize a web browser to ... enter data along with generating necessary reports" (August 02).

Oregon says "This year ... grant access to kindergarten teachers to enter data ... due at the Department by Nov. 1" (September 02)

A New Jersey NCLB reference manual says "Federal regulations require LEAs to collect and submit data to the NJDOE. The data is compiled and forwarded to the USDOE."

Before the NCLB, Nebraska backed up its student data on floppies and sent them to its capital in Lincoln: now or soon, from there into a single nationwide database. I'd guess that all the states are extremely pro-active when it comes to the anonymity of student data in their individual home systems--just a hunch (just like I'd guess that that the "N" in "NPMIS" stands for "Nebraska," and one of the "M's" in "MMARS" numbers stands for "Minnesota"--and that there's an "alaska" somewhere in Alaska's OASIS numbers: many abbreviations are easy to figure out). My impression is that the single nationwide database has yet to be completed, that there is still time--but not much--to stop prevent, block transmitting individually identifiable AK student data to USED for inclusion in the single nationwide database.

It was not helpful that the governor's office "announced the retirement of Shirley Holloway, commissioner of the Department of Education and Early Development, effective March 3. Also Thursday, Susan Stitham, chairwoman of the State Department of Education, said she received a message on her answering machine from Murkowski's office telling her she was relieved from duty and thanking her for time served" (Pesznecker, 2/21/03). Not helpful to scatter resource people in these NCLB resource consumptive times.

Some think state bashing--AK and most of the other 49 were already on the accountability bandwagon before the NCLB became law and USED one to whom one is accountable: like a joke I heard when I lived in Montana about why North Dakota dogs had stub noses (from chasing parked cars).

E-gov

You can also go online and read--for yourself--USED describing said database in its own words. Check out an email I sent to the gov (the last "cc:" bounced: I'd assumed, having accepted the seat and the office, she would keep the email address: so much for assumptions). You can paste the blue URLs below into your browser or click them: be sure to read about the "educationadvisor" site belonging to "Evaluation Software," a USED-funded TX data collection company, which mentions variables like "last name" (do a backwards directory crawl and find other HottestTopics like "student IDs"). And don't forget the new USED research office which espouses "cognitive variables."

 a) From: claudia <msdata @ srv.net>
Date: Fri Feb 28, 2003  9:09:59 AM America/Anchorage
To Governor @ gov.state.ak.us
Cc: Lt_governor @ gov.state.ak.us, stevens @ senate.gov, donyoung @ house.gov, 
murkowski @ senate.gov
Subject: a single national USED database containing individually 
identifiable data on every AK K-12 kid

February 28, 2003

Dear Governor Murkowski,

Since DEEDs and the state school board are currently rudderless, I guess that makes you the adult in charge: I write to express my concern about the second and not well known part of the NCLB, the "Performance-Based Data Management Initiative." I speak as a citizen, as someone who regularly conducts and interprets data or statistical analyses, and as someone who has maintained a web page on the HSGQE since its inception in 1998 (the proud mom of a 33-year-old, I am not part of AK K-12).

The goal of this initiative, which is not mentioned in USED's "NCLB Desktop Reference" (www.ed.gov/admins/lead/account/nclbreference/reference.pdf), is a single national database consisting of individually identifiable data --is last name sufficiently specific? (www.educationadvisor.com/documents/OCIO2001/SEAMtgNotesDC_06_03_02.doc) --on every K-12 kid in the country (http://www.whitehouse.gov/omb/budget/fy2003/bud13.html, www.whitehouse.gov/omb/budget/fy2003/bud13.html; commdocs.house.gov/committees/edu/hedcew5-73.000/hedcew5-73.htm ), with data from other federal databases to be merged to it by next year. In one of the preceding URLs, a USED spokesperson disingenuously noted that the states have recognized the value of individual student data for ascertaining student learning: that statement is true insofar as student-level data is the only way to examine student learning (something which Alaska currently does not do).

However, since, amongst other data points, USED vaporized its educational research office (OERI) last November--replacing it with an "Institute of Research" headed by someone who uses scientific-looking graphs to summarize literature reviews--I cannot imagine the intended use of said database being "educational research" It will be, intentionally or not, ignoring Orwellian concerns, a "hot" commercial property, and, once created, were the NCLB later repealed, it will continue to exist, its data being bought and sold. I suspect you are not a statistics or internet user, Governor Murkowski. You may, however, be a science fiction reader, in which case you'll understand what I mean when I say that said database will follow those in it--Alaska kids and every other kid in the country--throughout their lives like a plague of flies in a William Gibson novel. Individually identifiable student data has always existed, heretofore at a local level: I cannot imagine it being anything but abused at a national level.

Sincerely,

Dr. Claudia Krenz
Box 7050
Nikiski, AK 99635


b) From: Governor <office_of_the_governor @ gov.state.ak.us>
Date: Mon Mar 10, 2003  2:48:24 PM America/Anchorage
To: claudia <msdata @ srv.net>
Subject: Re: a single national USED database containing individually  
identifiabledata on every AK K-12 kid
Received: from gov.state.ak.us ([127.0.0.1]) by jnumail1.state.ak.us 
(Netscape Messaging Server 4.15) with ESMTP id HBK4SP00.JN7; Mon, 10 Mar 
2003 14:48:25 -0900 
Message-Id: <3E6D2448.32F37C5A@gov.state.ak.us>
Organization: Alaska Office of the Governor
X-Mailer: Mozilla 4.79 [en]C-CCK-MCD {SillyDog}  (Windows NT 5.0; U)
X-Accept-Language: en
Mime-Version: 1.0

Thank you for writing to Alaska Governor Frank H. Murkowski. The concerns, opinions, and/or information you have sent is important and valuable to the Governor. Although he is for obvious reasons unable to respond to each and every email himself, your message has been received and is being reviewed by the appropriate staff person in this office who can best address your need, suggestion, or comment.


c) From: claudia <msdata @ srv.net>
Date: Mon Mar 17, 2003  7:42:18 AM America/Anchorage
To: Governor <office_of_the_governor @ gov.state.ak.us>
Subject: Re: a single national USED database containing individually 
identifiable data  on every AK K-12 kid
TO: Office of the Governor, Juneau
FROM: Dr. Claudia Krenz, Nikiski

Since the schools are also now otherwise occupied, I am heartened that "the appropriate staff member" is looking into the creation of a nationwide database under the fully funded "Performance-Based Data Management Initiative" part of the NCLB [the most efficient way I've found to search the new law is to go to google advanced, paste in "legislation," "ESEA," your query term (e.g., "privacy"), and "ed.gov" as domain]. Although the www.ed.gov/policy/elsec/leg/esea02/ pages assert the confidentiality and privacy of student records (and FERPA is specific about the aggregated data), I am concerned, because a) The federal government has an atrocious record of protecting data entrusted it (USED all the more since it runs on Microsoft products--why not, in these budget-cutting times, switch to the more reliable and considerably more economical Linux?). b) Despite a general prohibition against such a database (SEC. 9531, www.ed.gov/policy/elsec/leg/esea02/pg112.html#sec9531), the URLs listed in my earlier email speak of entering data and creating a database without restrictions like "coordinating migrant education activities" (SEC. 1308, www.ed.gov/policy/elsec/leg/esea02/pg8.html#sec1308). I for one used to be confident in the NAEP's reported assessment results, less so now (SEC. 411, www.ed.gov/legislation/ESEA02/pg97.html). Although a nationwide database could be used for educational research, it could also be otherwise used: imagine, for example, a new entry in your medical record, "inflamm. of R troch. bursa," being copied into a UHaul database in Reno and then your email "inbox" being flooded with independent advertisements purporting "inflamm. of R troch. bursa" cures (inconsequential insofar as blocked by another spam filter). What does "Nothing in this section shall be construed to ... prohibit the distribution of scientifically or medically true or accurate materials" mean (SEC. 9526, www.ed.gov/policy/elsec/leg/esea02/pg112.html#sec9526)? Less banal uses than this example are also imaginable.

I hope that "the appropriate staff member" understands the NCLB's implications with reference to what used to be under local control: information about individual students--because I don't; the little I've observed, however, from reading USED pages discussing creating such a database (without reference to "migrant education") and state pages talking about data entry (again, without restriction) are a concern. Finally, I found: "Not later than April 30, 2003, the Secretary shall report to the Committee on Health, Education, Labor, and Pensions of the Senate and the Committee on Education and the Workforce of the House of Representatives the Secretary's findings and recommendations regarding the maintenance and transfer of health and educational information for migratory students by the States" ("the Secretary" being former Houston school superintendent Rod Paige). Again, I write to express my concern; only someone of the governor's stature can provide the digital leadership needed to prevent AK K-12 data being transferred into an initially single nationwide database housed in USED computers on the east coast.

What do you think?

Were I a modern K-12 parent--although prohibited under Sec. 9531--I'd worry that if it looks like a duck and quacks like a duck, it might be a duck: are you willing to assume the data collection effort is limited to migrant students (Sec. 1308(b) (www.ed.gov/policy/elsec/leg/esea02/pg8.html#sec1308)? The search engines--yes! internet as a repository library!!--have, as summarized in this section, served up USED pages talking about creating a database; they have served up state pages talking about entering individually identifiable data about their K-12 kids and sending it--through SSLs (makes no noise and wouldn't, especially in Alaska's case, be that big a file by today's standards)--to USED: You're the parent, the adult in charge.

Did you know that, under SEC. 9506, "a private school [or home school] that does not receive funds or services under this Act ... are to be excluded from assessment" (www.ed.gov/policy/elsec/leg/esea02/pg111.html#sec9506)?

Would you be concerned that, strapped for cash--and its education offices currently rudderless--a State might not even notice, that, search as you do, you find no discussion of the implications of the States talking about sending data to USED, and USED talking about creating a database? I'm not sure which "Ferengi Law of Acquisition" it falls into under, but don't you think that the "data warehousing people, who went on to fight for access to individually identifiable medical records--wouldn't you expect them to salivate over a single nationwide public K-12 database? can you imagine its value to spammers (email address is a variable listed in USED's previously mentioned "data dictionary")? Would you automatically trust an education establishment whose definition of "no child" is an aggregated score, an average? one that accepts anything as an "accountability plan" (e.g., Iowa). Do you think the likelihood that a State wouldn't notice would be increased by individually identifiable data being one among many digital "accountability" demands? I think such a database's abuse inevitable once it's created. *Alaska used to favor local control (Hensley, 1981; Foster, 1982). In 1998, the leg, jumping onto the "standards-based reform" movement, passed the Quality Schools Initiative (and I posted the first version of this page).

Ironically, in 1996, AK students outscored students in the home states of the oil execs--whose whining is credited for convincing the leg to pass the initiative: 30% of Alaska's 8th graders scored at or above the proficient level on the venerable National Assessment of Educational Progress (NAEP) math test--compared to only 17% of CA's and 21% of TX's.

*Alaska's performance standards (for writing, reading, and math) are the closest it has to a "gold standard." The courts think that--in the interest of fairness (not to mention common sense)--diploma-sanctioning tests must be aligned with the curriculum (e.g., Debra P. vs. Turlington): the burden of proof lies on CTB, AK's test contractor, to so demonstrate. Demonstrating that a test measures what you want it to measure is an essential but, alone, fairly low standard for meaningfulness. CTB doesn't meet it: its Spring 2000 and Fall 2000 and Spring 2001 technical reports were so lacking in detail one could not tell whether they're about the "HSGQE" or the "AHSGE" (Alabama's exit exam); one hopes that student test scores are a function of their responses--but, given CTB's quality control problems, such an assumption may be unwarranted. This is a problem of fact that will not disappear, because it is unnoticed and not reported (even Hawaii's tests are aligned with its content and performance standards).

*Although the effective date of the HSGQE was postponed, the ship of State nevertheless blindly assigns AK students CTB scores and then aggregates them to assign schools "report cards." Reminds me of the old joke about the physicist, chemist, and economist marooned with a case of canned food but no openers: The physicist started throwing rocks at trajectories that might puncture metal; the chemist started looking for plants that might dissolve metal; and the economist said "Let us pretend the cans are open." Just so, the relationship between Alaska's performance standards and CTB's tests: assuming the one is measured by the other it not helpful.

*In 2001, the federal government passed the "Leave No Child Behind" law (NCLB), jumping onto the standards bandwagon initiated by the states. The first part of the NCLB is called "accountability," and it has the states busy: USED initially rejected the accountability plans of 45 of the 50 states--including AK (4 Feb 03). States are also required to show "adequate yearly progress" (AYP), as defined by USED, most recently in last December's Federal Register. Since the impact of chance increases monotonically with test unreliability--and CTB has not shown evidence supporting the alignment of its tests with AK performance standards--for reasons having to do with the "sampling distribution of the mean" and the "law of large numbers," you'd expect half a state's schools to not show AYP, just by chance. The title "Randomly Accountable" summarizes the issue: there is increasing anecdotal evidence that schools judged meritorious by a variety of measures--including presidential proclamation--are not showing AYP (Dillon, 03; Winerip, 03). IMHO the states are too busy to notice that, just by chance--whose impact is inversely related to reliability--half their schools aren't going to show AYP no matter how well they're doing.

There is a second and not well known part of the NCLB, the "Performance-Based Data Management Initiative," whose goal is a single nationwide database consisting of individually identifiable data on every public K-12 kid in the country. Stands to reason that those most likely to be directly impacted by the existence of such a database would be the most geographically stable. Glad my kid is out of the current data-collection sweep [her UNIX is better than mine, too, so I know she knows how to stay out of trouble].

*As a parent, I had it easy: my daughter was in public junior high when I started my doc work in educational testing and measurement ... whenever an irrational test popped up--the schools had to get parental permission to administer them back then--I'd give my blanket approval (whatever paperwork needed) for her to use her time--testing occurs in real time--as she chose. Your kids though have it harder, the impact of standards-based reform being cumulative (3 days a year times 6, that many fewer learning days [minus, too, "test prep" time])

 

 

Nikiski Elementary: A School that Worked

Nikiski Elementary is a school that "worked," still "works," and will likely close, perhaps even for the best of its students (little elementary kids grow into bigger junior high kids). The "No Child Left Behind" (NCLB) law requires reading by 3rd grade: I've personally observed Nikiski Elementary kids reading by Halloween of lst grade. I overheard in a grocery line that at least part of first grade consisted of reading (Great book you've selected; finished? read to yourself, read aloud, listen to someone reading, start a new book, here's another pass to the library, choose the book you like best, reread it, here's a new book by your author). I'm told that, in second grade, starting to read "chapter books" was perhaps a hotter playground topic than starting to write cursive. I remember walking through a building where all the room doors were kid renditions of book covers (and in town I've seen kids wearing t-shirts so decorated). I've heard people talking about kids looking forward to reading to their principal. Nikiski Elementary is IMHO a community of learners (what else but "learners" to call teachers coming together to brainstorm ways to help a particular kid, "critical thinking" being one of its hallmarks) worthy of note ... Perhaps all Alaska K-6 schools work the same. Nikiski Elementary will not be counted; it will not be left behind; it will be gone, probably less noted and remembered than Abraham Lincoln's words (the school budget here is bleak). But I bet those young bullfrogs will fare well (stands to reason that elementary kids who take pleasure in reading will grow into junior high kids who still enjoy reading--and read better then than they did when they were younger), which is I believe called "learning:" The NCLB talks about learning, but its accountability provisions don't measure the learning of any child, indeed "no child."

 

 

HSGQE: Spring 2002 Retrospectively

Spring 2002 aggregated HSGQE test scores are now online. This single file consists of a series of alphabetically organized unnumbered tables, school aggregate by school aggregate (measured in situ). Each row (reading, writing, math) has columns for "number and % proficient," "number and % not proficient," "Oct. 1 enrollment," and "Participation rate." Each table has two footnotes: the first saying that the spring 2002 (11 month ago) results are not comparable to those of spring 2001 (23 months ago), because the test was "refocused" and the second that "PR is calculated by dividing the total count of students tested by the October, 2001 enrollment." Some tables have a footnote stating that results aren't reported because reporting them would violate confidentiality (FERPA underscores the importance of preserving student; see Nash, 02 for an excellent discussion of these issues). The "participation rate" shown on the third page is surprising: "114.3%"? Do these numbers say anything about Alaska students?

Not until an empirical link between the HSGQE and state performance standards has been established. It's possible [that's what I learned how to do in grad school]; it should be part of CTB's technical reports to the state--but its submitted reports are too vague to be useful ... I recently came across an article about CTB "quality control problems" [read and be astonished that CTB's technical reports have gone 100% uncommented upon]: All the more reason not to take HSGQE results on faith.

Until such a link is established let us call the HSGQE "homebrew." These numbers in any case shed no light on individual achievement: they're just aggregated in situ scores, just like that part of the NCLB requiring "accountability" through aggregated school "report cards."

 

 

HSGQE: Development
1998. Alaska State legislature passed law requiring that, from 2002 onwards, high school seniors pass a minimum competency examination (the HSGQE) to receive a diploma. Most of the other 49 had already done so, Texas for example over a decade earlier.
*The content of this examination is grounded in Alaska's educational vision and goals for its youth, which educators translated into more specific content standards--originally for English/language arts, math, science, geography, history, healthy life skills, arts, world languages, and technology.

Alaska has also developed performance standards in three of these domains: reading, writing, and math, whose final versions were approved by the Board in January 1999).

I don't show them here but, in my web scavenging, I came across numerous DEEDs reviews, meeting minutes, agenda items, reports, and whatnot. It takes a lot of work to develop specifications like these.

It is from the performance standards in these three content domains (reading, writing, and math) that HSGQE items were developed. Thus, although Alaska has articulated content standards for, say, geography, there are no geography items on the HSGQE.

The performance standards in each of the three domains were broken into four age groups:

*5 to 7,
*8 to 10,
*11 to 14, and
*15 to 18 years.

*Alaska planned, from the beginning, to use test results to rank individual schools and districts.1 The school accountability section of the 1998 Alaska Statutes (Sec. 14.03.123) states that

Beginning in August 2002 ... each public school in each district will be given a performance category .... based on multiple student measures including student achievement [my emphasis].

I have yet to locate any information about the nature of these other measures.

In 1998, Alaska budgeted 1.5 to develop the high school exit exam and $16 to cover next year's cost of the new education funding statute (Information Exchange, 25(12), 98). CTB-McGraw Hill was awarded the contract to develop the test (Information Exchange, 24 (21), 98). The proposed test-development schedule was:

1.Item development and selection in the spring of 1998;
2.Field testing during the 1998-1999 school year;
3.Final item pool development during the 1999-2000 school year;
4.The examination administered to approximately 10,000 tenth graders in spring 2000.

CTB/McGraw Hill, of course, develops high school exams for other states.

1999. Initial HSGQE items field tested. During this period, Alaska high school students spent three days taking a draft HSGQE: they were not, however, assigned scores based on their test performance. On each subtest--reading, writing, and math--they answered a combination of multiple-choice and constructed-response (short and long) items. CTB presumably used these Alaska student data to select and refine items for the first official HSGQE--but then, of course, reading, writing and math are the same here as outside.

 

 

HSGQE: Administration

2000. HSGQE administered as diploma sanctioning test. This time Alaska students were assigned scores based on their HSGQE test performance, and those scores were posted on their transcripts. Although CTB has stated that, at any one time, only one form or version of the HSGQE was administered, it's likely that the tests between Spring 2000 and Fall 2001 were somewhat different (being a test publisher, it is expected that CTB would use banks of items that had been equated)--and, of course, new test items would be field tested at different administrations.

The reading subtest was based on the 8 performance standards established by Alaskans; the writing subtest--and the the math, 31, of which 24 were covered on the Spring 2000 test administration.

On its assessment information page, DEEDs lists proposed and actual testing dates from 1999 to 2005 (another posting, for the year 2000, shows how much coordination and thus time is required to administer these tests). CTB scored the tests, no doubt spending less time on the multiple choice and more on the HSGQE constructed response items (check out an overview).

Alaskans set cut scores. CTB rank ordered each HSGQE subtest's individual items by difficulty level (based, presumably, on the Spring 2000 data) and presented these results, in booklet form, to three committees of Alaskans charged with setting the "cut" or lowest passing score--DEEDs has posted a description of the process it presented to the Legislature. One Alaska parent, who sat on the HSGQE math cut scoring setting subcommittee, took notes (Weiss, 00):

The subcommittee contained few math teachers or people with math backgrounds: its members first took the HSGQE and were then given answer sheets and scoring rubrics. They were next given a booklet of the rank ordered math items and asked to independently draw a line, within their booklet, at the place where they thought the cut score should be, i.e., where they no longer found items they thought it reasonable to expect Alaska seniors to know. These first-round results were discussed, followed by two more rounds of individual bookmarking and discussion; as typically occurs, cut scores converged over the three rounds. Some subcommittee members disagreed with the difficulty ordering presented in the booklet, noting that some items at the hard end of the spectrum seemed easy and some easy ones, hard--concluding that the rank ordering in the booklet was likely due to its being based on scores from "sophomores who hadn't taken a lot of the math courses that are covered in the test itself."

Cut scores adjusted to account for the "Standard Error of Measurement." The SEM is a mathematical expression of error: Tests are not perfect, and taking that fact into account is just common sense. Think of an individual student's score as being comprised of two parts, truth and error. In the context of the HSGQE, the former refers to the student's mastery of Alaska's performance standards and the latter, to anything influencing the student's score that's not related to his/her mastery.

Some error is systematic--often called "bias:" Wouldn't you expect that students who had had the opportunity to learn the material covered by the test to do better than those who hadn't? Some error is random: wouldn't you expect that a student whose cat hadn't died the day before the test would do better than a student of equal ability whose cat had? The SEM is a way of statistically quantifying that error. Using a computationally simplified example, suppose the cut score for a test was set at 355 and its SEM calculated to be 5, that you got a score of 360, someone else got a score of 350, and I, a score of 340. Under classical test theory, it is expected that--were the test taken an infinite number of times--your true score would fall in the interval 360 ± 5, the other student's, in the interval 350 ± 5, and mine, 340 ± 5. There'd be no question that you passed and I flunked, but what about that student whose true score is expected to occur in the same interval as the cut score? Should the inherent error in every test be taken into account when deciding who fails? Would you be making distinctions an eighth of an inch wide if the markings on your ruler were a quarter-inch thick? No, I wouldn't either. Adjusting the cut score to take measurement error into account makes sense.

Taking error into account is not "dumbing down" the test: Assuming that student test scores perfectly reflect student achievement is though dumb.

Expect error to creep into the estimation of true scores from the most irrelevant of differences: suppose that one school refers to the amount that remains after one quantity is subtracted from another as the "remainder" and another as the "difference;" suppose also that a state cares only whether students can derive the correct answer when subtracting one number from another --not what that answer is labeled; suppose now that "difference" is the word used in the state's math subtest ... all things being equal, wouldn't you expect students from the first school to have a greater chance of being confused by the wording of the test questions than those in the second school--just because their subtraction lessons had used a synonym to the word actually used on the test ("adjectives" and "modifiers" offering another illustration)? Factors influencing obtained scores--whether individually or aggregated at the school level--that are irrelevant to a state's performance objectives add error to those scores. Without as much as looking out the window, one can conclude that error's an inevitable component of test scores --and that its existence should be taken into account when interpreting them.

Expect error even in quantitative measures of error: a test's SEM is after all an average of scores aggregated over all students. Would a particular group's SEM be different from the overall SEM? This is an answerable question in the sense that the SEM can be computationally estimated--and reasonably so when error is evenly distributed--by multiplying the standard deviation of the group's individual test scores by the square root of the quantity (1 minus its test reliability coefficient). I'd want to consider that plus what the data suggest about a test's reliability and validity before interpreting results obtained by administering it, to examine the variability in group scores before interpreting a measure of central tendency like a mean or median.

2001. Alaska State Legislature postponed HSGQE's effective date from 2002 to 2004. Alaska sophomores taking it the previous Spring--when they were taking it-- thought they had to pass it to get their diplomas in 2002--they didn't know that the legislature was going to grandfather them--and students slated to graduate in 2003--out.

Some students in that initial cohort dropped out of high school, convinced by their HSGQE scores that they couldn't make the grade. I don't think anyone knows how many.

Although students graduating before 2004 were no longer required to pass the HSGQE, they still had to take it: Those who did not pass continued to take it, and results were posted on their diplomas and transcripts (Information Exchange, 29 (8), 29 Mar 01). By current Alaska law, students completing high school are now required to pass the HSGQE to receive a high school "diploma;" those not doing so are to receive a "certificate of achievement" (initially called a "certificate of attendance").

The State Legislature also somewhat changed the emphasis of HSGQEs to be administered from Spring 2002 onwards (not surprisingly, the biggest changes were in the math subtest). DEEDs has posted practice tests for the current HSGQE [and the individual benchmarks: reading (grades 3 and 6), writing (grades 3 and 6), math (grades 3 and 6), and all three (grade 8)]. DEEDs has also posted a discussion of accommodations that could be expected (in particular, guidelines for participation of special education and LEP students and a form for petitioning a change in testing location). Also online were lists of Quality Schools contacts, Alaska teacher, administrator, and school standards. DEEDs' Statistics Home Page (which includes many reports, among them educational "report cards" for the pre-HSGQE years, concluding with 1998-99), and School Designator Committee minutes (Feb 22-23, April 13-14, Oct 25-26 2000; Jan 16-17, March 28-29 2001). The School Designator Committee is charged with labeling Alaska's schools, based at least in part on HSGQE and benchmark test results.

2002. New version of HSGQE administered. "On the math test, geometry [is] worth 10 percent ... instead of 21 percent" (Peninsula Clarion, 27 Sept 01). Note that the original 31 math performance standards were collapsed into 6 broad categories.

Since test emphasis changed, Alaskans needed to set new cut scores (Information Exchange, 30(15), 19 Jun 02) and CTB, to analyze the resulting data. DEEDs is currently developing a new division of accountability and assessment (out of current staff) and has requested more monies to meet its assessment mandates: Alaska's assessment contract with CTB "increased by $498,900 for FY 2002 and by an additional $770,000 for FY 2003" (Information Exchange, 30(10), 26 Apr 02).

Alaska State Legislature postpones labeling of individual schools from 2002 to 2004. Since the School Designator Committee thought the available basis for labeling so incomplete that it did not want to see the labels applied, this seems a sound move (Peszsnecker, 02).

 

 

HSGQEs Spring 2000 to Fall 2001: Student performance

Student passing rates aggregated 2 statewide are available for Spring and Fall 2000 and Spring 2001. Links to other score breakdowns--by district, ethnicity, and gender--are posted under "Assessment Results" on DEEDs' Assessment Site Map.

What do these scores mean? There are many possible explanations. High failure rates--certainly the case for the HSGQE math subtest--would be expected

Similarly, high failure rates within a particular group would be expected if test items were biased against it, purposefully or accidentally. Or if its testing place was flooded on testing day or if it was in the middle of a chickenpox outbreak or any of the preceding. But why bother trying to interpret test-score differences without first learning the degree to which the test is a reasonable measure of what you want measured? Is the HSGQE "reliable" and "valid"? To rule out competing and equally compelling explanations of student performance, the HSGQE must be shown to be a reliable and valid measure of Alaska's articulated performance objectives.

 

 

HSGQEs Spring 2000 to Fall 2001: What do the test scores mean?

I've looked, online and off, and found little supporting the realiability and validity of the HSGQE. The publicly available evidence suggests rather the opposite. The data presented in CTB's two technical reports--covering the Spring 2000 and Fall 2000 and Spring 2001 test administrations--are not promising. CTB, for example, presents its opportunities-to-teach and to-learn surveys in support of the HSGQE's curricular validity, but their data are difficult to interpret both because specifics like survey dates and response rates are not stated and because they're not organized by the same performance standards. 3

RELIABILITY

Statistical data about the reliability of the HSGQE show that the percentage agreement rate between the first pair of CTB employees scoring each constructed response, Spring 2000, ranged from a high of 75% to a low of 40% and, Fall 2000, from a high of 50% to a low of 20% (Table 17 in the lst and Table 39 in the 2nd CTB report). These low rates of inter-rater reliability (the reports say nothing about how scorer disagreements were resolved) are a concern, because about a fourth of the total reading, a fifth of the math, and an eighth of the writing items were constructed responses. Inter-rater reliability is, however, but one form of reliability.

The HSGQEs' coefficients alpha--which show how well a test hangs together--are close to or greater than .9 (shown in Table 10 in the lst and Tables 6 and 36 in the 2nd), 1.0 being perfect. Alphas for the writing subtest were a tad lower than those for reading and math across all three test administrations.

Formally speaking, a test's validity coefficient cannot be larger than the square root of its reliability coefficient. CTB's technical reports suggest a coefficient somewhere in the interval between the square root of the HSGQE's inter-rater and alpha reliability coefficients.

VALIDITY

Another study available online 4 raises questions about the HSGQE's construct and predictive validity (based on year 2000 Anchorage data). This study compared nationally normed CAT percentiles with HSGQE cut scores: the corresponding CAT percentiles were the 25th, 60th and 91st for reading, writing, and math. The authors reasoned that, in a national sample of sophomores,

about 75% would have passed the state's reading test, about 40% would have passed the state's writing test, and about 19% would have passed the state's math test.

A minimum competency exam would not be soso on writing, easy on reading and super hard on math. At the very least, this study calls into question the HSGQE administered Spring 2000.
 

  In summary, in 1998 the Alaska State legislature passed a law requiring that Alaska students pass a qualifying exam to receive a diploma. In 1999 students took a draft HSGQE but were not assigned scores. In 2000 the HSGQE was first administered as a diploma sanctioning test. In 2001 the legislature changed the HSGQE's effective date to 2004. In 2002 a new version of HSGQE was administered and the legislature postponed the labeling of individual schools until 2004.

Some argue that labeling is a bad idea to begin with: Schools all agree are in trouble will be further burdened by being labeled "Deficient" and "In Crisis" (McCoy, 02).

Go to the top of this page and search on the word housing to get a taste of what's to come (and has already happened outside).

Another problem is that a school's test results are confounded with its demographic characteristics like Title 1 status (Figlio, 02): if not, why not "take the teachers and staff ... of the most successful school ... and exchange them with ... the least successful school" and then later retest (Berkowitz, 02)?

It costs money, and it takes time to take and administer the HSGQE. Yet the preponderance of publicly available evidence suggests that the initial HSGQE was not a reliable and valid test. Assigning students scores according to their performance on an unreliable test is conceptually similar to assigning them scores based on the spins of roulette wheels ... It does not seem inappropriate to advocate a new civil liberty: freedom from irrational testing.

  What do the numbers being posted on their transcripts mean?

  • If they're from the first three HSGQE administrations--Spring 2000, Fall 2000, Spring 2001--the available data suggest the numbers do not mean much: it's illogical to think an unreliable test a good measure of any kind of academic performance. One Alaska parent commented on the actual lessons being taught by the HSGQE (Spangler, 00). It has been sad--I posted the first version of this page in 1998--to watch Alaska repeat the mistakes earlier made outside. If not "world class" standards and students, why not "world class" tests?

  • If they're from the fourth or fifth administrations--Fall 2001 and Spring 2002--we don't know: I at least have found nothing about the Fall 2001 administration. And it has yet to be determined who passed the Spring 2002 HSGQE.

 

 

 

Where are we now?

One thing we do know is that the HSGQE isn't going to go away, given the new Federal Leave No Child Behind law. 6 Alaska already meets many NCLB requirements: It has reading and math tests for grades 3 and 8. It has already articulated content standards for science--and will need to articulate performance objectives (as it has already had experience doing by setting them for reading, writing, and math) and test them. DEEDs has been submitting "report cards" to the public for years, and the School Designator Committee is already developing a labeling system for schools. DEEDs has already added National Assessment of Educational Progress (NAEP) links to its Assessment web site. Alaska will also need to define basic, proficient, and advanced levels for the three subjects on both benchmark tests (as it has already had experience doing by setting cut scores for the HSGQE),

For now, proficiency at the 4th, 5th, 7th, and 9th grades has been defined (by whom I know not): at least 61% of a school's students must correctly answer at least half the questions asked on CTB's off-the-shelf tests, the CAT and Terra Nova (Peszsnecker, 02b). Schools with less than 60% answering correctly are labeled as having made "no progress" and publicly identified, 50 so far. Results from the 90 schools with less than 11 students are not reported to maintain student confidentiality.

It is assumed that Alaska will meet all NCLB reporting requirements.

Like the other 49, Alaska will send the U.S. Department of Education blocks of text and links--in whatever the specified e-format--which will be pasted between chunks of USED text-and-link boilerplate to create 50 annually updated web pages available at the nclb.gov web site.

Like the other 49, Alaska will struggle with problems like a lack of viable alternatives: whether you live in an area where there is no other school in your "district" or in a city where there are none with empty seats, you have no choice. Tutoring, however, is currently being provided in many areas by "'supplemental' educational service providers."

It is to be hoped that Alaska will accord the numbers required by the Feds the attention they merit--and that Alaska will do what's needed to make sure that student test results accurately reflect the state of education in Alaska: Tests should be shown to be aligned with the curriculum--and to be reliable and valid measures of it [nothing's perfect, but the questions at least should be asked]. It is also to be hoped--when addressing questions like whether a test item is biased for or against a particular group--that Alaska will use ethnicity categorizations descriptive of its student population instead of the generic printed ones from the Census 2000 form.

In their only two publicly available technical documents about the HSGQE and benchmarks, CTB divides Alaska's 134,391 K-12 public students into five categories: "African-American, Asian-Pacific Islanders, Hispanic, Alaskan Native, and White." This does not make sense, because the corresponding percentages are 4.5, 5.2, 3.2, 24.9, and 62.2 (NAEP state profile, 00). Grouping culturally and linguistically diverse groups--that constitute a quarter of Alaska's students--into a single "Alaskan Native" category doesn't make sense statistically either: within group diversity would be expected to increase statistical variabiity, which would make detecting biased items more difficult.

Finally, it is to be hoped that Alaska will look at passing rates at the level of individual performance standards--seems more useful to say "good on a/work on b" than "you passed/flunked."


 

 

TAAS: Constructing an exit exam

I overview some of the issues involved in constructing an exit examination like Alaska's with links to another test, the Texas Assessment of Academic Skills (TAAS), which, incidentally, is made by the Psychological Corporation. I do this because Texas has required a high school exit examination since 1990. I describe the steps in developing their test only to overview the process--most definitely not to suggest that Texas sets an example that Alaska should emulate.

Not that Texas would so agree: "In our perspective, Texas is the model of the nation for standards and accountability," said Linda Edwards, a spokesperson for Governor Bush (Houston Press, 98), who has been a strong TAAS supporter for at least several years (Houston Chronicle, 15 Oct 97).

Texas uses results to distinguish between: 1) students above the minimum competency level and those below it and 2) "low" and "high" performing schools. Texas' current officials treat these aggregated scores as "report cards" for the different teachers, schools, and districts--and encourage Texas citizens to do likewise.

Exit exams are not inexpensive: , including 47 for yearly costs like developing new test questions and for printing, grading and distributing the TAAS; 55 for cash bonuses--$750 each for educators whose aggregated TAAS scores were in the top quartile--and the rest for developing performance standards in additional content areas and testing more students (Brooks, P.A., Austin American-Statesman, 13 Dec 98).

The test development process begins with the articulation of specific goals. Those goals form a "blueprint" from which the individual items on the test are constructed: If one cares only about apples, then questions about oranges are not appropriate.
After an exam question, i.e., test item, is developed, it's reviewed: is it relevant to the State's stated goals? is it unbiased? These questions must be answered whatever the response format of the item. Items that "pass" initial muster are combined to form tests which are then administered to students. New items are continually being developed because multiple versions of the test are necessary. These items are "tested" by including them in student test booklets but not in the calculation of student scores.

Although I found nothing stating that there would be multiple versions of the HSGQE, I can't imagine that there wouldn't be, particularly from year to year.

Naturally, of course, if you've got different tests, you've got to equate them: It wouldn't do for Johnny to get the "easy" version and Suzy the "hard" one.

Using reasonable tests does not solve all problems. It is also essential that tests be sufficiently reliable and valid. If, for example, a written essay is to be scored, will different judges grade it the same? Does writing an essay have anything to do with the state's goal of increasing literacy?

The preceding are technical questions that must be answered empirically. If the answer to any of these questions is not affirmative, then the test is garbage--and, as we know, garbage in, garbage out.

Even mundane details of test administration cannot be left to chance. It won't do to have some students get 45 minutes to take the test while others get however much time they want. Nor would it be acceptable if one student's test were lost or confused with another's: a lot of paper is generated when every graduating senior in a state takes a test (wonder how it'd work out in pounds): someone has to keep it organized. Obviously, test administration includes (or should) quality assurance measures. What happens after the tests are administered is of equal concern. Take a recent example from New York State, where 8,000 students were mistakenly sent to summer school and 3,500 held back because of incorrectly reported scores (Saltpeter, J. & Foster K., TechLearning, n.d.)

Another decision to be made somewhere along the line is what's a "passing" score? Cut scores must be set because an exit examination is a "minimum competency" examination. The State does not care how George did relative to Dmitri--only how well George did relative to the cut score--which itself is usually determined somewhat arbitrarily: well, let's see, how about, say, 70%?...no, too low...75%...Great!

The State also needs to determine what to report to students and the public. How about test items themselves? What about the raw data? I'm still looking for last year's HSGQE data (with, of course, all identifying information removed). No one has refused my request--but you know how email tag works. Texas, on the other hand, has a page to welcome researchers (if you're going to have to make something public, you might as well do it gracefully).

In Texas, actual scores are reported to students, then aggregated over classrooms, schools, and districts and made freely available to all. If memory serves, Texas did not release actual test items until a sufficient number of people were sufficiently annoyed; currently, they release the previous year's items (indeed, you can download all of the 1999 TAAS tests--and read discussions of their accuracy).

And for awhile, thanks to the Texas Business and Education Alliance, you could take a version of the TAAS online. I took the test and show, below, a question I was asked.

  3. A teacher took 6.25 gallons of water on a field trip for her 25
   students and 2 chaperones to drink. By noon, the students drank
   2.75 gallons of water. How much water was left for the remainder
   of the field trip for the students and chaperones to drink?

  A. 4.5 gallons
  B. 3.5 gallons
  C. 9 gallons
  D. 3.75 gallons
  E. Not Here

Much to my astonishment, "A" was scored as the "correct" answer. Just goes to emphasize that the devil is always in the details.

Keep in mind, too, that as the stakes increase so does the pressure. No one, of course, encourages anyone else to cheat. The pressure to do well on exit exams like the TAAS has though ostensibly led some school administrators to do just that:
My personal favorite is the one where there were 5 times more erasures on one school's score sheets compared to others'--that 90% of those erasures were from changing incorrect into correct answers--and that "follow-up investigation of older tests by a forensic scientist showed tests had been tampered with for years" (Bushweller, K., American School Board Journal, 97). The school with the erasures was in Maryland, and the test was the Iowa Test of Basic Skills. Other states also have their share of test tamperers--among them, Florida (Miami-Herald, 6 Nov 98) and Texas (Philadelphia Inquirer, 7 April 99).

In summary, the basis for any exit exam is goals. What do you think is important to test? The ongoing translation of goals into test items is by no means a straightforward process. Many technical criteria must be met before we can conclude that the test means anything at all.  If the test is not technically adequate or if it is improperly administered, then results are garbage.

 

 

TAAS: Living with an exit exam

Texans do not agree that their exit exam is a valuable part of their educational system:
*State officials are solidly behind the TAAS.

The Texas Commissioner of Education thinks the state of education in the State of Texas is good and wants to expand the use of the TAAS (Press Release 98).

Indeed, part of his recent proposal included monetary bonuses for teachers with high scoring students.

State officials were also quick to defend the TAAS against bias charges brought by the Mexican-American Legal Defense and Education Fund (Amarillo Globe-News, 24 Oct 97).

Keep in mind that what we're talking about here is elected officials. The 1998 Texas Democratic Party Platform said to "Use the TAAS test as one measuring stick of success, not the sole factor in determining whether a child passes from one grade to the next, and rely on the classroom teacher's judgment of the child's overall performance to determine whether a child advances to the next grade ...." But then the Democrats didn't get elected.

*The Texas Public Policy Foundation (98), thinks Texas education is going down the tubes, pointing out that

*more and more children are being classified as special ed and thus exempt from taking the TAAS.

*while Texas students' TAAS scores are increasing, their SAT scores are decreasing--even as the SAT itself is becoming easier.

Others note that the TAAS itself is focused at too low a level. Examples of old TAAS items like

At a restaurant Steve ordered food totaling $6.85. If he paid with a $20 bill, how much change should he receive?

sure seem more akin to 6th grade. Maybe better items is what Texas ex-Governor Bush meant when he said he wanted a more rigorous TAAS (Stutz, T., Dallas Morning News, 5 Dec 98). Ironically, the questions on a high school math exam parody are more challenging.

Texans do agree that Texas education orbits around the TAAS.

  Conceptually, it was intended to do so.

The TAAS was instituted "in order to provide parents and teachers information about the progress of students on specific objectives; to help identify both outstanding and under performing schools; to provide accountability to taxpayers about the money spent on public education; and to restore the value and meaning of a high school diploma," (TFT Legislative Hotline, 27 May 98).

  If you're a student, you've got to take the test. If you're an educator, your career will be affected by the TAAS:

"Teacher evaluations are linked in part to their school's overall performance. Superintendents are largely judged by their districts' scores" (Melendez, M., Fort Worth Star-Telegram, 98).

It's a numbers game, and concern over TAAS performance dominates numerous educators' discussions: "That is a problem when you're looking at taking "at risk" students first. If you don't think one student makes a difference, go look at Hurst Junior High School where one student made them a low-performing campus one year...It only takes one student to get you there." [my emphasis] (quoted in a report from the Texas Public Policy Institute). Tsk, pesky students.

  Texas itself is plastered with TAAS "stuff" (for lack of a better word).

Don't be surprised to see a list of local schools and their "grades" in the housing section of the classifieds in the Austin American-Statesman or a searchable TAAS database in the Dallas Morning News. TAAS scores influence property values. There's even a cottage industry that's grown up around the TAAS--TAAS preparation materials: AMSCO School Publications and Sleek Software Corporation (with its amazing TAAS tutor) would like to help you prepare for the TAAS (just use your Visa), as would their competitors. There are probably even ads on radio and TV, in magazines and newspapers, maybe on billboards and lamp posts, too.

The orbit around the TAAS is faster for Texas schools with low TAAS scores.

As noted in the Nacogdoches High School Dragon Echo, attempts to raise TAAS scores included both

  • once a week TAAS drills--with disciplinary action taken against students who refused or turned in blank tests (Cannaday, L. & Haynie, M., Dec 97)

  • incentives--ranging from free cokes and "bonus points" (redeemable for credit on regular classroom assignments) to used cars (Hetrick, M., Feb 98).
The curriculum then is reduced to little more than test content. I may be way off base here but I do wonder whether the reason Texas has the lowest high school graduation rate in the nation is because it bores its students out of school.

Texas teachers are encouraged to "teach to the test."
Texas teachers are expected to organize daily instruction around the TAAS (called "teaching to the test"). The State Commissioner of Public Instruction advocates doing so. Other Texans though are diametrically opposed: In a letter to the editor of the Star-Telegram , P. Burkham forcefully restated a point raised for several years (Amarillo Today, 25 July 97):

More than three decades in public school, including several years of dealing firsthand with the TAAS, showed me that timid leaders fearfully follow the "teaching-to-the-test" bit. This is not educating a child. It is appeasing state leaders who don't know which end of the classroom to stand in (Fort Worth Star-Telegram, 16 Feb 99).


In summary, although Texans don't agree about the quality of a Texas education, they do agree that the K-12 system spins around the TAAS. Schools offer students a variety of incentives to "do well" on the TAAS, and teachers are encouraged to "teach to the test," a practice lauded by some and decried by others.


Well, all that's for the Texans to wrestle out. It's their state. And the TAAS is currently a state test.

 

 

Lessons from other states

There are numerous guides and summaries of other states' assessment activities, including a list of which states have exit exams and which don't. Also available are summaries of the whats and hows of assessment in Alaska and the other states in 1997.

If you're interested in exit examinations in general, educational reform, and their consequences from a U.S. national perspective, there's a lot of info out on the web.

From a historical perspective, you may want to check out an 1880 Illinois teacher's exam.

You may also want to read what contemporary educators are saying about exit examinations in their professional discussion groups and journals. You can also consult Web Resources for Educational Outcomes (99), the Northwest Regional Lab's catalog of assessment models (99). The National Assessment of Educational Progress (NAEP) provides the most accurate assessment of the state of education in the U.S.

It won't take much reading for you to discover that there's a war out there. The war is being fought on many fronts, among them:

  • Curriculum. At present, one of the hottest battles is over mathematics. Do take a look at the Mathematically Correct Home Page, an outgrowth of California parents protesting the "new new math" approach, the protesting parents themselves being an outgrowth of California's jumping onto the "new new math" bandwagon and onward into the achievement cellar. Another extensive site is Alexander Hu's Educational Deform Page, complete with his listing of educational rebels by state.

  • The new Leave No Child Behind Act ... which is, not surprisingly, based on the Texas model (the current Secretary of Education was the superintendent of a Texas school district before assuming his post). Should there be such a beast? At the very least, the costs of constructing and administering it will be subtracted from instructional dollars (also true for state exit exams). It's interesting that, in 1997, when the testing wing of the National Research Council first convened to discuss a national test, their focus was as much on minimizing the potential risks and unintended consequences of the testing as it was on maximizing its intended benefits. One of those unintended consequences would be possible homogenization of the curriculum across the country. And, beyond that, whose  curriculum anyway? A U.S. Federal site lauded a newly developed national math test, while others condemned it as worse than no test at all (Clopton, P., 98), i.e., the mathematically but not politically correct people protested the "new new math" emphasis of the test. Another unintended consequence occurred when Texas first implemented its exam, back in the 1990s: the high school drop out rate increased (Haney, 01), not a positive outcome.

  • How test scores are computed and used. An Educational Testing Service essay on testing in the schools (too much of the wrong kind) makes a particularly insightful point (Barton, P.B., 95): since the goal is to educate the individual child, the appropriate measure is knowledge gain in that  individual child, not everyone's scores pooled across classrooms, grades, schools, and districts. Would we be surprised to find both exceptionally high and exceptionally low scoring students at both affluent and impoverished schools? No. Suppose that the high scoring students learned nothing while the low scoring students learned a lot? A stew of aggregated scores wouldn't show us this. Would we be surprised to find more high scoring students in affluent than in impoverished schools? No. Aggregated scores simply reflect this and thus are merely tautological. In both instances aggregated scores tell us nothing about how our schools are doing. From a statistical perspective, I'm sure you're curious to know, this means that we need to use a within-subjects design to analyze the data.

It's long past time to Stand and Deliver. Jaime Escalante, where are you now?

Finally, if you want a laugh--I sure need one--read some of the reasons given for the very poor performance of U.S. students compared to those in other countries: one "...major critique—actually more like a dose of Prozac—said the country is doing fine so how could the schools have a problem? 'Low Scores are No Disgrace' soothed one. 'Stupid Students, Smart Economy?' asked another."

 

 

Pre-HSGQE: What we might have learned from other states

First, what about students who pass the exit exam in the 10th grade?

will they endure so many drills that they'll be bored out of school? Why will a single measure, the HSGQE, determine whether students receive certificates of achievement or high school diplomas while schools will be judged by both the HSGQE and other measures (whatever they may be)?

Second, what about "teaching to the test"? Conceptually, it's an idea with a lot going for it: what could be more reasonable than teaching and testing on the content that you expect to be learned? A state wants students to multiply by grade x; teachers in the earlier grades teach students to multiply; students are tested on multiplication problems when they reach grade x.

So what's there to oppose? "Teaching to the test" could lead to problems if

  • the exit examination didn't cover ALL content standards and

    Alaska's HSGQE focuses solely on reading, writing, and math. Alaska has a dozen or so other content domains, like geography, which are not covered in the HSGQE. The so-called 3Rs are a reasonable place to begin--and, with inevitable time and money constraints, one has to begin somewhere (actually, I don't know of a state that hasn't begun with the 3Rs).

  • it was required or expected that all subjects be taught and

    Although I did not find anything on the web addressing Alaska's plans for content areas like geography, which don't appear on the HSGQE, I assume Alaska wants them taught.

  • educators were judged only  on the basis of their students' aggregated exit exam scores.

    If Alaska wants Alaska children to understand subjects like geography, it would be silly to reward and punish educators solely on the basis of aggregated HSGQE scores, which cover only the 3Rs. In brief, if you're only buying beans, it's hard for me to imagine myself planting corn (or at least as much as usual).

    It was certainly North Carolina's experience that untested subjects were deemphasized (Raleigh News-Observer, 5 Feb 98); similarly, why should principals spend money for science enrichment courses when that money could be spent increasing reading, writing and math scores?

There are many other lessons: that we don't seem to have learned any is more than slightly discouraging.

 

 

Footnotes

1 Although ostensibly easy to interpret, school averages can be very misleading, especially over time: Imagine two schools, "Cloud's School for Grrlzz with Low Test Scores" and the nearby "School with Very High Test Scores" (SVHTS); imagine that, as suggested by the school names, Cloud's average test scores were the lower of the two; now suppose that Cloud's closed (couldn't make the rent), and all the Grrlzz went to ... you guessed it: and so, come testing time, SVHTS's average would undoubtedly decrease. It would still be the same great school as before, but its average test scores would not reflect that fact. [back to main text]

2 When monitoring student learning, as was mandated by the Quality Schools Initiative, the proper unit of analysis is, statistically speaking, the individual student, not some aggregated group of scores (ETS made the point around a decade ago). And the proper research design is the split plot, longitudinal not cross-sectional, with individual students across time nested within schools (Hayes, 94; Winer, 71). Breakdowns aggregated by school ignore short and long term demographic changes like students moving between test administrations: When annual cross-sectional school aggregates like these are used, scores from the same student could one year be added to one school's aggregate and another year, another's. That does not make statistical good sense, instead adding unnecessary measurement error. Alaska is not alone in doing dumb things with its exit exam data: the other 49 are doing pretty much the same, some worse. Virginia's leg mandated that its Standards of Learning test results account for students who had initially failed but, after having taken a remediation program, passed ... by adding their scores to the numerator but not the denominator of its SOL performance statistic, which, arithmetically speaking, can only increase the percentage passing--to over 100% (Goldfaber, 02). And, while speaking of the others, who can forget that in the late 19th century Indiana legislated pi to be 3.0, which changed pi not at all but made them look ridiculous. Alaska and the other 49 aren't alone in doing dumb things with numbers. It's hard to top the the beginning of last century, when the powers that were in Turkey switched from arabic to roman numerals--and its end, when one group of U.S. NASA engineers used metric in their Mars Orbiter work while another didn't. [back to main text]

3 In particular:

There were two other problems:

In summary, CTB's two reports address the kinds of issues you'd expect a test publisher to address in a technical report to its client: descriptive statistics, evidence of reliability, bias (lack of), and opportunities-to-teach and to-learn. Some report tables are, not surprisingly, identical, e.g., those showing demographic breakdowns (Tables 21-22 in the lst and Tables 18-19 (and 42-43) in the 2nd), as are formulas and the like; the content of others, also not surprisingly, are different, e.g., those describing raw data (Table 10 in the lst and Tables 6 (and 36) in the 2nd) from the three HSGQE administrations. The data presented are though difficult to interpret, primarily because of their imprecision and inconsistent organization. I wonder, too, about CTB's statistical bias analyses, given that they subdivided Alaska students into the vanilla Census 2000 five. If "world class" standards and students, why not "world class" reports (or at least technically sufficient ones)? [back to main text]

4 As noted in this study, the first opportunity-to-teach survey was only administered to test proctors--not necessarily those teachers doing the teaching (this is not stated in CTB's lst report; nor is the fact that only two thirds of the first group of students tested completed the opportunity-to-learn survey; nor is the date the surveys were administered). Further, while the State Department of Education Technical Advisory Committee faulted the first opportunities to-teach and to-learn questions as being too general, exactly those same questions were asked the next year (although to a broader spectrum of teachers). [back to main text]

5 CTB's intended audience is "the Alaska Department of Education and Early Development ... the Technical Advisory Committee, and CTB/McGraw-Hill (page 3);" CTB states that "No part of this publication may be reproduced or distributed in any form or by any means (page 2);" DEEDs posted CTB's two reports on the web; I assume that puts them in the public domain, but I'm not sure. Is CTB's work for the State of Alaska "work for hire?" It would also be interesting to actually see the actual words in the actual contract(s) between CTB and the State of Alaska. The only thing I know about them came from a reader who leaked me a copy of a report (in, if memory serves, the Alaska Budget Report) about the contract between CTB and the State under which the first HSGQE was administered (Spring 2000); as I remember, there were some major gaffes on the 's part, like not specifying a due date for deliverables like student scores. [back to main text]

6 DEEDs has posted a link to an interesting set of papers from a recent Fordham Foundation conference about the new law: As of this year, Alaska and the other 49 must comply with "HR1 Title I Part A, Sec. 1111 and HR1 Title VI, Part A," the new Federal "Leave No Child Behind" law [NCLB]. The NCLB mandates nationwide testing--with each state articulating its own performance standards, developing its own test, and establishing its own proficiency level--for all 3rd and 8th graders--in reading and math and, within 5 years, science--as well as biennial participation by a sample of 4th and 8th graders in the National Assessment of Educational Progress' reading and math tests (participation in the latter being a new Title 1 requirement). For the first year, the Feds budgeted $387 million to help states articulate their standards, develop and implement their tests, "reward" individual teachers and schools, and provide the "supplemental" educational services required for out-of-compliance schools.

The NCLB's language emphasizes individual children. However, what states are to report is test score aggregates--scores grouped by economic background, ethnicity (Census 2000 printed categories), English proficiency, and disability. States are to report--to the Feds and to their citizens--the annual yearly progress of each aggregated group at all its schools (unless doing so would violate a student's right to anonymity, e.g., DEEDs does not report scores for an aggregated group at a school numbering 10 or less). According to a recent FAQ at the new law's home page, each school's "report card" is to include:

*"Average yearly progress" will be measured by the percentage of a school's students scoring at its state proficient level--in a dozen years all students are "required" to score at or above that level (Goldhaber, 02). Even taking into account that the NCLB's proficient level is equivalent to the NAEP's Basic (Thernstrom, 02), such a goal presumes that no student--that's 0.0000%--in any grade in any classroom in any school in any state will accidentally be off a column on a bubble sheet or deliberately boycott a test.

*The first year a school is identified as needing improvement, it must develop a two-year plan, and all its students can transfer, at its expense, to a higher-performing school "in [its] district;" if out of compliance for three "consecutive years, it remains in school improvement and the district must ... also provide supplemental educational services;" if out of compliance for five years, it "will be identified for restructuring." It appears that states will administer "ayp"-based consequences to their own schools.

The Feds will monitor compliance at least in the sense that dollars will be spent and time expended doing so. Other than that, how could anyone substantively evaluate 50 different sets of reading and math standards measured in 50 different ways by 50 different tests? How even to evaluate content--science knowledge will be tested in 5 years--when some states teach creationism (Gandal, 02)? The NCLB web site explicitly disavows knowledge about the caliber of state-provided data (although it elsewhere states that tests should be "statistically valid and reliable"). On, for example, its official "Education Information for Alaska" page, one sees a) the new law's logo, b) a list of Alaska government officials and DEEDs web site links, and c) the following boilerplate caveat:

The U.S. Department of Education does not ... guarantee the accuracy ... of this outside information .... inclusion of links is not intended to reflect their importance ... endorse any views expressed ... or the organizations sponsoring the sites [my emphasis].

"Outside" in the sense of not being generated by USED, said caveat suggesting that it'll likely just post whatever names and links the states report [e-format being efficient]. A requirement that states make data about the psychometric adequacy of their tests publicly available thus seems unlikely--not even data about the variability within aggregated group scores are required on the NCLB school "report cards." Questions about psychometric defensibility--is a test aligned with its state's curriculum? to what degree is it reliable, valid? is it biased for or against given groups?--are answerable, and it makes sense to answer them before interpreting test score data like aggregated passing rates. It appears unlikely though that doing so will be required. Parents in some states have challenged particular tests and testing practices in the courts for not being psychometrically adequate, two early cases being MALDF's suit against Texas [bias] and Debra P.'s from Florida [opportunity to learn]. [back to main text]

 

 

Bibliography
[authors, dates, titles, & URLs included when available]

Last updated for link rot: 7/07

 Abilene Reporter-News. TAAS Preparation is Big Business (1998)

 Alaska. State Statutes, Sec.14.03.123 (1998)

 Alaska Department of Education and Early Development (DEEDs)
1998 Content Standards   English
  Geography
  Healthy Lifeskills
  History
  Language Arts
  Math
  Science
  Technology
  World Languages
HSGQE   Accommodations (19 Feb 1999)
  Bookmarking Presentation to Legislature (00)
  Test Years 1999 - 2005
  Written Test Scoring Protocols ( FastFacts Overview)
Other    Alaska Standards: Content and Performance Standards
  for Alaska Students (Feb 00) [print]

   Alternative Testing Location petition.

  Commissioner Holloway's Memo to Superintendents:
  Grade 10 Students Required to Take HSGQE (2 Aug 01)

  Educational Vision and Goals

  Funding Recommendations (5 Feb 01)

  Organizational Chart

  Participation Guidelines

  Quality Schools Contacts

  School Designator Committee Minutes (Feb 22-23, April 13-14,
  October 25-26 00; Jan 16-17, March 28-29 01)

  Work Plan: 2002 to 2006 (Mar 01)

Performance
Standards
(Staff)


   Administrators
   Schools
   Teachers
(Students)    Math
   Reading
   Writing

Practice Tests

  HSGQE

  Benchmarks: Reading (grades 3 and 6),
  Writing (grades 3 and 6), and Math (grades 3 and 6);
  Reading, Writing & Math (grade 8)


Statistics

  Report Cards   1998-1999

                      

                       Fall 2000 and Spring 2001

  Aggregated Scores Spring and Fall 2000

                              Spring 2001

Recent Statistical definition of AYP

NCLB page includes times scheduled for audio conferences and written public comments.

  Alaska State Board of Education. Meeting, Anchorage, AK (19 - 21 Mar 03). The smileys and resumes are coded statements for the Murkowski massacre of sitting state school officials.

 Alaska State Board of Education. Resolution 01-2003 (21 Mar 03). NCLB challenge: In addition to language--that alone is enough--the argument is that AK's "assessment plan" encompasses more kids than the NCLB and so is more rigorous

The NCLB's definition of AYP, made law by having been published in last December's Federal Register, stipulates that kids in small schools--as defined by FERPA (see Nash, 02 for original insight into what this means for Alaska)--are assumed to be making AYP, which means, functionally, their scores don't count, don't contribute to an LEAs "bottom line." AK's definition of AYP includes those students.

 Allen, M. and Yen, W. Introduction to Measurement Theory. Belmont, CA: Wadsworth (1979)

 Amling, K. Letter to the Editor, Anchorage Daily News (22 Nov 02)

 Amsco School Publications Home Page

 Anchorage Daily-News Schools Fight, Editorial (19 June 03)

 ASR-CAS Joint Study Group. Making Valid and Reliable Decisions in Determining Adequate Yearly Progress. Washington, D.C.: Chief State School Officers (02)

 Bach, D. Some Students Boycott or Blow off the WASL, Seattle Post-Intelligencer (26 April 03)

 Barton, P.B. Too Much Testing of the Wrong Kind in K-12 Education. ETS: Princeton, NJ (1995)

 Berkowitz, H. Communities Must Step in for Kids, Anchorage Daily News (18 Nov 02)

 Boruch, R. The Virtues of Randomness, EducationNext (Fall 02)

 Brooks, P.A. Austin American-Statesman (13 Dec 1998)

 Buckham, P. Letter to the Editor, Star-Telegram (16 Feb 1999)

 Bushweller, K. Teaching to the Test, American School Board Journal (Sept 1997)

 Campbell, D.T., & Fiske, D.W. Convergent and discriminant validity in the multitrait-multimethod matrix. Psychological Bulletin, 56, 81-105 (1959)

 Canedy, D. Critics of Graduation Exam Threaten Boycott in Florida, New York Times (13 May 03) tourist industry listens

 Cannaday, L.& Haynie, M. Nacodoches Dragon Echo (Dec 1997)

 Center for Educational Reform. Education Reform Nationwide (Summer 1997)

 Clopton, P. Testimony to the U.S. House of Representatives Committee on Education and the Workforce (21 Jan 1998)

 Clopton, P., Bishop, W., & Klein, D. Statewide Mathematics Assessment in Texas

 Cockerham, S. Lawmakers Criticize Education Cost Study, Anchorage Daily-News (2/24/03)

CTB-McGraw Hill Home Page

Test-Item Maps: HSGQE 2000 and 2002 (Math, Reading, Writing); benchmarks (grades 3, 6, and 8)

"Technical Report Benchmark Assessments and the High School Qualifying Exam Spring 2000 Form C" (12 Feb 01) 3

"Technical Report Benchmark Assessments and High School Graduation Qualifying Exams Fall 2000 Form B HSGQE and Spring 2001 Spring Form C Embedded Field Test Data" [no date]

Technical Report Spring 2002 Benchmark Assessments and the High School Qualifying Exam

 Cohen, M. Implementing Title I Standards, Assessments, and Accountability. No Child Left Behind: What Will It Take? Conference, Fordham Foundation, Washington D.C. (Feb 02) [access forbidden]

 Dallas Morning News. TAAS Results Database (1999)

 Dewan, S. What Does it Really Mean to Pass the TAAS? Houston Press (3 April 1998)

 Dillon, S. States Lower Academic Standards to Avoid Penalties, New York Times (25 May 03).

 Dillon, S.. Thousands of Schools May Run Afoul of New Law, New York Times (February 16, 2003)

 Education Week on the Web. Testing Over Time (16 Jun 1999)

 Educational Testing Service. Too Much Testing of the Wrong Kind (6 June 1999)

 Eisenhower Regional Consortia. Key State Policies on K-12 Education: High School Exit Exam and Policies on Graduates' Preparation (1998)

 Federal K-12 Leave No Child Behind Act Home Page

 Fikac, P. State Officials Defend TAAS from Minority Detractors, Amarillo Globe-News (24 Oct 1997)

 Figlio, D. Aggregation and Accountability, No Child Left Behind: What Will It Take? Conference, Fordham Foundation, Washington D.C. (Feb 02) [access forbidden]

 Foster, S. Alaska Board's 'Effective-Schooling' Plan Will Allow Local, Education Week on the Web (26 May 1982)

 Gandal, M. Multiple Choices: How will States Fill in the Blanks in their Testing Systems? No Child Left Behind: What Will It Take? Conference, Fordham Foundation, Washington D.C. (Feb 02) [access forbidden]

 Goldhaber, D. What Might Go Wrong with the Accountability Measures. No Child Left Behind: What Will It Take? Conference, Fordham Foundation, Washington D.C. (Feb 02) [access forbidden]

 Haney, W. Revisiting the Myth of the Texas Miracle in Education: Lessons about Dropout Research and Dropout Prevention, Conference, Dropout Research: Accurate Counts and Positive Interventions, Cambridge, MA. (01)

 Hayes, W. L. Statistics. Fort Worth: Harcourt Brace, 5 Ed. (1994)

 Henriques D. and Steinberg, G. None of the Above, FairTest (20 May 01)

 Hensley, W. Speech by Willie Hensley at Bilingual Conference. Bilingual Conference: Anchorage, AK (Feb 1981).

 Hetrick, M. Nacodoches Dragon Echo (Feb 1998)

 Hu, A. The Critical Education Reform / Deform Page

 Illinois Teacher's Examination (1880)

 Indiana State Legislature. House Bill 246 (1897)

Information Exchange 24 #21 (1998)

25 #12 (18 Dec 1998)

26 #3 (22 Jan 1999)

29 #8 (29 Mar 01)

30 #10 (30 Apr 02)

30 #15 (19 Jun 02)

 Ivey, E. State Exams: Another Hoop to Jump through, or a Means to a Better Education? Frontiersman (30 Sept 02)

 Jaeger, R. and Tucker, C. A Guide to Practice for Title 1 and Beyond. Washington, D.C.: Chief State School Officers (1998)

 Johnson-Lewis, M. Testing Head Start to Death, Black Commentator (20 Feb 03)

 Kane, T., Staiger, D., and Geppert, J. Randomly Accountable, EducationNext (Fall 02)

Kenai Peninsula Borough School Board.

 Krenz, C. Homebrew: Calculating the State of U.S. Education are Calculated (03)

 Krenz, C. Littering the Information Superhighway (02)

 LeMay, I. State Exams: Another Hoop to Jump Through, or a Means to a Better Education? Frontiersman (30 Sept 02)

 Loy, W. BP May Face Fine in Welder's Death: State Agency Claims Oil Company Violated Worker Safety Laws, Anchorage Daily-News (24 May 03)

 Lynch, B. Laws Ignore Students' Real Needs, Anchorage Daily-News (11 Mar 03)

 Mathematically Correct Home Page

 Math Exam Parody

 McCoy, R. One Set of Labels Won't Fit All Schools, Anchorage Daily News (15 Feb 02)

 Melendez, M. Mixed Grades for TAAS, Fort-Worth Star-Telegram (1998)

 Mexican American Legal Defense Fund, Complaint Filed in the United States District Court for the Western District of Texas San Antonio Division (14 Oct 1997)

 Miami-Herald (6 Nov 1998)

 Nash, A. Educational Graduate Qualifying Exams: Social and Economic Factors in the State of Alaska. Master's Project, Department of Resource and Applied Economics, University of Alaska-Fairbanks (July 2002)

National Assessment of Educational Progress (NAEP) Important Aspects of No Child Left Behind Relevant to NAEP (02)

FAQ (02)

Report Cards (Alaska, California, and Texas). (01)

 National Center for Educational Statistics. The Condition of Education 1998 (8 Feb 1999)

 National Council on Measurement in Education. Standards for Educational and Psychological Testing. Washington, DC: American Educational Research Association (1999).

 National Research Council Board on Testing and Assessment. Voluntary National Test Workshop (12 Jun 1997)

No Child Left Behind Act (NCLB) Education Information for Alaska Page (02)

FAQ (02)

Paige Outlines Adequate Yearly Progress Provisions Under "No Child Left Behind" (24 July 02)

Paige Releases Number of Schools in School Improvement in Each State (1 July 02)

Reaching Out...Raising American Indian Achievement (02)

 North Carolina State University. Internet Resources for Higher Education Outcomes Assessment (31 Aug 1999)

 Northwest Regional Educational Lab. Reform Models (4 April 1999)

 Olson, L. All States Get Federal Nod on Key Plans, Education Week on the Web (18 June 03)

 Olson, L. NAEP Board Worries States Excluding Too Many From Tests, Education Week on the Web (19 March 03)

 Paige, R. Dear Colleague Letter (24 July 02)

 Paige, R. Letter (2003). Evalsoft--a TX data collection company--is the only one receiving USED extra-mural funding that I've found.

 Peninsula Clarion. (27 Sept 01)

 Pesznecker, K. Education Commissioner Out, Anchorage Daily News (21 Feb 03)

 Peszsnecker, K. State Tells 50 Schools to Improve, Anchorage Daily News (23 Nov 02)

 Peszsnecker, K. School Designator Committee Runs Out of Time, Anchorage Daily News (11 Apr 02)

 Philadelphia Inquirer. School District and Official Indicted in Test Tampering (7 April 1999)1

 The Psychological Corporation Homepage

 Richard, A. Montana Leads Choir of RuralConcerns Over 'No Child' Law, Education Week on the Web (3 April 03)

 Robbins, M. Teachers Unhappy, Report Says, Amarillo Today (25 Jul 1997)

 Robelen, E. Researchers Helping NAEP Board Find Value in Background Queries, Education Week on the Web (2 Oct 02)

 Robison, C. & Walt, K. Bush Defending TAAS on Heels of Federal Suit, Houston Chronicle (16 Oct 1999)

 Saltpeter, J. & Foster K. TechLearning

 School Ratings, Classified Section, Austin American-Statesman (1998)

 Struck J. State by State Assessment Resources and Released Items, Iowa Educational Services (19 Aug 02)

 Simmons, T., State Weighs Adding Topics to ABCs Tests, Raleigh News-Observer (5 Feb 1998) 2

 Sleek Software's Incredible TAAS Tutor

 Spangler, S. A Parent's Opinion of the Exam, Parents News (Mar 00)

 Stutz, T. Plan Expands TAAS, Adds Bonuses, Dallas Morning News (5 Dec 1998)

 Stevenson, H.W. A TIMSS Primer: Lessons and Implications for U.S. Education, Fordham Report, (July 1998)

 Stofflet, F., Fenton, R., and Straugh, T. Construct and Predictive Validity of the Alaska State High School Graduation Qualifying Examination: First Administration, Conference, American Educational Research Association, Seattle WA, Apr 01

 TAAS Tests from 1998 (available on Texas Representative Scott Hochberg's Website)

 Texas Democratic Party Platform (1998)

Texas Education Agency   Press Release (4 Dec 1998)

  Researcher Page

  Technical Digest
  Chapter 2, "Test Development" (26 Feb 1997)

  Chapter 3, "Test Administration"

  Chapter 4, "Quality Assurance Procedures"

  Chapter 6, "Standards"

   Chapter 8, "Reliability" (26 Feb 1997)

  Chapter 9, "Validity"

  Chapter 10, "Equating"

 Texas Education Consumer's Organization. TAAS: How Accurate Is It?

 TFT Legislative Hotline (27 May 1998)

Texas Public Policy Foundation Design for Mediocrity, a Report on Current Reforms in Texas Schools. (1998)

Public School Choice in Texas - TEA Response.

The True State of Texas Education.

 Thernstrom, M. Comments. No Child Left Behind: What Will It Take? Conference, Fordham Foundation, Washington D.C. (Feb 02) [access forbidden]

 Tompkins, R. Leaving Rural Children Behind, Education Week on the Web (26 March 03) Arkansas, Illinois, Iowa, New Hampshire, New York, North Carolina, Maine, Massachusetts, Mississippi, Montana, Pennsylvania, Tennessee, and Vermont are having problems with the "highly qualified teachers" and AYP parts of the NCLB, some going as far as to say if it's not equitable, it's not constitutional. CA is, of course, clueless, and TX, well suffice it to say there are several happy TX businesses, like Evaluation Software Publishing, Inc. (creator of USED's Federal Data Dictionary-Phrase 1 Texas: Austin [n.d.]).

 U.S. Census Bureau. Alaska Quick Facts. (00)

 U.S. Department of Education. Voluntary National Tests (25 June 1998)

 Viadero, D. Scholars Aim to Connect Studies to Schools' Needs, Education Week on the Web (19 March 03)

 The Weekly Standard, EXAM SCAM - The Latest Education Disaster: Whole Math. (4 April 1997)

 Weiss, T. I Helped Set the Exit Exam Scores. Parents News (Mar 00)

  Whitehurst, G. Research on Teacher Preparation and Professional Development White House Conference on Preparing Tomorrow's Teachers. States that "Contrary to our intuitions and anecdotes ... Coleman's methodology is now understood to have been seriously flawed. All of his analyses were conducted on data that had been aggregated to the school level" (my emphasis); here the new head of the new Institute of Education Sciences--I've got a problem even with the name: education is not "science" like chemistry and physics--both less than 6 months--is saying that aggregating by schools (which is how school "report cards" are created) is methodologically flawed.

 Winer, B.J. Statistical Principles in Experimental Design. NY: McGraw-Hill, 2nd Ed. (1971)

 Winerip, M. Defining Success in Narrow Terms, New York Times (19 Feb 03)



 

 

Author Note

I began this page, little more than an annotated bibliography, winter 1998, my first in Alaska, just after the state legislature enacted its "Quality Schools Initiative." I had numerous interesting conversations with parents and students throughout the state. That there is no money to be made being against accountability testing as practiced in the contemporary U.S. does not though mean it was a "labor of love:" it's damned hard to watch supposedly sane individuals like state legislators do stupid things. It's even worse to be able to do no more than sit by while the state's students are stiffed: personally, I think it is wrong to steal time from children by taking tests of little merit (insufficiently reliable and valid scores) ... and then to stumble on NCLB's pbdmi/eden data collection agenda. I posted this page in the same spirit as anyone else posting useful--to some--information without commercial value: I didn't expect a thank you but was surprised to have been /link/ flamed by the state's first test publisher. Consult other sources for information about Alaska's HSGQE after 2003 (there has been ample time, under NCLB, for numerous schools to have been labeled as "failing"). Last updated for link rot 3/07.



  Alaska CSSA Fall 2003 HSGQE Request Student Report Appeals (2003)

  CTB-McGraw Hill, Technical Report Spring 2003 Benchmark Assessments and the High School Qualifying Exam

  DRC home page: AK's new testing contrator

  Editorial, Salt Lake Tribune (2/14/04)

  Center for Research and Evaluation, University of Maine Kenai Peninsula Borough School District Companion Document for Achievement Results (Jan 03)

  Holthouse, R. , No Child Left Behind is a Public Travesty, Anchorage Daily News (03 Sept 03): Houston TAAS score improvements as unsubstantiated as wmd: AK can't afford the hit

  Herbert, B. Claims to Being Education President is Left Behind, Syndicated Column included in Anchorage Daily News (02 Sept 03): Whitehouse cuts $200 million NCLB accountability funding

 Henriques, D. Errors Fill School Tests, Study Finds, NYT in Anchorage Daily News 02 Sept 03): Massachusetts, Nevada Georgia, Minnesota, New Hampshire expect testing errors to increase

 KCIN Enews Iowa (29 Jan 04)

 Loshbaugh, S., Survey gives high rating to 3 Rs: Results of school district mail-out emphasize importance of academics, safety, the Peninsula Clarion (13 Aug 01). Another UMaine "analysis" from another borough school district survey w/ a response rate of 1,600 out of 7,400 mailed questionnaires.

 Rhodes, K., & Madaus, G. Errors in Standardized Tests: A Systemic Problem, National Board on Educational Testing and Public Policy Monograph. Boston College, Lynch School of Education (2003)

 Toomer-Cook, J. Reject U.S. funds, Orem's Dayton Says, Deseret Morning News (12/18/03)

  Winter, G. California Will Wait Until 2006 to Require High School Graduates to Pass Exit Exam, New York Times (10 July 03)

  USED, Performance Based Data Management Initiative, The Federal Register, Vol. 68, No. 164 / Monday, August 25, 2003