This is one of a multi-part series on the progression of education policies in the U.S. from its founding. Click here to see a list of all the posts in this series.
By Matthew Lynch
During the 1980s, American educators and the public first became concerned in a sweeping way about the quality of education in tax-funded schools. In 1981, the National Commission on Excellence in Education was formed and the group released A Nation at Risk, an in-depth report that warned against the future dangers that could result from mere mediocrity in U.S. public schools. With the help of the national media who sunk its teeth into the story, the report seemed to awaken concern from sea to shining sea – in every school room and around every dining room table. Though the concern should have been purely based on the learning aspect of American students, there were even larger worries that loomed, mainly the future economy. It marked the first time in the history of public schools in America that citizens began to compare students to those in other developed countries like Japan, China and even England. The assumption that America the Beautiful was also the best at everything, including educating its kids, was turned on its head. People started to worry, really worry, about where the youth of the nation would guide them in coming years.
Reform started to take place but on local levels. Schools took it upon themselves to correct the problem of incompetent and uncaring students by adding graduation requirements, and raising teacher salaries. Universities jumped in by heightening the requirements for young educators to earn their degrees in an attempt to give K-12 students an advantage through the resource of stronger teachers. Books written by reformers like Allan Bloom emphasized the need for stronger curriculum that emphasized the ideas behind things, not just the facts. Simply teaching what was on a page was clearly not working when it came to teaching the whole student and getting young Americans prepared for the workforce and citizenship.
One big push of the late 1980s when it came to reform, as pointed out by reformer and educator Diane Ravitch in a piece about the decade in public education, was the elimination of multiple choice tests as forms of assessments. Free response items and short-essay options were beginning to gain favor with educators across the country as true indicators of what students had really comprehended. While multiple choice options started to fade from routine school exams, it is interesting to note that they are still the main way state assessments are delivered today. As a matter of efficiency, these easily scanned answer sheets make the most sense. As a way to truly assess what students do and do not know, they are lacking.
Ravitch also points out in her essay that “one of the most promising developments” of public education in the 1980s was the recognition from the business community that the schools needed its help. Businesses provided funding to schools, set up scholarship programs and looked for other opportunities to give students the help they needed to feel encouraged in their educational pursuits.
That momentum carried over into the 1990s but instead of a renewed dedication to the goals of public education, the American public and reformers looked outside for answers. The phrase “school choice” began to resonate throughout the country, with people wondering what could be done to funnel public dollars to alternatives to public schools. Funding for schools with a religious theme had actually been discussed in the past, over 100 years earlier when it was first suggested that parochial schools receive a government stipend to help with expenses. Fearing the rising Irish Catholic population, state lawmakers put the kibosh on any such plans, citing separation of church and state. As parents began to question the value of the public school education provided to their kids, they began to feel entitled to different choices for their kids when the provided tax-funded school performed under par.
A new ideology began to take shape in the form of charter schools – publicly funded, non-religious schools that were given freedoms to innovate outside the constraints of public school regulations. To some, it seemed like a smart way to provide more educational options while fueling the fire under public schools which up until then, had faced no real competition. To critics, the plan to use taxpayer dollars to fund new schools only focused the money away from the place where it was really needed: actual public schools.
The school choice debate still rages on today, with a renewed call for vouchers for religious schools thrown in the mix. Some states, like Texas and Florida, allow wide-ranging options in school choice while others like Mississippi have virtually none. The effectiveness of all this “choice” is also difficult to determine. According to a Stanford University document, there were 6,000 charter schools in 42 states and the District of Columbia that served 2.3 million students in 2013– a number that rose 80 percent from 2009. Yet the unaccountability that charter schools are afforded, in comparison with traditional public schools, can cause them to be pretty unreliable too. During 2013, 17 charter schools closed in Columbus, Ohio alone. Nine of those schools only hung around for a few months before being forced to shut down. Students who left their own public schools, and even private ones, to give the shiny new charter schools a chance were forced to go back to their original spots.
The closing of 17 public schools in one city is completely unheard of and for good reason. Public schools are investments in more than just the students who attend. They represent the communities where they are established, and even if they have a rough year or two, they continue to strive for excellence for their students. It’s not that charter and other schools of choice do not share this passion for their students, but with the freedom to wander any direction they choose, these schools can be very volatile.
The 1990s also ushered in a new age of accountability in public schools, triggered by the quality concerns in the 1980s. The roots for the No Child Left Behind Act that was established in 2002 were planted in 90s educational reform movements. NCLB was a reenactment of the very outdated Elementary and Secondary Education Act of 1965. Both acts focused on ways to bring higher levels of equality to public education but NCLB also had a strong focus on bolstering student test scores and making teachers liable. In ways that were completely groundbreaking, NCLB put pressure on every educator from top education policy-makers to teachers in the classroom to heighten achievement.
While the basic tenets of NCLB have been scrutinized, and questioned at every turn, the truth remains that they are still a very large part of the educational system in today’s public school classrooms. The release and adoption of Common Core Standards in 2013 took the ideology of NCLB to a new level. Though voluntary based on state, these new accountability measures strike eerily similar to the federally mandated ones of NCLB. Instead of getting away from empty assessments that often take the shape of multiple choice questions it seems that public school systems are simply adding to the void. It’s not a pretty picture but the reality of what educators are facing right now, and it will certainly impact this generation of K-12 students. Follow my series on the progress of the U.S. educational system to learn more about where we’ve been, and where we need to go, as collective educators.