edchat

How Did We Get Here? Part VII: A Nation of Public School Students at Risk

This is one of a multi-part series on the progression of education policies in the U.S. from its founding. Click here to see a list of all the posts in this series

By Matthew Lynch

During the 1980s, American educators and the public first became concerned in a sweeping way about the quality of education in tax-funded schools. In 1981, the National Commission on Excellence in Education was formed and the group released A Nation at Risk, an in-depth report that warned against the future dangers that could result from mere mediocrity in U.S. public schools. With the help of the national media who sunk its teeth into the story, the report seemed to awaken concern from sea to shining sea – in every school room and around every dining room table. Though the concern should have been purely based on the learning aspect of American students, there were even larger worries that loomed, mainly the future economy. It marked the first time in the history of public schools in America that citizens began to compare students to those in other developed countries like Japan, China and even England. The assumption that America the Beautiful was also the best at everything, including educating its kids, was turned on its head. People started to worry, really worry, about where the youth of the nation would guide them in coming years.

Reform started to take place but on local levels. Schools took it upon themselves to correct the problem of incompetent and uncaring students by adding graduation requirements, and raising teacher salaries. Universities jumped in by heightening the requirements for young educators to earn their degrees in an attempt to give K-12 students an advantage through the resource of stronger teachers.  Books written by reformers like Allan Bloom emphasized the need for stronger curriculum that emphasized the ideas behind things, not just the facts. Simply teaching what was on a page was clearly not working when it came to teaching the whole student and getting young Americans prepared for the workforce and citizenship.

One big push of the late 1980s when it came to reform, as pointed out by reformer and educator Diane Ravitch in a piece about the decade in public education, was the elimination of multiple choice tests as forms of assessments. Free response items and short-essay options were beginning to gain favor with educators across the country as true indicators of what students had really comprehended. While multiple choice options started to fade from routine school exams, it is interesting to note that they are still the main way state assessments are delivered today. As a matter of efficiency, these easily scanned answer sheets make the most sense. As a way to truly assess what students do and do not know, they are lacking.

Ravitch also points out in her essay that “one of the most promising developments” of public education in the 1980s was the recognition from the business community that the schools needed its help. Businesses provided funding to schools, set up scholarship programs and looked for other opportunities to give students the help they needed to feel encouraged in their educational pursuits.

That momentum carried over into the 1990s but instead of a renewed dedication to the goals of public education, the American public and reformers looked outside for answers. The phrase “school choice” began to resonate throughout the country, with people wondering what could be done to funnel public dollars to alternatives to public schools. Funding for schools with a religious theme had actually been discussed in the past, over 100 years earlier when it was first suggested that parochial schools receive a government stipend to help with expenses. Fearing the rising Irish Catholic population, state lawmakers put the kibosh on any such plans, citing separation of church and state. As parents began to question the value of the public school education provided to their kids, they began to feel entitled to different choices for their kids when the provided tax-funded school performed under par.

A new ideology began to take shape in the form of charter schools – publicly funded, non-religious schools that were given freedoms to innovate outside the constraints of public school regulations. To some, it seemed like a smart way to provide more educational options while fueling the fire under public schools which up until then, had faced no real competition. To critics, the plan to use taxpayer dollars to fund new schools only focused the money away from the place where it was really needed: actual public schools.

The school choice debate still rages on today, with a renewed call for vouchers for religious schools thrown in the mix. Some states, like Texas and Florida, allow wide-ranging options in school choice while others like Mississippi have virtually none. The effectiveness of all this “choice” is also difficult to determine. According to a Stanford University document, there were 6,000 charter schools in 42 states and the District of Columbia that served 2.3 million students in 2013– a number that rose 80 percent from 2009. Yet the unaccountability that charter schools are afforded, in comparison with traditional public schools, can cause them to be pretty unreliable too. During 2013, 17 charter schools closed in Columbus, Ohio alone. Nine of those schools only hung around for a few months before being forced to shut down. Students who left their own public schools, and even private ones, to give the shiny new charter schools a chance were forced to go back to their original spots.

The closing of 17 public schools in one city is completely unheard of and for good reason. Public schools are investments in more than just the students who attend. They represent the communities where they are established, and even if they have a rough year or two, they continue to strive for excellence for their students. It’s not that charter and other schools of choice do not share this passion for their students, but with the freedom to wander any direction they choose, these schools can be very volatile.

The 1990s also ushered in a new age of accountability in public schools, triggered by the quality concerns in the 1980s. The roots for the No Child Left Behind Act that was established in 2002 were planted in 90s educational reform movements. NCLB was a reenactment of the very outdated Elementary and Secondary Education Act of 1965. Both acts focused on ways to bring higher levels of equality to public education but NCLB also had a strong focus on bolstering student test scores and making teachers liable. In ways that were completely groundbreaking, NCLB put pressure on every educator from top education policy-makers to teachers in the classroom to heighten achievement.

While the basic tenets of NCLB have been scrutinized, and questioned at every turn, the truth remains that they are still a very large part of the educational system in today’s public school classrooms. The release and adoption of Common Core Standards in 2013 took the ideology of NCLB to a new level. Though voluntary based on state, these new accountability measures strike eerily similar to the federally mandated ones of NCLB. Instead of getting away from empty assessments that often take the shape of multiple choice questions it seems that public school systems are simply adding to the void. It’s not a pretty picture but the reality of what educators are facing right now, and it will certainly impact this generation of K-12 students. Follow my series on the progress of the U.S. educational system to learn more about where we’ve been, and where we need to go, as collective educators.

How Did We Get Here? Part VI: Unified, Then Divided, Public Schools

This is one of a multi-part series on the progression of education policies in the U.S. from its founding. Click here to see a list of all the posts in this series

By Matthew Lynch

Public education in the U.S. remained primarily unchanged throughout the first and second World Wars. Improvements in communication, particularly through radio transmissions, brought schools into the worldwide on goings of the battles. Though not part of any textbooks or measured testing, wartime knowledge became something of value in public schools and patriotism grew in its role as a virtue. Unlike the Civil War, which divided the nation, the World Wars in the first half of the 20th century knit the union more tightly together. As millions of men fought outside U.S. soil, women filled in job roles and kids continued to attend school. Going to school and learning was in its own way a sign of solidarity.

The united feel of public education was all but destroyed in the 1950s and 1960s as issues of desegregation plagued the nation. Many Americans cheered the changes, of course, but there were enough opposing desegregation to make it a bleak time in U.S. public school history. If public education was, after all, meant to provide common knowledge, and life skills, in equal ways to all children in America then the theory of “separate but equal” certainly needed to be disposed. Change is difficult though, even in one of the most progressive nations in the world. The World War I and II-era solidarity in public school classrooms faded, replaced for arguably the first time in U.S. history with controversy within the schools. As the adults of the nation debated what should be allowed in schools and what was best for students, the children lived it out.

These two decades mark an important change in the role and perception of public schools in America. Before schools started taking on bigger issues like desegregation, and abuse, and childhood hunger, they were places that served the needs of the nation. That tide turned in the mid-1900s as public schools began to lead instead of just follow. Public schools stopped adhering to what was dictated for its next generation in terms of learning and citizenship and began to blaze a trail for the rest of society where collective belief systems were concerned. It may have been too late to change the minds of disenfranchised adults who had grown up accepting their worlds in a particular way, but it was still early for students. Needed change was not going to happen overnight but it was needed just the same.

Schools became the vehicles for future change, starting with the youth of the nation. The focus was no longer just on economics, or raising ideal citizens; core ideologies were being shaped in public school classrooms across the nation.

This characteristic of public schools is still evident today. Take anti-bullying campaigns, for example, particularly as they relate to lesbian, gay, bisexual and transgender students. While many parents (and even some school boards) are fighting against anti-bullying policies that are designed to protect LGBT students, schools across the country are adopting them at a rapid pace anyway. The same is true of healthy eating programs and the push to get kids away from television and computers and involved in active pursuits instead. Schools cannot change what is being taught at home, or even what students themselves believe. Yet by leading change through example and actual policy, the thought is that future generations will have a different approach to important issues than their parents did. Like the socially conscious efforts of Dewey, public schools establish principles that then govern a particular group of K-12 students as adults.

The 1970s brought even more equality in public schools with the passage of the Education for All Handicapped Children Act. This was the first federal regulation that mandated that public schools accepting federal funding also provide a free education, and meals, to children with mental or physical disabilities. It was not enough to simply accept the students; schools had to create a teaching plan that would give these students as close to a typical education as their non-handicapped peers. Though separate classrooms were inherent to the plan, schools were instructed to keep special education students as near to their peers as possible. By this token, public schools became even stronger when it came to truly opening their doors for all students, and being a right of American life. Follow my series on the progress of the U.S. educational system to learn more about where we’ve been, and where we need to go, as collective educators.

Want college to be affordable? Start with Pell Grants

Donald E. Heller, University of San Francisco

In her speech accepting the Democratic presidential nomination, Hillary Clinton talked about free college and student debt relief.

Convention speeches are not normally known for providing details of policy proposals, and keeping with tradition, Clinton offered few details of her own. Now that we are past the conventions and into the campaign, presidential nominees Hillary Clinton and Donald Trump are likely to speak in more detail about their specific policies.

What is missing in the debate about free college, however, is a discussion of the role of Pell Grants, the centerpiece of the federal government’s student aid programs. These grants, which used to cover almost the entire cost of a college education for poor students, today cover less than a third. The current Republican budget proposal would erode it even further, threatening the ability of students from poor and moderate-income families to attend and graduate from college.

From my perspective as a researcher who has studied questions of college access for two decades, any discussion of free college has to include the role of Pell Grants in college affordability.

What are Pell Grants and why are they important?

Pell Grants were created in the 1972 reauthorization of the Higher Education Act. This coming academic year they will provide grant aid of up to US$5,815 to students from low- and moderate-income families.

Last year, over eight million undergraduates across the nation received a total of about US$30 billion in Pell Grants.

In 2011-12, 41 percent of undergraduates received a Pell Grant. Dollar image via www.shutterstock.co

Data from the U.S. Department of Education show that in the 2011-12 school year (the most recent data available), 41 percent of all undergraduate students received a Pell Grant, almost double the 22 percent of students who received them in 1999.

For most students, the funding they receive from the Pell program outstrips what they receive in aid from either their state or the institution they attend.

Using data from the U.S. Department of Education, I calculated that the average Pell Grant recipient received an amount from that program that was five times greater than what they received in state grant aid and 2.6 times greater than the amount of scholarship assistance received from the institution attended.

Without Pell Grants, in other words, many low-income students would not be able to attend college, or would not be able to attend full time and make good progress toward earning their degree.

Pell Grant value dips, tuition increases

In a book I edited a few years ago, I demonstrated that back in the 1970s, a student attending a public, four-year university and receiving the maximum Pell Grant would have approximately 80 percent of the price of her college education – tuition, housing, food, books and miscellaneous costs – covered by the grant.

If the student had no resources of her own to contribute, the remaining 20 percent of the cost was often made up through state grants, scholarships from the university, work study and perhaps a small amount of student loans.

Today the maximum that a Pell Grant covers is only about 30 percent of the price of attending college for that same student. The erosion in the value of the grant is due to two reasons: 1) the rising price of college attendance and 2) a drop in the real value of Pell Grants.

Since 1985, average tuition prices at public, four-year colleges and universities have increased 222 percent after adjusting for inflation. The situation at private four-year colleges and community colleges is only slightly better – average prices in the two sectors have increased more than 130 percent in real terms during the same three decade period.

Pell Grants, in contrast, have grown much less rapidly. The average grant increased only 30 percent in inflation-adjusted dollars during this same period.

Former U.S. President George W. Bush after he signed a bill on Pell Grants. Larry Downing/Reuters

In the latter half of the 1980s and through most of the 1990s, Congress and a series of presidents – Ronald Reagan, George H.W. Bush and Bill Clinton – allowed the purchasing value of Pell Grants to decline even further.

The maximum Pell Grant actually dropped 19 percent in real dollars between 1985 and 1996. While federal funding over the last two decades has allowed it to regain some of its value, the maximum Pell Grant today is still below the 1975 level in inflation-adjusted dollars.

Impact of GOP proposal

As bad as this situation is, it could get much worse. The current Republican spending plan in the House of Representatives proposes to place a cap on the maximum Pell Grant. What this means is that it would stay at its 2015-16 level for the next 10 years.

While it is hard to predict for sure what will happen to tuition prices over the next decade, it is fairly certain that prices will continue to rise. This will cause the value of the Pell Grant to erode even further during this period.

Students protesting against rising college costs. Max Whittaker?Reuters

For example, again, based on my calculations, if college prices increase 3 percent per year over the next decade, and Pell Grants are held at their current level, its purchasing power at public four-year institutions would drop from 30 percent of total college costs today to only 21 percent in 2026.

At private four-year institutions, the Pell value would drop from 17 percent of costs today to only 12 percent 10 years from now.

The Republican proposal, if enacted, would undoubtedly have an impact on the college access and success of students from low- and moderate-income families. Constraining the grant aid available to them from the federal financial aid programs could force more students to drop out of college. Or, students could take longer to earn their degrees, or could afford to attend only a community college rather than a four-year institution.

The impact on college access for these students would be detrimental to the nation as a whole. As President Obama noted in his first address to Congress in 2009, the future growth of our economy will depend on having more workers with post-secondary credentials. Without a Pell Grant program that keeps pace with college costs, we will be unable to attain this goal.

Clinton and Trump should be talking about the issue of college affordability on the campaign trail. But they need to address all of the policies that help make college affordable for students and their families.

Funding for the Pell Grant program is a critical component of that.

The Conversation

Donald E. Heller, Provost and Vice President of Academic Affairs, University of San Francisco

This article was originally published on The Conversation. Read the original article.

Helping Students to Develop Presentation Skills

Show and Tell

As a young mom I was not familiar with the concept of “show and tell”. My eldest son was 4 years old at the time and he had to take a toy to school and show it to the class, tell them a little bit about it and answer the eager audience’s questions. I thought this is such a great idea to introduce children to the world of public speaking and presentations! After all, public speaking is not necessarily a talent, but a skill, and the younger a child is when they begin to learn this skill, the better.

Apart from being mom, I am also a sixth form teacher and am too well-aware that some students genuinely struggle when asked to present information to a group. I can see that this may be a problem when students go on to tertiary education and also later in life. For personal and professional success, effective presentation skills delivered in a confident manner are vital.

That is why presentation skills need to be nurtured from a young age, before the student really has an awareness of being in the spotlight and possibly being faced with stage fright. Public speaking and presentation skills could be fostered, to such an extent that it becomes a natural skill. “Show and tell” helps a child to prepare a talk about an abstract object rather than a familiar one, it helps to create an awareness of vocal projection and most importantly, it helps to build confidence.

Spotlight 

By the time my second son had to do “show and tell”, we had perfected the practice! We progressed from showing (and telling about) favorite toys, to eventually using PowerPoint. By now, my sons were 8 and 10 and their confidence surprised their teachers. “Show and tell” helped to build their public speaking skills and helped them to feel comfortable with talking in front of a group of peers! However, they were also confident because every time that they were expected to present information to the class, they were well prepared. Confidence and preparation are crucial aspects for effective presentation!

My 7 year old daughter has to talk about her summer holidays in class soon. I know that if she is well prepared, she will feel confident and be able to do a good presentation. She was super excited when I suggested that she make a mysimpleshow video to introduce her holiday experience. Afterwards she will also show holiday photographs and talk about each of them. I know that if the presentation goes well, she will be more confident and keen to do a presentation when she gets her next spotlight topic.

Presentations

When asked about the basics of speech making, my advice to students and parents is simple:

  1. Prepare the speech/presentation very well – plan carefully what you’ll say and use speech cards with highlighted keywords
  2. Practice the presentation a few times – if possible, do it in front of a test audience, like your family
  3. Pay attention to proper posture – be mindful of weird mannerisms that may distract the audience
  4. Make eye contact
  5. Speak loudly and clearly
  6. Be confident! If the audience senses that you are nervous, they will also be nervous

My advice to teachers?

If you are teaching little ones:

  • Keep the “show and tell” and spotlight going from a young age. It does wonders to build confidence!

If you are teaching older students:

  • Regularly include short student presentations in your classes to emphasize the basics of speech making
  • Suggest various ways to make presentations more interesting to an audience, like the use of objects or the showing of short video clips as part of the presentation.

Educators play a vital role in helping students to learn and experience public speaking. Leadership in the community, business world or any organization demands effective presentation skills. Leaders are expected to be able to make presentations without any qualms. So, let’s foster great presentation skills from a young age and right through our students’ school careers, to ensure that they acquire a skill that will be very useful to them throughout their lives.    

LGW Irvine is a secondary school teacher specializing in history, performing arts and languages. With a keen interest in writing, she has published Teacher Planners and an AFL Teacher Handbook. Among her presentations include in-depth courses in study methods and essay writing, as she has a particular interest in helping others to reach their full potential in those areas. Her current projects include History Revision Guides as well as Study Methods workbooks.

Contact Information

Simpleshow gmbh

mysimpleshow.com

How Did We Get Here? Part V: Public Education as a National Need

This is one of a multi-part series on the progression of education policies in the U.S. from its founding. Click here to see a list of all the posts in this series

By Matthew Lynch

By the early 1900s the idea that every American child had the right to an education had gained mass adoption. Even students destined for a life in the mines, or on the railroads, deserved basic spelling, arithmetic and science lessons. Public schools were a place to absorb the common learning priorities that other students were also absorbing throughout the country. This view of public schools gave all children (at least the white ones) an equitable start in life, at least when it came to actual curriculum presented. School as a national pastime was established with the sole purpose of giving students base knowledge. From there, the students were free to carve out the lives they wanted, or follow in a predetermined path based on family or geographic limitations.

Just after the start of the 20th century, a new public education ideology began to emerge that hinted that schools should be utilized as more than spots to memorize facts. According to reformers like University of Chicago professor John Dewey, public schools needed to serve a greater good – for the individual and the country. Dewey was a figurehead of the Progressive Movement that insisted schools be socially conscious places where more than book learning took place.

While Dewey’s theories were widely known and discussed, they did not see much realization in the height of his popularity. Much like the public school districts of today, Dewey faced bureaucratic red tape at every turn and an unfriendly approach to change. In the eyes of educators, schools were established for learning what was written in a textbook, not for any other purposes, particularly ones that could easily be duplicated in homes.

Though slow to gain adoption in his own time, Dewey’s theories of public schools as socializers, and agents of change for the better, are certainly evident in school systems today. Consider public awareness campaigns, like First Lady Nancy Reagan’s “Just Say No” initiative that infiltrated schools in the 1980s, or the emphasis on Earth Day every April in public schools throughout the nation, or First Lady Michelle Obama’s current Let’s Move campaign that offers specific health awareness programs to schools. In the U.S., schools are the front lines for initiating change in behaviors as a nation and telling (more so than showing) students what is “right” or “wrong” in cultural terms.

Along with the base, common knowledge that accompanies the facts in textbooks, K-12 students in America are expected to know a set parameter of life truths before they graduate, or decide to drop out, like: smoking will kill you, drugs will kill you, obesity will kill you, taking care of the environment is not an option, stealing is bad, going to jail is bad, lying is bad, and cheating is bad too. Though not religious institutions, public schools have transformed in the past century from agents of factual information to ethics-infused entities. It is not enough for students to pass a test at the end of each grade and at the end of a K-12 career; to be true contributors to society, they must have moral compasses and understand the responsibilities of citizenship.

Of course some schools are better at this than others. In areas impacted by high poverty and crime rates, it is more difficult to graduate students who rise above their circumstances. Even middle- and upper-class school districts have their own bad apples. Still, these students are certainly aware of the right and wrong ways to live their lives, at least in theory, though they may not truly believe those truths themselves. Dewey would certainly be proud of the approach of public schools when it comes to socially conscious behaviors, if not disappointed by the outcomes of such efforts.

Though his theories were not particularly political, Dewey’s ethically-minded approach fed into the nation’s thirst for patriotism. Part of contributing to society was loving it and all its symbolism too. Consider the morning ritual of every public school in the nation since the early 1920s: reciting the Pledge of Allegiance. In his young adult fiction novel Nothing But the Truth, author Avi challenges patriotic rituals in public schools through the character of a young man who refuses to quietly listen as “The Star-Spangled Banner” plays in his classroom. The boy becomes a national celebrity, with both supporters and detractors. The supporters believe he was trying to sing along and should be celebrated for that fact. The detractors say he didn’t show enough respect for the song by refusing to stay silent as it played. On the final page of the book, the boy is asked to sing the National Anthem on a radio talk show and he admits he doesn’t know the words. The point then of the novel is that much of what students learn, at least when it comes to patriotism, is not based on an intrinsic loyalty but on one that is imposed.

The same can be said of the other ethics-based lessons that are part of American public schools today. The principles that constitute being a good citizen are just that: suggestions for living, not commands. Students are still guided by their own free will, though aware of the consequences. The idea, however, that schools should at least be presenting a socially conscious agenda started in the early-to-mid 20th century and still permeates K-12 public school classrooms today. Follow my series on the progress of the U.S. educational system to learn more about where we’ve been, and where we need to go, as collective educators.

How Did We Get Here? Part IV: Mann Reforms to Public Education

This is one of a multi-part series on the progression of education policies in the U.S. from its founding. Click here to see a list of all the posts in this series

By Matthew Lynch

The first attempt at regulating exactly what American students were learning in those schoolhouses came in 1837 from now-famed education reformer Horace Mann. When he took over the role of Secretary of Education in Massachusetts, he set out to create a common way of teaching educational content, particularly to elementary students. He borrowed his idea from a Prussian model that also stressed training for educators.

Along with shared content, Mann’s reforms brought about the first age-grade systems where students were promoted up based on age, not academic aptitude. While this led to greater concentration on subjects that increased in difficulty as students grew older, it also planted roots of American students as passive learners, as opposed to active ones. The idea that each student of a certain age should master content by a certain time, and based on set criteria that was in no way customized, was founded as a way to keep students moving through the public education system, and advancing in their studies rather than idling on topics they already knew while younger students learned them for the first time.

States around the nation rushed to duplicate Mann’s ideas in their own schools and multi-age classrooms disappeared in the coming decades. As the American population rose, it just made sense to accommodate students in a more segmented way. Age-grading was meant to improve efficiency of classrooms and the entire public education system. The more students that could be passed through the public schools, the better. It made economic sense, and in the minds of reformers like Mann, it also meant a more highly-educated public.

Though Mann’s system for age grading was introduced over 175 years ago, it is still the main form of organization in the public, and private/independent, schools in America today. While some students are retained (or held back) when they do not master the material at hand, the idea of socially promoting students based solely on their ages is more popular than you might think. It is difficult to measure exactly how many students are passed on to the next grade based more on their age, and less on their academic merit, because teachers are obviously not keen to admit it. Retaining students is simple to measure but only tells half the story. Of the students who are not retained, how many of them should be?

In the past two decades, the social faux pas associated with students who are held back a grade has begun to fade. A Public Agenda survey from 2003 found that 87 percent of parents would rather that their children be held back than promoted if they have not mastered their grade-level material. There was once a time when a student who was held back was viewed as being outside the “norm” of what we have come to expect in U.S. classrooms. That’s changing though as parents begin a push back against social promotion.

The so called “redshirting” of Kindergarteners is rising each year in popularity. Rather than having to make the decision to hold a child back (most often it happens in grades K-2), parents are just delaying the start of school instead. In the mid-1990s, just 9 percent of children entering Kindergarten were age 6 or older. According to U.S. Department of Education statistics, by 2007 that had risen to 16.4 percent. It’s reasonable to assume this rise is due at least in part to the increased demands placed on academic achievement at such a young age. Redshirting is becoming so common that Kindergartners who are 6, going on 7, are not a strange sight at all.

While it does speak to the maturity of American parents, sending children to school later does throw a wrench in the traditional age-grade system. Teachers are often ill-prepared to deal with students who are outside the age specifications in their classrooms, and in cases where both a 5 and 7 year old are in the same classroom, understandable behavior and maturity differences are evident. By adhering strictly to an age-grade system for just some, it puts a strain on the others. Teachers who hope to avoid problems for their colleagues in higher grades often take the easier route of age-grading promotion.

Despite the pitfalls of the age-grading system, the positive impact of Mann’s endeavors should not go unnoticed. Along with age-grading, he emphasized the need for mandatory attendance. Public education was not something that was a perk of American life; it was a necessity. He believed that for the nation to truly advance, its youth belonged in classrooms (not just in fields, or factories) and that states should implement attendance policies to support this view. While it took some time for his emphasis to really see mass appeal, his advocacy for mandatory public schools found some resonance. By 1900, 34 states implemented required schooling laws, 30 of which required students to stay in school until the age of 14. Ten years later, 72 percent of the children in the U.S. went to school. Just 10 years after that, every state had required attendance policies. By 1940, half of all young adults in the U.S. were high school diploma recipients. Follow my series on the progress of the U.S. educational system to learn more about where we’ve been, and where we need to go, as collective educators.

 

How Did We Get Here? Part III: The Birth of the American Public School

This is one of a multi-part series on the progression of education policies in the U.S. from its founding. Click here to see a list of all the posts in this series

By Matthew Lynch

There were public schools in America as far back as the mid-1600s in the Massachusetts Bay Colony but the first truly American public schools began appearing in Pennsylvania at the end of the 18th century. Money was no object as the even the poorest of citizens were welcomed through the schoolhouse doors and offered a public education. The New York Public School Society came soon after in 1805, and by 1870 all states had at least a minimal public program in place to educate students en masse. These programs were voluntary, though.

What was taught in these early schools varied by region but was grounded in a basic set of ground rules for bringing up American students right. Public education was meant to unite American families through a common interest: raising educated children who would soon be at the helm of the nation’s future. Basic education was not something reserved for the elite. Reading, writing and basic arithmetic were necessities of living as Americans and were important when it came to guiding the young nation.

The learning resources of early America were understandably limited. It was too early to have much variety in American-made textbooks or other learning tools. Much of what was used in these schools were texts developed in England and repurposed for American pursuits. The need for purely American educational texts did exist though, and slowly but surely they began to take shape.

In the 1780s, Noah Webster set out to create a textbook that would teach children the realities of spelling in the new land. Until that point, spelling textbooks were mainly imports from England that sought to teach kids the most unusual and difficult, yet least used, words in the English language. Webster saw the impracticality of this, and set out to change it. The American Spelling Book (shrunk down by Webster from a much longer, pompous title suggested by his editor) became a staple for learning in homes and the few organized educational models that existed. In accomplishing this, Webster established the first systematic method for learning in the young country that was practical, easy to navigate and widely used. Even as late as 1866, after many other spelling books had been written and updated, Webster’s original version was still selling 9 million copies annually.

In the early 1800s, several other publishing companies followed suit, piggybacking on the idea of nationalism through learning. Popular titles included The United States Spelling Book and The American Preceptor. When it came to arithmetic, the titles were more complex but the patriotic theme remained. New and Complete Systems of Arithmetic Composed for the Use of the Citizens of the United States and Being a Plain, Practical and Systematic Compendium of Federal Arithmetic were just a few of the more popular textbooks widely circulated in the early 1800s. Using these purely American terms was still a new and exciting way to remind citizens that the country was in its infancy and that everything needed to be reinvented with a purely American spin. The idea that exists today that universal texts need the American touch was born in this era of national construction, when what was taught began to be just as important as how it was presented.

From the start, then, public education in the United States was about moving students collectively in the direction the nation wanted to go. Individualism and customized learning were certainly not common terms and the choices for education were slim. The accepted curricula for one American was good enough for another. This base learning was rooted in the need to not only obtain knowledge, but to use education as a way to build up a nation that was still teetering dangerously on the edge of failure. Parents did not encourage their children to learn spelling or arithmetic so they could have a “better life” but so they could continue to have a free one. Education was a means of survival and banding together with the same education goals, at least when it came to common people, was a way to build the entire nation up. Sure, there was some educational elitism through private schooling and university systems, but when it came to the public institutions of learning, every student deserved and earned the same set of knowledge.

As the country continued to expand, both in sheer numbers and land mass, public education became more segmented. Public schools until the 1840s were under local control, with little input from the state and virtually no federal oversight. Attendance was rising though. The U.S. Census from 1840 shows that 3.68 million children from ages 5 to 15 attended school, and this represented about 55 percent of the population in that age bracket. Around this time the idea of one-room schoolhouses took shape, with the older students acting as helpers to the younger ones. Students learned in common ways from teachers and each other. When it came to teachers, there was no formal credentialing. This is why young, single women often filled the roles. They were available, and able, so they filled the roles of educators until they became married and were needed full-time in homes of their own. As in the post-Revolutionary days, education became the responsibility of everyone in the community.

It’s important to understand how public schools started in order to reach the point where we understand WHY they operate the way they do today. Follow my series on the progress of the U.S. educational system to learn more about where we’ve been, and where we need to go, as collective educators.

 

 

 

How Did We Get Here? Part II: Early Learning in America

This is one of a multi-part series on the progression of education policies in the U.S. from its founding. Click here to see a list of all the posts in this series

By Matthew Lynch

When the public school systems of America were first founded in in the late 1700s, they were practical places. In a growing country trying to build a stable economy on the world stage, public schools stood as the building blocks for the next generation of U.S. workers. Students did not need more than a few years to learn the basics of what they would need to propel the nation forward to its next level. Socialization and learning side skills, like manners, were perks of the public school system but not primary goals. The success of a particular education path was determined by the functionality in society of the student upon completion.

Without a national system in place to address educational issues, it fell on the shoulders of private institutions of Colonial America. Education was not mandatory and there were very few paid educators. Learning endeavors were voluntary – both on the part of the students and those who were able to teach. Knowledge did come with value attached but not as much, say, as learning the skill of a particular trade. Apprenticeships were common and “idle” learning was viewed negatively.

Much of the prescribed education during this cornerstone period of the nation’s growth happened in the home. Learning to read was often a task assigned to mothers, though without access to paper, most colonial children learned to write and trace letters using the ashes of the fireplace. When children had mastered enough to read on their own, they received a Bible in hand or another piece of British literature. As a result of this devotion to learning set forth by mothers, most children who did attend school already knew how to read when they arrived at the age of 7 or 8.

That’s an interesting thing to consider, particularly since the children entering today’s Kindergarten classrooms generally do not possess proficiency in literary skills. Kindergarten students, at least the ones considered “ready,” should know the alphabet, how to write their first and last names, and how to count to 20. From there, today’s teachers are expected to turn them into voracious readers who fall in love with literacy. By the age of 5 or 6 (more parents than ever are choosing to delay Kindergarten starts by one year for their kids), the seeds of interest have already been planted in kids’ heads. What’s more – by the age of 5, many learning opportunities have been missed. Research has found time and again that the first five years of a child’s life are the most vital to an individual’s overall knowledge base.

Waiting until Kindergarten to learn reading, math and other language concepts means a large missed window of learning opportunities. Children are hard-wired for learning but in contemporary America, the first five years of life are widely regarded as ones that belong to the “play” category. There are certainly things that are learned through play and everyday life but concentrated, organized learning tends to be reserved for years following the first half-decade of a child’s life. This was not the case when it came to Colonial families. Learning was a responsibility of home life and one that was not relegated to the few formal education systems that existed.

In the late 1700s, when children knew enough to reach their perceived potential in life, they left school to enter the workforce – whether on the family farm or in a trade outside their homes. Women used their own knowledge sets on the domestic front, and eventually to teach their own children. Very few Colonial-time students went on to college and very few really needed it. For all intents and purposes, school was for learning the practical side of life and for developing a shared sense of knowledge among the youth of the young nation.

Clearly, today’s U.S. educational system is much different — so how did we get here? Follow me in this series to learn more about the progression of education in America from those earliest days to contemporary classrooms.

Zero tolerance laws increase suspension rates for black students

F. Chris Curran, University of Maryland, Baltimore County

The State Senate of Michigan is currently considering legislation that would scale back “zero tolerance” discipline policies in the state’s public schools.

Zero tolerance discipline laws require automatic and generally severe punishment for specified offenses that could range from possessing weapons to physical assault. They leave little leeway for consideration of the circumstances of the offense.

The bill, already approved by the State House, proposes to add provisions that would consider the contextual factors around an incident, such as the student’s disciplinary history, and would ask whether lesser forms of punishment would suffice.

In other words, suspension and expulsion would no longer be as “mandatory” and there would be a little more “tolerance” in these state discipline laws.

As a researcher of education policy and school discipline, I would highlight that these revisions, some of which have been passed in other states, represent a significant change of course for state school discipline law.

In fact, my recent work and that of others suggests that the shift away from zero tolerance approaches is for the better.

Why zero tolerance policies were introduced

Throughout the 1990s, the number of states with zero tolerance laws, those requiring suspension or expulsion for specified offenses, increased significantly.

The rapid adoption of such laws was spurred in part by the passage of the 1994 Gun-Free Schools Act, federal legislation that required states to adopt mandatory expulsion laws for possessing a firearm in school.

These safety concerns were further heightened by the shooting that took place at Columbine High School, a public high school in Littleton, Colorado.

Following Columbine, by the early 2000s, nearly every state had a zero tolerance law in place. Many of these laws expanded beyond firearms to include other weapons, physical assaults and drug offenses.

Push back against zero tolerance

Clearly, such zero tolerance laws were meant to improve the safety and order of the school environment. However, in recent years, they have been seen as being overly prescriptive and as contributing to racial disparities in school discipline.

For instance, there are cases of students being suspended for accidentally bringing a pocketknife to school. In one high-profile case, a student was suspended for chewing a pastry into the shape of a gun.

Black kids are suspended at a higher rate. Children image via www.shutterstock.com

Additionally, federal data show that black students are suspended at rates two to three times higher than their white peers.

As a result, in 2014, the U.S. Department of Justice and Department of Education issued a joint “Dear Colleague” letter directed to public school districts. The letter was a call for reductions in the use of suspensions and expulsions and, instead, for a focus on ensuring the fair use of school discipline for students of all backgrounds.

Here’s what new research shows

In a newly published study, I explored the implications of state zero tolerance laws – laws that require school districts to adopt zero tolerance policies.

In particular, I sought to find out if they contributed to increased use of suspensions and if they led to racial disparities. Given claims by proponents of such laws that they increase the safety and order of the school overall, I also wanted to see if these laws contributed to decreases in perceptions of problem behaviors in the school as a whole.

I used national data collected by the U.S. Department of Education as part of the Civil Rights Data Collection and the Schools and Staffing Survey. The sample included thousands of school districts and principals spanning the late 1980s to the mid-2000s.

The study revealed three important findings.

First, the study showed that state laws requiring schools to have zero tolerance policies increased suspension rates for all students. Second, suspension rates increased at a higher rate for African-American students, potentially contributing to racial disparities in discipline. Finally, principals reported few decreases in problem behaviors in schools, suggesting that the laws did not improve the safety and order of schools.

The findings, in context

The findings show that the adoption of state zero tolerance laws result in increases in district suspension rates. For the average-sized district, such laws resulted in approximately 35 more suspensions per year.

Though this number may seem small, the potential impact is quite large.

A recent study by researchers at UCLA, for example, suggests that a one percentage point reduction in the suspension rate nationally would result in societal gains of over US$2 billion through reduced dropout and increased economic productivity. In short, state zero tolerance laws may be imposing significant financial costs on society.

Burden of zero tolerance laws is not shared equally. Boy image via www.shutterstock.com

Furthermore, the burden of these costs are not equally shared across all groups.

The results of my study suggest that the increase in suspension rates for black students as a result of these laws is approximately three times the size of that for white students.

Coupled with other research that finds links between zero tolerance policies and racial disparities, this finding demonstrates that these laws, though supposedly neutral with regard to race, are disproportionately impacting students of color.

Recent data released by the U.S. Department of Education’s Office for Civil Rights also point to persistent disparities by race in the use of school discipline.

No reduction in misbehavior

Proponents of zero tolerance discipline have argued that the use of suspensions and expulsions increases the safety and order of the learning environment as a whole. My study found evidence to refute the claim.

In my data set, principals rated the degree to which various behavior problems (i.e., fighting, disrespect, use of drugs, weapons) were problems in their schools.

I found that, in the view of principals, the presence of a state zero tolerance law did not decrease their rating of the degree to which these various behaviors are problems. In other words, state zero tolerance laws did not appear to be contributing to improved levels of safety and order overall.

What the results mean for policy and practice

Students, parents and other stakeholders have an expectation that schools should be safe and orderly environments that treat all students equitably. While it is imperative that schools take active steps to achieve these goals, the findings of my work call into question whether state zero tolerance discipline laws are the most effective way to do so.

While suspension and expulsion may still be appropriate tools in some circumstances, it is important for schools to consider context, and states to allow such discretion, in the administration of school discipline. Furthermore, it is important to have safeguards in place to ensure that such discretion is utilized equitably for students of color, who too often experience disproportionate disciplinary exclusion.

The revised disciplinary laws under consideration in Michigan and similar revisions to school disciplinary policies in other states represent more promising steps to ensuring effective and fair school discipline.

The Conversation

F. Chris Curran, Assistant Professor of Public Policy, University of Maryland, Baltimore County

This article was originally published on The Conversation. Read the original article.

The Diversity Responsibility Colleges Face Following the 2016 Election

The Presidential race between Hillary Clinton and Donald Trump will go down as one of the most unpredictable, and contentious, of American history. The candidates’ personalities, paired with the 24-7 news cycle and social media, made an already inundated time for political messaging completely saturated. It pushed people to their breaking points, revealing the worst in some and the true colors in many.

The end result is a country that will truly never be the same. Whether it’s neighbors with opposing yard signs who can no longer see eye to eye, or family members disinvited to Thanksgiving celebrations, the very real impact of this election season will linger for long after the votes were cast.

Things have changed for colleges and universities too. It’s too soon to know exactly what to expect in the way of legislation, funding and federal support for the higher education landscape over the next four years, but there are some intangible effects that are already evident. The most basic of these lessons is this: We aren’t as far along as a diverse nation as we thought.

And it isn’t just set-in-their-ways adults either. The night of Trump’s victory, a black baby doll with a noose around its neck was found in an elevator at predominantly white Canisius College in Buffalo, New York. This is just one example that proves that the nation’s youngest adults are not all enlightened when it comes to diversity and equality; there is still a lot of work to be done and much of it should happen on college campuses. Yet, on the higher education scene students are still being marginalized – whether that discrimination is direct or a result of policy.

So where do we go from here? As a collective college and university system, how do we piece together our latest revelations about our nation and apply it to building a more diverse ecosystem?

Colleges must recognize the new normal.

It starts by colleges acknowledging that we truly aren’t as progressive (as a nation) as we thought. Of course, those of us who have made diversity our life’s work have long been aware of the holes in the equality spectrum. But now we have an entire nation who is seeing it, some for the first time, too. Whether they choose to acknowledge it or not, a deeper awareness of the plight of many marginalized Americans was revealed during election season. It will impact the way we treat each other and it will impact the atmosphere of college campuses. People just know more. That knowledge ups the responsibility of what colleges teach and how they interact with student bodies.

Colleges must acknowledge everyday injustices.

Discrimination isn’t always outright. It doesn’t always manifest in hate crimes or racial slurs. Many times it is subtly ingrained in our societal fabric – penetrating our psyche to the point that we don’t even notice it anymore. This is especially true for the traditionally privileged of society – the white, middle-class males (if we are going for a stereotype). The unfair things these Americans have faced pale in comparison to minority groups, and even women. When you’ve never been exposed to the type of establishment racism and division that are common to disadvantaged populations, it feels like that type of existence is far-fetched – maybe even made up. It takes movements like Black Lives Matter, or …., or even the obvious xenophobia and racism that arose during the election season to really wake a person up.

Colleges must step up when it comes to eliminating inequality across the board with a more proactive approach. Instead of having a crisis team on call, universities must work consistently to give all students the opportunities they deserve. They must also call on the workforce beyond the college years to do the same. Where there is a student at a disadvantage, questions must be asked as to what led to that point – and how it can be fixed.

There is no easy fix for where we are as a nation when it comes to diversity. Colleges and universities have the responsibility to spearhead positive change, though. The next generation of adults deserves better opportunity and higher education is the starting place.