elearning

10 Tips for EdTech Entrepreneurs

Edtech entrepreneurs all set out with good intentions; to better education. While noble, these intentions are met with many obstacles and failure is easy.  Only 14% of teachers use digital curricula weekly. So while the doors of opportunity seem open, the market is saturated with products that never make their way into the classroom. If edtech entrepreneurs are to flourish, they need to address the fears that educators have about technology while pushing innovation. Below are 10 Tips for EdTech Entrepreneurs

  1. Talk to Teachers and Students

As with any product, you need to know what the consumer needs. Doing market research is a fundamental aspect of entrepreneurship, and the edtech industry is no difference. Creating a product without having an idea of its practical use, classroom implementation or student’s needs will fail. Entrepreneurs also need to be aware that their ideas around education (which may stem from their high school days) may be outdated and the concerns already addressed.

  1. Create Edtech that Serves a purpose

Not all edtech entrepreneurs have worked in the education field. It is, therefore, important that the product be defined and that it serves a purpose. Entrepreneurs should never aim to replace educators but aid students and teachers to better do their work. Having a defined purpose is also vital when approaching investors.

  1. Do Research into Pricing Options and Investors

The death of any edtech product is unrealistic pricing. As with any product, profit margins should be slim in the beginning. Aim to appeal to a broad market and be aware of pricing models. EdSurge provides a comprehensive insight into pricing models and how startups can best price their product in the hopes of both enticing buyers and making a profit.

  1. Start Small

Edtech entrepreneurs would love to have thousands of children using their product. This reality is only accomplished over time. Be realistic about who the product is aimed at and how much content will be available. Products that offer 500 unique lessons plans are not built up over night. Start small by offering consumers a product that is budget friendly to both parties pockets and wets their appetite for more.

  1. Collect a Group of Creative Mind

Any edtech products require the skill sets of a number of people. By employing or co-creating with individuals who share the same vision as you, you can be reassured that the end product will be something that can compete in the market. Sourcing freelance web designers and content writers is an excellent way to start as their prices are usually lower than professionals, but the quality can be comparable.

  1. Download and play with Edtech

In order to create a unique product, it is important to know what is on the market and how those products are succeeding or failing.  The best way to do this is to download and use as much edtech as possible. Not only does this give you an eye into the competition but it ensures that the product being created is not a duplicate. Sites like eLearning Industry have databases full of products available for free download.

  1. Stay on Top of EdTech Trends

The edtech industry is quick moving and for a product to succeed it must be both innovative and accessible. By reading articles, attending seminars and staying in the know, increases your chances of creating a product that is in line with what is on offer. Thinking outside of the box is always encouraged, but it is important that the product is in line with current technological literacies.

  1. Advertise Smart

Advertising should make up 7-8 % of the gross revenue of any new business. On top of this, it is important to advertise in the right spaces. Be proactive in finding web pages, magazines and other ways of getting your product seen by educators and prominent insiders. Advertising is fundamental, and without it, a great product can go unnoticed.

  1. Interact with the Edtech Community

Networking may be a term thrown around in the early 2000s, but for new entrepreneurs it is invaluable. By connecting with other professionals in edtech opens doors to meeting investors, collaborating and learning from influential entrepreneurs who have found success. With the internet, this ability to network is easier than ever.

  1. Stay True to Your Goals

This point may sound sentimental but staying true to goals is vital for any emerging business. Edtech products that promise to improve student’s vocabulary should do just that. This is not only a good life lesson, but investors, educators, and advertisers will be more likely to invest in a product that stays to true to its initial intentions. Growing and morphing are always good, but at the beginning, this does little to encourage success.

So, if you have an excellent idea about an edtech project or are already in the process of creating one, these tips offer something for everyone. It is important to stay focused and make sure that the good intention that fueled the idea, materialized into a product that any teacher would be proud to have in their classroom.

 

 

 

 

 

 

 

 

 

 

Disengaged Students, Part 19: Lack of Support for Teachers

In this 20-part series, I explore the root causes and effects of academic disengagement in K-12 learners and explore the factors driving American society ever closer to being a nation that lacks intellectualism, or the pursuit of knowledge for knowledge’s sake.

New teachers enter the job with a certain amount of naive enthusiasm.  Experience discourages them rapidly. This is likely true to some degree in every profession, but teachers seem particularly vulnerable to burnout. Compare the attitude of first-year teachers to that of teachers who have put in two or more decades on the job. Perhaps you know people in both groups firsthand. The veteran teachers are likely to lack enthusiasm.  Even if they have a little excitement left, they have generally stopped voicing new ideas or looking for ways to innovate their classrooms and schools. Some of this is the natural consequence of time and fatigue which take their toll on people in every career. But this weariness is intensified by working in a highly demanding job that comes with little support and with much adversity caused by students, parents, administrators and the community at large.

Under Appreciation of Teachers Underestimated

If you think that it is inaccurate for teachers to be painted as martyrs, consider these facts. The National Center for Education Statistics found that the average teacher’s salary was only 3 percent higher in the 2010-2011 school year than it was in 1990-1991. At the end of the 2012 school year, the number of days teachers took for stress-related time off was 10 percent higher than four years earlier. American teachers make only 67 to 72 percent of what a person with a bachelor’s degree should make over the course of a career.

Some professions may be able to implement measurable and reasonable benchmarks for higher compensation. Teaching has too many intangible outcomes that are not valued by the cut-and-dry American policymakers and public. While teachers who demand better working conditions and pay are often portrayed as greedy or uncaring, most educators say that their gripes are not about their paychecks per se but about the value society places on their work. In a culture that shows appreciation through financial rewards, teachers are clearly undervalued.

Teaching Performance and Pay

This issue is further complicated when pay-for-performance rules are implemented. Such rubrics compromise the teaching experience.  They reward teachers for teaching to standardized tests instead of encouraging critical thinking skills that provide a long-term foundation for educational betterment. Performance-based incentives for educators are flawed because they turn students into assets that have an impact on the earning potential of teachers.

Teachers need accountability, but their work rightly includes many immeasurable goals of that fall outside of testable material. All occupations have their own forms of review and feedback, but the constant testing culture of contemporary education leaves many teachers feeling inadequate and under-appreciated.

For teachers to truly lead students out of a state of academic disengagement and into a state of thirst for knowledge, they need to believe that there are people who believe in the value of their vocation. It’s always been understood that children may not naturally cling to intellectual initiatives but if enough adults in their lives do, it can have positive results when it comes to academic engagement. It is a teacher’s job to impart an academic rigor and a love of learning which go far beyond outcomes which can be measured on a standardized test.  Without encouragement from outside forces, that job becomes nearly impossible.

 

Teaching the next generation of cybersecurity professionals

Nasir Memon, New York University

Each morning seems to bring new reports of hacks, privacy breaches, threats to national defense or our critical infrastructure and even shutdowns of hospitals. As the attacks become more sophisticated and more frequently perpetrated by nation-states and criminal syndicates, the shortage of defenders only grows more serious: By 2020, the cybersecurity industry will need 1.5 million more workers than will be qualified for jobs.

In 2003, I founded Cyber Security Awareness Week (CSAW) with a group of students, with the simple goal of attracting more engineering students to our cybersecurity lab. We designed competitions allowing students to participate in real-world situations that tested both their knowledge and their ability to improvise and design new solutions for security problems. In the past decade-plus, our effort has enjoyed growing interest from educators, students, companies and governments, and shows a way to closing the coming cybersecurity workforce shortage.

Today, with as many as 20,000 students from around the globe participating, CSAW is the largest student-run cybersecurity event in the world. Recruiters from the U.S. Department of Homeland Security and many large corporations observe and judge each competition. (Registration for this year’s competition is still open for a little while.)

But the pipeline for cybersecurity talent cannot begin in universities. High school students and teachers also participate in CSAW events to teach young people the computer science and mathematics skills that will allow them to succeed at the university level.

Teaching students to be adversarial

Thousands of students join together to learn about cybersecurity. CSAW, CC BY-ND

The main draw of CSAW is our Capture the Flag event, a contest in which the team members must pool their skills to learn new hacking methods in a series of real-world scenarios. Named after the outdoor game where two teams play to find and steal the enemy’s hidden flag, it includes multiple games that cover a broad range of information security skills, such as cryptography (code-making and breaking), steganography (hiding messages in innocent-looking images or videos) and mobile security.

Teams start by being assigned systems that have security flaws, and are given a certain amount of time to identify and fix them. Then each team is set against an opponent, and must protect its own system while attacking the other team’s. The hidden “flags” are data files stored on the opposing system. In the real world, these would contain critical information – such as credit card numbers or codes for controlling weapons. In the game, they contain information that proves a team “captured” that “flag,” with which the team is awarded a certain number of points, based on how difficult that particular challenge was.

There are many Capture the Flag competitions held throughout the country, which helps make our event the most popular of the week’s six competitions. It is also the most grueling: Teams must work for 36 hours straight, testing each participant’s ability to stay focused enough to create new solutions to emerging problems.

This type of challenge-based learning is vital in a field in which new threats emerge regularly. It also instills in students an adversarial mindset, which is an essential quality for successful security professionals. Learning the different ways to break a system firsthand is a vital first step to learning how to secure it.

Adapting on the fly

In one CSAW competition, the Embedded Security Challenge, students break into teams that must be able to work quickly at both attacking and defending each other from various threats. This is an attack/defense game like Capture the Flag, but focuses on vulnerabilities in hardware, rather than software. Last year, competitors were tasked with altering the digital results of a mock election – exposing potentially real threats to everyday elections.

This ability to quickly adapt as new threats are perceived is a top priority for security personnel. That’s a key element of all CSAW competitions – the idea that successful cybersecurity is not limited to mastering what’s known. Rather, students and professionals alike must constantly push their abilities to intercept future threats in an ever-evolving field. The cybersecurity industry – and all operations that rely on it, from small businesses to major military installations – depend on its practitioners’ ability to innovate. Every year, we change the types of challenges to reflect new threats, such as the recent rise of ransomware, for example.

Cybersecurity efforts must extend well beyond national borders; this year CSAW will dramatically increase its international activities. A collaboration with NYU Abu Dhabi and the Indian Institute of Technology Kanpur will allow teams in the Middle East, India, North Africa and the United States to compete simultaneously. The competitors in these games in an educational setting, in the U.S. and around the world, will – not long from now – be the protectors of our most sensitive personal and national data. We need them to be prepared.

The ConversationNasir Memon, Professor of Computer Science and Engineering, New York University

This article was originally published on The Conversation. Read the original article.

Where are new college grads going to find jobs?

Michael Betz, The Ohio State University

College graduates of the new millennium are different than previous generations. Not just because they prefer Snapchat to email and have mountains of school loans, but also because of their choices of where to live.

In the past, several factors such as the proportion of a city’s workers who are college educated, job prospects, income levels, and city amenities have influenced college graduates’ decisions on where to live.

So, are college graduates looking for the same things as they did in the earlier decades? Or has a changing world affected what attracts educated workers?

These choices matter to city officials, who work hard to attract the next generation of skilled workers through the lure of exciting new entertainment districts or by becoming a hub of a trendy new industry.

As a regional economics researcher, studying the migration of highly skilled workers, I found that there have been changes in how college graduates choose where they go to work, which has implications for the health of the national economy.

Grads in the 1990s

Previous research found that in the 1990s – after controlling for lots of other city characteristics like population, income and amenities – the proportion of the population with a college degree increased more in cities that already had lots of college grads.

Boulder, Colorado Let Ideas Compete, CC BY-NC-ND

For example, small college towns like Boulder and Ann Arbor had some of the country’s highest proportions of populations holding a bachelor’s degree – around 42 percent of the population at the start of the 1990s.

These towns ranked first and eleventh in the population of college graduates. Over the next decade, they added 10.2 and 6.2 percentage points to their respective totals.

There are many reasons why highly educated workers might be attracted to towns with a high population of college graduates.

Art fair in Ann Arbor, Michigan. Michigan Municipal League, CC BY-ND

One, there is evidence that productivity from better educated workers can “spill over” to other workers, enabling them to earn higher wages. Two, better educated places tend to offer more cultural amenities and entertainment options, which college graduates value highly.

If this trend goes on long enough, it could lead to segregation according to education level across cities, as cities with better educated workers will continue to attract more educated workers. In fact, one prior study suggests this is exactly what happened in the three decades between 1970 to 2000.

We were interested in seeing whether these trends still held, or if different factors were now luring graduates. So what is this generation of graduates looking for?

Large cities attract grads

A lot has changed for colleges graduates of the new millennium.

After all, in the last decade there have been two recessions (including the Great Recession in 2008), a continued decline in overall interstate migration, and significant industry restructuring from the loss of so many manufacturing jobs. Other studies have highlighted the trend of college-age adults living with their parents longer, providing some evidence that the struggling economy was swaying behaviors.

We wanted to see if the same economic forces had also changed where college graduates were moving.

We used data from the U.S. Census on the share of population who are college graduates in 358 Metropolitan Statistical Areas in the United States for two separate decades: 1990-2000 and 2000-2010. We controlled for many other city characteristics like size, median income and natural amenities.

We found it was not necessarily the better educated cities that attracted graduates as in the 1990s, rather it was the large cities (holding city education levels constant) that attracted graduates post-2000.

Bigger cities are attracting more graduates. Aurelien Guichard, CC BY-SA

Bigger cities usually have more diversified economies and therefore pose less risk of unemployment. It could be that graduates preferred places where they would increase their odds of landing any job during a time when the overall economy was struggling.

In previous decades graduates might have risked failure by chasing down that dream job in a city that was less diversified but more risky. Take Des Moines, Iowa, for example. It is a city of about 200,000 people that is highly specialized in the insurance industry. The percent of the population who were college grads increased six percent in the 1990s, but had slowed to less than four percent post-2000.

We believe, that because economy was so strong in the 1990s, graduates were willing to take those risks. So they went to cities with industries that employed highly educated people and were growing fast, but which also were more risky. If things didn’t work out, they believed they could always find a decent job somewhere else.

In the first decade post-2000, much of that perceived job security had eroded.

What can policymakers do

Our research is important because it further refines models of regional college graduate growth and updates our understanding of how the mix of factors influencing city college graduate growth has changed in the new millennium.

The link between education and economic growth is well-established and intuitive. Better educated workers are more productive in their jobs and more likely to start new businesses that create even more jobs. College graduates are more productive workers, have higher civic engagement, and help create new businesses.

Because of these things, our results have important policy implications for decision-makers aiming to lure talented workers to their cities. While city officials can do little to make their cities quickly grow in population, they can enact policies such as investing in education and supporting local entrepreneurs that create more stable labor markets.

The Conversation

Michael Betz, Assistant Professor of Human Development and Family Science, The Ohio State University

This article was originally published on The Conversation. Read the original article.

20 Top Virtual Reality Apps that are Changing Education

*The Edvocate is pleased to produce its “Best of the Best” resource lists. These lists provide our readers with rankings for education-related blogs, twitter accounts, influencers, products, etc. These lists are meant to be fluid, and for that reason, they are regularly updated to provide up to the moment information.*

Virtual reality is one of the hottest edtech trends. Not only are students allowed the opportunity to emerge themselves into a subject but can travel the world from their desk chairs. While not readily available in every classroom, programs such as Google Cardboard aim to make VR headsets cheap and accessible. The majority of students in the USA own a cell phone, and with many of these educational apps available on both iOs and the iTunes-enabled devices, they are becoming more accessible to more students. Educationally, these VR apps allow students to visualize concepts that were confined to the pictures in a textbook. Below are 20 Virtual Reality Apps that are changing education.

  1. Star Chart – with over 20 million users this app brings the universe a little closer. Students can learn about constellations by aiming their phones at the night sky. There are additional features that allow students to interact with facts about planets and space discovery.
  2. Google Translate – while conventional Google Translate may not sound like a VR app, its new camera feature students can translate 30 languages by aiming their camera at a Students can watch in real time as the text is translated. This additional feature is great for language student
  3. Cleanopolis– Fighting climate change becomes interactive with this app. Students learn about CO2 and battle along with Captain Clean to save the world. Not only is this a fun game but the educational quality would make it great in any science classroom.
  4. Public Speaking VR – practice the skills of public speaking with this immersive VR experience. With photorealistic environments, students can prepare for a job interview of a class presentation.
  5. Quiver – Watch colored in creations come to life with Quiver. Though VR technology, 2D images become 3D and “walk “ off the page. Ideal for younger students.
  6. Boulevard – Art classes can now be supplemented with visits to some of the world’s best art museums. Students can tour six art museums, interact with famous artworks and learn about the art, all thanks to the advancements of VR technology
  7. Unimersiv – History comes alive with the apps developed by Unimersiv. Students can explore ancient Greece, the Titanic or the Egyptian Mysteries.
  8. InMind– Neurons and brain tissue have never looked more realistic. Travel into the brain and learn about anatomy with this great app.
  9. Apollo 11 VR – Be part of one of the most significant space expeditions. Though VR technology, students can have a front seat in this documentary style app. This award winning app is pushing the possibilities of VR as an educational tool
  10. Earth AR – See the globe from new unseen angles. Motion detection and zooming capabilities will make geography more interactive.
  11. Cospaces– creating virtual realities is not as impossible as it sounds. Students are actively involved in the creation and creative process that goes into building a VR world
  12. TiltBrush – Creating 3D paintings is every artist’s dream, and now with TiltBrush, it is a reality. Painting Is done using a handheld “paintbrush,” and the creation possibilities will be awe inspiring for any creative student.
  13. Anatomy 4D – study the human body with clear images that come to life. Ideal for biology students or anyone with interest in the inner workings of the body.
  14. Sites in VR– explore famous landmarks in all their splendor. With an emphasis on Islamic temples, tombs, and ancient cities, students will get to see sites that otherwise would be inaccessible
  15. King Tut VR – Explore the tomb of the legendary Egyptian king and get lost in the secret chambers full of hieroglyphics and treasures
  16. Flashcards- Animal Alphabet – Made for younger students, this immersive flashcard game teaches students words while bringing it all together with some colorful animal friends
  17. Imag-n-o-tron– Stories jump off the page with Imag-n-o-tron. Downloadable content makes this app suitable for any age. Students improve their reading while engaging with complimentary images making the VR world an educational space
  18. EON Experience – This collection of VR lessons encapsulates everything from physics to history. Students or teachers can create their VR lessons from preloaded content.
  19. Titans of Space – This guided tour of space is both informative as it is breathtaking. With voice overs, facts and scored music it is a cutting edge VR product.
  20. Discovery VR Discovery TV channel compiled all the content for this app. Students can explore exotic natural locations and interact with our planet in a futuristic way.

Virtual reality allows students to engage with educational content in a whole new way. They can travel, create art and dissect a frog. The inclusion of VR into the curriculum will also allow teachers to supplement their talents with that of engineers and pioneers of virtual technology.

Disengaged Students, Part 18: Religious Fundamentalism, TV and Anti-Intellectualism

In this 20-part series, I explore the root causes and effects of academic disengagement in K-12 learners and explore the factors driving American society ever closer to being a nation that lacks intellectualism, or the pursuit of knowledge for knowledge’s sake.

While the struggle between intellectualism and anti-intellectualism is longstanding, the 40 years leading up to the first election of President Barack Obama have been especially harsh for intellectuals.

In her bestselling book The Age of American Unreason, historian Susan Jacoby identifies some key factors in this harsh climate. A rise in religious fundamentalism, a decrease in Americans’ general knowledge of the sciences, and the public’s reliance on “infotainment” to deliver facts have all contributed to the dangerous ideological conditions that citizens face collectively. Jacoby talks about how legendary American ideas like the “self-made man” actually do more harm than good when it comes to progress, and how the celebrity status of both right- and left-wing pundits preys upon the vulnerabilities of everyday Americans.

Faith versus Fact

Religious fundamentalism, loosely defined as a set of beliefs that adhere strictly to a religious text despite new knowledge or changes in modern thought, tends to spike during times of intense or rapid change. If you look at Jacoby’s 40-year timeline, beginning in 1978 and ending as the first African-American President was awarded a spot in the White House, the reasons for a rise in fundamental thought systems is easy to surmise. In America’s nearly 250 years’ history, change has never stopped barreling forward and the momentum of the past four decades continues to gain steam.

Fundamentalists generally dislike change and view it as a departure from simpler times and therefore as a danger to their way of life. Fundamentalists tend to like absolutes and singular, straightforward answers to topics that others may consider more complex. If one passage in the Bible, for example, can be thrown out as a story or parable rather than a factual description, then who is to say that the rest of the scriptures are literally true? Though not always tangibly damaging to its adherents, fundamentalism generally bars progressive thought in science or in other realms of knowledge.

When a core population becomes increasingly fundamentalist, an argument can be made that the intellectual progress of the entire nation is hindered, at least in part; and yet it is the American way to encourage citizens to express religious beliefs of all varieties even if those beliefs run counter to scientific or other fact-based thought.

TV: Educational Help or Hindrance?

Jacoby also explores the educational value of television as a medium. She admits she was raised in a Midwestern household that put faith in the potential for television to educate a wider group and bring heightened awareness to world issues. She notes that the original teaching nature of television has given way to a culture which depends on what emanates from the airwaves instead of using that information as a supplement to other forms of self-discovery and education. The Internet has only compounded video culture, making it possible for amateurs with unfounded opinions to carry the same weight as well-learned experts on the same issues.

But with so many accessible opinions, shouldn’t the state of intellectualism actually be improving? Understanding other opinions, after all, is essential to truly having a well-rounded, and therefore well-grounded, view of the world. The problem here is that too much information has the tendency to overwhelm and saturate readers. Rather than take the time to read, compare and debate each position, people (Americans, especially) tend to choose the quickest, easiest answer that best fits their belief systems instead. The availability of so much information, and the tendency of perception to favor presentation over actual facts, has created a culture that has far too many ideas to process. The result is a blanket “live and let live” approach to dissenting opinions that may seem tolerant and even intellectual at first glance, but that reeks of indifference.

Lacking Basic Knowledge

Anti-intellectualism also manifests itself in Americans’ lack of general knowledge on topics outside their immediate needs. In her book, Jacoby talks about the juxtaposition between the record-high numbers of U.S. college graduates and the large percentage of Americans who seem to lack basic, foundational knowledge. A 2009 study conducted by Harris Interactive and commissioned by the California Academy of Sciences and Citizens discovered that most Americans fail when it comes to answering basic science questions correctly. Respondents could not accurately answer how long it takes for the Earth to revolve around the Sun, or what percentage of the Earth is covered with water.

Despite these dismal results, 80 percent of the Americans surveyed insisted that science education is “absolutely essential” to navigate issues like the U.S. economy, healthcare system and global reputation. This wacky divergence of idealized knowledge and actual knowledge is a classic example of anti-intellectualism in American culture. Citizens place high value on scientific aptitude but on the other hand, are not terribly concerned by the fact that they do not possess it.

Perhaps having too much access to knowledge has actually had the reverse intended effect. Could it be that the information is so easy to come by that Americans have just stopped trying?

Coding, Robotics and the Jobs of the Future

Since as early as the 1800’s, fears of robots taking over human jobs has been a reality. As we enter the true age of robotics, those concerns are resurfacing, and educators are unsure about what jobs their students will be competing for. For example, IT jobs will grow by 22% through 2020 and jobs in STEM are said to see similar growth. Educators are expected to equip their students with skills that will translate into careers and yet they have no idea what these skills should be. While timeless skills such as critical thinking, languages and mathematics aid in every career they do not provide the specialized skills that “jobs of the future” may require. So, what are the jobs of the future and how can be best prepare students for them?

Programming jobs are growing 50 percent faster than the market overall. With such a rapidly growing market, it is important to note that not all coding jobs fall within the technology sector. Health care, manufacturing, and finance are in need of coders as is the tech industry. If current K12 students are to fill these positions, they need to be engaging with STEM subjects from a young age. Coding products and “beginner guides” are being marketed to children as young as three, in the hopes of encouraging a coding passion.

Coding is the backbone of many technologies, and in the future, it will be an important tool for entrepreneurs and innovators. However, only one in 10 U.S. schools teach children to code, and so companies look to cheaper (foreign) coders for positions available in the USA. If schools are to align themselves with the future job prospects of their students, it is vital that coding is a skill rather than just an aspect touched on in computer class.

Robotics is another career field that will set to see exponential growth. The global competition to create artificial intelligence (AI) is aggressive, and robotics engineers are pioneered as the jobs of the future. As with coding, the need for robotics is across job sectors and is not solely focused on creating IA. In 2015, the robotics industry saw a 15% increase in sales, which goes to show that people are making and buying more robots than ever.

The Robot Academy and other organization, have realized the lack of robotics in traditional school curriculums and aimed to provide resources for both teachers and students. STEM subjects are vital for future careers in the robotics sector, and if students are not offered the opportunity to create, program and think of robotics, there will be a shortage of these skills in the future.

However, not all “jobs of the future” have their roots in technology. According to the US Bureau of Labor Statistics, these are some of the “Top 30 Fasted Growing Jobs by 2020.”

  • Veterinarians
  • Mental Health Counselors
  • Meeting, convention, and event planning
  • Home health aids

What is particularly interesting about these jobs is that they have a very “human “aspect to them. So while coding and robotics may seem at the forefront, there is still a need for care and humanity; virtues that need to be instilled along with coding and STEM principals. This is reassuring for humanities students who may feel threatened by the future looking geared to those with mathematical abilities.

As educators, it is important to teach skills that will be invaluable. Fostering a love of learning, a commitment to innovation and ethics are fundamental to any position. By understanding the jobs of the future, educators can better prepare themselves and ensure that the curriculum is in line with the expectations and job openings that will be available. On a side note, these articles were not written by a robot.

 

 

 

 

 

Is it time to eliminate tenure for professors?

Samantha Bernstein, University of Southern California and Adrianna Kezar, University of Southern California

The State College of Florida recently scrapped tenure for incoming faculty. New professors at this public university will be hired on the basis of annual contracts that the school can decline to renew at any time.

The decision has been highly controversial. But this is not the first time tenure has come under attack. In 2015, Wisconsin Governor Scott Walker called for a reevaluation of state laws on tenure and shared governance. As of March 2016, a new policy at the University of Wisconsin has made faculty vulnerable to lay offs.

The tenure system provides lifetime guarantees of employment for faculty members. The purpose is to protect academic freedom – a fundamental value in higher education that allows scholars to explore controversial topics in their research and teaching without fear of being fired.

It also ensures that faculty can voice their opinions with university administration and ensure that academic values are protected, particularly from the increasingly corporate ideals invading higher education institutions.

Our research on the changing profile of university faculty shows that while the university enterprise has transformed dramatically in the last hundred years, the tenure employment model remains largely unchanged. So, has the tenure model become outdated? And if so, is it time to eliminate it altogether?

Growth of adjunct faculty

The demographic of higher education faculty has changed a lot in recent years. To start with, there are very few tenured faculty members left within higher education.

Tenure-track refers to that class of professors who are hired specifically to pursue tenure, based largely on their potential for producing research. Only 30 percent of faculty are now on the tenure-track, while 70 percent of faculty are “contingent”. Contingent faculty are often referred to as “adjuncts” or “non-tenure track faculty.” They are usually hired with the understanding that tenure is not in their future at that particular university, and they teach either part-time or full-time on a semester-to-semester or yearly basis.

There are fewer tenured faculty in the higher education system. St. Ambrose University, CC BY-NC

Most contingent faculty have short-term contracts which may or may not be renewed at the end of the contract term. As of 2010, 52 percent of contingent faculty had semester-to-semester part-time appointments and 18 percent had full-time yearly appointments.

Researchers suggest that the increase in contingent appointments is a result of the tenure model’s failure to adapt with the significant and rapid changes that have occurred in colleges and universities over the last 50 years.

The most significant of these changes is the rise of teaching-focused institutions, the largest growth being in the community college, technical college and urban institutions that have a primary mission to educate students with little or no research mission. Between 1952 and 1972 the number of community colleges in the United States nearly doubled, from 594 to 1141, to accommodate a large increase in student enrollments, leaving four-year institutions to focus on research and development.

Campuses changed, not tenure system

Most commentators have described the growth of contingent faculty as a response to financial pressures in the 1990s.

But our research shows that this growth actually began in the 1970s when market fluctuations caused unexpected growths in college enrollment. Between 1945 and 1975, college enrollment increased in the United States by 500 percent. However, rising costs and a recession in the late 1970’s forced administrators to seek out part-time faculty to work for lower wages in order to accommodate these students. The practice increased dramatically thereafter.

In addition to enrollment changes, government funding for higher education decreased in the late 1980s and ‘90s. The demand for new courses and programs was uncertain, and so campuses needed more flexibility in faculty hiring.

Further, over the last 20 years new technologies have created new learning environments and opportunities to teach online.

Tenure-track faculty incentivized to conduct research were typically not interested in investing time to learn about new teaching technologies. Consequently, a strong demand for online teaching pushed institutions into hiring contingent faculty to fill these roles.

As a result, what we have today is a disparity between the existing incentive structures that reward research-oriented, tenure-track faculty and the increased demand for good teaching.

Why the contingent faculty model hurts

Critics of tenure argue that the tenure model, with its research-based incentives, does little to improve student outcomes. But the same can be said of the new teaching model that relies so heavily on contingent faculty – it is not necessarily designed to support student learning.

Research on contingent faculty employment models illustrates that they are poorly designed and lack many of the support systems needed to foster positive faculty performance.

For example, unlike tenure-track faculty, contingent faculty have little or no involvement in curriculum planning or university governance, little or no access to professional development, mentoring, orientations, evaluation, campus resources or administrative support; and they are often unaware of institutional goals and outcomes.

Furthermore, students have limited access to or interaction with these faculty members, which research suggests is one of the most significant factors impacting student outcomes such as learning, retention and graduation.

Studies have shown that student-faculty interaction provides students with access to resources, mentoring and encouragement, and allows them to better engage with subject material.

Studies show lower graduation rates as a result of the faculty workforce model. Sakeeb Sabakka, CC BY

Recent research on contingent faculty has also identified some consistent and disturbing trends related to student outcomes that illustrate problems related to new faculty workforce models. These include poor performance and lower graduation rates for students who take more courses with contingent faculty, and lower transfer rates from two-year to four-year institutions.

Using transcripts, faculty employment and institutional data from California’s 107 community colleges, researchers Audrey Jaeger and Kevin Eagan found that for every 10 percent increase in students’ exposure to part-time faculty instruction, they became 2 percent less likely to transfer from two-year to four-year institutions, and 1 percent less likely to graduate.

Additionally, studies of contingent faculties’ instructional practices suggest that they tend to use fewer active learning, student-centered teaching approaches. They are also less engaged with new and culturally-sensitive teaching approaches (strategies encouraging acknowledgment of student differences in a way that promotes equity and respect).

Consequently today, when the pool of Ph.D. students is growing, the number of tenure-track positions available for graduates is shrinking. As a result, a disconnect has evolved between the types and number of Ph.D.s on the job market in search of tenure, and the needs of, and jobs available within, colleges and universities.

Some estimates show that recent graduates have less than a 50 percent chance of obtaining a tenure-track position. Furthermore, it is graduates from the top-ranked quarter of graduate schools who make up more than three quarters of tenure-track faculty in the United States and Canada, specifically in the fields of computer science, business and history.

A new tenure system?

We appear to be at a crossroads. The higher education enterprise has changed, but the traditional tenure model has stayed the same. The truth is that universities need faculty who are dedicated to teaching, but the most persuasive argument in support of tenure – its role in protecting academic freedom– has come to be too narrowly associated with research.

Academic freedom was always meant to extend to the classroom – to allow faculty to teach freely, in line with the search for truth, no matter how controversial the subject matter. Eliminating tenure completely will do little to protect academic values or improve student performance.

Instead, the most promising proposal that has emerged many times over the last 30 years is to rethink the traditional tenure system in a way that would incentivize excellent teaching, and create teaching-intensive tenure-track positions.

Under an incentive system, when considering whether to grant tenure, committees can take into account excellence in teaching, by way of student evaluations, peer review, or teaching awards. For faculty on a teaching-intensive track, tenure decisions would be made based primarily on their teaching, with little or no weight given to research.

Though not every contingent faculty member would be eligible for such positions, these alternative models can change the incentive structures inherent in the academic profession. They may be able to remove the negative stigmas surrounding teaching in the academy and may eliminate the class-based distinctions between research and teaching faculty that have resulted from the traditional tenure model.

The Conversation

Samantha Bernstein, PhD Student, University of Southern California and Adrianna Kezar, Professor of Higher Edcuation, University of Southern California

This article was originally published on The Conversation. Read the original article.

Eliminating inequalities needs affirmative action

Richard J. Reddick, University of Texas at Austin; Stacy Hawkins, Rutgers University, and Stella M Flores, New York University

The Supreme Court has upheld the affirmative action admission policy of University of Texas. Abigail Fisher, a white woman, applied to the University of Texas at Austin (UT Austin) in 2008. She sued the university after she was denied admission on the grounds that the university’s race-conscious admissions policy, violated the equal protection clause of the Fourteenth Amendment.

On Thursday, June 23, the Supreme Court ruled that the race-conscious admissions program was constitutional – a decision that the three scholars on our panel welcome. They tell us why existing educational inequalities need considerations of race and ethnicity in admissions.

How else do you eliminate inequality?

Richard J. Reddick is an associate professor in educational administration at University of Texas at Austin.

UT Austin’s history on legal decisions about race in higher education goes back to Sweatt v. Painter (1950), a case that successfully challenged the “separate but equal” doctrine articulated in Plessy v. Ferguson (1898). The landmark case helped pave the way for Brown v. Board of Education (1954), which outlawed racial segregation in education.

The next test, in the Hopwood v. Texas (1996) case, came from the other direction. Cheryl Hopwood was a white applicant who was denied admission. She challenged UT Austin’s use of race in its admissions decisions as unconstitutional. The Fifth Federal Circuit Court of Appeals eliminated the consideration of affirmative action in universities and colleges in Texas. This decision was overruled in 2003.

Fisher, then, was another challenge to the university’s renewed efforts to provide educational opportunity and access to underrepresented students at predominantly white institutions.

UT Austin’s history on legal decisions about race in higher education goes back to Sweatt v. Painter. qmechanic, CC BY-NC-SA

Opponents of affirmative action often argue that metrics, such as test scores and class rank, that appear to be neutral, should be the method by which to admit students.

These arguments fail to consider the real impact that racial and socioeconomic discrimination has on educational opportunity. School resources and teacher quality differ significantly, and intangibles such as leadership opportunities often depend on subjective criteria such as teacher recommendations.

Furthermore, many students from underrepresented communities confront challenges in navigating school systems. We additionally know that standardized testing can show bias in certain populations.

In other words, these “neutral” measures actually reinforce social inequities.

The most selective institutions of higher education in the nation no longer rely solely on these metrics. They seek out students with a variety of experiences – factors that may not always correspond to test scores and class ranking.

Today’s ruling is a reassurance, as fleeting as it might be, that the massive task of eliminating educational inequality – which correlates to many other forms of inequality – can be supplemented by approaches in college admissions that consider race and ethnicity.

It does not minimize the importance of eradicating racial discrimination in all walks of life: in the words of UT Austin president Greg Fenves, “race continues to matter in American life.”

However, emphasizing the significance of careful, narrowly tailored approaches to enhancing diversity at predominantly white institutions is a victory for the scholars, researchers, administrators and families who have demonstrated how diversity provides significant educational benefits for all students and American society.

What are the implications for other colleges?

Stacy Hawkins is an associate professor at Rutgers University, where she teaches courses in Employment Law and Diversity in the Law.

The Supreme Court’s decision is cause for both celebration and circumspection.

Justice Anthony Kennedy, the court’s moderate swing justice, whose opinion was rightly predicted to be the key to the decision, undoubtedly shocked many by voting for the first time to uphold a race-conscious admissions policy.

However, the decision is more consistent with Justice Kennedy’s prior decisions, notwithstanding the difference in outcome, than might appear at first blush.

On the one hand, Justice Kennedy reaffirmed his commitment to diversity as a compelling educational interest in 21st-century America (a view he expressed in prior cases on diversity in higher education, as well as in primary and secondary schools).

On the other hand, however, Justice Kennedy also reaffirmed his long-standing belief that, notwithstanding this interest, race may play no more a role than is absolutely necessary to achieve the educational benefits of diversity.

In striking this delicate balance, Justice Kennedy sanctioned the University of Texas’ race-conscious admissions policy today, but gave fair warning that the future of this policy is by no means secure.

More important perhaps than the implications of this decision for the University of Texas is what, if any, implications this decision may have for other colleges and universities?

As Justice Kennedy acknowledged, the University of Texas is unique in its use of race to narrowly supplement a plan that admits the overwhelming majority of students (at least 75 percent) on the sole basis of high school class rank without regard to race, a feature that was critical to Justice Kennedy’s approval of the policy.

Thus, the vast majority of colleges and universities may still be left to wonder about the constitutionality of their own race-conscious admissions policies that operate more widely than Texas’ does.

With a similar case against Harvard University currently winding its way through the federal courts, the answer may not be far off.

Affirmative action bans exist in many states

Stella M. Flores is an associate professor of higher education at the Steinhardt School of Culture, Education, and Human Development at New York University.

Demography, economy and diversity are key issues facing the nation’s colleges and universities and should also be a part of their policy design.

In Fisher v. Texas today, Justice Kennedy’s opinion clearly states two outcomes. The first is that the university’s deliberation that race-neutral programs had not achieved their goals was supported by significant statistical and anecdotal evidence.

Admissions policies at universities play a key role in diversifying key areas. Supreme Court image via www.shutterstock.com

The second is that universities have the obligation to periodically reassess their admissions programming using data to ensure that a plan is narrowly tailored so that race plays no greater role than is necessary to meet its compelling interests. This is in essence an accountability mechanism for universities to follow using data and research.

Admissions policies at universities play an important role in the ability to diversify key fields relevant to the nation’s economy, including law, medicine, STEM, education and public policy, so that they can appropriately reflect and serve the unprecedented demographic expansion facing our country.

The decision ensures that pathways to the nation’s most critical educational and employment fields will stay open.

But there are other considerations and realities that include the following. First, some of the nation’s most racially diverse states will still operate under affirmative bans due to state legislation and referenda. These include California, Florida, Michigan, Arizona and Oklahoma.

Second, there is still a clear need for additional effective policies and efforts beyond a consideration of race in college admissions to address the disconnect between the demographics of the nation and its public K-12 schools and who is represented at selective colleges and universities.

Retracting the use of race nationally would have been a step toward increasing racial and ethnic inequality in schools and society. But we’re in a time where race really matters in this country and in how we learn together as a diverse society in our classrooms. This decision reflects this reality.

The Conversation

Richard J. Reddick, Associate Professor in Educational Administration, University of Texas at Austin; Stacy Hawkins, Associate Professor, Rutgers University, and Stella M Flores, Associate Professor of Higher Education, New York University

This article was originally published on The Conversation. Read the original article.

Just graduated? Does it make you feel like a grown up?

Michael Vuolo, The Ohio State University and Jeylan T Mortimer, University of Minnesota

We may think that a simple age cutoff – such as 18 – should make us feel like adults. And why not? After all, crossing an age threshold can bestow certain rights, such as voting, military enlistment, purchase of certain substances as well as adult images or videos.

From our perspective as researchers who study the transition from adolescence to adulthood, these legally defined age markers are hardly a good indicator of when we feel like adults. They can be subject to change and have no universal or even national standard.

For example, the minimum age for purchase for alcohol and recreational marijuana is 21. But the purchase of recreational marijuana is not legally permitted in all states. While tobacco purchase age is typically 18, two states and several cities recently moved the tobacco purchase age up to 21.

In addition, often times, individuals may not always “feel” like an adult simply because they passed an age marker.

So, when do we “feel” like adults?

Path to adulthood

Our idea of “adult” is bound closely to both our objective attainment of certain roles as well as our subjective evaluation of the timing of those roles.

Scholars working in this area have identified five important role transitions marking adulthood: finishing school, leaving home, acquiring stable work, marrying and parenting.

Although each of these adult roles has been considered alone or in pairs, little is known about how people traverse all the roles simultaneously and how achieving these markers of adulthood affects considering one’s self an “adult.”

People may feel “on time” or “off time,” depending on whether they achieve adult roles at the “right time.“ In other words, feeling like an adult may be tied to achieving multiple roles marking adult life rather than any single one and doing so in a timely manner compared to peers.

Pathway to adult life? Elizabeth Donoghue, CC BY-NC-ND

A typical pathway was laid out in the early and mid-20th century: exit school, get a job, move out of the parental home, get married, and have children.

While this might be considered the “normal” pathway even today, these transitions do not occur in such a neat and predictable order for many contemporary young people. Furthermore, the time to complete them has become longer.

It is commonplace today for young people to return to school after beginning work, move back in with parents (or never leave), have children prior to marriage, or work in less secure part-time jobs.

Different transition paths

Given the myriad possible paths through these roles, our research seeks to find frequent patterns or commonalities in the ways roles marking adulthood are traversed from ages 17 to 30 and what they mean for considering oneself an adult.

The study is based on a sample of 1,010 freshmen of St. Paul Public Schools, a school district of Minnesota. The survey started in 1988 and continued near annually through 2011. Over 20 years, this study has examined the consequences of work and other formative experiences in adolescence for the transition to adulthood.

Using a method that could identify distinct patterns in the timing and sequencing of adult roles, we found that the traditional school-to-work transition followed by “family formation” (that is, getting married and having children – around age 25) described above still exists.

However, only about 17 percent of young people follow that path today. Rather, most youth take four other pathways to adulthood.

Two of those paths involve a traditional school-to-work transition in one’s early twenties. But they are different in when they choose to form a family: one group delayed forming a family until their late twenties (20 percent); another did not do so by age 30 (27 percent).

The two remaining paths were distinguished by their low likelihood of attending college and early marriage and kids. Each member of this group had children by age 22.

But even these two paths defined by early parenting differed from one another: One group of early parents married and acquired full-time work (15 percent). The other, however, had much lower chances of achieving those roles (20 percent).

In other words, there were several objective ways to traverse the transition to adulthood.

Marriage, parenthood are critical

The question remains, do the members of these groups feel like an adult when they reach their mid-twenties? Have they acquired an adult identity? Do they think they are on or off time in achieving the five markers of adulthood?

Given the social acceptance of the traditional pathway of school-work-marriage- kids, individuals following that were more inclined to view themselves “entirely” as an adult. They considered themselves “on time” with regards to marriage and financial independence, relative to their peers.

Early parents who married and acquired full-time work also felt entirely like adults, although they considered themselves “very early” in traversing those markers.

Truly feeling like an adult is tied to forming one’s own family. Kim Davies, CC BY-NC-ND

By contrast, the early parents who did not get married or acquired stable work, felt “very early” on parenthood, but “very late” on other markers like marriage, cohabitation, and financial independence.

The other two groups who took the traditional school-to-work transition but delayed or did not get married and had kids felt “not entirely” like an adult. They believed that they were “very late” on parenthood.

While they achieved several traditional markers of adulthood, including finishing school, getting a job, and moving out on their own, they still did not feel like adults without marriage and parenthood.

It would appear that truly feeling like one has become an adult is tied to forming one’s own family via marriage and parenthood.

When do we “feel” adult?

Our research shows that there are many pathways that young people take in transitioning to adulthood. Adulthood is a subjective process that no one marker appears to be able to define, though marriage and parenthood are particularly important.

Moving away from the more traditional school-to-work transition allows for a period of exploration, as young people figure out what they want to do in life. Acquiring markers of adulthood is associated with leaving behind deviant behavior, such as heavy partying and even theft, usually committed at younger ages. Furthermore, in ongoing research, we find that early parents without partners have poor objective and subjective health outcomes.

But, to come back to the original question, when do we “feel” like adults, there is no simple answer.

Individuals become adults when they feel like adults, but this feeling is tied to the timely acquisition of certain markers, especially marriage and parenthood. Such subjective assessments are socially constructed.

In time, as the four “non-traditional” pathways become more commonplace, perhaps what is perceived as “on time” adulthood will shift so that individuals following those paths will view themselves as adults earlier in life.

The Conversation

Michael Vuolo, Assistant Professor of Sociology, The Ohio State University and Jeylan T Mortimer, Professor of Sociology, University of Minnesota

This article was originally published on The Conversation. Read the original article.