Sunday, August 14, 2011
Tuesday, July 26, 2011
By LAURA PAPPANO
Published: July 22, 2011
William Klein’s story may sound familiar to his fellow graduates. After earning his bachelor’s in history from the College at Brockport, he found himself living in his parents’ Buffalo home, working the same $7.25-an-hour waiter job he had in high school.
It wasn’t that there weren’t other jobs out there. It’s that they all seemed to want more education. Even tutoring at a for-profit learning center or leading tours at a historic site required a master’s. “It’s pretty apparent that with the degree I have right now, there are not too many jobs I would want to commit to,” Mr. Klein says.
So this fall, he will sharpen his marketability at Rutgers’ new master’s program in Jewish studies (think teaching, museums and fund-raising in the Jewish community). Jewish studies may not be the first thing that comes to mind as being the road to career advancement, and Mr. Klein is not sure exactly where the degree will lead him (he’d like to work for the Central Intelligence Agency in the Middle East). But he is sure of this: he needs a master’s. Browse professional job listings and it’s “bachelor’s required, master’s preferred.”
Call it credential inflation. Once derided as the consolation prize for failing to finish a Ph.D. or just a way to kill time waiting out economic downturns, the master’s is now the fastest-growing degree. The number awarded, about 657,000 in 2009, has more than doubled since the 1980s, and the rate of increase has quickened substantially in the last couple of years, says Debra W. Stewart, president of the Council of Graduate Schools. Nearly 2 in 25 people age 25 and over have a master’s, about the same proportion that had a bachelor’s or higher in 1960.
“Several years ago it became very clear to us that master’s education was moving very rapidly to become the entry degree in many professions,” Dr. Stewart says. The sheen has come, in part, because the degrees are newly specific and utilitarian. These are not your general master’s in policy or administration. Even the M.B.A., observed one business school dean, “is kind of too broad in the current environment.” Now, you have the M.S. in supply chain management, and in managing mission-driven organizations. There’s an M.S. in skeletal and dental bioarchaeology, and an M.A. in learning and thinking.
The degree of the moment is the professional science master’s, or P.S.M., combining job-specific training with business skills. Where only a handful of programs existed a few years ago, there are now 239, with scores in development. Florida’s university system, for example, plans 28 by 2013, clustered in areas integral to the state’s economy, including simulation (yes, like Disney, but applied to fields like medicine and defense). And there could be many more, says Patricia J. Bishop, vice provost and dean of graduate studies at the University of Central Florida. “Who knows when we’ll be done?”
While many new master’s are in so-called STEM areas — science, technology, engineering and math — humanities departments, once allergic to applied degrees, are recognizing that not everyone is ivory tower-bound and are drafting credentials for résumé boosting.
“There is a trend toward thinking about professionalizing degrees,” acknowledges Carol B. Lynch, director of professional master’s programs at the Council of Graduate Schools. “At some point you need to get out of the library and out into the real world. If you are not giving people the skills to do that, we are not doing our job.”
This, she says, has led to master’s in public history (for work at a historical society or museum), in art (for managing galleries) and in music (for choir directors or the business side of music). Language departments are tweaking master’s degrees so graduates, with a portfolio of cultural knowledge and language skills, can land jobs with multinational companies.
So what’s going on here? Have jobs, as Dr. Stewart puts it, “skilled up”? Or have we lost the ability to figure things out without a syllabus? Or perhaps all this amped-up degree-getting just represents job market “signaling” — the economist A. Michael Spence’s Nobel-worthy notion that degrees are less valuable for what you learn than for broadcasting your go-get-’em qualities.
“There is definitely some devaluing of the college degree going on,” says Eric A. Hanushek, an education economist at the Hoover Institution, and that gives the master’s extra signaling power. “We are going deeper into the pool of high school graduates for college attendance,” making a bachelor’s no longer an adequate screening measure of achievement for employers.
Colleges are turning out more graduates than the market can bear, and a master’s is essential for job seekers to stand out — that, or a diploma from an elite undergraduate college, says Richard K. Vedder, professor of economics at Ohio University and director of the Center for College Affordability and Productivity.
Not only are we developing “the overeducated American,” he says, but the cost is borne by the students getting those degrees. “The beneficiaries are the colleges and the employers,” he says. Employers get employees with more training (that they don’t pay for), and universities fill seats. In his own department, he says, a master’s in financial economics can be a “cash cow” because it draws on existing faculty (“we give them a little extra money to do an overload”) and they charge higher tuition than for undergraduate work. “We have incentives to want to do this,” he says. He calls the proliferation of master’s degrees evidence of “credentialing gone amok.” He says, “In 20 years, you’ll need a Ph.D. to be a janitor.”
Among the new breed of master’s, there are indeed ample fields, including construction management and fire science and administration, where job experience used to count more than book learning. Internships built into many of these degrees look suspiciously like old-fashioned on-the-job training.
Walter Stroupe, a retired police first lieutenant and chairman of the department of criminal justice at West Virginia State University, acknowledges that no one needs to get the new master’s degree in law enforcement administration the school is offering beginning this fall. In fact, he concedes, you don’t even need a college degree in West Virginia to become a police officer, typically the first step to positions as sheriff and police chief.
Still, Dr. Stroupe says, there are tricky issues in police work that deserve deeper discussion. “As a law enforcement officer, you can get tunnel vision and only see things from your perspective,” he says. “What does a police officer do when they go up to a car and someone is videotaping them on a cellphone?” The master’s experience, he hopes, will wrangle with such questions and “elevate the professionalism” among the police in the state.
These new degrees address a labor problem, adds David King, dean of graduate studies and research at the State University of New York at Oswego, and director of the Professional Science Master’s Program, which oversees P.S.M. degrees across the SUNY system.
“There are several million job vacancies in the country right now, but they don’t line up with skills,” he says. Each P.S.M. degree, he says, is developed with advisers from the very companies where students may someday work. “We are bringing the curriculum to the market, instead of expecting the market to come to us,” he says.
That’s why John McGloon, who manages the technical writing and “user experience” team at Welch Allyn, the medical device company, helped shape the master’s in human-computer interaction at Oswego. He says employers constantly fear hiring someone who lacks proper skills or doesn’t mesh. Having input may mean better job candidates. This summer, Mr. McGloon has three SUNY Oswego interns. “We plug them right into the team,” he says. “Not only can you gauge their training, you can judge the team fit, which is hard to do in an interview.”
While jobs at Welch Allyn may not require a master’s, the degree has been used as a sorting mechanism. After posting an opening for a technical writer, Mr. Mc- Gloon received “dozens and dozens” of résumés. Those in charge of hiring wondered where to start. “I said, ‘Half of our applicants have master’s. That’s our first cut.’ ”
Laura Georgianna, in charge of employee development at Welch Allyn, confirms that given two otherwise equal résumés, the master’s wins. A master’s degree “doesn’t guarantee that someone will be much more successful,” she says. “It says that this person is committed and dedicated to the work and has committed to the deep dive. It gives you further assurance that this is something they have thought about and want.”
The exposure to workplaces, and those doing the hiring, makes master’s programs appealing to students. “The networking has been unbelievable,” says Omar Holguin. His 2009 B.S. in engineering yielded only a job at a concrete mixing company. At the University of Texas, El Paso, which is offering a new master’s in construction management, he’s interning with a company doing work he’s actually interested in, on energy efficiency.
There may be logic in trying to better match higher education to labor needs, but Dr. Vedder is concerned by the shift of graduate work from intellectual pursuit to a skill-based “ticket to a vocation.” What’s happening to academic reflection? Must knowledge be demonstrable to be valuable?
The questions matter, not just to the world of jobs, but also to the world of ideas. Nancy Sinkoff, chairwoman of the Jewish studies department at Rutgers, says its master’s, which starts this fall, will position students for jobs but be about inquiry and deep learning.
“I would imagine in the museum world, I would want to hire someone with content,” she says hopefully. “To say, ‘I have a master’s in Jewish studies,’ what better credential to have when you are on the market?”
“This will make you more marketable,” she is convinced. “This is how we are selling it.”
Whether employers will intuit the value of a master’s in Jewish studies is unclear. The history department at the University of South Florida has learned that just because a content-rich syllabus includes applied skills (and internships) doesn’t mean students will be hired. “Right now, yes, it’s very hard to get a job” with a master’s in public history, says Rosalind J. Beiler, chairwoman of the history department, noting that the downturn hurt employers like museums and historical societies.
The university is revamping its master’s in public history, a field that interprets academic history for general audiences, to emphasize new-media skills in the hopes of yielding more job placements. “That is precisely the reason we are going in that direction,” she says.
“Digital humanities,” as this broad movement is called, is leading faculty members to seek fresh ways to make history more accessible and relevant in their teaching and research. A professor of Middle Eastern history, for example, has made podcasts of local Iraqi war veterans in a course on the history of Iraq.
It may be uncomfortable for academia to bend itself to the marketplace, but more institutions are trying.
In what could be a sign of things to come, the German department at the University of Colorado, Boulder, is proposing a Ph.D. aimed at professionals. Candidates, perhaps with an eye toward the European Union, would develop cultural understanding useful in international business and organizations. It would be time-limited to four years — not the current “12-year ticket to oblivion,” says John A. Stevenson, dean of the graduate school. And yes, it would include study abroad and internships.
Dr. Stevenson sees a model here that other humanities departments may want to emulate.
It does, however, prompt the question: Will the Ph.D. become the new master’s?
Laura Pappano is author of “Inside School Turnarounds: Urgent Hopes, Unfolding Stories.”
Thursday, July 21, 2011
Monday, July 18, 2011
Posted by Gene Marks
I was in a Rite Aid pharmacy the other day and about to pay for my stuff at their new bank of automated self-check out kiosks. I heard one woman behind me say to her friend, “Oh, I would NEVER use those things. They take jobs away from people.”
What’s that? You’d like to work for my small business? I appreciate your interest. And I, like so many others, feel terrible about how long you’ve been unemployed. We would like to do something about the situation. We’d like to help you. But there’s something you (and the woman from the Rite Aid) need to know. I’m not sure how to say this kindly so it’s best I just say it: many of us don’t really need more employees.
Of course the fact that you’re out of a job has a lot to do with the state of the economy. Growth is anemic. The uncertainty in the current business environment is holding a lot of us back from making the investments that we’d like to make. And regulations and the prospect of more regulations, let alone higher taxes to pay for our country’s deficits, are giving many of us pause for concern. For that we can certainly blame many: our politicians, the government, the banking system, the media…even ourselves.
But it’s not just that. In fact, one of the biggest reasons why you don’t have a job (and the prospects of finding a job are not encouraging) can also be blamed on someone else: Microsoft. And other technology companies like them.
Because there’s something else going on in this economy. Just look at the below chart. It shows that our country’s Gross Domestic Product, while growing at a painfully slow pace, is now higher than it was before the 2008 recession. And yet it’s common knowledge among those who track these things that there are more than seven million people without jobs than there were at the start of the recession. Which means that businesses are producing more products and services than ever before…but with seven million less people. And by the way…corporate profits are at an all-time high too.
Manufacturers are leading the charge. Just look at how manufacturing productivity has risen over the past thirty years in this country while the number of people employed to make stuff has decreased.
I know you need a job and I know this is a very difficult situation. And I don’t want to sound cruel because I’m trying to help you. And to get help with a problem the first thing we have to do is diagnose the problem. So here’s the cold, hard truth about why you’re unemployed: most businesses don’t need you any more. We can do just as much, if not more, without you.
Over the past twenty years, the technology industry, led by companies like Microsoft, have given us powerful databases, operating systems, networks and software applications that have made it easier for us to accomplish more tasks than we did before with less people. And it’s not just Microsoft who you can blame.
Blame Sage, who makes Enterprise Resource Planning and Customer Relationship Management software that has enabled businesses to automate their marketing campaigns, build workflows for alerting managers when inventory needs to be replenished and generate workorders and invoices that are immediately emailed without employing teams of people.
Blame Rackspace and Amazon and other cloud based infrastructure providers, who allow us to host all of our business applications on their servers, thereby eliminating many in our information technology departments and cutting back on wasted time from downed computers and security flaws.
It’s true that the costs of healthcare and other regulations have discouraged many of us from hiring full time employees. But at the same time we’ve come to realize that maybe we don’t need as many full time employees as we used to. And because technology has advanced so much, even over the past few years, we’ve seen an explosion of outsourcing among businesses, both small and large.
For little cost, companies like mine can easily setup systems for external access and collaboration. We use remote desktop services (again from Microsoft) , but also from companies like Citrix Online and LogMeIn so that our contractors can access our networks to do their work. We use cloud based applications like Box.net, Basecamp, Salesforce.com and NetSuite to manage projects, share data and schedule tasks with both employees and approved outsiders wherever they are. Thanks to Microsoft, Google and companies like Zoho and Dropbox we can now easily put out entire office in the cloud – documents, spreadsheets, presentations, databases, projects.
And we can communicate with our outsourced help, wherever they are, more quicker and easier than before. We make free phone calls using Skype and inexpensive mass calls (or texts) using products like VoiceShot. We hold free conference call sessions using Freeconferencecall.com. We share our desktops using Glance and Join.me. We hold training sessions using Webex. We use video tools like Oovoo to virtually meet face to face.
And finding outsourced help is easier than it’s ever been. That’s because we can search sites like Craigslist, Elance and Guru. And when we find qualified people to accomplish specific tasks for us we can use these sites to set our relationships, manage our payments and communicate our needs.
Which is why so many of the tasks once done by companies are now being outsourced to individuals and other companies who can, using their own internal technology, perform these same tasks with so many less people. Most of the clients I work with outsource their payroll to companies like ADP and Paychex. Many outsource their bookkeeping needs to firms that do nothing else, but do it more efficiently. Most companies now have internet based phone systems where an automated attendant re-directs calls to people’s cell phones and voice mail messages are sent to them via text with no humans in the middle.
Are you starting to see the picture? I know you want to be hired full time by me. And I want to be doing my part. But please understand: I’m running a business. I want to make profits. And these tools are letting me make more profits by employing people only when I need them rather than carrying them on my payroll.
It’s not all Microsoft’s fault. What they’re doing is nothing compared to what’s happening on the shop floor. Because, quietly and without fanfare, companies like the Oystar Group are making machines that fill tubes faster than before, requiring less shifts of people to complete an order. And equipment from Keller Technology enables cosmetics and pharmaceutical manufacturers to produce more product with less people. And software and consulting firms like Intuitive ERP and Epicor are helping manufacturers change their internal processes to create more products from less space and using less resources, particularly people.
We know this is true in our own lives. Things are lasting longer and working better. We’re keeping our cars well beyond 100,000 miles. We’re letting our fridges and toasters and kettles do their jobs well beyond the lifespan that our parents did. New developments in flooring, painting and construction are resulting in longer use of our homes. Because technology is better. Have you ever had a TV repairman to your house? How many times has your washing machine broken down over the past twelve years and thousands upon thousands of cycles? Because of technology, there are less people needed to manufacture and service the durable equipment that we use because this stuff is working better and for a much longer period of time.
And with all that, we still need you. Don’t believe me? Look at last month’s Monster Employment Index or read Gallup’s recent Job Creation data. Both surveys find that job availabilities are at their highest level than before 2008. But these are not same the jobs that existed before 2008. That’s because we don’t need as many receptionists, clerks, cashiers, bookkeepers, inventory stockers and maintenance people as we used to. Technology has helped us cut back on all of that. Go to your local supermarket (or Rite Aid) and you’ll see what I mean.
But we do need programmers. And experienced customer service people. We need engineers, scientists, high end equipment operators, nurses, lab technicians and (very soon) capable construction workers too. In other words: people with skills. As a business owner it’s a no-brainer to me that if I can profit from your skills I may very well be persuaded to hire you. What expertise can you bring to me that a machine can’t do for much less? I have to meet that challenge with my own customers. That’s the challenge that we all face.
Of course, all economies are cyclical. And more jobs will be created once the economy again begins to grow. No one knows when this will happen and right now, in our current political environment, many aren’t feeling too confident that this will happen anytime soon. But even in a growing economy will we ever see 5-6% natural unemployment again? This may never happen. And if it doesn’t, please don’t just blame the politicians. Blame Microsoft. And other tech companies like them. It’s because of them that I’m not hiring you.
Besides Forbes, Gene Marks writes weekly for The New York Times and frequently for The Huffington Post and American City Business Journals. He runs a ten person consulting firm outside of Philadelphia and can be followed on Twitter.
Friday, July 15, 2011
What to Expect When You Leave College and Begin Working 9-5
1. You will, somehow, start eating at Subway all the time.
2. There will be a growing pains period when your friends constantly text you at 2 p.m. saying, “We’re at the beach! Come!” and you will sit and get disproportionately mad, thinking thoughts like, “Don’t they know I’m at work, wtf,” and “Who the hell is free at 2 p.m.?”
3. You will suddenly need to “buy stamps.”
4.Welcome to college loans. Despite the fact that your university job made you a piddly $400 per pay period and you now make significantly more, you will envy and be mystified by the days when you could afford $80 worth of art supplies/ shoes/ whatever per month.
5. You’ll get a little fat. Once you work full-time, you’re sitting at a desk 8-9 hours per day and guess what, there are free cookies and pies all the time. There just are.
6. Say a tearful goodbye to Regis and Kelly, or whatever guilty daytime TV you used to love but are too embarrassed to Tivo.
7. Slowly, you will start to become a normal person again. You will go to bed before midnight. You will wake up early and read the newspaper, no hangover in sight. You will join a gym and think about volunteering. You might even bike with your colleagues on the weekends.
8. Your friends will describe your clothing style on the weekends as “work appropriate, minus the sweater” and on the weekdays as “weekend clothes, covered up for work.”
9. Since most of your friends are either still in college/ bartending/ working in retail, the rest of your life does not quite match that of your settled down co-workers, who will inevitably find pictures of you mock kissing a girl in a pool while smoking and holding a beer, or find a tweet of you talking about your roommates doing acid.
10. You will be constantly sleep deprived. You’re still not sure how to not watch Netflix until 2 a.m., but you also become miraculously trained to wake up at 7 every morning. This also means that your weekends involve you waking up amongst party peers/ boyfriend, whoever and reading several magazines while everyone leisurely slumbers till 11.
Monday, July 11, 2011
Newly elected Jumaane Williams didn't let Tourette's stand in his way-now wants to help others
Flatbush community organizer Jumaane Williams - who toppled incumbent City Councilman Kendall Stewart in last week's primary - is used to standing up for himself.
Williams was diagnosed with Tourette's syndrome as a teenager growing up in Brooklyn and spent years battling school bureaucrats who wanted him out of gifted classes and into special ed.
Now Williams, 33, wants to draw from the experience to help others.
"Especially as a kid of color, if my mother hadn't been so involved I would have been in special ed," said Williams, who tested into the Philippa Schuyler Middle School for the Gifted and Talented and Brooklyn Technical High School before attending Brooklyn College, where he earned his bachelor's and master's degrees.
Friday, July 8, 2011
My rating: 5 of 5 stars
Taking Back Control
Self-education in this economy is necessary for success, but self-education is most often discouraged within the education system. In my own self-education, I came across this book and many others that are helping steer me in the right direction. One could say the only thing I really needed to learn in school was how to read, how to write, and how to check out a library book. Everything else was superfluous.
This however, is a list of invaluable information I gleaned just from Bach's book:
1. Education can only happen in an environment in which people feel respected, and that their learning is necessary. They need love and encouragement from their teachers to succeed, not by way of high marks, but by formulating a personality that comes from knowing things and the curiosity to know more.
2.In school, and even in the working environment, most often others succeed when they have a sense of uniquely belonging. They want to be apart of something, but they want to bring to that something their own unique contribution. This is necessary in the classroom to a student who wants to learn, but doesn't simply want to follow along in the textbook, and regurgitate facts. Think of a pack of wolves rather than a school of fish. We want children and adults who devour their own sought out information, not passive fish who glean what they "can".
3.Criticism and intimidation are not the same thing, but in the school system are hand-in-hand. Most people who are "bad" at science and math, say it is because they are intimidated by numbers. People who are "bad" readers say they are intimidated by words. Numbers shouldn't be threatening, and speed reading should be discouraged. Criticism should be healthy, and failure should be funny.
4. People should be encouraged to take pride in what they can uniquely do, which encourages them to be successful at it, and other things. They should be encouraged to learn outside of school, and for that learning to count.
5. Adults in the workforce need to enrich their lives by continuing to learn. Learning after college, in the workplace, should always happen! Experts know that being an expert means knowing who to ask. Create your own syllabus of books to study. Create a syllabus of questions to spark your curiosity. Don't ignore your curiosity. Learn, explore. Know that you are smart, and no matter what your vocation, become a professional intellectual.
As a graduate do you have all the information you need to succeed in the working world? Of course not, no one does. I especially do not, that is why I am continuing to learn, and doing so for free. I may not have the degree but I will know as much as someone who does. I am taking control of what I am learning. I am in the driver's seat again. Are you in the driver's seat of your education?
View all my reviews
Wednesday, July 6, 2011
My rating: 5 of 5 stars
I think this book is an excellent example of the great divide between generations. Now, much larger than a gap, this "new" divide is between those who are still pursuing education/information/reason, and those who are clinging to it with the last breath in their ancient body.
It is a wonder that already intelligent, free-thinking individuals find themselves caught holding a very empty bag. Well, actually, that bag is not quite so empty as it is filled with the surmounting debt required for this "education".
Bottom-line, those same students have the power of language and a love of knowledge and want more control over the process of attaining more and pursuing their highest self. This was once the purpose of a higher education. Unfortunately, the purpose of higher education now is to line the pockets of higher-up administrators, politicians, and lobbyists. Often villainized, tenured professors only see a fraction of that payout.
It comes nowhere near the grubby little hands of undergrad students, destitute grad students, and hopeful yet slighted non-tenure track profs and adjunct faculty. In this economy one can't afford to go into the red for the job they will never find with the help of their college or university. This book encourages the self-reliant, like a dissatisfied customer, to take their business elsewhere.
This book continues to fight that good fight against that sad, pathetic problem in higher education, festering and growing by the day.
And to that I say: "God Bless, America!"
View all my reviews
The best way to deal with being told to quiet down and watch your language on the phone while riding the rails is probably not to n pull out the "I'm educated so I'm not doing anything wrong" card and continue to yell and disrupt the rides of others. One woman did just that last week on a Metro North train out of New York City.
In a video posted by another rider last week (which has since been taken down from YouTube but is still viewable on Gawker), a train employee is berated by the passenger, who, according to the uploader, had been swearing loudly on her phone and disturbing other passengers.
It's especially fun as the woman seems to have adopted some sort of "I'm very posh" accent, as she yells things like, "Do you know what schools I've been to? How well-educated I am?" and "I'm sorry do you think I'm a little hoodlum?"
She also asks for her money back repeatedly, before the employees walk away from her. The person who uploaded the video adds that the train conductor then chimes in on the loudspeaker, reminding passengers to keep it down, "especially those people who went to Harvard or Yale or are from Westport."
It is a little known fact that if you went to college, you are allowed to be rude to anyone you want to, even if you're at fault. Oh wait, that's not true at all.
(many dealing with the topic of major and education)
Thursday, June 30, 2011
Monday, June 27, 2011
This "Can I Just Tell You?" segment was written and voiced by NPR's Tony Cox.
Some thoughts about school and the struggles black kids face. Lots of folks with lots of experience have lots of opinions about what to do to better educate young African-American males. Harvard scholar Henry Louis Gates recently offered yet another glimpse into the issue, suggesting in a piece for the website The Root that the need is dire, which of course it is.
But for many of us in education — and to my mind that includes parents, family and friends — the problem is more than knowing what's needed. It's knowing how to get it done and make it work, how to get young African-American men not only interested but engaged in learning, and enjoying rather than dreading the journey. That requires a lot of commitment from them and from us, and there are no shortcuts.
Besides my work here at NPR, I am a tenured professor in broadcast journalism at California State University, Los Angeles. I primarily teach writing, and it troubles me to no end to see young black men struggle in my classes because they can't or don't see the value of an education and the effort required to obtain one. Records show black male students badly lagging in their graduation rates from colleges and universities. When we see them on campus, they often dress differently, speak differently, have different expectations, and in the classroom can sometimes be difficult to reach.
I get that life for them is tough, sometimes in ways that I don't fully appreciate, even though I grew up in the '60s in South Central Los Angeles. My challenges for survival back then are different in many ways from the hardships these young men face today.
That said — can I just tell you? Education was a very useful weapon in my struggle for survival, and I'm convinced it still is. Maybe more so now.
So how do we convince these young men that the sacrifice is worth it? What do we do? I've scratched my head searching for answers and then asked myself: What have you tried that's worked? A couple of things, actually, which come from my decade of teaching and remembering those who taught me.
The first thing is to not give up on these young men — no easy task when you're fighting with someone you're trying to help. Persistence is required of teachers because learning isn't like a light switch that you flip on and off. Success is more gradual, and it takes time to realize its effect and impact.
Secondly, recognize that each black male is different and deal with each individual accordingly. A hard push works for one, while a pat on the back or a kind word works better for another. You need to have more than one teaching "move" you can go to.
It's important to not forget history, because that history puts in context the ongoing sociological and financial disadvantages that many black boys (and girls) face from the outset in pursuit of an education.
That means teachers must fight for things like accurate and unbiased class materials and textbooks; encourage participation whenever possible; be firm and fair when assessing their skills; promote programs that offer opportunities outside the classroom through internships, scholarships, part-time jobs and community organizations; and be honest when talking to these young men, many of whom have already experienced enough of adult life to know a con when they see or hear one. They read teachers more closely than they read textbooks.
I know it's not that simple, but sometimes it's the small steps that have made the most difference in my relationship with students. Learning how to talk to my black male students — and how to listen to them without prejudgment — is a lesson that I had to learn in order to do my job better. All of which leads me to one last, important point.
Don't try this unless you're fully committed to making a difference. Because like my dad used to always tell me, "Half an effort is worse than no effort at all."
Friday, June 24, 2011
Thursday, June 16, 2011
Picking their next role: Joe College or hot young star?
Young actors face a tough decision: career or upper education. Some, like Emma Watson, think higher education is worth it. Others, like Blake Lively, skip it.
Emma Watson, who plays Hermione Granger, of the "Harry Potter" films.
(Matt Sayles / AP / February 14, 2011)
By Amy Kaufman, Los Angeles Times
June 12, 2011
For most 18-year-olds, a university degree is an expensive but necessary investment leading to personal growth and a well-paying job. But for Watson, already a multimillionaire as a result of playing Hermione Granger in the "Harry Potter" movies, the calculus was more complex.
Watson opted to attend Brown University — a decision that confounded Hollywood directors and publicists.
"I've had to say no to stuff that people have been gobsmacked about. I've had big directors say to me, 'What do you mean, you can't do this movie? We don't understand,'" the actress, now 21, said recently by phone from her native England. "I always hear, 'What do you mean she can't do this magazine cover?' or 'What do you mean she can't have this meeting for a once-in-a-lifetime opportunity?' And my agent will say, 'She's at school, sorry.'
"Yes, it's hard for me to turn down amazing opportunities. But I've been working solidly since I've been 9 years old. So for me, to have this space to learn and figure myself out a bit is obviously worth it."
Transitioning from child star to adult actor never has been easy. But the explosion of kid-oriented entertainment on cable TV and in the movies means more teens than ever are competing to make the leap into adult acting jobs. So opting to take time out for a college degree — never a requirement in Hollywood to begin with — seems increasingly difficult.
Blake Lively, star of the hot teen soap "Gossip Girl," faced the same decision as Watson but chose a different route. She said she dreamed throughout her childhood of attending an Ivy League school and worked toward that goal at Burbank High School, maintaining a 4.2 grade point average while cheerleading, joining a nationally competitive show choir, playing sports and being elected class president.
But when she began to find success starting at age 17 in the film "The Sisterhood of the Traveling Pants," those around her pushed her to skip out not only on college but also the rest of high school. (She decided to finish anyway.)
"Everybody said, 'Strike while the iron is hot.' And everybody is so replaceable these days that to maintain your 'heat,' or whatever, you are supposed to put aside school," said Lively, who's now 23 and building a film career, including roles in last year's "The Town" and next weekend's "Green Lantern."
"One of the reasons why I wanted to do 'Gossip Girl' was because we had talked about giving me one day a week to go to Columbia starting the second season, once things slowed down. But things never slowed down. The show took off, and they were never able to carve out the time in my schedule. It still makes me sad every day that I didn't have that college experience."
As Lively discovered, choosing college can mean swimming against a tide of advice from family, friends, agents and managers, many of whom are quick to point out that many onetime teen stars — including Leonardo DiCaprio, Drew Barrymore and Scarlett Johansson — went on to big adult careers without attending a university. (Such members of an actor's inner circle, of course, might themselves lose out on income if a young actor decides to spend years at college rather than on film sets.)
"Nobody cares if you went to school unless you're on the business side of Hollywood," said Cindy Osbrink, head of the youth theatrical department at the Osbrink Talent Agency, whose clients include Dakota Fanning and her sister, Elle.
Complicating the decision further, Osbrink says, is that many young stars find that upon turning 18, their job opportunities suddenly expand because they no longer face restrictions on how many hours they can work as they did when they were minors. "It's a huge advantage to be a high school graduate of legal age" in the acting world, because 18-year-olds can often play younger roles, she said.
Brad Pitt, who attended the University of Missouri's journalism school, acknowledged that many actors develop into well-rounded people without a formal education. But he believes some performers who stop their schooling at an early age may be making a strategic error that could hurt them down the line.
"I worry for the young, young guys, because they haven't experienced enough to know not to get eaten up by the machine," he said. "I worry that they get defined before they really know who they are. … When they blow up too big at too young an age, they don't get the luxury to make the mistakes. They get defined and discarded."
Of course, some of Hollywood's most acclaimed actors who started in the business at a young age are college grads. Jodie Foster, 48, who studied literature at Yale, has won two Academy Awards. Natalie Portman, 30, who majored in psychology at Harvard University, won the lead actress Oscar this year for "Black Swan."
And James Franco, 33, who hosted this year's Oscars and was nominated for lead actor for "127 Hours," has been perhaps the most active actor-scholar of late: He is enrolled in Yale University's English PhD program and North Carolina's Warren Wilson College for poetry. In May, he earned a master's degree from New York University's Tisch School of the Arts and Columbia University's MFA writing program, after already graduating from Brooklyn College for fiction writing last year.
Yet as Franco and some other actors have found, it can be awkward to be a celebrity on campus. Students are known to doze off during lectures, but when Franco fell asleep during a class at Columbia, someone snapped an embarrassing picture of him, mouth agape, that ricocheted around the Internet. Foster famously had two stalkers while at Yale, one of whom, John Hinckley Jr., followed her to the New Haven, Conn., campus and later shot President Ronald Reagan in an attempt to impress her.
Monday, June 6, 2011
By Alexandra Nikolchev
June 3, 2011
A classroom in Shanghai.
Ezra Klein’s Washington Post blog recently featured a guest post by Columbia University journalism student Dana Goldstein entitled “Is the U.S. doing teacher reform all wrong?” Goldstein focuses on the findings of a recent National Center on Education and the Economy’s study, which compares education policies in five top performing countries — Finland, China, Japan, Singapore and Canada — with the United States. One of the main conclusions is that, basically, the way the U.S. recruits, prepares and evaluates teachers is completely out of step with this group of high-achieving countries.
Public schools in the United States have emulated the Teach for America model: Young, enthusiastic people are thrown into classrooms, often without any experience and little to no required formal coursework. There is no U.S. policy system that pairs new teachers with experienced mentors. Teachers are granted little autonomy in their classrooms and their performance evaluations are largely based on student test scores.
In contrast, teachers in top performing countries must commit to teaching as a serious profession before they enter their classrooms. Each candidate must first go through a system that requires high levels of training and education. As a result, teacher autonomy in the classroom is prioritized and there is less emphasis on student test scores.
The report concludes “that the strategies driving the best performing systems are rarely found in the United States, and conversely, that the education strategies now most popular in the United States are conspicuous by their absence in the countries with the most successful education systems.”
As the report suggests, understanding what systems are being implemented for teachers in academically high-achieving countries should factor into our own policy reform efforts here in the U.S.
To hear more on what might make a positive difference for U.S.teachers and education, watch our “Fixing Education” series of interviews. Need to Know sat down with educators and policymakers from around the world at the “Celebration of Teaching and Learning” organized by WNET in New York City. We wanted to get a global perspective on successful strategies for education reform. A number of those interviewed, including Finland’s Minister of Education and Science and Hong Kong’s Under Secretary for Education, echoed the sentiment that education is more effective when the teachers are well-trained and respected as professionals.
Thursday, June 2, 2011
A pair of new studies suggests a correlation between intelligence and a thirst for alcohol. What's the connection?
posted on October 26, 2010, at 1:11 PM
Don't worry, all that excessive drinking is just a sign of your intelligence. According to two long-term studies — one American, one British — there's a correlation between smarts and a thirst for alcohol. The "more intelligent children in both studies grew up to drink alcohol more frequently and in greater quantities than less intelligent children," says Liz Day at Discover. Why might this be the case?
It's all about evolution: Drinking alcohol was "unintentional, accidental, and haphazard until about 10,000 years ago," says Satoshi Kanazawaat at Psychology Today. Smart people are generally early adopters and, in the context of human history, "the substance [alcohol] and the method of consumption are both evolutionarily novel."
"Why intelligent people drink more alcohol"
Alcohol makes up for boring early years: "I'm surprised" by the findings, says Joanne Hinkel at The Frisky, so "here’s my pop-psychology theory" to explain it: "All that studying in childhood repressed kids so much that they’re still trying to compensate well into adulthood for all that fun they missed." Granted, that's just a theory.
"Brain types booze more — are you surprised?"
Drinking is the only way to deal with morons: Smart people "booze so we can tolerate everyone else," says Greg at Food & Wine Blog. When sober, we tend to "take people’s responses at literal face value." But after a few drinks, "we can relax a bit, stop being so anal with semantics and let comments slide a bit."
"Speculative reasons why smart people drink more"
Most college students I have talked to are excited about the real world after school – excited about the work, the perks, but most of all, the freedom. In the real world, there are no tests or papers looming over their heads, no professors to answer to, no dealing with the stresses and dramas that invariably accompany the college experience. Yeah, college is fun, but there’s almost a mythic quality about life beyond college: it’s substituting the sweats for suits, the kegs for martinis, the hookups for a steady, sickeningly-attractive significant other… While college seniors go through the requisite nostalgia in their last few months as an academic, this nostalgia is still often dampened by lofty expectations for the next stage in their life.
Why then, do so many young professionals hate their jobs?
(I must preface this by limiting my observations to those in the field of business. Most would-be doctors I know are happily trucking away in med school, most would-be lawyers are busily debating each other in law school, and for the rest of my graduating class—those who are doing research in Bolivia or writing articles for Mother Jones—they seem, on the most part, relatively satisfied. Then that begs the question: are jobs in the business fields overly cruel, or are those people that go into business just overly hateful? Note: This observation also excludes investment bankers, who should expect to hate their jobs even before they start.)
* The College Hangover: For many young people, you’re thrown into the fire right out of school. You’re not used to waking up before noon and having to look somewhat presentable. You’re not used to being “on” all the time, every single day, at least five days a week. If only you could skip work without anyone noticing (like college lectures), and still get your big performance bonus…that would be the life. Of course, that would never happen, and thus the nostalgia for college never really goes away. However, the College Hangover only serves as a legitimate excuse for your first few months out of school… After that, if you’re still falling asleep at work in reminiscence of those college glory days, well, you should lay off the drinking.
* The Bottom of the Totem Pole: You were a pretty big deal in college… president of some organization, captain of some sports team, leader of the beer pong circuit. Now, you’re the entry-level analyst who is seen as the little know-it-all who wants to shoot straight to the top, but in actuality is only making a contribution as a master formatter or lunch bitch. You’re relegated to modeling (thankfully we’re talking only about Excel), and making sure that someone less smart than you looks more smart than everyone else. Of course, no one is as smart as us, so it’s a tough reality to stomach.
* Those Lofty Expectations: You thought it was going to be first-class, up in the sky, sipping champagne, living the life… Your job was supposed to be glamorous, impressive, and telling of your smarts, skills, and talents. You thought that you’d be challenged every second of the day; you would have interesting coworkers, exciting projects, and intellectual discussions. You’d be an integral part of the company, just short of the glue that holds everything together. Unfortunately, most of us don’t have interesting projects all of the time, and we certainly know a couple of coworkers who have a few screws loose. We don’t foresee the hours of administrative tasks and unrewarded legwork that is part of the daily grind. You start asking yourself why you are here, what you are doing with your life, and how you can get into a new role/company/industry that is way more glamorous than what you are in now…or so you’d like to think.
* Too Much Freedom: When you’re young, there’s an ordered sequence of how things happen. After pre-school you go to kindergarten. After kindergarten, you’re in first grade. After first grade… etc, etc. The proverbial “life train” goes through a predictable sequence: elementary school, middle school, high school, college—from A to B. But after graduating from college, you’re alone at the train station, and only YOU have to figure out where to next. Get on the banking train, or the consulting one? Marketing, or sales? It always seems like the other train is moving faster, with nicer seats and greener grass on their side of the scenic route to your future. Anxiety strikes. Uneasiness festers. Resentment grows. You end up curled up in the corner of the caboose, hugging your knees, thinking you should have become a doctor instead… at least that would’ve delayed the decision-making for a few more years.
* Your Job Actually Sucks: If you liked the train analogy above, then your standards for quality have obviously been lowered from your time spent on the job. Maybe all that modeling/formatting/Excel-ing is getting to your head. Or maybe your job actually sucks. Hey, it happens. Perhaps it’s time to go to business school then.
Regardless of all the reasons why many people hate their jobs, most of them are still in these jobs…so perhaps “hate” is a strong word. Only a few recent graduates I know have been so fed up that they decided to quit well-paying, respectable jobs and brave unemployment. Then, despite all the negatives, there must be some reason why we are still in the grind. Maybe it’s the money, or the benefits and perks, or the hope that things will get better. Or perhaps we are just paralyzed by fear that the next job will be worse. The main challenge is to balance the expectations of our jobs with a tempered ambition. There will always be days where unemployment looks preferable, but unless that starts to happen day-after-day, week-upon-week (meaning, Your Job Actually Sucks and you should start updating that resume), I’d say to just put your head down, put the hate aside, file it all under “Learning Experience”, and get to work.
Monday, May 23, 2011
Graduation quotes for the new generation
It's time to stop using "The Graduate." Here are some cultural references that college kids can relate to
It's nearly the end of May, and across the country thousands of fresh-faced 20-somethings will be entering the workplace after years of toiling away at collegiate studies. I recently went to a commencement address for a family member and heard not one but two references to Dr. Seuss' "Oh the Places You'll Go!" In the same speech.
Sandwiched between these words of wisdom -- taken from a book designed for babies -- was the obligatory non sequitur from some faculty member attempting to explain why the advice of "Plastics" was so funny in the "The Graduate." Maybe it would have been less irritating if these weren't the exact same two quotes I was preached when accepting my diploma. Isn't it about time we threw out these two clichéd references and updated them with some more applicable cultural dialogue?
This is why I've started peppering my commencement addresses with more "hip" movie lines to appeal to a younger audience. In case anyone wants to hire me to talk at next year's graduation, I have my list ready:*
1. Hello, class of 2012! As you embark on this next phase of your life, I want to say just one word to you. Just one word: "Rango."
2. (Point to someone in the audience, preferably in front row.) ''The leads are weak?' The fucking leads are weak?!?! You're weak. What's my name? I drove an eighty thousand dollar BMW. That's my name." Guys, this may sound harsh, but this is exactly what your first boss will sound like, especially if he's played by Alec Baldwin.
3. When in doubt, always remember the gospel of Matt Damon as he told off that douchey guy in a Harvard bar:
"In 50 years you're going start doing some thinking on your own and you're going to come up with the fact that there are two certainties in life: one, don't do that, and two, you dropped 150 grand on a fucking education you could have got for a dollar fifty in late charges at the public library."
(If this does not go over well, go to a backup "Good Will Hunting" quote: "How do you like dem apples?")
4. In Aaron Sorkin's "The Social Network," we are told a million dollars "isn't cool." It is followed up by the statement that what is actually cool is a billion dollars. I think that's something we can all agree is a sound life philosophy! (Pause for laughter.)
But also? How crazy is it that Mark Zuckerberg didn't even finish college? Turns out he didn't even need a degree to make those "cool" billions!
Anywhoozle, good luck with those student loans.
5. A young man named Peter Parker was once told, "With great power comes great responsibility." Well, the good news here is that as far as I know, not one student graduating today has been bit by a radioactive spider and subsequently turned into a superhero. So relax: as far as that 'personal responsibility' thing goes, it will most likely never come up in real life situations. I speak from experience.
6. In closing I say to you, graduates of Arizona State: "You are not a beautiful and unique snowflake. You are the same decaying organic matter as everyone else, and we are all part of the same compost pile." That's from Chuck Palahniuk's "Fight Club," and that's the real world people, so get used to it.
Also, it's a great film. Netflix it sometime.
*I also do Bar Mitzvahs.
Friday, May 20, 2011
May 4, 2011 This article appeared in the May 23, 2011 edition of The Nation.
A few years ago, when I was still teaching at Yale, I was approached by a student who was interested in going to graduate school. She had her eye on Columbia; did I know someone there she could talk with? I did, an old professor of mine. But when I wrote to arrange the introduction, he refused to even meet with her. “I won’t talk to students about graduate school anymore,” he explained. “Going to grad school’s a suicide mission.”
The policy may be extreme, but the feeling is universal. Most professors I know are willing to talk with students about pursuing a PhD, but their advice comes down to three words: don’t do it. (William Pannapacker, writing in the Chronicle of Higher Education as Thomas Benton, has been making this argument for years. See “The Big Lie About the ‘Life of the Mind,’” among other essays.) My own advice was never that categorical. Go if you feel that your happiness depends on it—it can be a great experience in many ways—but be aware of what you’re in for. You’re going to be in school for at least seven years, probably more like nine, and there’s a very good chance that you won’t get a job at the end of it.
At Yale, we were overjoyed if half our graduating students found positions. That’s right—half. Imagine running a medical school on that basis. As Christopher Newfield points out in Unmaking the Public University (2008), that’s the kind of unemployment rate you’d expect to find among inner-city high school dropouts. And this was before the financial collapse. In the past three years, the market has been a bloodbath: often only a handful of jobs in a given field, sometimes fewer, and as always, hundreds of people competing for each one.
It wasn’t supposed to be like this. When I started graduate school in 1989, we were told that the disastrous job market of the previous two decades would be coming to an end because the large cohort of people who had started their careers in the 1960s, when the postwar boom and the baby boom combined to more than double college enrollments, was going to start retiring. Well, it did, but things kept getting worse. Instead of replacing retirees with new tenure-eligible hires, departments gradually shifted the teaching load to part-timers: adjuncts, postdocs, graduate students. From 1991 to 2003, the number of full-time faculty members increased by 18 percent. The number of part-timers increased by 87 percent—to almost half the entire faculty.
But as Jack Schuster and Martin Finkelstein point out in their comprehensive study The American Faculty (2006), the move to part-time labor is already an old story. Less visible but equally important has been the advent and rapid expansion of full-time positions that are not tenure-eligible. No one talks about this transformation—the creation of yet another academic underclass—and yet as far back as 1993, such positions already constituted the majority of new appointees. As of 2003, more than a third of full-time faculty were working off the tenure track. By the same year, tenure-track professors—the “normal” kind of academic appointment—represented no more than 35 percent of the American faculty.
The reasons for these trends can be expressed in a single word, or buzzword: efficiency. Contingent academic labor, as non-tenure-track faculty, part-time and full-time, are formally known, is cheaper to hire and easier to fire. It saves departments money and gives them greater flexibility in staffing courses. Over the past twenty years, in other words—or really, over the past forty—what has happened in academia is what has happened throughout the American economy. Good, secure, well-paid positions—tenured appointments in the academy, union jobs on the factory floor—are being replaced by temporary, low-wage employment.
* * *
You’d think departments would respond to the Somme-like conditions they’re sending out their newly minted PhDs to face by cutting down the size of their graduate programs. If demand drops, supply should drop to meet it. In fact, many departments are doing the opposite, the job market be damned. More important is maintaining the flow of labor to their domestic sweatshops, the pipeline of graduate students who staff discussion sections and teach introductory and service courses like freshman composition and first-year calculus. (Professors also need dissertations to direct, or how would they justify their own existence?) As Louis Menand puts it in The Marketplace of Ideas (2010), the system is now designed to produce not PhDs so much as ABDs: students who, having finished their other degree requirements, are “all but dissertation” (or “already been dicked,” as we used to say)—i.e., people who have entered the long limbo of low-wage research and teaching that chews up four, five, six years of a young scholar’s life.
If anything, as Menand notes, the PhD glut works well for departments at both ends, since it gives them the whip hand when it comes to hiring new professors. Graduate programs occupy a highly unusual, and advantageous, market position: they are both the producers and the consumers of academic labor, but as producers, they have no financial stake in whether their product “sells”—that is, whether their graduates get jobs. Yes, a program’s prestige is related, in part, to its placement rate, but only in relative terms. In a normal industry, if no firm sells more than half of what it produces, then either everyone goes out of business or the industry consolidates. But in academia, if no one does better than 50 percent, then 50 percent is great. Programs have every incentive to keep prices low by maintaining the oversupply.
Still, there’s a difference between a Roger Smith firing workers at General Motors and the faculty of an academic department treating its students like surplus goods. For the CEO of a large corporation, workers are essentially entries on a balance sheet, separated from the boardroom by a great gulf of culture and physical distance. If they are treated without mercy, that is not entirely surprising. But the relationship between professors and graduate students could hardly be more intimate. Professors used to be graduate students. They belong to the same culture and the same community. Your dissertation director is your mentor, your role model, the person who spends all those years overseeing your research and often the one you came to graduate school to study under in the first place. You, in turn, are her intellectual progeny; if you make good, her professional pride. The economic violence of the academic system is inflicted at very close quarters.
How professors square their Jekyll-and-Hyde roles in the process—devoted teachers of individual students, co-managers of a system that exploits them as a group—I do not know. Denial, no doubt, along with the rationale that this is just the way it is, so what can you do? Teaching is part of the training, you hear a lot, especially when supposedly liberal academics explain why graduate-student unions are such a bad idea. They’re students, not workers! But grad students don’t teach because they have to learn how, even if the experience is indeed very valuable; they teach because departments need “bodies in the classroom,” as a professor I know once put it.
I always found it beautifully apt that my old department occupies the same space where the infamous Milgram obedience experiments were conducted in the early 1960s. (Yes, really.) Pay no attention to the screams you hear coming from the next room, the subjects were told as they administered the electric shocks, it’s for their own good—a perfect allegory of the relationship between tenured professors and graduate students (and tenured professors and untenured professors, for that matter).
Well, but so what? A bunch of spoiled kids are having trouble finding jobs—so is everybody else. Here’s so what. First of all, they’re not spoiled. They’re doing exactly what we always complain our brightest students don’t do: eschewing the easy bucks of Wall Street, consulting or corporate law to pursue their ideals and be of service to society. Academia may once have been a cushy gig, but now we’re talking about highly talented young people who are willing to spend their 20s living on subsistence wages when they could be getting rich (and their friends are getting rich), simply because they believe in knowledge, ideas, inquiry; in teaching, in following their passion. To leave more than half of them holding the bag at the end of it all, over 30 and having to scrounge for a new career, is a human tragedy.
Sure, lots of people have it worse. But here’s another reason to care: it’s also a social tragedy, and not just because it represents a colossal waste of human capital. If we don’t make things better for the people entering academia, no one’s going to want to do it anymore. And then it won’t just be the students who are suffering. Scholarship will suffer, which means the whole country will. Knowledge, as we’re constantly told, is a nation’s most important resource, and the great majority of knowledge is created in the academy—now more than ever, in fact, since industry is increasingly outsourcing research to universities where, precisely because graduate students cost less than someone who gets a real salary, it can be conducted on the cheap. (Bell Labs, once the flagship of industrial science, is a shell of its former self, having suffered years of cutbacks before giving up on fundamental research altogether.)
It isn’t just the sciences that matter; it is also the social sciences and the humanities. And it isn’t just the latter that are suffering. Basic physics in this country is all but dead. From 1971 to 2001, the number of bachelor’s degrees awarded in English declined by 20 percent, but the number awarded in math and statistics declined by 55 percent. The only areas of the liberal arts that saw an increase in BAs awarded were biology and psychology—and this at a time when aggregate enrollment expanded by something like 75 percent. On the work that is done in the academy depends the strength of our economy, our public policy and our culture. We need our best young minds going into atmospheric research and international affairs and religious studies, chemistry and ethnography and art history. By pursuing their individual interests, narrowly understood, departments are betraying both the values they are pledged to uphold—the pursuit of knowledge, the spirit of critical inquiry, the extension of the humanistic tradition—and the nation they exist to serve.
We’ve been here before. Pay was so low in the nineteenth century, when academia was still a gentleman’s profession, that in 1902 Andrew Carnegie created the pension plan that would evolve into TIAA-CREF, the massive retirement fund. After World War II, when higher education was seen as an urgent national priority, a consensus emerged that salaries were too small to attract good people. Compensation soared through the 1950s and ’60s, then hit the skids around 1970 and didn’t recover for almost thirty years. It’s no surprise that the percentage of college freshmen expressing an interest in academia was more than three times higher in 1966 than it was in 2004.
But the answer now is not to raise professors’ salaries. Professors already make enough. The answer is to hire more professors: real ones, not academic lettuce-pickers.
Yet that’s the last thing schools are apt to do. What we have seen instead over the past forty years, in addition to the raising of a reserve army of contingent labor, is a kind of administrative elephantiasis, an explosion in the number of people working at colleges and universities who aren’t faculty, full-time or part-time, of any kind. From 1976 to 2001, the number of nonfaculty professionals ballooned nearly 240 percent, growing more than three times as fast as the faculty. Coaching staffs and salaries have grown without limit; athletic departments are virtually separate colleges within universities now, competing (successfully) with academics. The size of presidential salaries—more than $1 million in several dozen cases—has become notorious. Nor is it only the presidents; the next six most highly paid administrative officers at Yale averaged over $430,000 in 2007. As Gaye Tuchman explains in Wannabe U (2009), a case study in the sorrows of academic corporatization, deans, provosts and presidents are no longer professors who cycle through administrative duties and then return to teaching and research. Instead, they have become a separate stratum of managerial careerists, jumping from job to job and organization to organization like any other executive: isolated from the faculty and its values, loyal to an ethos of short-term expansion, and trading in the business blather of measurability, revenue streams, mission statements and the like. They do not have the long-term health of their institutions at heart. They want to pump up the stock price (i.e., U.S. News and World Report ranking) and move on to the next fat post.
If you’re tenured, of course, life is still quite good (at least until the new provost decides to shut down your entire department). In fact, the revolution in the structure of academic work has come about in large measure to protect the senior professoriate. The faculty have steadily grayed in recent decades; by 1998 more than half were 50 or older. Mandatory retirement was abolished in 1986, exacerbating the problem. Departments became “tenured in,” with a large bolus of highly compensated senior professors and room, increasingly squeezed in many cases, for just a few junior members—another reason jobs have been so hard to find. Contingent labor is desirable above all because it saves money for senior salaries (as well as relieving the tenure track of the disagreeable business of teaching low-level courses). By 2004, while pay for assistant and associate professors still stood more or less where it had in 1970, that for full professors was about 10 percent higher.
What we have in academia, in other words, is a microcosm of the American economy as a whole: a self-enriching aristocracy, a swelling and increasingly immiserated proletariat, and a shrinking middle class. The same devil’s bargain stabilizes the system: the middle, or at least the upper middle, the tenured professoriate, is allowed to retain its prerogatives—its comfortable compensation packages, its workplace autonomy and its job security—in return for acquiescing to the exploitation of the bottom by the top, and indirectly, the betrayal of the future of the entire enterprise.
* * *
But now those prerogatives are also under threat. I am not joining the call for the abolition of tenure—a chorus that includes two of last year’s most widely noticed books on the problems of America’s colleges and universities, Higher Education?, by Andrew Hacker and Claudia Dreifus, and Crisis on Campus, by Mark Taylor. Tenure certainly has its problems. It crowds out opportunities for young scholars and allows academic deadwood to accumulate on the faculty rolls. But getting rid of it would be like curing arteriosclerosis by shooting the patient. For one thing, it would remove the last incentive for any sane person to enter the profession. People still put up with everything they have to endure as graduate students and junior professors for the sake of a shot at that golden prize, and now you’re going to take away the prize? No, it is not good for so many of academia’s rewards to be backloaded into a single moment of occupational transfiguration, one that sits like a mirage at the end of twelve or fifteen years of Sinaitic wandering. Yes, the job market would eventually rebalance itself if the profession moved, say, to a system of seven-year contracts, as Taylor suggests. But long before it did, we would lose a generation of talent.
Besides, how would the job market rebalance itself? If the people who now have tenure continued to serve under some other contractual system, the same surplus of labor would be chasing the same scarcity of employment. Things would get better for new PhDs only if schools started firing senior people. Which, as the way things work in other industries reminds us, they would probably be glad to do. Why retain a 55-year-old when you can replace her with a 30-year-old at half the price? Now that’s a thought to swell a provost’s revenue stream. Talk about efficiency.
And what exactly are you supposed to do at that point if you’ve spent your career becoming an expert in, say, Etruscan history? Academia exists in part to support research the private sector won’t pay for, knowledge that can’t be converted into a quick buck or even a slow one, but that adds value to society in other ways. Who’s going to pursue that kind of inquiry if they know there’s a good chance they’re going to get thrown out in the snow when they’re 50 (having only started to earn a salary when they were 30, to boot)? Doctors and lawyers can set up their own practice, but a professor can’t start his own university. This kind of thing is appalling enough when it happens to blue-collar workers. In an industry that requires a dozen years of postsecondary education just to gain an entry-level position, it is unthinkable.
Nor should we pooh-pooh the threat the abolition of tenure would pose to academic freedom, as Hacker and Dreifus do. “We have scoured all the sources we could find,” they write, “yet we could not find any academic research whose findings led to terminating the jobs of college faculty members.” Yes, because of tenure. If deans and trustees and alumni and politicians rarely even try to have professors fired, that is precisely because they know they have so little chance of making it happen. Before tenure existed, arbitrary dismissals were common. Can you imagine what the current gang of newly elected state legislators would do if they could get their hands on the people who teach at public universities? (Just look at what happened to William Cronon, the University of Wisconsin historian whose e-mails were demanded by the state Republican Party after he exposed the role of the American Legislative Exchange Council in Governor Scott Walker’s attack on public employee unions.) Hacker and Dreifus, who recognize the importance of academic freedom, call instead of tenure for presidents and trustees with “backbone” (a species as wonderful as the unicorn, and almost as numerous). Sure, and as long as the king is a good man, we don’t need democracy. Academics play a special role in society: they tell us things we don’t want to hear—about global warming, or the historical Jesus, or the way we raise our children. That’s why they need to have special protections.
* * *
But the tenure system, which is already being eroded by the growth of contingent labor, is not the only thing that is under assault in the top-down, corporatized academy. As Cary Nelson explains in No University Is an Island (2010), shared governance—the principle that universities should be controlled by their faculties, which protects academic values against the encroachments of the spreadsheet brigade—is also threatened by the changing structure of academic work. Contingent labor undermines it both directly—no one asks an adjunct what he thinks of how things run—and indirectly. More people chasing fewer jobs means that everyone is squeezed for extra productivity, just like at Wal-Mart. As of 1998, faculty at four-year schools worked an average of about seven hours more per week than they had in 1972 (for a total of more than forty-nine hours a week; the stereotype of the lazy academic is, like that of the welfare queen, a politically useful myth). Not surprisingly, they also reported a shrinking sense of influence over campus affairs. Who’s got the time? Academic labor is becoming like every other part of the American workforce: cowed, harried, docile, disempowered.
In macropolitical terms, the erosion of tenure and shared governance undermines the power of a large body of liberal professionals. In this it resembles the campaign against teachers unions. Tenure, in fact, is a lot like unionization: imperfect, open to corruption and abuse, but incomparably better than the alternative. Indeed, tenure is what professors have instead of unions (at least at private universities, where they’re banned by law from organizing). As for shared governance, it is nothing other than one of the longest-standing goals of the left: employee control of the workplace. Yes, professors have it better than a lot of other workers, including a lot of others in the academy. But the answer, for the less advantaged, is to organize against the employers who’ve created the situation, not drag down the relatively privileged workers who aren’t yet suffering as badly: to level up, in other words, not down.
Of course, some sectors of the academy—the ones that educate the children of the wealthy and the upper middle class—continue to maintain their privilege. The class gradient is getting steeper, not only between contingent labor and the tenure track, and junior and senior faculty within the latter, but between institutions as well. Professors at doctoral-granting universities not only get paid a lot more than their colleagues at other four-year schools; the difference is growing, from 17 percent in 1984 to 28 percent in 2003. (Their advantage over professors at community colleges increased during the same period from 33 percent to 49 percent.) The rich are getting richer. In 1970 (it seems like an alternative universe now) faculty at public colleges and universities actually made about 10 percent more than those at private schools. By 1999 the lines had crossed, and public salaries stood about 5 percent lower. The aggregate student-faculty ratio at private colleges and universities is 10.8 to 1; at public schools, it is 15.9 to 1—almost 50 percent higher.
Here we come to the most important issue facing American higher education. Public institutions enroll about three-quarters of the nation’s college students, and public institutions are everywhere under financial attack. As Nancy Folbre explains in Saving State U (2010), a short, sharp, lucid account, spending on higher education has been falling as a percentage of state budgets for more than twenty years, to about two-thirds of what it was in 1980. The average six-year graduation rate at state schools is now a dismal 60 percent, a function of class size and availability, faculty accessibility, the use of contingent instructors and other budget-related issues. Private universities actually lobby against public funding for state schools, which they see as competitors. In any case, a large portion of state scholarship aid goes to students at private colleges (in some cases, more than half)—a kind of voucher system for higher education.
Meanwhile, public universities have been shifting their financial aid criteria from need to merit to attract applicants with higher scores (good old U.S. News again), who tend to come from wealthier families. Per-family costs at state schools have soared in recent years, from 18 percent of income for those in the middle of the income distribution in 1999 to 25 percent in 2007. Estimates are that over the past decade, between 1.4 million and 2.4 million students have been prevented from going to college for financial reasons—about 50 percent more than during the 1990s. And of course, in the present climate of universal fiscal crisis, it is all about to get a lot worse.
* * *
Our system of public higher education is one of the great achievements of American civilization. In its breadth and excellence, it has no peer. It embodies some of our nation’s highest ideals: democracy, equality, opportunity, self-improvement, useful knowledge and collective public purpose. The same president who emancipated the slaves and funded the transcontinental railroad signed the Morrill Land Grant Act of 1862, which set the system on its feet. Public higher education is a bulwark against hereditary privilege and an engine of social mobility. It is altogether to the point that the strongest state systems are not to be found in the Northeast, the domain of the old WASP aristocracy and its elite private colleges and universities, but in places like Michigan, Wisconsin, Illinois, Virginia, North Carolina and, above all, California.
Now the system is in danger of falling into ruin. Public higher education was essential to creating the mass middle class of the postwar decades—and with it, a new birth of political empowerment and human flourishing. The defunding of public higher education has been essential to its slow destruction. In Unmaking the Public University, Newfield argues that the process has been deliberate, a campaign by the economic elite against the class that threatened to supplant it as the leading power in society. Social mobility is now lower in the United States than it is in Northern Europe, Australia, Canada and even France and Spain, a fact that ought to be tattooed on the foreheads of every member of Congress, so directly does it strike at America’s identity as the land of opportunity.
But it was not only the postwar middle class that public higher education helped create; it was the postwar prosperity altogether. Knowledge, again, is our most important resource. States that balance their budgets on the backs of their public universities are not eating their seed corn; they’re trampling it into the mud. My state of Oregon, a chronic economic underperformer, has difficulty attracting investment, not because its corporate taxes are high—they’re among the lowest—but because its workforce is poorly educated. So it will be for the nation as a whole. Our college-completion rate has fallen from second to eighth. And we are not just defunding instruction; we are defunding research, the creation of knowledge itself. Stipends are so low at the University of California, Berkeley, the third-ranked research institution on the planet, that the school is having trouble attracting graduate students. In fact, the whole California system, the crown jewel of American public higher education, is being torn apart by budget cuts. This is not a problem; it is a calamity.
Private institutions are in comparable trouble, for reasons that will sound familiar: too much spending during the boom years—much of it on construction, much of it driven by the desire to improve “market position” relative to competitors by offering amenities like new dorms and student centers that have nothing to do with teaching or research—supported by too much borrowing, has led to a debt crisis. Among the class of academic managers responsible for the trouble in the first place, an industry of reform has sprung up, along with a literature of reform to go with it. Books like Taylor’s Crisis on Campus, James Garland’s Saving Alma Mater (2009) and the most measured and well-informed of the ones I’ve come across, Robert Zemsky’s Making Reform Work (2009), propose their variously visionary schemes.
Nearly all involve technology to drive efficiency. Online courses, distance learning, do-it-yourself instruction: this is the future we’re being offered. Why teach a required art history course to twenty students at a time when you can march them through a self-guided online textbook followed by a multiple-choice exam? Why have professors or even graduate students grade papers when you can outsource them to BAs around the country, even the world? Why waste time with office hours when students can interact with their professors via e-mail?
The other great hope—I know you’ll never see this coming—is the market. After all, it works so well in healthcare, and we’re already trying it in primary and secondary education. Garland, a former president of Miami University of Ohio (a public institution), argues for a voucher system. Instead of giving money to schools, the state would give it to students, and the credit would be good at any nonprofit institution in the state—in other words, at private ones as well. The student would run the show (as the customer should, of course), scouring the market like a savvy consumer. Universities, in turn, “would compete with each other…by tailoring their course offerings, degree programs, student services, and extracurricular activities” to the needs of our newly empowered 18-year-olds, and the invisible hand would rain down its blessings.
But do we really want our higher education system redesigned by the self-identified needs of high school seniors? This is what the British are about to try, and in a country with one of Europe’s most distinguished intellectual traditions, they seem poised to destroy the liberal arts altogether. How much do 18-year-olds even know about what they want out of college? About not only what it can get them, but what it can give them? These are young people who don’t know what college is, who they are, who they might want to be—things you need a college education, and specifically a liberal arts education, to help you figure out.
* * *
Yet the liberal arts, as we know, are dying. All the political and parental pressure is pushing in the other direction, toward the “practical,” narrowly conceived: the instrumental, the utilitarian, the immediately negotiable. Colleges and universities are moving away from the liberal arts toward professional, technical and vocational training. Last year, the State University of New York at Albany announced plans to close its departments of French, Italian, Russian, classics and theater—a wholesale slaughter of the humanities. When Garland enumerates the fields a state legislature might want to encourage its young people to enter, he lists “engineering, agriculture, nursing, math and science education, or any other area of state importance.” Apparently political science, philosophy, history and anthropology, among others, are not areas of state importance. Zemsky wants to consider reducing college to three years—meaning less time for young people to figure out what to study, to take courses in a wide range of disciplines, to explore, to mature, to think.
When politicians, from Barack Obama all the way down, talk about higher education, they talk almost exclusively about math and science. Indeed, technology creates the future. But it is not enough to create the future. We also need to organize it, as the social sciences enable us to do. We need to make sense of it, as the humanities enable us to do. A system of higher education that ignores the liberal arts, as Jonathan Cole points out in The Great American University (2009), is what they have in China, where they don’t want people to think about other ways to arrange society or other meanings than the authorized ones. A scientific education creates technologists. A liberal arts education creates citizens: people who can think broadly and critically about themselves and the world.
Yet of course it is precisely China—and Singapore, another great democracy—that the Obama administration holds up as the model to emulate in our new Sputnik moment. It’s funny; after the original Sputnik, we didn’t decide to become more like the Soviet Union. But we don’t possess that kind of confidence anymore.
There is a large, public debate right now about primary and secondary education. There is a smaller, less public debate about higher education. What I fail to understand is why they aren’t the same debate. We all know that students in elementary and high school learn best in small classrooms with the individualized attention of motivated teachers. It is the same in college. Education, it is said, is lighting a fire, not filling a bucket. The word comes from the Latin for “educe,” lead forth. Learning isn’t about downloading a certain quantity of information into your brain, as the proponents of online instruction seem to think. It is about the kind of interchange and incitement—the leading forth of new ideas and powers—that can happen only in a seminar. (“Seminar” being a fancy name for what every class already is from K–12.) It is labor-intensive; it is face-to-face; it is one-at-a-time.
The key finding of Richard Arum and Josipa Roksa’s Academically Adrift (2011), that a lot of kids aren’t learning much in college, comes as no surprise to me. The system is no longer set up to challenge them. If we’re going to make college an intellectually rigorous experience for the students who already go—still more, for all the ones we want to go if we’re going to reach the oft-repeated goal of universal postsecondary education, an objective that would double enrollments—we’re going to need a lot more teachers: well paid, institutionally supported, socially valued. As of 2003 there were about 400,000 tenure-track professors in the United States (as compared with about 6 million primary- and secondary-school teachers). Between reducing class sizes, reversing the shift to contingent labor and beefing up our college-completion rates, we’re going to need at least five times as many.
So where’s the money supposed to come from? It’s the same question we ask about the federal budget, and the answer is the same. We’re still a very wealthy country. There’s plenty of money, if we spend it on the right things. Just as we need to wrestle with the $700 billion gorilla of defense, so do universities need to take on administrative edema and extracurricular spending. We can start with presidential salaries. Universities, like corporations, claim they need to pay the going rate for top talent. The argument is not only dubious—whom exactly are they competing with for the services of these managerial titans, aside from one another?—it is beside the point. Academia is not supposed to be a place to get rich. If your ego can’t survive on less than $200,000 a year (on top of the prestige of a university presidency), you need to find another line of work. Once, there were academic leaders who put themselves forward as champions of social progress: people like Woodrow Wilson at Princeton in the 1900s; James Conant at Harvard in the 1940s; and Kingman Brewster at Yale, Clark Kerr at the University of California and Theodore Hesburgh at Notre Dame in the 1960s. What a statement it would make if the Ivy League presidents got together and announced that they were going to take an immediate 75 percent pay cut. What a way to restore academia’s moral prestige and demonstrate some leadership again.
But leadership will have to come from somewhere else, as well. Just as in society as a whole, the academic upper middle class needs to rethink its alliances. Its dignity will not survive forever if it doesn’t fight for that of everyone below it in the academic hierarchy. (“First they came for the graduate students, and I didn’t speak out because I wasn’t a graduate student…”) For all its pretensions to public importance (every professor secretly thinks he’s a public intellectual), the professoriate is awfully quiet, essentially nonexistent as a collective voice. If academia is going to once again become a decent place to work, if our best young minds are going to be attracted back to the profession, if higher education is going to be reclaimed as part of the American promise, if teaching and research are going to make the country strong again, then professors need to get off their backsides and organize: department by department, institution to institution, state by state and across the nation as a whole. Tenured professors enjoy the strongest speech protections in society. It’s time they started using them.
May 4, 2011 | This article appeared in the May 23, 2011 edition of The Nation.
Some books on the topic:
Unmaking the Public University
The Forty-Year Assault on the
By Christopher Newfield.
Buy this book.
The American Faculty
The Restructuring of Academic
Work and Careers.
By Jack H. Schuster and Martin J.
Buy this book.
The Marketplace of Ideas
Reform and Resistance in the
By Louis Menand.
Buy this book.
Inside the Corporate University.
By Gaye Tuchman.
Buy this book.
How Colleges Are Wasting Our Money and Failing Our Kids—and What We Can Do About It.
By Andrew Hacker and Claudia Dreifus.
Buy this book.
Crisis on Campus
A Bold Plan for Reforming Our
Colleges and Universities.
By Mark C. Taylor.
Buy this book.
No University Is an Island
Saving Academic Freedom.
By Cary Nelson.
Buy this book.
Saving State U
Why We Must Fix Public Higher Education.
By Nancy Folbre.
Buy this book.
Saving Alma Mater
A Rescue Plan for America’s Public
By James C. Garland.
Buy this book.
Making Reform Work
The Case for Transforming Higher Education.
By Robert Zemsky.
Buy this book.
The Great American University
Its Rise to Preeminence, Its
Indispensable National Role,
Why It Must Be Protected.
By Jonathan R. Cole.
Buy this book.
Limited Learning on College
By Richard Arum and Josipa Roksa.
Buy this book.
- ► July (12)
- ► June (7)