Several hundred fast food workers have walked off their jobs at McDonalds, Burger Kings, Taco Bells, et al in New York City, where most earn the minimum wage of $7.25 an hour.
At the moment the media seems to be treating the walk-out as more of a novelty than anything else—the Times is reporting it on their City Room blog. Commenters around the Web are fairly universal in saying that challenging the fast-food giants isn’t going to work.
I’d suggest, though, that this could be a distant early signal of a labor trend that we may see increasingly as the decade goes on—indeed, a labor trend that may be necessary just to keep the American economy working.
More and more of the new jobs we’re seeing in this recovery are service jobs. In larger cities, college grads compete with each other to land jobs folding shirts at J. Crew or steaming milk at Starbucks. Blue collar jobs are already being automated or outsourced, and low- and mid-level white collar jobs are next. In the long run, the functions that can’t be moved into cyberspace involve actually giving customers physical goods or providing hands-on services, like home health care attendants.
These are the jobs that will still be here in 2020. These are also, however, jobs that rarely produce either a living wage or a career path. In a sense they are like the factory jobs of the early 20th century, before unions and the American labor movement.
I’m acutely aware of all the optimistic arguments that say as old jobs are automated, new jobs appear—new jobs with higher wages that require more intellect, jobs that machines can’t do. But I’m not sure that’s always going to be true as we begin to automate more and more white-collar positions and put intelligent robots to work.
It’s generally assumed on both sides of the political aisle that a healthy, thriving middle class is crucial to the American economy. But that’s not how the future is shaping up.
So we may have a choice: do we increase the wages of service workers, as we did with factory workers a century ago, and give them a path into the middle class? Or do we increasingly redistribute income via the government through measures like the earned income tax credit?
I think the first option—raising service industry wages—is a healthier alternative than a permanently shrinking middle class. But that will require employers to go along, and that’s not going to happen without things like fast food strikes.
On the other hand, it may already be too late for that. Behind the scenes even fast food automation is moving ahead quickly. By 2020 you may be ordering on an iPad and picking up your meal from a conveyor belt, with few low-wage service workers in sight.
I’ve been on the road this week speaking to some fairly traditional groups--lawyers, insurance executives--and I’m getting questions about Marissa Mayer’s dictum that canceled work-from-home arrangements for Yahoo workers:
“Listen, if a Silicon Valley outfit can’t make it work, then maybe this telecommuting thing just isn’t such a good idea.”
First, I explain that the Yahoo situation isn’t typical. Mayer inherited a company that, having lacked direction for years, probably doesn’t even know what all its people are doing in the first place. Temporarily herding them all into the office is likely a good way to sort it out. (Not to mention that new mother Mayer built a nursery next to her office, so the work-at-home issue is moot for her.)
But then I emphasize that telecommuting itself is here to stay, and will grow only more important. We tend to forget that the US is still the fastest-growing developed nation on earth--now over 300 million, on our way to 400 million sometime in the early Forties. If you think it’s crowded out there now, just wait. Traffic congestion already adds one entire work week of sitting in the car to the average worker’s life each year, and that number keeps going up.
It’s going to be increasingly difficult to explain to young office workers why they have to get in the car and commute every day in order to sit in a cubicle and send emails and IM and do videoconferences. That will become even more of an issue for employers later in this decade as baby boomers finally retire and the competition for talented millennials really heats up.
Sure, there will always be good reasons for people to meet in person, although those occasions may diminish as telepresence systems get ever more “real” and the next generation of workers brings a new comfort with virtual work. But the need to meet in person every once in a while doesn’t mean that you have to move your entire workforce into the office every day.
The trend is utterly inevitable. And perhaps fifty years from now there will even be an online trivia competition in which one of the questions is:
“What was a rush hour?”
Give away what you used to sell, and sell what you used to give away.
It sounds like a zen parable, but it’s also something that more of my clients are pondering, as their business models move into the cybersphere.
My career started at Rolling Stone, so I naturally think of the music industry as an example. Years ago, if you were a rock and roll band, the way you made money was simple: you recorded an LP or CD, and when it was released, you went on tour to promote it.
You needed as much exposure as possible, so ticket prices were low and tours often didn’t make much money. You nearly gave the t-shirts away, because when you left Philadelphia you wanted to make sure that every kid in town had Led Zeppelin Summer 1976 displayed on their backs. Where you made your money was album sales, and everything else was marketing toward that end.
Now, for reasons ranging from online piracy to lower royalties for streaming services like Spotify, the value of recorded music has dropped precipitously. So some bands actually give away their music and instead make money on tours and selling merchandise (like those t-shirts). Baby boomers who haven’t been to a rock concert in a while are often stunned by $200 ticket prices and wonder: when did that start? Well, it started when it became clear that nobody was ever again going to get rich on CDs.
Something similar happened in journalism: in the early days of the Internet we found it impossible to charge for online news--but people were happy to pay $4 or $5 for a single article from the archives. Fresh news, it seemed, was supposed to be free, but once it was a few days old it was information and readers were willing to pay for it. (Explaining that to a grizzled old newspaper editor was a real challenge.) While more newspapers are now finally charging for news, some still use a business model wherein today’s news is free but you have to be a subscriber to see anything older than 24 hours.
Where else does this happen? One client used to make excellent money as a clearing house for government environmental records that were otherwise hard to access. But they recognized that sooner or later those records would be easily accessible online, so they turned their free Website into a for-pay community for environmental professionals. Lawyers, also, are increasingly mulling a future in which basic legal services may be either automated or out-sourced--so perhaps the real value they offer is the advice they give away for free during those client lunches and dinners. Lately I’ve even heard of corporate travel agencies who earn bonuses for NOT booking travel for employees but instead talking them into using telepresence.
Clearly one size doesn’t fit all--but it’s an interesting question for almost every intellectual property or services company to consider: Give away what you used to sell, and sell what you used to give away.
No matter what my speaking topic, the Q&A portion often turns into a discussion of Millennial behavior, either from the perspective of parents or employers. Did our parents spend this much time at professional meetings talking about us? Perhaps--but I also think that we're seeing not only traditional generational tut-tutting about the youngsters' strengths and shortcomings, but also a deeper kind of bewilderment about the impact of virtual communication and relationships.
Last week, at a major international consulting firm, I heard a story from the head of internships that combines two dominant themes.
This particular company runs an extensive multi-year internship program that begins with undergraduates. The selection process is so demanding that when a student lands an internship, she can be pretty sure that she's going to get a job after graduation.
In this case, the interns were invited to a weekend field trip in a large Eastern city, mixing with some of the company's higher-ranking officers. At one point, the whole group went from one venue to another via bus. One young woman found herself sitting next to the company's CFO on the bus. And she proceeded to spend the entire ride texting on her mobile phone.
Afterwards, the CFO went to the internship coordinator and said he was sorry, but someone who can't make small talk on a short bus ride just isn't going to work out at the firm.
At the end of the day the internship coordinator took the young woman aside and said that, regretfully, they were going to have to remove her from the program. And then the coordinator just had to ask: "What were you thinking? Sitting next to a senior offficer of the company and spending all your time texting?"
"I was texting my father," the girl explained. "To ask him what you should say to a CFO."
When I worked for Newsweek in the Eighties and Nineties I was always surprised by how much stature and respect the magazine received in Europe and Asia, given the fact that it was a) in English and b) had a relatively small circulation. But it was, of course, read by English-speaking power players (there always seemed to be multiple Rolex and Patek-Philippe ads), and the marketing was excellent: newsstands everywhere seemed to have Newsweek awnings.
In China the demise of the print edition was seen as a major watermark in American publishing. In several interviews there I tried to explain that there are many other print magazines still doing well--that Newsweek's need to drop the print edition was the result of some business and editorial missteps in addition to the changing publishing environment. I always expected that Newsweek would have a print edition at least through this decade. As time went on, it would become increasingly a luxury product for a diminishing audience that more or less collated the best of that week's Website, but it would not vanish quite so quickly.
My Chinese interviewers were genuinely surprised to hear that there are still many print magazines being started in the U.S. The real lesson of Newsweek print's demise is that there is no longer much margin for error in publishing on paper--whether you're an 80-year-old brandname or a freshly-minted start-up.
I’ve long predicted that the real turning point on climate change action will be driven by extreme weather events. Humans aren’t built to sense climate…our time frame is weather, and that’s what we respond to.
We’ve already seen two examples historically. Australia was the only developed country besides the US not to sign the Kyoto Protocol on carbon mitigation, since they sell lots of coal to China. Then Australia went through the worst drought in its history—coincidentally, just about the time that Al Gore’s Inconvenient Truth was released. The Australian electorate blamed climate change for the drought and elected a new government, one of whose first acts was to sign the Protocol and initiate climate change legislation.
Something similar happened in Russia after their record-breaking drought several years ago. A decade earlier Vladimir Putin had said that Russia welcomed global warming—the wheat could grow longer and they’d need to buy fewer coats! Then came the drought, which devastated their wheat crop and spawned such enormous fires in the countryside that Moscow was choking on dense smoke. Soon thereafter Dmitry Medvedev announced that Russia needed to face the fact that climate change was a real threat.
Of course, humans being humans, once the extreme weather subsided (and the global economy tanked) both Australia and Russia grew less enthusiastic about carbon reduction. But the seed had been planted.
Now it’s the U.S.’s turn, and New York Mayor Bloomberg’s abrupt endorsement of Obama as the candidate best suited to tackle climate change is another example of extreme weather as sudden motivator.
Inevitably, Americans will lose interest in the climate change issue once the damage is repaired and we have a few months of normal weather. But now another prominent American, whose Wall Street loyalties insulate him from dismissal as just another liberal tree-hugger, is on the record about climate change, and another seed is planted.
I suspect it will take the rest of this decade, and a series of extreme weather events worldwide, to finally create a global awakening. In the US, for example, it might be a Category 4 hurricane hitting Miami, which my insurance clients say would almost certainly bankrupt the state of Florida. When private insurers began to shun Florida coastal property, Florida basically self-insured, and there’s not enough money in the state treasury to cover a major hit.
Someday, in short, there will be a number of extreme weather incidents , around the globe, in a relatively brief period of time. And that will finally catalyze the sophisticated social networks of the late Teens to create a worldwide movement demanding action on climate change.
It will be the Millennial generation, not the Boomers, who lead this movement. Climate change is not a Boomer issue—most Boomers will be happy if they’re still sitting on the porch in 2040, when the global impacts get truly dire. It will be the Millennials and their children whose futures are truly at stake. For them, companies and countries that continue to emit excessive carbon dioxide will be seen as international criminals. And only then will serious worldwide carbon reduction begin.
So, for the first time since I was 18, I am soon going to be without a car. That's no small emotional transition for a southern California native who grew up in a culture where if you hadn't been in the car for an hour, you hadn't gone anywhere. It was a world in which every boy in high school counted down the hours until you were old enough to take the driving test. Since back then it was age 16, you pretty much started counting down when you turned 12.
When I moved to New York City twelve years ago, I had my car shipped out from San Francisco. And even though the cost of garaging and insuring a vehicle in New York are ridiculous, I always felt that it was worth the price. I even started planning how to convince my apartment building to install a charging station for the plug-in hybrid I would buy next.
But no more. Two things changed my mind. The first is that traffic in New York City, never good, has gotten dramatically worse in the past decade. Driving, always difficult, is now just about impossible: there is almost no time that you can count on a trip without severe congestion somewhere along the line. And the second is that there are now four Zipcars in the basement parking garage of my building, available for hourly rental whenever I really must have a car.
Suddenly I'm far more sympathetic to the thesis that automobiles are becoming less interesting to the Millennial generation.
The open road is not exciting if you spend most of your time sitting in traffic, and it's going to get worse: the US is still the fastest growing industrialized nation on earth and we'll add 15 million more licensed drivers just in the next three years. Even the smallest towns I visit these days have rush hours and traffic back-ups.
Add to this the fact that more Millennials are moving back into city centers or close-in suburbs, where there is either mass transit or services like Zipcar.
Obviously, once you start a family, the car becomes more important, so it's not as if the automobile industry is going to collapse. But it will be a fundamental shift in the American psyche when the car--once a symbol of pleasure and freedom--becomes just another somewhat onerous duty of adulthood.
I often look ahead to the year 2020 for industries ranging from finance and media to transportation, energy and more. But last month it was an enterprise closer to my heart: the Professional Conference Management Association educational meeting in San Antonio, where I talked about “Imagining the Convention of 2020.”
Will virtual conferences and events supplant their real-world predecessors? The short answer is, of course, no. Consider last summer’s five year college reunions for the class of 2006—the first class to graduate with Facebook in full flower. Ever since 2006, alumni organizers have worried about this class: these graduates have been talking to each other regularly on social networks ever since they graduated. Would they still want to meet in person as well? The answer was yes: five year reunions for the class of 2006 were well-attended. Hugs and beers, in short, are still not effectively shared online.
However: virtual events will become a far larger part of the conference industry in the late Teens and Twenties. The shift will be driven by much better (and cheaper) video displays and ubiquitous high-bandwidth connectivity. Add to that a new generation of virtually-adept attendees—and the increasing cost, both economic and environmental, of conference travel. And crucially, conference sponsors—the folks who buy the booth space and sponsor those lunches and coffee breaks—will also start to move more of their marketing budgets into the virtual world. Thus the conference industry needs to think hard today about how to make money on virtual events.
Too many conference planners remind me of newspaper publishers a decade ago, for whom online was a sideshow that didn’t get the intense focus it deserved. Now, as print revenues plunge, newspaper publishers have far less time and money left to reinvent their businesses. The printed newspaper may still be the centerpiece, but the audience is spending far more time online. Ironically, traditional publishers are increasingly turning to real-world events to bolster their bottom lines. Event organizers need to do this in reverse: begin to think of events as content, and figure out how to “publish” them to larger audiences.
The first problem is that virtual conferences are today a Babel of various interfaces and technical standards. That’s not how it is in the physical world, where every conference venue is fundamentally familiar—whether in Hong Kong or New Orleans, attendees immediately recognize registration booths, hallways, meeting rooms, the convention floor. And at every convention center the exhibitors’ trucks deliver standardized booths onto the loading docks, which then set up, in a standard way, on the show floor. But virtual conferences vary wildly in look, navigation and function, confusing to attendees and frustrating for exhibitors who have no interest in building a different digital “booth” for every show that comes along.
More of my thoughts are in this interview in PCMA’s magazine, Convene. But it’s worth noting that for me the best part of the PCMA speech was after I was finished: the people I met the rest of that day in San Antonio. That’s an experience we can’t yet entirely duplicate in the virtual world—but we will grow much better at it in the years ahead. The conference organizers who get it right will literally reinvent the meaning of event.
The other day an association hired me for a return speaking visit, noting that the last time I’d spoken to them, five years ago, I’d predicted that there would be “tablets everywhere.” And that was several years before the iPad was launched.
I was pleased that they had remembered—and happy that I’d been right. And while I don’t think that precise prognostication is the most valuable offering of the futurist, the fact is that anyone who talks about the future almost inevitably starts making predictions.
For me, predictions started with writing science fiction thirty years ago. And going back over my old short stories I see that even some of my odder predictions ultimately came true. For example: flowers implanted with bioluminescent genes so they glow in the dark in a story called “Lilies of the Trench.” Or a violin with electronic bow and strings so that it can only be heard through headphones in “Klysterman’s Silent Violin.”
Lately, along with tablets, I’ve predicted what I call heads-up goggles—eyeglasses with clear glass that also have a small projection of your computer screen down in one corner. In my speech The Virtualization of America (and the World) I describe how via devices like heads-up goggles we will someday be connected to the virtual world almost constantly. Kids born in 2025 will have to be taught what “offline” means—because “online” will be the normal state of things.
I’ve been talking about heads-up goggles for five years or so and so I was happy earlier this year when Google announced their Googles Glasses program. Google, in fact, predicted their heads-up goggles would be on sale by the end of 2012. That’s a prediction I don’t agree with—the technology is still too complex and expensive. I suspect Google is just trying to grab some mindshare with a premature announcement. Google Glasses is, after all, a pretty catchy name. Although I think Apple should hurry up and announce something as well—“iGlasses” sounds even better.
But the real value of the futurist is to talk about directions, rather than destinations—to use examples of what might be in the future to suggest ways to change what we do today. If you rely too much on making precise predictions, sooner or later you’re going to get in trouble. In fact, futurists as a breed are still often mocked for their predictions of “flying cars.” In fact, in 1957, on the cover of of Popular Mechanics, a futurist promised flying cars for everyone by 1967.
Wrong, of course. But flying cars seem to be eternal. At this spring's New York Auto Show a company called Terrafugia showed off a $279,000 flying car and said they already had 100 orders. So Popular Mechanics was off by 45 years and a little too optimistic about the price tag. But that’s all too often par for the course on predictions—and I feel quite comfortable in predicting that the same will be true for many years to come.
The news today about Steve job's decision to delay surgery for cancer and instead use alternative therapies fits his view of the world perfectly. Steve was famously one of the ultimate control freaks in technology. And Apple products have always reflected that--they tend to be closed boxes, rather hostile to user interventions. How many companies could, at this late date, still get away with selling a mobile phone in which you're not even supposed to change the battery yourself?
And so it makes sense that when Steve confronted the idea of surgery, it seemed a lot like letting someone else open the box. It was a process outside his control, and so he opted for alternative therapies--diet, acupuncture--that were both external and controllable.
Contrast that to Andy Grove of Intel who, when diagnosed with prostate cancer in the mid-Nineties, went on an incredible scientifically rigorous search for the very best treatment, and even documented the process in an article for Fortune.
Grove took an engineer's approach to his disease. Steve's was more of an artist's approach. And of course, Grove just turned 75 last month and, while now battling Parkinson's disease, remains deeply involved in funding and writing about medical research.