Read an Excerpt
Back to Work!
Create New Opportunities in the Wake of Job Loss
By Stephen P. Adams, Elizabeth Cody Newenhuyse
Moody PublishersCopyright © 2009 Stephen P. Adams
All rights reserved.
THE JOB SQUEEZE
Layoffs. Restructuring. Reengineering. Buyouts. Downsizing. Rightsizing. Reduction in force. It all adds up to the same thing: disappearing jobs. An employment black hole across America.
In 1996, The New York Times was telling us just how bad things were: "On the battlefields of business, millions of casualties." The big story was a half-million Americans a year being laid off—some permanently—with ripple effects for millions of families.
Fast-forward to 2009, when more than a half-million Americans a month were being laid off early in the year, and the story was the Times, where economic reality finally reached this far. A hundred employees were let go, and the remaining workers experienced a 5 percent pay cut. In February alone, about 1,500 newspaper journalists were looking for another situation.
Even though, as of this writing in late summer 2009, the Great Recession of 2008–2009 appears to be easing, or at least "less bad," the same cannot be said about the overall shape of the labor market. According to some forecasts, the U.S. unemployment rate won't return to "normal" levels until 2014. In July 2009, 14.5 million Americans were out of work. Manufacturing and construction have been particularly hard-hit, but so have other white-collar sectors.
So if you're out of a job—you're in good company.
Theories abound on why the current unemployment situation is so dire, even as other indicators, like the stock market, seem to be trending upward. When the financial markets cratered in 2008 and many investors saw serious erosion in their portfolios, consumers stopped spending. The credit crunch affected countless businesses. Large bellwether industries like the automotive Big Three slashed their workforces, in particular GM as the government stepped in to stabilize the company's shaky finances. The housing crash impacted a host of industries from construction to real estate sales to big-box stores.
Moreover, some jobs may be gone forever.
Mortimer Zuckerman, chairman and editor-in-chief of U.S. News & World Report, wrote that this was the first recession since the Great Depression to entirely "wipe out all job growth from the previous expansion." Add in some other real-life situations—those who have stopped looking for work or have taken part-time jobs, for example—and the real number would be more like 16.5 percent or 25 million people involuntarily idle, he said. But what about when the economy turns around? Zuckerman again:
The likelihood is that when economic activity picks up, employers will first choose to increase hours for existing workers and bring part-time workers back to full time. Many unemployed workers looking for jobs once the recovery begins will discover that jobs as good as the ones they lost are almost impossible to find because many layoffs have been permanent.
Just prior to this major recession, female representation in the workforce was pegged at 49.1 percent, according to Labor Department figures. So, after millions of new job losses, has that number changed? Considering that as high as 80 percent of those pink slips went to males, America is entering the era of the majority-female workforce. Various reasons are given for that disproportionate impact, which tend to boil down to higher-paid individuals being more vulnerable to layoffs but also to the fact that education and health care have been less affected and women comprise the bulk of the workforce in those sectors. But the more serious question may be the long-term impact on a society in which the new norm is the stay-at-home dad.
Family policy expert and political consultant Jim Pfaff offers this perspective on the current situation:
WELCOME TO "POST-POST-INDUSTRIAL AMERICA"
This current recession is unlike anything we have seen in three generations. The surreal landscape on the back end of the "collapse" of financial institutions has distorted the way we have traditionally evaluated the job market. Post-Industrial America is quickly becoming Post-Post-Industrial America, where job seekers find the employment landscape in disarray.
In past recessions America had a strong manufacturing base. Companies like General Motors, Chrysler, Ford, and even American Motors were mainstays even for workers displaced by layoffs who just returned when things got better. Westinghouse, General Electric, Amana, Maytag, and other electrical and appliance manufacturers went though similar cycles of hiring, laying off, and rehiring. Beginning with the dot-com boom in the 1990s, graduating college students found new avenues of opportunity, and the country emerged as a post-industrial economy where intellectual capital had become as valuable as the latest production technology.
The world of finance exploded, too, with abundant venture capital, mergers and acquisitions, and personal investment capital flowing into an ever-expanding investment market. Wall Street was roaring just like many new dot-com start-ups, offering real opportunity for significant long-term advancement for new university grads.
Then, when the Internet boom went bust, it seemed that all the promise of the boom had been short-lived. But it wasn't. Internet companies reformed and retooled. We learned that the Internet was the business platform of the future, and this was the first shakeout of weak players. Dot-commers either rode out the storm or joined forces to build new online companies and services, building upon the lessons learned. Wall Street was banged up and knocked down, too, but quickly returned for another round with little employee displacement. Once again, a recession didn't seem so bad because everyone just hung on until they could jump back on the same merry-go-round.
The recession that began in December 2007, however, is a totally different animal. This time around, many of the newly unemployed will have no choice but to find a new career. This may mean some will have to go backward in their career to find the way back.
THE FUTURE OF JOBS
Only a generation or two ago, it was common for workers to have one career and not unusual even to have the same job all their working days. Lifetime employment was the norm. In 1996 experts were saying the average person would have about ten jobs in a lifetime and three career changes—and in the future would have six to ten career changes in their lives. By May 2009, Sony's "Did You Know" YouTube clip was winding its way around the e-world with a mind-blowing collection of exponential social and technological trends, including this update on the above figures: "The U.S. Department of Labor estimates that today's learner will have 10–14 jobs ... by the age of 28" (Italics added). Other factoids: "1 in 4 workers has been with their current employer for less than a year" and "1 in 2 has been there less than 5 years."
The packaging of work in the form of jobs is a mere two hundred years old—and a pendulum swing may be in full reverse.
Yet, in the scheme of things, jobs are a relatively recent phenomenon. Work itself is as old as the first patch of weeds in the garden after the fall. William Bridges, author of JobShift: How to Prosper in a World without Jobs, calls jobs a "social artifact" of the industrial era. Unfortunately, those of us who depend on this social artifact for many of our needs are living in what's been called the Post-Industrial Era:
The job concept emerged early in the nineteenth century to package the work that needed doing in the growing factories and bureaucracies of the industrializing nations. Before people had jobs, they worked just as hard but on shifting clusters of tasks, in a variety of locations, on a schedule set by the sun, the weather, and the needs of the day. The modern job was a startling new idea—to many people, an unpleasant and even socially dangerous one.
The Industrial Revolution began powering up in England and western Europe shortly after the American Revolution in the late eighteenth century, driven by such breakthroughs as steam-powered spinning machines and looms. Until then, the workplace had been the village, the field, and the home, where farmers, craftsmen, and families did their work without time clocks, employment contracts, and management consultants. Now the factory with its regimentation began to become the norm.
By the early twentieth century, another principle came to the fore—narrow functional specialization and scientific management, as defined by Frederick Taylor, the father of management gurus. This involved breaking jobs down into a large number of simple tasks the worker would repeat over and over with machinelike efficiency under a command-and-control type of supervision. Ironically, today's management consultants counsel quite different approaches—self-directed work teams, participative management, etc. Different times call for different prescriptions.
By 1914, Henry Ford introduced the assembly line, dividing tasks into narrow functional specialties and ushering in the era of mass production. It was the ideal means of manufacturing goods in great quantity with a large population of low-skilled and uneducated workers, many of them immigrants. Companies could train an individual to perform the same job routine repeatedly without requiring a great deal of independent thought. That was left to the growing cadre of supervisory and support personnel.
Under this approach, the United States became the manufacturing giant of the world—the largest producer and exporter of goods with the largest labor force and the highest wages. This remarkable success gave America the highest standard of living in the world, but the U.S. occupation of the catbird seat was far from permanent. The golden era of dominance, according to historians and economists, lasted approximately thirty years, from the end of World War II to the onset of the global economy and the Technology Revolution—1945 to 1975.
The illusion of permanent prosperity coincided with the advent of the baby boomer generation—76 million babies born from 1946 to 1964. The 1960s, though overshadowed by the threat of nuclear war, began with the glow of a Camelot presidency and the belief that America could "bear any burden, pay any price," in the words of John F. Kennedy. Expectations couldn't have been higher. Until her military might became bogged down in Vietnam, it seemed there was little that was not within America's grasp. President Kennedy vowed that America would put a man on the moon before the end of the decade. And, though Kennedy did not live to see it himself, the promise was fulfilled in 1969, when Neil Armstrong took a giant leap for mankind.
Economically, too, America was a juggernaut. But success begat complacency, and without serious competition from abroad—at first—there was little incentive for U.S. manufacturers to worry about down-the-road problems with cost and quality. It became the age of inflation, as workers and companies pursued an upward spiral that eluded President Nixon's wage and price controls. But around the corner lurked two other major forces—consumerism and the quality movement.
Ralph Nader's career as a consumer advocate and corporation basher was launched with the publishing in 1965 of Unsafe at Any Speed, an indictment of U.S. automakers for shoddy design and workmanship, as evidenced by General Motors' Corvair. Unwittingly, American manufacturers were setting themselves up as juicy targets once the Japanese and others figured out that they could beat the Yanks at their own game and capture a major portion of the global market with lowercost, higher-quality goods.
Experts say the wake-up call began in 1973 with the Arab oil embargo, which decisively demonstrated that no nation—not even the U.S.A.—can go it alone and that the price of industrial might is energy dependence. The so-called global economy began to take center stage with the fall of Communism and the unification of Europe under the European Union. Then the Western Hemisphere followed suit with the removal of trade barriers through the North American Free Trade Agreement (NAFTA), whose opponents argued bitterly that it would cost U.S. jobs.
Foreign competition was one of four factors cited by the NewYork Times in a series published during the recession of the early to mid-nineties. The other three: technological progress that lets machines replace hands and minds, the ease of contracting out work, and payroll cuts to make companies more attractive on Wall Street. To these could be added a generation of baby boomer managers much more willing than the previous generation to trim staff size, and changes in the accounting practices mandated by law. And these factors-among others—are still in play in the current downturn.
THE TECHNOLOGY REVOLUTION
The Technology Revolution began a sharp spike in 1976 with the appearance of the personal computer, which offered the promise of putting mainframe power into the hands of ordinary people. Over the next twenty years, the PC began to redefine the workplace, changing the speed of work and the way literally thousands of different kinds of tasks are carried out. By the 1980s, power was doubling and the price was being cut in half approximately every eighteen months, a phenomenon that became known as Moore's Law. By the 1990s, automakers were pointing out that the brains of their cars were bigger than the computers aboard the Apollo spacecraft.
The greater wonder may be why it's taken so long for this revolution to begin chopping heads.
But this new technology also was beginning to do something else a little less user-friendly: The long-feared specter of automation finally was taking its toll on jobs through industrial robots, computerized machinery, and microelectronic techniques. Soon, even white-collar workers, who had been immune to such vicissitudes in the past, began to fall prey to the pink slip as even those jobs began to disappear.
As long as computers have been around, the greater wonder may be why it's taken so long for this revolution to begin chopping heads. The answer may involve the ways technology is handled or, especially in the early stages, mishandled. Experts have noted that when the electric motor was first introduced, it took a remarkably long time to change things and fulfill its potential of portable power. At first, it was used simply as a direct replacement for the giant, smoky steam engine that had no place on the factory floor. Its power was transmitted instead to individual machines and work stations through a Rube Goldberg series of belts and pulleys connected to one central driveshaft from the outside.
Eventually, the industrial engineers figured out that there was a more efficient way of doing things, and the next technological corner was turned. Similarly, it wasn't until recent years that the power of computing reached critical mass in the workplace in a very direct human sense. Harry S. Dent Jr., author of JobShock, said the tendency at first was to use information technology merely to enhance the old ways of the "paper-shuffling bureaucracy" rather than advance real innovation. Hence the older-generation manager whose desktop computer was little more than a high-tech paperweight. But that's all changing with a vengeance, Dent suggested in a section of his book entitled "Computers Are the New Office Workers."
"Sure, we've had computers in offices," he wrote, "but we haven't used them to real advantage in most organizations. The workplace has been filled with an older generation used to working in a hierarchical command and control system. This generation is less willing to take risks than the newer generation moving into positions of power now and in the future: baby boomers. The members of this newer generation have already proved they will take necessary, calculated risks to bring in new, creative ways of conducting business when they have come into power."
And now, of course, many of the jobless are themselves boomers who wonder if they have a place in today's—and tomorrow's—workplaces.
John Naisbitt and Patricia Aburdene, authors of Megatrends, accurately forecast the current situation in their book Re-inventing the Corporation, when they predicted a major flattening of the ranks of middle managers in America. Staff managers who supervise people would give way to small groups, work teams, and other self-management structures, according to Naisbitt and Aburdene, and line managers in charge of systems would be replaced by computers.
Their words, published in 1985, seem eerily prophetic now:
Today, computers are replacing middle managers at a much greater rate than robots are replacing assembly line workers. Once indispensable to senior executives, many middle managers are now watching computers do their job in a fraction of the time and at a percentage of the cost. The whittling away of middle management presents serious problems for all those baby boomers about to enter middle management. The number of men and women between thirty-five and forty-six, the prime age range for entering middle management, will increase 42 percent between 1985 and 1995. Clearly, millions of baby boomers who aimed for middle management will never reach their goal. There simply will not be enough middle management jobs. It is a scary thought for some people.
Excerpted from Back to Work! by Stephen P. Adams, Elizabeth Cody Newenhuyse. Copyright © 2009 Stephen P. Adams. Excerpted by permission of Moody Publishers.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.