When fast-food workers walked off the job in several U.S. cities a few weeks ago, it was like a blast from the past. For one day, at least, American labor was flexing organizing muscles that have largely atrophied over the past few decades. And the prognosis has only been getting worse: Michigan Governor Rick Snyder signed legislation in December limiting the power of unions in a state that once served as a symbol of union might, starting the countdown clock to the end of unions everywhere. The fight over Detroit’s financial future is a further sign of things to come: with municipal and state finances still a mess nationwide, even the liberally inclined but nonunion man on the street has begun to turn on his organized labor neighbors as the fight over which matters more—government employee pensions or basic city services—moves from the theoretical into the very real.
But let’s get back to the fast-food furor. The workers have a simple demand: a minimum wage of $15 an hour, roughly equivalent to $30,000 a year. According to the National Employment Law Project, the average front-line fast-food worker (which includes most non-management positions, including cashiers, cooks and deliverymen) makes barely more than half that amount, or $8.94 an hour. That’s less than $20,000 a year.
Employers counter that wages of $15 an hour are too high for their business model, while also seeking the moral high ground by claiming that these jobs play a crucial role for millions of Americans as an “entry point” into the work force. By comparison, McDonald’s former CEO, Jim Skinner, received $27.7 million in compensation in 2012. And David Novak, CEO of Yum Brands (the parent of KFC, Taco Bell and Pizza Hut), took home $11.3 million for his role in bringing Dorito taco shells to a salivating public.
While workers’ complaints haven’t focused on the CEO-worker pay gap, they might as well have, given that CEO pay continues to skyrocket even as corporate America has seen the systematic dismantling of the collective bargaining capability of its lower rungs. According to the A.F.L.-C.I.O., the CEO-to-worker pay ratio stood at an unprecedented 354:1 in 2012, versus 281:1 in 2002 and 201:1 in 1992. Since 1982, the average income of the top 1 percent of earners in the United States has grown 125 percent, while that of the 99 percent is up just 10 percent.
When, exactly, did things start to get so out of hand? A question like that usually serves as a rhetorical device for the answer, “It was a gradual change with no precise beginning.” Not this time. The long death of American organized labor began nearly a century ago, when the 1935 Wagner Act mandated collective bargaining with unions because companies were already figuring out how to screw the little guy. At that time when managers sought advice about how to deal with the rising power of unions, they turned to outside advisers such as McKinsey & Company, the secretive strategy-consulting firm that’s been having an outsize influence on corporate America for as long as “corporate America” has existed. McKinsey was never so foolish as to get labeled anti-union. But to its clients (corporate executives), it was clear which side of that growing struggle the consultants were on. And they remain there to this day.
Anyone who has worked in the corporate milieu knows that the arrival of McKinsey on the scene tends to not be a sign of good news for the rank and file. What is less known is McKinsey’s role in the creation of the CEO-to-worker gap itself. In 1951, General Motors hired McKinsey consultant Arch Patton to conduct a multi-industry study of executive compensation. The results appeared in Harvard Business Review, with the specific finding that from 1939 to 1950, the pay of hourly employees had more than doubled, while that of “policy level” management had risen only 35 percent. If you adjusted that for inflation, top management’s spendable income had actually dropped 59 percent during the period, whereas hourly employees had improved their purchasing power.
The “academic” imprimatur of Harvard Business School’s house organ gave the work a certain credibility and the study was suddenly an annual affair, appearing in HBR for more than a decade thereafter, at which point it moved into McKinsey Quarterly. From 1948 to 1951, HBR had one article a year on executive compensation. A few years later, the review was running five times that amount. This was actually a perfect moment for the new “field of study,” because in the post-World War II years, there was a shortage of executive talent and corporate leaders had begun poaching executives not just from the competition but also from entirely different sectors. And they had to know how much to offer, did they not? Moreover, in the post-Depression years, no one had wanted to talk out loud about compensation. But after the war, they were ready to raise the volume.
But let’s get back to 1952. Juan Trippe, then CEO of Pan American World Airways, caught wind of the research and engaged Mr. Patton to work on a study of stock options for his management team. (Yes, the stock option as compensation kicker goes back that far as well.) Before long, managers everywhere had taken special note of the news that they were underpaid, and demand for Mr. Patton’s imprimatur on executive pay packages went through the roof. Although longtime McKinsey boss Marvin Bower considered the franchise beneath the firm’s calling as high-order problem-solver, he also knew a cash cow when he saw one. For several years, Mr. Patton personally accounted for almost 10 percent of the firm’s billings. At the end of the war, only 18 percent of companies in the country had bonus plans. By 1960, about 60 percent of them did.
Mr. Patton then did what anyone who crushes it in a new field does: he wrote a few books explaining his wisdom. In 1961, he published two books with McGraw-Hill: Men, Money and Motivation: Executive Compensation as an Instrument of Leadership and What Is an Executive Worth? In short, the man was a singular force in the creation and dissemination of the new philosophy of executive pay.
McKinsey’s influence is such that it tends to have that effect:
The firm showed similar sway in the 1990s with The War for Talent (likewise first a management concept and then an eventual book), which was all about paying your most aggressive people too much and firing your bottom 10 percent every year. And this resulted in the creation of the entirely ridiculous concept of the chief talent officer—and Enron.
The point is, when McKinsey embraces an idea and wholeheartedly pushes it on its clients, it tends to become widespread. And there’s no denying that this is what happened with Mr. Patton. He published one of his findings in Men, Money & Motivation: a CEO’s compensation tended to increase with the size of a company. Unsurprisingly, CEOs have been gung-ho about M&A ever since. Another finding: in the 1940s and 1950s, the No. 2 man at companies made an average of 55 percent to 60 percent of the CEO’s pay, a “spread” that Mr. Patton considered totally reasonable. Mr. Patton was also involved in studies that showed that companies that paid bonuses boosted profit twice as quickly as others over a 10-year period versus non-bonus-payers. You can’t make this stuff up, people. Or maybe you can.
An envious competitive set has sometimes accused McKinsey of recycling its work, changing little more than the client’s name in its reports. The same might be said for the “academic” output. “[Mr. Patton wrote] the same article  times for the Harvard Business Review,” one former McKinsey staffer told me while I was writing my book The Firm: The History of McKinsey and Its Secret Influence on American Business, which will be published by Simon & Schuster next month. In total, Mr. Patton wrote more than 60 articles on the subject over the years. And is it any wonder? Considering the topic, it’s no shocker that the M.B.A. crowd was willing to read that particular story again and again. (One of McKinsey’s secrets over the years has been its consultants’ personal relationships with CEOs. It doesn’t take a genius to see the benefit of having a guy like Mr. Patton around.)
Mr. Patton was arguably one of the first management gurus, a guy asked for by name, and after he retired from McKinsey, he was named chairman of a presidential commission on legislative, judicial and executive branch salaries in 1973. Even the government wanted a piece of that action! But so did everyone—then and now. Indeed, once started, the demand to increase and justify executive compensation became a perpetual motion machine that’s still chugging along in 2013.
It’s not as simple as that, of course. During his decades researching executive compensation, Mr. Patton used McKinsey’s trademark rigorous analysis to consider multiple factors in search of the best and most appropriate incentives. And he always insisted that performance appraisal be a crucial part of the compensation exercise, a fact that has clearly been lost on many boards of directors in ensuing years. At this point, too, it’s pretty much preposterous: CEOs’ pay keeps on rising, no matter their performance or the fact that workers’ compensation has been frozen in place. Everything’s gone from bad to worse, and the money’s never changed a thing.
But invoking the name McKinsey as a rubber stamp for self-serving corner-office decision-making is a long corporate tradition, and for six decades now, those holding the strings of the corporate purse have chosen to simply skip the “rigorous” part in rationalizing ever more absurd levels of CEO pay. In its 1996 obituary, the Times reported Mr. Patton’s chagrin at how managers had abused his survey, in large part by assuming that “all [executives] were above-average performers.” Asked in the 1980s how he felt about the effect of his work, his reply was simple: “guilty.” Consider, too, that the 1982 CEO-to-worker pay ratio was only a relatively paltry 42:1. One wonders how he’d feel today.