Skip to Content
Search Icon

Features

Against Human Resources

On H.R.'s dominance in the workplace.

image

There was no such thing as “human resources” before 1958, when the term first appeared in print in an academic paper. The art of keeping one’s workforce in good order used to be called “personnel management” or “industrial relations,” and before about 1920 there was no such thing as that, either. It was not thought to be a separate type of management or something one could specialize in. For most of human history, workers and their bosses had face-to-face relationships. Only when corporations became so large that an owner could no longer learn the names of all of his employees did anyone start to talk about “human resources” in the abstract.

And even then it was hardly inevitable that the systematic science of selecting and managing workers would end up looking like the schoolmarmish, therapeutic, risk-averse paper-pushing that characterizes H.R. departments today. One textbook defines H.R. as “a largely behavioral science approach to the study of nonunion work situations, with particular emphasis on the practice and organization of management.” This is a pithy way of saying that H.R. sees bosses as economic actors and workers as psychological ones. From the beginning, H.R. has been the discipline addressed not so much to workers’ welfare as to their feelings.

As soon as the field of human resources was isolated from the rest of management, extravagant claims started to be made on its behalf. Henry Ford II said in 1946 that “solving the problem of human relations in production” could be as big a revolution as the assembly line. “Our task is nothing less than to rehumanize industry,” one psychologist declared in 1919. More recently, Silicon Valley C.E.O.s have mixed human resources with California-style spiritualism. Tony Hsieh of Zappos called his management system, Holacracy, “the next stage in the evolution of human consciousness.” His book, Delivering Happiness: A Path to Profits, Passion, and Purpose, spent twenty-seven weeks on the New York Times best seller list.

Zappos employees were not quite as enthusiastic about Holacracy. When the company offered buyouts to anyone who would not commit to the system, nearly twenty percent of employees took the money and quit. In November 2020, Hsieh barricaded himself inside a pool shed in New London, Connecticut, got high on nitrous oxide and marijuana, and burned himself and the shed to the ground. He was forty-six.

Modern human resources originated in a single study, known as the Hawthorne experiment, published in 1933. In a way, the Hawthorne experiment was a bookend to Frederick Winslow Taylor’s more famous pig-iron study at Bethlehem Steel in 1899. It was the origin of scientific management, which enjoyed such a vogue among progressives in the early part of the last century. The takeaway from Taylor’s experiment was that there is “one best way” to perform any task, in this case transporting pig-iron from A to B, and an employer merely has to coax workers to conform to the objective standard. The takeaway from the Hawthorne experiment was that there is no “one best way” where human beings are concerned, and the real key to productivity lies in treating workers not as cogs but as individuals.

The Hawthorne experiment revolved around six female workers from a Western Electric factory near Chicago. They were plucked from the assembly line where they worked making telephone parts and put in a special room where experts could isolate various conditions and see their effects on productivity. The trials ran through more than a dozen iterations over several years and experimented with a wide array of motivators: free snacks, longer breaks, shorter hours, increased pay, productivity bonuses, softer lighting. To the experimenters’ delight, each new change led to a perceptible increase in efficiency.

Next the experimenters took away all the perks they had added and reverted back to the working conditions at the start of the experiment—and, to their surprise, productivity stayed the same. The increase was maintained. Then they put all the perks back, and productivity rose higher than ever.

To make sense of this puzzling result, the Harvard professor in charge of the experiment concluded that it was not any particular enticement that motivated the women to work harder. It was the feeling of being special. Over the years of the experiment, the subjects had overcome their initial reticence and become active participants, forming relationships with the experimenters and deriving pride from their involvement in something that had the potential to affect how their peers were treated. If bosses wanted to increase productivity, they should strive to replicate that feeling that management was genuinely interested in their employees’ input.

In retrospect, a few problems with this interpretation emerge. The first is that the subjects did not reflect the average American employee. They were young women—mostly teenage girls. (The ages of the six were eighteen, eighteen, nineteen, nineteen, twenty-four, and twenty-eight.) Of course flattery goes far with that demographic. We did not need a Harvard study to tell us that. What motivates a nineteen-year-old girl trying to earn pocket money or a dowry is not the same as what motivates a thirty-something male breadwinner with five dependents.

The idea that the secret to workplace harmony lay in being attuned to workers’ feelings, rather than in material perks like paying them more, was eagerly embraced by employers. Between 1920 and 1940, the share of companies with personnel departments rose to thirty percent. By the end of World War II, it was two-thirds. Women flocking to industrial work during the war contributed to this increase, according to the historian Loren Baritz. “Women workers, especially, troubled managers who had neither the time nor the skill necessary to deal with their problems,” he wrote. “Nondirective therapy and counseling” aimed at “individual personality problems” was not something that employers had found it necessary to provide in the days when their workers were male. It was a service that H.R. stood ready to provide.

The growth in H.R. slowed after World War II—if Baritz is right, partly because of the withdrawal of women from the workplace—and did not resume until the late 1960s. Then it exploded. The number of H.R. workers in America increased tenfold between 1960 and 2000. The only reason the growth has stopped is that now the vast majority of employees in this country are subject to the supervision of human resources to one degree or another. The empire of H.R. has reached the shores of the continent and can expand no further.

The wedge that H.R. used to shove its way into the American workplace was anti-discrimination law. The Civil Rights Act of 1964 was not just a new law but a new type of law, a bigger revolution than anything passed during the New Deal. Before, business regulation had been categorical: you may not hire workers under x years of age or force them to work more than y hours in a day. Violations were easy to determine; they were straightforward questions of fact. With the Civil Rights Act, for the first time in the legal history of the Anglosphere, regulation sought to scrutinize a boss’s motivations. Whether an act was lawful or not depended on whether the reasons behind it were pure.

In retrospect, it is odd that personnel departments were the site of growth in response to anti-discrimination law. If you want to know whether your business is in compliance with the law, typically you ask a lawyer. But as the Harvard sociologist Frank Dobbin explains, lawyers were “unwilling to recommend compliance strategies not yet vetted by judges. Professional norms directed them to advise clients about black letter law and not to speculate wildly about what the courts might or might not approve.” Personnel departments, having no such inhibitions, filled the vacuum.

Another musty old legal tradition upended by the civil rights revolution was the presumption of innocence. In an anti-discrimination lawsuit, if a plaintiff can make a prima facie case that some demographic is underrepresented in a company’s workforce—a matter of bare statistics, which doesn’t necessarily imply any ill will or bias—the burden of proof shifts to the employer, who now has to prove that the disparity has an innocent explanation. An employer’s investment in diversity programs could go a long way in convincing a judge of his good intentions. In this new regulatory paradigm, doing the bare minimum to comply with the law was not enough. Employers were forced to compete with one another to demonstrate their commitment to diversity. This was a license for H.R. departments to let their imaginations run wild.

Take maternity leave. In the early 1970s, paid time off for working mothers was promoted by H.R. departments as a way of protecting companies from legal liability in case of sex discrimination lawsuits. The connection was not strictly logical. Failing to offer maternity leave was not a form of discrimination, exactly. Maternity leave was simply a tangible demonstration that the company cared about women, which could be useful if that question ever came before a judge. Eventually the Supreme Court took the rug out from under even this tenuous logic when it ruled in 1976 that civil rights law did not require paid maternity leave, but by that time most large firms already had it. Naturally, H.R. departments paid no penalty for being wrong about what the law required.

Eventually H.R. departments added new rationales for their existence. Instead of selling themselves as guardians against lawsuits, they began talking about the importance of diversity in a globalized marketplace or the need to attract the best employees in an increasingly diverse America. The sociologist Lauren Edelman pinpoints 1987 as the year when the benefits of diversity overtook protection against lawsuits as the justification for H.R. programs in management periodicals. The majority opinion in the 2003 Supreme Court case Grutter v. Bollinger, written by Sandra Day O’Connor, invoked both the substance and the prose style of H.R. from this era: “Major American businesses have made clear that the skills needed in today’s increasingly global marketplace can only be developed through exposure to widely diverse people, cultures, ideas, and viewpoints.”

None of these purported benefits was ever based in any empirical proof. This has been a chronic problem for H.R. Studies from as far back as the 1960s have shown that anti-bias training does not reduce bias, anti-harassment training does not reduce sexual harassment, grievance procedures don’t avert lawsuits, and diverse workplaces are not more efficient or creative than the average. H.R. departments have confidently repeated each of these claims over the years, and managers have deferred to them in the belief that the experts know what they are talking about. But study after study has shown every one of these H.R. best practices to be snake oil.

The flimsiness of the evidence was ironic, because the way H.R. abolished every rival system for managing workers was by tattling to judges about the lack of evidence behind them. Under civil rights law, any policy that has a disparate impact on minorities is presumptively illegal. A company must prove the policy is relevant to a bona fide business objective—for example, an aptitude test can only be given to prospective hires if an employer goes to the trouble of proving that high scorers perform better on the job. The standards for that proof are high. In the case of aptitude tests, Nathan Glazer estimated in 1975 that validating a test cost forty thousand dollars minimum—a cost that had to be repeated not just for each test but for each business location, since a test validated for a factory in one state was not automatically valid at the same company’s factory in another state where conditions might be different.

Needless to say, H.R. policies were never subjected to this exacting standard of proof; if they had they would not have survived. Nevertheless, the demand for empirical proof was a powerful weapon that H.R. used to expand its empire and destroy all its rivals. Casualties were not just I.Q. tests but every kind of hiring program that could not be subsumed into H.R. Many factories used to have family hiring programs, where brothers and cousins by the score ended up working at a single plant. Lawsuits were filed asking whether companies could prove that the advantages of these programs outweighed the (inadvertent) discriminatory impact. Sure, the companies said, everyone knows a man works harder when he knows his performance reflects on the family’s honor. But they didn’t have a study to prove it. Those family hiring programs are all gone now.

Consider the characteristics of H.R. as described so far: rules are ambiguous rather than explicit; conciliation is prized and conflict of any kind abhorred; emotions and subjective perceptions are valued more than objective conditions. The common thread uniting all of these themes is that they are feminine. It is no accident that women dominate H.R. Three-quarters of all H.R. professionals are female, a ratio that has been stable for at least thirty years. The “H.R. lady” is not just a stereotype. She is a statistical reality.

The result has been the feminization of the American workplace, the inevitable effect of giving H.R. ladies veto power over everything that happens there. This feminization has happened even in the most unlikely workplaces. Astrophysics is a predominantly male profession. Yet Dr. Matt Taylor found himself in the middle of an international scandal in 2014 when, during a press conference to announce that his team had become the first in history to land a spacecraft on a comet, he showed up wearing a rockabilly-style shirt with busty pinup girls on it. The shirt was denounced as disrespectful to women. His tearful forced apology was a conspicuous triumph for H.R. ladies everywhere.

There is a masculine alternative to H.R. It is called a union. In any given workplace, H.R. ladies and union reps perform many of the same functions. If you have a conflict that needs adjudicating, you want to make sure the company gives you all the vacation days you’re entitled to, or you have a complaint about workplace conditions, you go to them. Underneath this functional similarity, however, the two models of workplace relations rest on very different assumptions.

The idea behind unions is that workers and bosses are fundamentally in conflict. They don’t have to hate each other, by any means, but their interests diverge, and the best way for them to reach agreement is to have a fair fight by clearly defined rules. This is the opposite of H.R.’s ethos, which is all about denying that conflict exists and finding win–win solutions—or at least solutions that everyone will pretend are win–win after they have been badgered into accepting the consensus.

Obviously these two models could not co-exist. H.R. defeated unionism, or at least prevailed over it. It could be a coincidence that H.R. happened to receive a boost from anti-discrimination law at exactly the time when unions were growing weaker for unrelated reasons, such as the decline of manufacturing. But at the time all this was playing out, certainly unions perceived the H.R. model as a direct threat. The labor correspondent of the Financial Times wrote in 1985 that the question among many of his union sources was, “If the law can give workers what they want, will they want us?”

Choosing the H.R. path over the union path had consequences for American workers. It gave white-collar employees the advantage. A study in 2017 found that only twenty percent of workplace discrimination lawsuits were filed by blue-collar workers; the rest were by professionals, managers, salesmen, and other office workers. Female supervision also made impossible many of the things that, for men, had made the workday bearable: banter with the lads, obviously, but also competitiveness, which to an H.R. lady looks too much like conflict and carries the risk of hurting the loser’s self-esteem. Unions never would have gone for the “bring your whole self to work” ethos that H.R. has wielded so effectively on behalf of left-wing identity groups.

Above all, the replacement of unions by H.R. departments was a humiliating experience for workers. In 1985, the Ford plant in Dagenham, England, embarked on a “hearts and minds” program intended to woo the British union officials into being less antagonistic. One day a union man walked into the head office and saw a shop steward “with a magazine and a pair of scissors and he was cutting out pictures. ‘What the fuck are you doing?’ the union man asked. It was a confidence-building exercise, the steward answered sheepishly. ‘I am going back to the plant, and I’m going to tell your members what you’re doing,’ the union man said. ‘Pack it up and get a bit of dignity inside you.’”

The purpose of H.R. was to bring order to the anarchy of early industrialism, when two workers doing the same job often received different pay and seniority, and qualifications counted for nothing against the whims of a foreman. Its ultimate effect has been just the opposite. For all its formalities and proceduralism, it has made workplaces more capricious. Instead of clear-cut rules, workplaces are governed by the H.R. manager’s sense of what qualifies as “not a good look.”

Employers go along with it because they are afraid of lawsuits, and they are right to be. Civil rights law makes up a huge chunk of the federal docket, composing up to twenty percent of cases in federal district courts in recent years. Living in fear of such lawsuits is rational: the average settlement is forty thousand dollars, according to the Equal Employment Opportunity Commission.

More than fifty years after the law was passed, employers still can’t get a straight answer on how to avoid violations. The rules are so unclear that it makes sense for workers to roll the dice on a lawsuit and take their chances. The sociologist Ellen Berrey conducted a quantitative study of eighteen hundred workplace discrimination lawsuits for her book Rights on Trial. “During data collection, we initially asked the research assistants who read and coded the case files to try and assess the merits of the cases,” she writes. They dropped that column because the cases were always ambiguous. “Some were clear instances of ‘frivolous’ cases and a few seemed to have ‘smoking guns,’ but the vast majority fell in between.”

The uncertainty of the legal landscape suits H.R. departments just fine. It makes them more necessary. Otherwise, the need for “human resources” might be subject to question. If we put the payroll department in charge of vacation days and supervisors in charge of morale, what exactly would be left for “human resources” to manage? Corporations did perfectly well without human resources for a long time. Maybe they could again.

The rise of H.R. co-incided precisely with increasing workforce participation by wives and mothers, something the post-war American industrial regime had been built to avoid. Women flooded into America’s offices so quickly and in such great numbers that one wonders in retrospect whether jobs were created to make positions for these keen new recruits, regardless of whether the jobs added much value to the firms. In an era when having too few women among your employees could be illegal in and of itself, the possibility cannot be discounted.

This unprecedented shift in the workforce has taken many women away from their children and prevented others from ever becoming mothers at all. Many of these women became H.R. ladies, where their job was to treat grown men as if they were children. It would have been more efficient all around if the mothering had been left to families and workplaces left to the professionals. Too many women have jobs; too many jobs are fake; these problems overlap. H.R. is at the center of that Venn diagram.

Helen Andrews is editor of the American Conservative and the author of Boomers: The Men and Women Who Promised Freedom and Delivered Disaster.