The Great Grocery Squeeze
The concept of the food desert has been around long enough that it feels almost like a fact of nature. Tens of millions of Americans live in low-income communities with no easy access to fresh groceries, and the general consensus is that these places just don’t have what it takes to attract and sustain a supermarket. They’re either too poor or too sparsely populated to generate sufficient spending on groceries, or they can’t overcome a racist pattern of corporate redlining.
But these explanations fail to contend with a key fact: Although poverty and ruralness have been with us forever, food deserts arrived only around the late 1980s. Prior to that, small towns and poor neighborhoods could generally count on having a grocery store, perhaps even several. (The term food desert was coined in 1995 by a task force studying what was then a relatively new phenomenon.)
The high-poverty, majority-Black Deanwood neighborhood of Washington, D.C., is typical of the trend. In the 1960s, the area had more than half a dozen grocery stores, according to a study by the anthropologist Ashanté Reese. These included a branch of the local District Grocery Stores co-op, a Safeway supermarket, and independent Black-owned businesses such as Tip Top Grocery on Sheriff Road. By the 1990s, however, the number of grocery stores in Deanwood had dwindled to just two, and today the neighborhood has none.
A similar story played out across rural America, following the same timeline. Up until the 1980s, almost every small town in North Dakota had a grocery store. Many, in fact, had two or more competing supermarkets. Now nearly half of North Dakota’s rural residents live in a food desert. (The USDA defines a food desert as a low-income census tract where the nearest grocery store is more than 10 miles away in a rural area or more than one mile away in a city.)
A slew of state and federal programs have tried to address food deserts by providing tax breaks and other subsidies to lure supermarkets to underserved communities. These efforts have failed. More food deserts exist now than in 2010, in the depths of the Great Recession. That’s because the proposed solutions misunderstand the origins of the problem.
Food deserts are not an inevitable consequence of poverty or low population density, and they didn’t materialize around the country for no reason. Something happened. That something was a specific federal policy change in the 1980s. It was supposed to reward the biggest retail chains for their efficiency. Instead, it devastated poor and rural communities by pushing out grocery stores and inflating the cost of food. Food deserts will not go away until that mistake is reversed.
The structure of the grocery industry has been a matter of national concern since the rise of large retail chains in the early 20th century. The largest was A&P, which, by the 1930s, was rapidly supplanting local grocery stores and edging toward market dominance. Congressional hearings and a federal investigation found that A&P possessed an advantage that had nothing to do with greater efficiency, better service, or other legitimate ways of competing. Instead, A&P used its sheer size to pressure suppliers into giving it preferential treatment over smaller retailers. Fearful of losing their biggest customer, food manufacturers had no choice but to sell to A&P at substantially lower prices than they charged independent grocers—allowing A&P to further entrench its dominance.
Congress responded in 1936 by passing the Robinson-Patman Act. The law essentially bans price discrimination, making it illegal for suppliers to offer preferential deals and for retailers to demand them. It does, however, allow businesses to pass along legitimate savings. If it truly costs less to sell a product by the truckload rather than by the case, for example, then suppliers can adjust their prices accordingly—just so long as every retailer who buys by the truckload gets the same discount.
For the next four decades, Robinson-Patman was a staple of the Federal Trade Commission’s enforcement agenda. From 1952 to 1964, for example, the agency issued 81 formal complaints to block grocery suppliers from giving large supermarket chains better prices on milk, oatmeal, pasta, cookies, and other items than they offered to smaller grocers. Most of these complaints were resolved when suppliers agreed to eliminate the price discrimination. Occasionally a case went to court.
During the decades when Robinson-Patman was enforced—part of the broader mid-century regime of vigorous antitrust—the grocery sector was highly competitive, with a wide range of stores vying for shoppers and a roughly equal balance of chains and independents. In 1954, the eight largest supermarket chains captured 25 percent of grocery sales. That statistic was virtually identical in 1982, although the specific companies on top had changed. As they had for decades, Americans in the early 1980s did more than half their grocery shopping at independent stores, including both single-location businesses and small, locally owned chains. Local grocers thrived alongside large, publicly traded companies such as Kroger and Safeway.
With discriminatory pricing outlawed, competition shifted onto other, healthier fronts. National chains scrambled to keep up with independents’ innovations, which included the first modern self-service supermarkets, and later, automatic doors, shopping carts, and loyalty programs. Meanwhile, independents worked to match the chains’ efficiency by forming wholesale cooperatives, which allowed them to buy goods in bulk and operate distribution systems on par with those of Kroger and A&P. A 1965 federal study that tracked grocery prices across multiple cities for a year found that large independent grocers were less than 1 percent more expensive than the big chains. The Robinson-Patman Act, in short, appears to have worked as intended throughout the mid-20th century.
Then it was abandoned. In the 1980s, convinced that tough antitrust enforcement was holding back American business, the Reagan administration set about dismantling it. The Robinson-Patman Act remained on the books, but the new regime saw it as an economically illiterate handout to inefficient small businesses. And so the government simply stopped enforcing it.
[Zephyr Teachout: Why judges let monopolists off the hook]
That move tipped the retail market in favor of the largest chains, who could once again wield their leverage over suppliers, just as A&P had done in the 1930s. Walmart was the first to fully grasp the implications of the new legal terrain. It soon became notorious for aggressively strong-arming suppliers, a strategy that fueled its rapid expansion. By 2001, it had become the nation’s largest grocery retailer. Kroger, Safeway, and other supermarket chains followed suit. They began with a program of “self-consolidation”—centralizing their purchasing, which had previously been handled by regional divisions, to fully exploit their power as major national buyers. Then, in the 1990s, they embarked on a merger spree. In just two years, Safeway acquired Vons and Dominick’s, while Fred Meyer absorbed Ralphs, Smith’s, and Quality Food Centers, before being swallowed by Kroger. The suspension of the Robinson-Patman Act had created an imperative to scale up.
A massive die-off of independent retailers followed. Squeezed by the big chains, suppliers were forced to offset their losses by raising prices for smaller retailers, creating a “waterbed effect” that amplified the disparity. Price discrimination spread beyond groceries, hobbling bookstores, pharmacies, and many other local businesses. From 1982 to 2017, the market share of independent retailers shrank from 53 percent to 22 percent.
[Christopher Beam: Welcome to pricing hell]
If you were to plot the end of Robinson-Patman enforcement and the subsequent restructuring of the retail industry on a timeline, it would closely parallel the emergence and spread of food deserts. Locally owned retail businesses were once a mainstay of working-class and rural communities. Their inability to obtain fair prices beginning in the 1980s hit these retailers especially hard because their customers could least afford to pay more. Those who could travel to cheaper chain stores in other neighborhoods or towns were especially likely to do so. (Food deserts were not, by the way, a consequence of suburbanization and white flight, as some observers have suggested. By 1970, more Americans already lived in suburbs than in cities. Yet, at that point, low-income neighborhoods had more grocery stores per capita than middle-class areas. The relationship didn’t begin to reverse until the 1980s.)
Why didn’t large chains fill the void when local stores closed? They didn’t need to. In the 1960s, if a chain like Safeway wanted to compete for the grocery dollars spent by Deanwood residents, it had to open a store in the neighborhood. But once the independent stores closed, the chains no longer had to invest in low-income areas. They could count on people to schlep across town to their other locations. Today, in fact, many Deanwood residents travel to a Safeway outside the neighborhood to shop. This particular Safeway has had such persistent issues with expired meat and rotting produce that some locals have taken to calling it the “UnSafeway.” Yet, without alternatives, people keep shopping there.
In rural areas, the same dynamic means that Walmart can capture spending across a wide region by locating its supercenters in larger towns, counting on people in smaller places that no longer have grocery stores to drive long distances to shop for food. An independent grocer that tries to establish itself in a more convenient location will struggle to compete with Walmart on price because suppliers, who can’t risk losing Walmart’s business, will always give the mega-chain a better price. Indeed, during the height of the pandemic, when supply-chain disruptions left grocery manufacturers struggling to meet demand, Walmart announced stiff penalties for suppliers who failed to fulfill 98 percent of its orders. Suppliers complied by shorting independent grocers, who scrambled to keep staple products in stock even as Walmart’s shelves were full.
The problem of food deserts will not be solved without the rediscovery of the Robinson-Patman Act. Requiring a level pricing playing field would restore local retailers’ ability to compete. This would provide immediate relief to entrepreneurs who have recently opened grocery stores in food deserts, only to find that their inability to buy on the same terms as Walmart and Dollar General makes survival difficult. With local grocery stores back on the scene in these neighborhoods, chain supermarkets may well return, too, lured by a force far more powerful than tax breaks: competition.
The Biden administration has begun to connect the dots. Alvaro Bedoya, a member of the Federal Trade Commission, has been an outspoken proponent of Robinson-Patman enforcement, and the FTC under Chair Lina Khan is widely expected to file its first such case in the coming months. But Donald Trump’s election casts doubts on the long-term prospects for a Robinson-Patman revival. Although the law has garnered support among some GOP House members, powerful donors are calling for corporate-friendly appointments to the FTC. Hopefully the incoming Trump administration realizes that the rural and working-class voters who propelled him to power are among those most affected by food deserts—and by the broader decline in local self-reliance that has swept across small-town America since the 1980s. A powerful tool for reversing that decline is available. Any leader who truly cared about the nation’s left-behind communities would use it.