Americans have long debated the proper role of government in the economy. In 2010 that debate focused on issues including government rescues of big financial institutions and automobile manufacturers in the recent severe recession, mandated expansion of health insurance coverage to more people, tougher financial regulation and offshore oil drilling.
The debate goes back to the nation’s founding. A range of taxes imposed by the British helped trigger the Revolutionary War in 1775. Alexander Hamilton, the United States’ first secretary of the Treasury, succeeded in establishing a national central bank but lost his campaign for a federal policy to promote strategically important industries. The central bank charter was allowed to expire in the 1830s; the United States had no central bank from then until the creation of the Federal Reserve in 1913.
Government intervenes in the economy in at least four ways:
• It provides such goods and services as roads, education, public safety and national defense.
• It transfers income between groups of people, most notably to retirees from younger workers through the Social Security and Medicare programs.
• It collects taxes and borrows money to pay for spending.
• It regulates business activity.
Federal, state and local governments have from the beginning regulated the economy, intervening to help the interests of specific regions, industries and individuals. Just how far the government should go in doing this has long been the source of much debate.
The legal justification for federal economic regulation rests on a few sections of Article I of the U.S. Constitution. These give Congress authority to collect taxes and duties, borrow on the credit of the nation, pay the federal government’s debts, create a U.S. currency and regulate its value, establish laws governing bankruptcy and naturalization of immigrants and grant patents and copyrights.
The most general — and controversial — language lies in Article I, Section 8, which authorizes Congress “To regulate Commerce with foreign Nations, and among the several States, and with the Indian Tribes.”
Courts interpreted the Constitution’s “commerce clause” narrowly in the 19th century. Later, with the consent of the courts, the federal government interpreted the clause to justify far-reaching programs that the nation’s Founding Fathers could probably never have imagined. In the 1960s, for example, the courts affirmed civil rights laws against racial discrimination on the basis of Congress’ power to regulate interstate commerce. Beginning in the 1990s, a number of court rulings sought once again to narrow the scope of the commerce clause to issues directly connected with economic activities.
In the life cycle of an American business, the first step is the least regulated of all. An entrepreneur seeking to form a new business need only register with state tax authorities. Those entering specific professions like medicine and law may need a license, typically awarded only after passing a comprehensive examination, but starting a company requires no permission.
No legal business in the United States escapes some regulation. Laws passed by Congress and regulations adopted by administrative agencies so authorized by Congress seek to prevent businesses from exercising monopoly power or operating fraudulently. Financial regulations aim to protect people’s savings and investments from business mismanagement or unscrupulous practices [see the sidebar on financial regulation]. Health and safety regulations are designed to protect the public from unsafe foods, drugs, toys, autos, airlines and other products and services. Another set of statutes and regulations protects workers’ health and safety on the job. Other legal provisions balance the rights of workers and employers. In most states workers are considered “at will” employees, meaning they can be discharged whenever the employer chooses — except in narrowly defined circumstances. Under federal law, workers may not be fired because of their race, gender, age or sexual preference. A federal “whistle blower” law protects employees who disclose an employer’s illegal activities.
Congress in 1898 gave workers the right to organize labor unions and authorized government mediation of conflicts between labor and management. During the Great Depression, Congress passed the National Labor Relations Act of 1935 (commonly known as the Wagner Act) that more specifically set out the rights of most private-sector workers to form labor unions, to bargain with management over wages and working conditions and to strike to obtain their demands. The Fair Labor Standards Act passed in 1938 established a national minimum wage, prohibited oppressive child labor and provided for overtime pay in designated occupations.
Enforcement of U.S. antitrust laws (or competition laws) for more than a century reflects the evolving debate over government regulation. By the end of the 19th century, concerns about economic power had focused on monopolies that controlled commerce in industries as diverse as oil, steel and tobacco, and whose operations were often cloaked in secrecy because of hidden ownership interests.
The monopolies typically took the form of “trusts,” with shareholders giving control of companies to a board of trustees in return for a share of the profits in the form of dividends. More than 2,000 company mergers were accomplished from 1897 through 1901. In the latter year, Theodore Roosevelt became president and began a “trust-busting” campaign aimed at what he called the “malefactors of great wealth.”
Under Roosevelt and his successor, William Howard Taft, the federal government won antitrust lawsuits that broke up most of the major monopolies, including John D. Rockefeller’s Standard Oil trust; J.P. Morgan’s Northern Securities Company, which dominated railroads in the Northwest; and James B. Duke’s American Tobacco trust.
The government’s main antitrust authority resides in two laws. The Sherman Antitrust Act of 1890 aims to stop conspiracies among companies to fix prices and restrain trade; it also empowers the federal government to break up monopolies into smaller companies to promote competition. The Clayton Act of 1914 defines anti-competitive and unfair practices more specifically and gives government the right to prevent mergers of companies that could undermine competition. Additional federal statutes address specific industries.
In deciding how far government should go to protect competition, the focus at the start was on the conduct of dominant companies, not their size and power alone. In 1911, the Supreme Court set down its “rule of reason,” which stated that only unreasonable restraints of trade — those that had no clear economic purpose — were illegal under the Sherman Act. A company that gained monopoly power by producing better products or following better strategies should not face antitrust penalties.
During the Great Depression, however, Congress passed the Robinson-Patman Act aimed at maintaining a balance between nationwide manufacturing and retailing businesses on one side and small businesses on the other. The idea that the law should preserve a competitive balance by restraining dominant companies regardless of their conduct was reinforced by court decisions into the 1970s. At the peak of this trend, the federal government was pursuing antitrust cases against IBM Corporation, the largest computer manufacturer at the time, and AT&T Corporation, the national telephone monopoly.
In the 1980s, under President Ronald Reagan, the federal government shifted its competition policies in line with the philosophy of University of Chicago academics, such as Nobel Prize-winning economist Milton Friedman. According to “Chicago School” theory, government antitrust enforcement usually fails to promote competition. Chicago School proponents assert that self-correcting market forces will almost always restore competition.
Each presidential administration interprets antitrust law with varying degrees of aggressiveness. Under President Bill Clinton in the 1990s, for example, the Justice Department prosecuted the Archer Daniels Midland (ADM) company for allegedly conspiring with Asian partners to monopolize the sale of several feed products and additives. Eventually three ADM executives went to prison, and the company paid $100 million in fines.
The Clinton administration also launched in 1998 a case against Microsoft Corporation, which then controlled most of the market for personal computer operating systems software.
When Microsoft built its Internet Explorer browser software into its dominant Windows operating system, antitrust regulators accused Microsoft of leveraging its market power over operating systems to dominate the browser market.
A federal judge ruled against Microsoft, but an appeals court overruled that decision. In the appellate judge’s view, Microsoft’s offer of its browser software for free, while hurting smaller competitors, nevertheless benefited consumers and allowed the kind of innovation that ultimately promotes economic competition. President George W. Bush halted the Justice Department’s case against Microsoft.
The severe recession that began in late 2007 has shattered many people’s belief that markets are self-correcting and have no need for regulation. President Barack Obama pledged to enforce antitrust law with vigor. His Justice Department prosecuted cases against a number of foreign air freight carriers and against a number of Asian manufacturers of liquid crystal display panels, resulting in collection of more than $1 billion in fines in 2009, the second highest total for any year.
Rapid globalization has also forced reconsideration of competition law. Fewer U.S. markets remain primarily domestic; more U.S. producers compete against foreign companies that operate under different regulatory regimes. For more than a decade, the Justice Department has been forging cooperative agreements with antitrust authorities in foreign countries. It entered into such an agreement with its Russian counterpart in 2009 and has started engaging relatively new competition authorities in China and India.