I | INTRODUCTION |
United States
History, story of how the republic developed from colonial beginnings in
the 16th century, when the first European explorers arrived, until modern times.
As the nation developed, it expanded westward from small settlements along the
Atlantic Coast, eventually including all the territory between the Atlantic and
Pacific oceans across the middle of the North American continent, as well as two
noncontiguous states and a number of territories. At the same time, the
population and the economy of the United States grew and changed dramatically.
The population diversified as immigrants arrived from all countries of the
world. From its beginnings as a remote English colony, the United States has
developed the largest economy in the world. Throughout its history, the United
States has faced struggles, both within the country—between various ethnic,
religious, political, and economic groups—and with other nations. The efforts to
deal with and resolve these struggles have shaped the United States of America
into the 21st century.
This is one of seven major articles that
together provide a comprehensive discussion of the United States of America. For
more information on the United States, please see the other six major articles:
United States (Overview), United States (Geography), United States (People),
United States (Culture), United States (Economy), and United States
(Government).
II | EARLY CULTURAL INTERACTION |
Early American history began in the collision
of European, West African, and Native American peoples in North America.
Europeans “discovered” America by accident, then created empires out of the
conquest of indigenous peoples and the enslavement of Africans. Yet conquest and
enslavement were accompanied by centuries of cultural interaction—interaction
that spelled disaster for Africans and Native Americans and triumph for
Europeans, to be sure, but interaction that transformed all three peoples in the
process.
A | Native America in 1580 |
The lands and human societies that European
explorers called a New World were in fact very old. During the Ice Ages much of
the world’s water was bound up in glaciers. Sea level dropped by hundreds of
feet, creating a land bridge between Alaska and Siberia. Asians walked across to
become the first human inhabitants of the Americas. Precisely when this happened
remains unknown, but most scientists believe it occured before 15,000 years ago.
When the last glaciers receded about 10,000 years ago (thus ending this first
great migration to America), ancestors of the Native Americans filled nearly all
of the habitable parts of North and South America. They lived in isolation from
the history—and particularly from the diseases—of what became known as the Old
World. See also First Americans.
The Native Americans who greeted the first
Europeans had become diverse peoples. They spoke between 300 and 350 distinct
languages, and their societies and ways of living varied tremendously. The
Aztecs of Mexico and the Incas of Peru built great empires (see Aztec
Empire; Inca Empire). In what is now the United States, the Mississippians
(see Mound Builders) built cities surrounded by farmland between
present–day St. Louis, Missouri, (where their city of Cahokia was larger than
medieval London) and Natchez, Mississippi. The Mississippians’ “Great Sun” king
ruled authoritatively and was carried from place to place by servants, preceded
by flute–players. The Pueblo peoples of the Southwest lived in large towns,
irrigated their dry land with river water, and traded with peoples as far away
as Mexico and California.
In the East, the peoples who eventually
encountered English settlers were varied, but they lived in similar ways. All of
them grew much of their food. Women farmed and gathered food in the woods. Men
hunted, fished, and made war. None of these peoples kept herds of domestic
animals; they relied on abundant wild game for protein. All lived in family
groups, but owed their principal loyalties to a wider network of kin and to
their clans. Some—the Iroquois in upstate New York and the Powhatan confederacy
in Virginia—formed alliances called confederacies for the purposes of keeping
peace among neighbors and making war on outsiders. Even within these
confederacies, however, everyday political organization seldom extended beyond
villages, and village chiefs ruled their independent–minded people by
consent.
B | West Africa in 1580 |
In Central and West Africa, the great inland
kingdoms of Mali and Ghana were influenced (and largely converted) by Islam, and
these kingdoms had traded with the Muslim world for hundreds of years. From the
beginning, slaves were among the articles of trade. These earliest enslaved
Africans were criminals, war captives, and people sold by their relatives to
settle debts. New World demand increased the slave trade and changed it. Some of
the coastal kingdoms of present–day Togo and Benin entered the trade as
middlemen. They conducted raids into the interior and sold their captives to
European slavers. Nearly all of the Africans enslaved and brought to America by
this trade were natives of the western coastal rain forests and the inland
forests of the Congo and Central Africa.
About half of all Africans who were
captured, enslaved, and sent to the Americas were Bantu–speaking peoples. Others
were from smaller ethnic and language groups. Most had been farmers in their
homeland. The men hunted, fished, and tended animals, while women and men worked
the fields cooperatively and in large groups. They lived in kin–based villages
that were parts of small kingdoms. They practiced polygyny (men often had
several wives, each of whom maintained a separate household), and their
societies tended to give very specific spiritual duties to women and men.
Adolescent girls and boys were inducted into secret societies in which they
learned the sacred and separate duties of women and men. These secret societies
provided supernatural help from the spirits that governed tasks such as hunting,
farming, fertility, and childbirth. Although formal political leaders were all
men, older, privileged women exercised great power over other women. Thus
enslaved African peoples in the New World came from societies in which women
raised children and governed one another, and where men and women were more
nearly equal than in America or Europe.
C | European Exploration |
In the century before Columbus sailed to
America, Western Europeans were unlikely candidates for worldwide exploration.
The Chinese possessed the wealth and the seafaring skills that would have
enabled them to explore, but they had little interest in the world outside of
China. The Arabs and other Islamic peoples also possessed wealth and skills. But
they expanded into territories that were next to them—and not across uncharted
oceans. The Ottoman Turks captured Constantinople in 1453 and by the 1520s had
nearly reached Vienna. These conquests gave them control over the overland trade
routes to Asia as well as the sea route through the Persian Gulf. The conquests
also gave them an expanding empire to occupy their attention.
Western Europeans, on the other hand, were
developing the necessary wealth and technology and a compelling need to explore.
A group of new monarchs were making nation-states in Britain and in continental
Europe—states with unprecedentedly large treasuries and military establishments.
The population of Western European nations was growing, providing a tax base and
a labor force for new classes of large landholders. These “elites” provided
markets for goods that were available only through trade with Asia. When the
expansion of Islam gave control of eastern trade routes to Islamic middlemen,
Western Europeans had strong incentives to find other ways to get to Asia.
They were also developing sailing
technology and knowledge of currents and winds to travel long distances on the
open sea. The Portuguese led the way. They copied and improved upon the designs
of Arab sailing ships and learned to mount cannons on those ships. In the 15th
century they began exploring the west coast of Africa—bypassing Arab merchants
to trade directly for African gold and slaves. They also colonized the Madeira
Islands, the Azores, and the Cape Verde Islands and turned them into the first
European slave plantations.
The European explorers were all looking for
an ocean route to Asia. Christopher Columbus sailed for the monarchs of Spain in
1492. He used the familiar prevailing winds to the Canary Islands, off the
northwest coast of Africa, and then sailed on. In about two months he landed in
the Caribbean on an island in the Bahamas, thinking he had reached the East
Indies. Columbus made three more voyages. He died in 1506, still believing that
he had discovered a water route to Asia.
The Spanish investigated further. Italian
navigator Amerigo Vespucci sailed to the northern coast of South America in 1499
and pronounced the land a new continent. European mapmakers named it America in
his honor. Spanish explorer Vasco Núñez de Balboa crossed the Isthmus of Panama
and in 1513 became the first of the European explorers of America to see the
Pacific Ocean. That same year another Spaniard, Juan Ponce de León, explored the
Bahamas and Florida in search of the fountain of youth.
The first European voyages to the northern
coast of America were old and forgotten: The Norsemen (Scandinavian Vikings)
sailed from Greenland and stayed in Newfoundland for a time around 1000. Some
scholars argue that European fishermen had discovered the fishing waters off
eastern Canada by 1480. But the first recorded voyage was made by John Cabot, an
Italian navigator in the service of England, who sailed from England to
Newfoundland in 1497. Giovanni da Verrazzano, in 1524, and Jacques Cartier, in
1534, explored nearly the whole Atlantic coast of the present United States for
France. By that time, Europeans had scouted the American coast from Newfoundland
to Brazil. While they continued to look for shortcuts to Asia, Europeans began
to think of America for its own sake. Spain again led the way: Hernán Cortés
invaded Mexico in 1519, and Francisco Pizarro did the same in Peru in 1532.
D | Cultural Interaction: The Columbian Exchange |
What was to become American history began
in a biological and cultural collision of Europeans, Native Americans, and
Africans. Europeans initiated this contact and often dictated its terms. For
Native Americans and Africans, American history began in disaster.
Native Americans suffered heavily because
of their isolation from the rest of the world. Europe, Africa, and Asia had been
trading knowledge and technologies for centuries. Societies on all three
continents had learned to use iron and kept herds of domestic animals. Europeans
had acquired gunpowder, paper, and navigational equipment from the Chinese.
Native Americans, on the other hand, had none of these. They were often helpless
against European conquerors with horses, firearms, and—especially—armor and
weapons.
The most disastrous consequence of the
long-term isolation of the Americas was biological. Asians, Africans, and
Europeans had been exposed to one another’s diseases for millennia; by 1500 they
had developed an Old World immune system that partially protected them from most
diseases. On average, Native Americans were bigger and healthier than the
Europeans who first encountered them. But they were helpless against European
and African diseases. Smallpox was the biggest killer, but illnesses such as
measles and influenza also killed millions of people. The indigenous population
of Mexico, for example, was more than 17 million when Cortés landed in 1519. By
1630 it had dropped to 750,000, largely as a result of disease. Scholars
estimate that on average the population of a Native American people dropped 90
percent in the first century of contact. The worst wave of epidemics in human
history cleared the way for European conquest. See also United States
(People): Disease and Death in Early America.
Europeans used the new lands as sources of
precious metals and plantation agriculture. Both were complex operations that
required labor in large, closely supervised groups. Attempts to enslave
indigenous peoples failed, and attempts to force them into other forms of bound
labor were slightly more successful but also failed because workers died of
disease. Europeans turned to the African slave trade as a source of labor for
the Americas. During the colonial periods of North and South America and the
Caribbean, far more Africans than Europeans came to the New World. The slave
trade brought wealth to some Europeans and some Africans, but the growth of the
slave trade disrupted African political systems, turned slave raiding into
full–scale war, and robbed many African societies of their young men. The
European success story in the Americas was achieved at horrendous expense for
the millions of Native Americans who died and for the millions of Africans who
were enslaved.
III | COLONIAL EXPERIMENTS |
Beginning in 1519, Spain, Portugal, France,
The Netherlands, and England established colonies in the Americas. Spain made a
great mining and agricultural empire in Mexico, South America, and the
Caribbean. Portugal created a slave-based agricultural colony in Brazil. In
North America the French and Dutch established rudimentary European societies
and, more importantly, elaborate, long-term trading networks with the indigenous
peoples. Among the European invaders of North America, only the English
established colonies of agricultural settlers, whose interests in Native
Americans was less about trade than about the acquisition of land. That fact
would have huge implications in the long struggle for control of North
America.
A | New Spain |
Spain was the first European nation to
colonize America. Cortés invaded Mexico and (with the help of smallpox and other
Native Americans) defeated the Aztec Empire between 1519 and 1521. By 1533
Pizarro had conquered the Incas of Peru. Both civilizations possessed artifacts
made of precious metals, and the Spanish searched for rumored piles of gold and
silver. They sent expeditions under Hernando de Soto, Francisco Vásquez de
Coronado, and Álvar Núñez Cabeza de Vaca as far north as what is now Kansas and
Colorado. They were looking for cities made of gold and did not find them. But
in 1545 they did discover silver at Potosí, in what is now Bolivia, and in
Mexico around the same time. New World gold and silver mines were the base of
Spanish wealth and power for the next hundred years.
Shortly after the conquests, Catholic
missionaries—Jesuits until 1571, Franciscans and Dominicans after that—attempted
to convert Native Americans to Christianity. They established missions not only
at the centers of the new empire but also in New Mexico and Florida. Spanish
Jesuits even built a short–lived mission outpost in Virginia.
After defeating indigenous peoples,
Spanish conquerors established a system of forced labor called
encomienda. However, Spanish governmental and religious officials
disliked the brutality of this system. As time passed, Spanish settlers claimed
land rather than labor, establishing large estates called haciendas. By
the time French, Dutch, Swedish, and English colonists began arriving in the New
World in the early 17th century, the Spanish colonies in New Spain (Mexico), New
Granada (Colombia), and the Caribbean were nearly 100 years old. The colonies
were a source of power for Spain, and a source of jealousy from other European
nations.
B | New France |
By the 1530s French explorers had scouted
the coast of America from Newfoundland to the Carolinas. Samuel de Champlain
built the foundations of what would become French Canada (New France). From 1604
to 1606 he established a settlement at Acadia in Nova Scotia, and in 1608 he
traveled up the St. Lawrence River, made contact with the Huron and Algonquin
peoples, and established a French settlement at Québec.
From the beginning, New France
concentrated on two activities: fur trade and Catholic missions. Missionaries
and traders were often at odds, but both knew that the success of New France
depended upon friendly relations with the native peoples. While Jesuits
converted thousands of Native Americans, French traders roamed the forests. Both
were among the first white explorers of the interior of North America, and
France’s ties with Native Americans would have important implications for the
next 150 years. By 1700 the French population of New France was 14,000. French
Canada was a strategically crucial brake on English settlement. But the much
smaller sugar islands in the Caribbean—Saint-Domingue (Haiti), Guadeloupe, and
Martinique—were economically far more valuable to France.
C | Dutch Settlements |
Another contender for influence in North
America was the Dutch, inhabitants of the leading commercial nation in the early
17th century. Sailing for the Dutch in 1609, Henry Hudson explored the river
that now bears his name. The Dutch established a string of agricultural
settlements between New Amsterdam (New York City) and Fort Orange (Albany, New
York) after 1614. They became the chief European traders with the Iroquois,
supplying them with firearms, blankets, metal tools, and other European trade
goods in exchange for furs. The Iroquois used those goods to nearly destroy the
Huron and to push the Algonquins into Illinois and Michigan. As a result, the
Iroquois gained control of the Native American side of the fur trade.
The Dutch settlements, known as New
Netherland, grew slowly at first and became more urban as trade with the
indigenous peoples outdistanced agriculture as a source of income. The colony
was prosperous and tolerated different religions. As a result, it attracted a
steady and diverse stream of European immigrants. In the 1640s the 450
inhabitants of New Amsterdam spoke 18 different languages. The colony had grown
to a European population of 6,000 (double that of New France) on the eve of its
takeover by England in 1664.
D | First English Settlements |
The Spanish, French, and Dutch wanted to
find precious metals in the Americas, to trade with the indigenous peoples, and
to convert them to Christianity. Their agricultural colonies in the Caribbean,
Mexico, and South America were worked by African slaves and by unwilling native
peoples, and relatively few Europeans settled permanently in those places. In
contrast, England, a latecomer to New World colonization, sent more people to
the Americas than other European nations—about 400,000 in the 17th century—and
established more permanent agricultural colonies.
English migrants came to America for two
main reasons. The first reason was tied to the English Reformation. King Henry
VIII broke with the Catholic Church in the 1530s. Through a series of political
and religious twists and turns, the new Church of England developed a Protestant
theology, but it retained much of Catholic liturgy and ritual forms. Within the
Church of England, radical Protestants, later called Puritans, wanted to
suppress the remaining Catholic forms. The fortunes of the Puritans depended on
the religious preferences of English monarchs. Queen Mary I, who ruled from 1553
to 1558, was a committed Catholic who tried to roll back the tide of religious
change; she executed hundreds of Protestants and chased many more into exile.
Her successor, Elizabeth I, invited the exiles back and tried to resolve
differences within the English church. The Stuart kings who followed her, James
I and Charles I, again persecuted Puritans. As a result, Puritans became willing
to immigrate to America.
The second reason for English
colonization was that land in England had become scarce. The population of
England doubled from 1530 to 1680. In the same years, many of England’s largest
landholders evicted tenants from their lands, fenced the lands, and raised sheep
for the expanding wool trade. The result was a growing number of young, poor,
underemployed, and often desperate English men and women. It was from their
ranks that colonizers recruited most of the English population of the mainland
colonies.
IV | GROWTH OF THE ENGLISH COLONIES |
Permanent English settlement began in the
Chesapeake Bay area in 1607 and in Massachusetts in 1620. The histories of the
two regions during their first century and a half are almost opposite. Virginia
began as a misguided business venture and as a disorderly society of young men.
Massachusetts settlers were Puritans. They arrived as whole families and
sometimes as whole congregations, and they lived by laws derived from the Old
Testament. Over time, however, Virginia was transformed into a slave-based
tobacco colony where slaves were carefully disciplined, where most white
families owned land, and where a wealthy and stable planter-slaveholder class
provided much of the leadership of revolutionary and early national America. New
England, on the other hand, evolved into a more secularized and increasingly
overpopulated society based on family farms and inherited land—land that was
becoming scarce to the point that increasing numbers of whites were slipping
into poverty.
A | The Chesapeake |
A1 | Virginia |
Jamestown, the first permanent English
settlement in America, began as a business venture that failed. The Virginia
Company of London, a joint stock company organized much like a modern
corporation, sent 104 colonists to Chesapeake Bay in 1607. The company wanted to
repeat the successes of the Spanish: The colonists were to look for gold and
silver, for a passage to Asia, and for other discoveries that would quickly
reward investors. If the work was heavy, the colonists were to force indigenous
peoples to help them. The composition of the group sent to Jamestown reflected
the company’s expectations for life in the colony. Colonists included
silversmiths, goldsmiths, even a perfumer, and far too many gentlemen who were
unprepared for rugged colonial life.
The colonists found a defensible spot on
low ground and named it Jamestown. None of their plans worked out, and the
settlers began to die of dysentery and typhoid fever. At the end of the first
year, only about one-third remained alive. The Native Americans were
troublesome, too. Organized into the large and powerful Powhatan confederacy,
they grew tired of demands for food and launched a war against the settlers that
continued intermittently from 1609 to 1614.
In 1619 the Virginia Company
reorganized. The colony gave up the search for quick profits and turned to
growing tobacco. Under the new plan, colonists received 50 acres from the
company for paying a person’s passage to Virginia. The new settlers were
indentured servants who agreed to work off the price of their passage. Thus
settlers who could afford it received land and labor at the same time. In 1624
King James I of England made Virginia the first royal colony. He revoked the
Virginia Company’s charter and appointed a royal governor and council, and
established a House of Burgesses elected by the settlers. Despite fights with
the Powhatan confederacy (about 350 settlers died in one attack in 1622), the
Virginia colony began to prosper. It had found a cash crop, a source of labor,
and a stable government.
A2 | Maryland |
In 1634 Cecilius Calvert, 2nd Baron
Baltimore, founded Maryland under a royal charter, which made the colony
Baltimore’s personal property. Baltimore, a Catholic nobleman, hoped to
establish a refuge for English Catholics and sell large estates to individuals
who would operate as feudal lords.
Neither the plans for feudalism nor for
a Catholic refuge worked out, however. More Protestants than Catholics
immigrated to Maryland. In 1649 Baltimore granted religious toleration to all
Christians, but Protestants did not stop opposing him. They even overthrew
Baltimore’s government on several occasions. Baltimore’s dreams of feudalism
failed as well. Freed servants preferred farming on their own to staying on as
tenants, and the colony quickly evolved as Virginia had: Planters (many of them
former servants) imported servants from England and grew tobacco.
A3 | Mortality Rate |
Chesapeake tobacco growers needed
able–bodied servants. Most of those imported to Virginia and Maryland were
young, poor, single men. Disease, bad water, and hostile native peoples produced
a horrific death rate. In 1618 there were 700 English settlers in Virginia. The
reorganized Virginia Company sent 3,000 more before 1622. A headcount that year
found only about 1,200 still alive. Still, surviving planters continued to
import servants. Some servants lived long enough to end their indentures, but
many others died. In addition, there were too few women in the Chesapeake to
enable surviving men to build families and produce new Virginians. More than
two-thirds of men never married, and the white population of Virginia did not
begin to sustain itself until at least the 1680s. Before that, the colony
survived only by importing new people to replace those who died.
A4 | Introduction of Slavery |
White servants worked Chesapeake tobacco
farms until the late 17th century. But earlier in the century, English tobacco
and sugar planters in the Caribbean had adopted African slavery, long the chief
labor system in Portuguese and Spanish sugar colonies in the Caribbean. By 1700
the English islands were characterized by large plantations and by populations
that were overwhelmingly African. These African slaves were victims of a
particularly brutal and unhealthy plantation system that killed most of them. It
was not a coincidence that these islands produced more wealth for England than
its other colonies. See also Slavery in the United
States: Introduction of Slavery.
Before the 1680s, Chesapeake planters
purchased few African slaves, and the status of Africans in Virginia and
Maryland was unclear. Some were slaves, some were servants, some were free, and
no legal code defined their standing. The reasons for the slow growth of slavery
in the Chesapeake were not moral but economic. First, slave traders received
high prices for slaves in the Caribbean—higher than Virginians could afford,
particularly when these expensive laborers were likely to die. White indentured
servants cost less, and planters lost little when they died. But Chesapeake
colonists—both English and African—grew healthier as they became “seasoned” on
their new continent. At the same time, the English economic crisis that had
supplied servants to the colonies diminished. These changes made African slaves
a better long–term investment: The initial cost was higher, but the slaves lived
and reproduced.
Beginning around 1675, Virginia and
Maryland began importing large numbers of African slaves. By 1690 black slaves
outnumbered white servants in those colonies. Virginia now gave white servants
who survived their indentures 50 acres of land, thus making them a part of the
white landholding class. At the same time, the House of Burgesses drew up legal
codes that assumed a lifetime of bondage for blacks. In the early 18th century,
the Chesapeake emerged as a society of planters and small farmers who grew
tobacco with the labor of African slaves. There had been slaves in Virginia
since 1619. But it was not until nearly 100 years later that Virginia became a
slave society.
B | The Beginnings of New England |
New England began as a refuge for
religious radicals. The first English settlers were the Pilgrims. They were
Separatists—Protestants who, unlike the Puritans, seceded from the Church of
England rather than try to reform it. They sailed for the New World in 1620.
After difficult early years, they established a community of farms at Plymouth
that was ultimately absorbed by the Massachusetts Bay Company.
B1 | Religion in the New England Colonies |
A much larger Puritan migration began in
1630. The Puritans objected to the corruption and extravagance of the Stuart
kings, who considered alliances with Catholic monarchs and paid no attention to
Puritan demands for religious reform. The Puritans came to believe that God
would destroy England for these sins. They obtained a charter from the
Massachusetts Bay Company and made plans to emigrate—not to hide in the
wilderness from God’s wrath, but to preserve Protestant beliefs and to act as a
beacon of truth for the world. A thousand Puritans migrated to Massachusetts in
1630. But this Great Migration ended in 1642, when the Puritans became involved
in a civil war against the Stuart kings. The Puritans eventually won and ruled
England until 1660. When the migration ended, Massachusetts had 13,000 European
inhabitants.
The Puritans left England because of
religious persecution, but they, too, were intolerant. In Massachusetts they
established laws derived from the Bible, and they punished or expelled those who
did not share their beliefs. The Puritans established a governor and a general
court (an assembly elected by adult male church members) and governed
themselves. Although they refused to secede from the Church of England, they did
away with bishops and church hierarchy and invented congregationalism. In this
type of Protestantism, each congregation selected its own minister and governed
its own religious life (although outside authority sometimes intervened to
punish heresy).
Government officials were expected to
enforce godly authority, which often meant punishing religious heresy. Roger
Williams was a Separatist who refused to worship with anyone who, like nearly
all Puritans, remained part of the Church of England. Massachusetts banished
him, and he and a few followers founded Providence in what is now Rhode Island.
Anne Hutchinson was a merchant’s wife and a devout Puritan, but she claimed that
she received messages directly from God and was beyond earthly authority. This
belief was a heresy, a belief contrary to church teachings, known as
Antinomianism. She, too, was banished and she moved to Rhode Island. Puritan
magistrates continued to enforce religious laws: In the 1650s they persecuted
Quakers, and in the 1690s they executed people accused of witchcraft.
B2 | Growth of New England’s Population |
Once the Puritan migration to New
England stopped in 1642, the region would receive few immigrants for the next
200 years. Yet the population grew dramatically, to nearly 120,000 in 1700. Two
reasons explain this. First, in sharp contrast to the unhealthy Chesapeake,
Massachusetts streams provided relatively safe drinking water, and New England’s
cold winters kept dangerous microbes to a minimum. Thus disease and early death
were not the problems that they were farther south. Second (again in contrast to
the Chesapeake) the Puritans migrated in families, and there were about two
women for every three men, even in the early years. Nearly all colonists married
(typically in their mid–20s for men and early 20s for women), and then produced
children at two-year intervals. With both a higher birth rate and a longer life
expectancy than in England, the Puritan population grew rapidly almost from the
beginning.
C | The Restoration Colonies |
By 1640 England had founded 6 of the 13
colonies that would become the original United States. In 1660, after the end of
Puritan rule, Charles II was crowned king of England, an event known as the
Restoration. Charles founded or took over six more colonies: New York (taken
from the Dutch in 1664), New Jersey, Pennsylvania (including what became
Delaware), and North and South Carolina. All were proprietary colonies—huge land
grants to individuals or small groups who had been loyal to the king during the
civil war.
These colonies shared other similarities
as well. None of them was well–funded; they could ill afford to import colonists
from overseas. Thus they tried to attract settlers from other colonies as much
as from the Old World. These colonies made it easy to own land, and they tended
to grant religious toleration to all Christians. The result (even though
Pennsylvania began as a Quaker colony under the wealthy proprietor William Penn)
was a more ethnically mixed and religiously pluralistic European population than
had come to New England or to the Chesapeake. These new colonies were populated
not only by the English, but also by the Dutch and eventually by Scots,
Scots–Irish, and Germans. Their populations included Quakers and other religious
dissenters.
D | Settlers and Native Americans |
The French and Spanish came to the New
World to trade with the indigenous peoples, to convert them to Christianity, and
sometimes to turn them into a labor force for mining and agriculture. In
contrast, the English settlers wanted farmland. Thus they posed a far greater
threat to the Native Americans. Wars were the result. In New England a Wampanoag
chief named Metacomet (the English called him King Philip) became worried about
English intrusion on his land and ordered attacks on the settlements in 1675.
For the next year Metacomet and his allies destroyed 12 of 90 Puritan towns and
attacked 40 others, capturing or killing one in ten adult male English settlers.
The Puritans counterattacked in the summer of 1676. They killed Metacomet, sold
his wife and chief supporters into slavery in the West Indies, and scattered his
coalition. With that, the power of coastal Native Americans in New England was
broken.
In the same years (1675 to 1676) in
Virginia, land–hungry settlers led by a planter named Nathaniel Bacon picked a
fight with the Susquehannock people. The settlers’ goal was simply to end Native
American occupation of lands that whites wanted. When Governor William Berkeley
objected, the rebellious settlers forced the House of Burgesses to back their
war (see Bacon’s Rebellion). Later, they marched on Jamestown and burned
the colonial capital. Shortly after that, Bacon died of disease, and his
rebellion sputtered out. But a new treaty signed with the Native Americans in
1677 made much of their land available to white settlers.
E | The English and their Empire |
The English had colonies before they had a
colonial policy or an empire. The English government had little interest in
directly governing its colonies. The government was, however, mercantilist: It
wanted colonial economic activity to serve England. The Navigation Act of 1651
stipulated that imports into British harbors and colonies could be carried only
in British ships or those of the producing country. A second Navigation Act in
1660 decreed that colonial trade could be carried only in English ships and that
crucial commodities such as tobacco and sugar could be sent only to England or
another English colony. Further Navigation Acts in 1663 and 1696 regulated the
shipment of goods into the colonies and strengthened the customs service. For
the most part, the Navigation Acts succeeded in making colonial trade serve
England. They also made the colonists accustomed to and dependent upon imported
English goods. But the acts did not amount to a colonial administration. Private
companies, wealthy proprietors, and the settlers themselves did what they wanted
without official English interference.
King James II tried to change that. In
1684 he revoked the charter of the Massachusetts Bay Company. Then in 1686 he
created the Dominion of New England from the colonies of Massachusetts, New
Hampshire, Rhode Island, Plymouth, and Connecticut (all colonies that had been
derived from the original Massachusetts Bay colony), along with New York and New
Jersey. The king sent Sir Edmund Andros to be royal governor of this huge area.
However, the king had problems at home. He was a Catholic, and he threatened to
leave the throne in the hands of his Catholic son. In 1688 England’s ruling
elites deposed James II and replaced him with his daughter Mary and her husband,
a militant Dutch Protestant, William of Orange. As part of the agreement that
made him king, William issued the English Bill of Rights that ended absolutist
royal government in England. The ascension of William and Mary is known in
English history as the Glorious Revolution.
American colonists staged smaller versions
of the Glorious Revolution. Massachusetts and New York revolted against the
Dominion of New England. At the same time, the Protestant majority in Maryland
revolted against Charles Calvert, 3rd Baron Baltimore, and his Catholic elite.
William could have punished all these rebels and re–established the Dominion of
New England. Instead, he reorganized Massachusetts, New York, and Maryland as
royal colonies with elected legislative assemblies and royally appointed
governors. By 1720 the crown had transformed all the mainland colonies along
these lines except for Pennsylvania, Maryland, and Delaware. The Glorious
Revolution ended absolutism in England, and it ensured that government in the
mainland colonies would be both royal and representative.
F | Colonial Society |
The colonies over which the English were
beginning to exercise control were growing rapidly. In 1700 approximately
250,000 Europeans and Africans were living in what would become the United
States. In 1775 there were approximately 2.5 million. Much of the increase was
due to immigration: the forced migration of enslaved Africans, and the willing
migration of English, Scots-Irish, and Germans. See also United States
(People): European and African Immigration in the Colonies.
The middle colonies were much more diverse
than the northern colonies. The English majority contended with a variety of
European settlers, with a large Native American presence on the western edges,
and with a significant minority of African slaves. In Maryland and Virginia, the
early English settlers had been joined, particularly in the western counties, by
Scots, Scots–Irish, and Germans. In the eastern counties, African slaves—many of
them natives of Africa—often outnumbered whites.
South Carolina and Georgia had white
populations as diverse as those in the Chesapeake, and their slave populations
were African–born and ethnically diverse. One historian has noted that a slave
would have met more different kinds of Africans in one day in South Carolina
rice fields than in a lifetime in Africa.
By far the greatest source of population
growth, however, was a phenomenal birth rate and a relatively low death rate.
Americans in the 18th century had many children, who in turn survived to have
children of their own. American population growth in these years may have been
unprecedented in human history. See also United States (People):
Birthrates in Native America and Colonial America.
The household was the central institution
of colonial society. In Puritan society in particular, families were the
cornerstone of godly government. As one historian put it, Puritans experienced
authority as a hierarchy of strong fathers—beginning with God, descending down
through government officials and ministers, and ending with the fathers of
families. These families were patriarchal: Fathers ruled households, made family
decisions, organized household labor, and were the representatives of God’s
authority within the family. Fathers passed that authority on to their sons.
Puritan magistrates inspected families to ensure that they were orderly, and it
was a capital crime (at least in the law books) to commit adultery or to strike
one’s father.
Households in other 18th–century colonies
may have been less godly, but they were almost equally dominated by fathers, and
most white men had the opportunity to become patriarchs. Land was relatively
abundant, and Americans seldom practiced primogeniture and entail, which gave
oldest sons their fathers’ full estates and prevented men from dividing their
land. Fathers tended to supply all of their sons with land (daughters received
personal property as a dowry). Thus most American white men eventually owned
their own land and headed their own households.
As populations grew and as colonial
economies developed, however, that independence based on property ownership was
endangered. Good farmland in the south came to be dominated by a class of
planters, while growing numbers of poor whites became tenants. The pressure of a
growing population on the supply of farmland made tenancy even more common in
New Jersey and Pennsylvania (research puts the proportion at about 25 percent by
mid-century), while in New England more and more fathers found themselves unable
to provide for their sons. On the eve of the American Revolution (1775-1783),
American white men prided themselves on a widespread liberty that was based on
economic independence. Meanwhile, the land ownership that upheld that
independence was being undermined.
G | 18th-Century Slavery |
In the first half of the 18th century, the
mainland colonies grew dramatically but in very different ways. The Chesapeake
and the Carolinas grew plantation staples for world markets—tobacco in the
Chesapeake and North Carolina, rice and indigo in the coastal regions of South
Carolina and Georgia—and they were committed to African slave labor. Fully 70
percent of South Carolina’s population was black; nearly all Africans were
imported directly to the colony in the 18th century. The numbers were so huge
and the malarial wetlands they worked on were so unhealthy that masters
encouraged slaves to organize their own labor and to work unsupervised. Because
so many slaves lived and worked relatively unsupervised in this area, African
cultures—language, handicrafts, religious experience and belief, and
more—survived most fully among American slaves in South Carolina. Rice planters
of South Carolina permitted this cultural independence because it was easier and
because the slaves made them lots of money. South Carolina’s lowland planters
were the wealthiest group in the mainland colonies.
Further north, the tobacco colonies of
Virginia and Maryland were equally committed to slave labor, but slaves led
somewhat different lives here than in the deep South. The African population in
these colonies began to replace itself through reproduction as early as 1720
(compared with 1770 in South Carolina). Still, Chesapeake planters continued to
import new slaves from Africa; about 70,000 went to Virginia in the 18th century
and about 25,000 to Maryland. Slaves in these colonies tended to live and work
in smaller, more closely supervised groups than slaves farther south, and their
cultural memory of Africa, although often strong, was less pervasive than that
of Carolina slaves. In addition, white Virginians and Marylanders were turning
to wheat as a secondary crop, a development that required mills and towns, and
thus slave labor in construction, road building, and some of the skilled
crafts.
H | Northern Agriculture |
Around the middle of the 18th century, a
heavily populated and increasingly urbanized Europe lost the capacity to feed
itself, providing an important market for North American farmers. The middle
colonies, particularly Pennsylvania, became the breadbasket of America. After
Pennsylvania farmers provided for their families from their farms and by trading
with neighbors, they sent their surplus production of corn and wheat, as much as
40 percent of what they produced, on to the Atlantic market. New England farmers
worked soil that was poor and rocky, but used the same system.
Economists call this system safety–first
or subsistence–plus agriculture: Farmers provided for household and neighborhood
needs before risking their surplus in distant and unpredictable markets. In
profitable years, farmers were able to buy finished cloth, dishes and crockery,
tea and coffee, and other goods that colonial trade with England provided—goods
on which more and more Americans depended by 1770.
I | Religion |
British North America in the 18th century
was a religiously and ethnically diverse string of settlements. New England’s
population was overwhelmingly English, descended from the Great Migration of the
1630s. New England had a reputation for poor land and intolerance of outsiders,
and immigrants avoided the region. New Englanders continued to practice
congregationalism, although by the 18th century they seldom thought of
themselves as the spearhead of the Reformation. A wave of revivals known as the
Great Awakening swept New England beginning in the 1720s, dividing churchgoers
into New Light (evangelical Calvinists) and Old Light (more moderate) wings. An
increasing minority were calling themselves Baptists.
Nearly all Europeans in these colonies
were Protestants, but individual denominations were very different. There were
Presbyterians, Lutherans, Baptists, Anglicans, Dutch Reformed, Mennonites, and
Quakers. While the Church of England was the established church (the official,
government–supported church) in the Chesapeake colonies, German and Scottish
non-Anglicans were migrating south from the middle colonies, and Baptists were
making their first southern converts. Although most Chesapeake slaves were
American–born by the late 18th century, they practiced what they remembered of
African religions, while some became Christians in 18th-century revivals. See
also United States (People): Religion in the Colonies.
V | RESISTANCE AND REVOLUTION |
A | The Wars for North America, 1689–1763 |
Seventeenth–century colonists fought wars
with the coastal Native American peoples upon whom they had intruded.
Eighteenth-century colonial wars, in contrast, usually began in Europe, and they
pitted the English colonies against French and Spanish empires in North America.
These empires posed a number of problems for English colonists. Spanish Florida
offered refuge to runaway slaves from the southeastern colonies. The French
built an interior arc of settlements from Québec to New Orleans; they also made
trading agreements with Native Americans. The French trading empire impeded the
expansion of English settlements, and the strength of the French and their
Native American allies was a constant concern to the British and to American
settlers.
The English and French fought frequently:
in King William’s War (1689-1697; known in Europe as the War of the League of
Augsburg), in Queen Anne’s War (1702-1713; the War of the Spanish Succession),
in King George’s War (1744-1748; War of the Austrian Succession), and in the
French and Indian War (the Seven Years’ War), which began in America in 1754 and
ended in Europe in 1763. In all of these wars, the French had the assistance of
most Native Americans of the interior.
During the course of these wars, the
English gained strength in relation to their French and Spanish rivals, and in
the French and Indian War, with strong help from colonial militias, they
expelled the French from mainland North America. In 1763 Britain became the lone
European imperial power in North America between the Atlantic Ocean and the
Mississippi River. (The Spanish, allies of the French, gave up Florida but took
over French claims in New Orleans and in lands west of the Mississippi as
compensation.) Within 20 years the British would lose most of what they had
gained.
Victory in the French and Indian War gave
the British an enlarged mainland empire but also brought new problems. First,
the war had been expensive: The interest alone on Britain’s debt required half
the government’s revenues, and the overtaxed British people could not be asked
to pay more. Second, the acquisition of French and Spanish territory gave the
British new administrative tasks. They acquired not only vast tracts of land,
but also the French settlers and indigenous peoples who lived there.
The difficulties became clear in early
1763, when an Ottawa chief named Pontiac became worried about losing the French
allies who had helped keep British settlers out of the interior. Pontiac led an
uprising of a broad Native American coalition that included Seneca, Wyandots,
Delawares, Shawnee, Miami, Ottawa, and other nations. They attacked British
forts and frontier settlements in Pennsylvania and Virginia. During the summer
of 1763 they killed as many as 2,000 settlers, but they could not dislodge the
British from their fortified strongholds at Detroit, Niagara, and other places
in the interior. Settlers responded by murdering Native Americans, most of whom
had done nothing. The British government realized that it needed not only more
revenue but also a military presence and a colonial administrative policy to
establish British authority and keep the peace in North America.
B | Break with Britain |
B1 | Constitutional Understandings: Britain |
British officials believed that the
British government—and Parliament in particular—had the constitutional power to
tax and govern the American colonies. The rulers of Parliament assumed what they
called parliamentary sovereignty. Parliament, they insisted, was dominant within
the British constitution. Parliament was a brake against arbitrary monarchs;
Parliament alone could tax or write legislation, and Parliament could not
consent to divide that authority with any other body. As Thomas Hutchinson, the
royal governor of Massachusetts, put it, there could be no compromise “between
the supreme authority of Parliament and the total independence of the colonies.
It is impossible there should be two independent legislatures in one and the
same state.”
B2 | Constitutional Understandings: America |
The Americans, however, had developed a
very different opinion of how they should be governed. By the 1720s all but two
colonies had an elected assembly and an appointed governor. Contests between the
two were common, with governors generally exercising greater power in the
northern colonies and assemblies wielding more power in the south.
Governors technically had great power.
Most were appointed by the king and stood for him in colonial government.
Governors also had the power to make appointments, and thus to pack the
government with their followers.
The assemblies, however, had the “power
of the purse”: Only they could pass revenue (tax) bills. Assemblies often used
that power to gain control over appointments, and sometimes to coerce the
governor himself. This was particularly true during the French and Indian War,
when governors often asked assemblies to approve revenue bills and requisitions
to fund the fighting. Assemblies used their influence over finances to gain
power in relation to governors.
Colonists tended to view their elected
assemblies as defenders against the king, against Parliament, and against
colonial governors, who were attempting to increase their power at the expense
of popular liberty. Thus when the British Parliament asserted its right to tax
and govern the colonies (something it had never done before), ideals clashed.
The British elite’s idea of the power that its Parliament had gained since 1689
collided with the American elite’s idea of the sovereignty of its own
parliaments. The British assumed that their Parliament legislated for the whole
empire. The Americans assumed that while the parts of the empire shared British
liberties and the British king, the colonies could be taxed and governed only by
their own elected representatives. The British attempt to tax the colonies was
certain to start a fight.
B3 | Toward Independence |
Parliament passed the Sugar and Currency
acts in 1764. The Sugar Act strengthened the customs service, and on the surface
it looked like the old Navigation Acts. The Sugar Act was different, however,
because it was designed not so much to regulate trade (a power that colonists
had not questioned) but rather to raise revenue (a power that colonists denied
to Parliament). The Currency Act forbade colonies to issue paper money—a move
that many colonies saw as an unconstitutional intervention in their internal
affairs. Individual colonies petitioned against these measures, but a unified
colonial response to British colonial reform did not come until 1765.
B3a | The Stamp Act Crisis |
That year, Parliament passed the Stamp
Act, which required all legal documents, licenses, commercial contracts,
newspapers, pamphlets, dice, and playing cards to carry a tax stamp. The Stamp
Tax raised revenue from thousands of daily transactions in all of the colonies.
In addition, those accused of violating the act would be tried in Vice–Admiralty
Courts—royal tribunals without juries that formerly heard only cases involving
maritime law. The colonial assemblies petitioned the British, insisting that
only they could tax Americans. The assemblies also sent delegates to a Stamp Act
Congress, which adopted a moderate petition of protest and sent it to England.
Other Americans took more forceful measures. Before the Act went into effect, in
every large colonial town, mobs of artisans and laborers, sometimes including
blacks and women, attacked men who accepted appointments as Stamp Act
commissioners, usually forcing them to resign. American merchants also organized
nonimportation agreements, which put pressure on English merchants, who in turn
pressured the British government.
In spring 1766 a newly elected
Parliament repealed the Stamp Tax, believing it had been unwise. Parliament did
not, however, doubt its right to tax the colonies. When it repealed the Stamp
Act, Parliament passed the Declaratory Act, which reaffirmed Parliament’s right
to legislate for the colonies “in all cases whatsoever.”
B3b | Townshend Acts |
In 1767 a new ministry led by
chancellor of the Exchequer Charles Townshend addressed the North American
situation. Townshend drew up new taxes on imports (tea, lead, paper, glass,
paint) that Americans could receive only from Britain. More ominously, he
earmarked the revenue from these duties for the salaries of colonial governors
and judges, thus making them independent of the colonial assemblies. He also
strengthened the organization responsible for enforcing customs duties and
located its headquarters in Boston, the center of opposition to the Stamp Act.
Finally, he moved many units of the British army away from the frontier and
nearer the centers of white population.
Clearly, the Townshend Acts were meant
not only to tax the colonies but also to exert British authority. When colonial
assemblies protested the duties, Townshend dissolved the assemblies. Americans
rioted. They also agreed to boycott all imported British goods—particularly tea.
The British responded by landing troops at Boston (the center of resistance) in
October 1768. Tensions between townspeople and soldiers were constant for the
next year and a half. On March 5, 1770, tensions exploded into the Boston
Massacre, when British soldiers fired into a mob of Americans, killing five
men.
In Britain on the day of the Boston
Massacre, Parliament repealed all of the Townshend Duties except the one on
tea—a powerful reminder that it would never relinquish its right to tax and
govern Americans. The Americans, in turn, resumed imports of other goods, but
continued to boycott tea.
B3c | Other British Measures |
The Tea Act of 1773 maintained the tax
on tea and gave the English East India Company a monopoly on the export of that
commodity. The company’s tea ships ran into trouble in American ports, most
notably in Boston, where on December 16, 1773, colonials dressed as Native
Americans dumped a shipload of tea into the harbor (see Boston Tea
Party).
Britain responded to this Boston Tea
Party with the Intolerable Acts of 1774, which closed the port of Boston until
Bostonians paid for the tea. The acts also permitted the British army to quarter
its troops in civilian households, allowed British soldiers accused of crimes
while on duty in America to be tried in Britain or in another colony, and
revised the Massachusetts Charter to abolish its elected legislature.
At the same time, the Québec Act
organized a British government in Canada that frightened many Protestant,
libertarian Americans: It allowed the Catholic Church to remain established in
French Canada, and it established a government with fewer liberties than
Americans enjoyed. Some Americans saw the act as a model for what the British
had in mind for them. Along with the Intolerable Acts and the Québec Act came
clear signs that Britain would use whatever military force it needed to subdue
the Americans.
B3d | Continental Congress |
In September 1774 every colony but
Georgia sent delegates to the First Continental Congress in Philadelphia,
Pennsylvania. The Congress refused to recognize the authority of Parliament and
instead sent a petition to the king. The petition stated the principle that
Parliament could not legislate for the colonies without their consent and
extended this principle beyond taxation to any legislation.
While the British army occupied Boston,
Massachusetts established a provincial congress that met in Concord. The new
congress became the de facto government of Massachusetts. The British responded
by sending an army out from Boston to seize arms and American leaders at
Concord. They were met by Massachusetts militiamen, and colonial protest turned
into revolutionary war at the battles of Lexington and Concord in April 1775. A
Second Continental Congress met the following month and proclaimed the militia
that had routed the British in the countryside a Continental Army, with George
Washington as its leader. In August, King George III proclaimed the colonies to
be in rebellion. The British army, after a costly victory at the Battle of
Bunker Hill, left Boston and sailed for Nova Scotia. With that, there was
virtually no British military presence in the rebellious 13 colonies.
Through 1775 and into 1776, the
Americans fought without agreeing on what the fight was about: Many wanted
independence, while others wanted to reconcile with the king but not with
Parliament. The pamphlet Common Sense by Anglo-American philosopher
Thomas Paine presented powerful arguments opposing kings and supporting a pure
republic. It changed the minds of many colonists.
The British hired about 30,000 German
mercenaries (Hessians) to help put down the Americans, and that, too, convinced
some Americans that there could be no reconciliation. Congress appointed a
committee to draft a declaration of independence. Thomas Jefferson, a
congressman from Virginia, took on the job of writing the first draft. Congress
voted for independence on July 2, 1776, and signed the formal declaration two
days later.
The Declaration of Independence was
primarily a list of grievances against the king. But the opening paragraphs
amounted to a republican manifesto. The preamble declared (and committed future
generations of Americans to the proposition) that “all men are created equal,”
and that they possess natural rights that include “Life, Liberty, and the
pursuit of Happiness.” Perhaps most important, the declaration insisted that
governments derive their powers only by consent of the governed. Protest against
British colonial rule had been transformed into a republican revolution.
C | The American Revolution |
In 1776 the prospects for American victory
seemed small. Britain had a population more than three times that of the
colonies, and the British army was large, well–trained, and experienced. The
Americans, on the other hand, had undisciplined militia and only the beginnings
of a regular army or even a government. But Americans had powerful advantages
that in the end were decisive. They fought on their own territory, and in order
to win they did not have to defeat the British but only to convince the British
that the colonists could not be defeated.
The British fought in a huge, hostile
territory. They could occupy the cities and control the land on which their army
stood, but they could not subdue the American colonists. Two decisive battles of
the war—Saratoga and Yorktown—are cases in point. At Saratoga, New York, a
British army descending on the Hudson Valley from Canada outran its supply
lines, became tangled in the wilderness, and was surrounded by Americans. The
Americans defeated a British detachment that was foraging for food near
Bennington, Vermont, then attacked the main body of the British army at
Saratoga. The British surrendered an army of about 5,800 (see Battles of
Saratoga).
More important, the American victory at
Saratoga convinced France that an alliance with the Americans would be a good
gamble. The French provided loans, a few troops, and, most importantly, naval
support for the Americans. The French alliance also turned the rebellion into a
wider war in which the British had to contend not only with the colonials but
also with a French navy in the Caribbean and on the American coast.
In the battle of Yorktown, the climactic
campaign of the war, the vastness of America again defeated the British. In 1781
Lord Charles Cornwallis led an army through Virginia almost without opposition,
then retreated to a peninsula at Yorktown. There he was besieged by George
Washington’s army and held in check by the French navy. Unable to escape or to
get help, Cornwallis surrendered an entire British army. His defeat effectively
ended the war. In the Treaty of Paris of 1783, the British recognized the
independence of the United States and relinquished its territory from the
Atlantic to the Mississippi.
D | The Revolution: Winners and Losers |
Colonial elites—large landholders and
plantation masters—benefited most from American independence: They continued to
rule at home without outside interference. Below them, property–holding white
men who became full citizens of the American republic enjoyed the “life,
liberty, and property” for which they had fought. White women remained excluded
from public life, as did most white men without property. But the Americans for
whom the legacy of revolution proved disastrous—or at best ambiguous—were Native
Americans and African American slaves.
In 1760 the British defeated the French in
North America, and Native Americans lost the French alliance that had helped
protect and strengthen them for 150 years. In the Revolution, they tended to
side with the British or to remain neutral, knowing that an independent republic
of land–hungry farmers posed a serious threat. The six Iroquois nations divided
on this question, splitting a powerful confederacy that had lasted more than 200
years. When some Iroquois raided colonial settlements, Americans responded by
invading and destroying the whole Iroquois homeland in 1779. Further south, the
Cherokee people sided with the British and lost heavily. Up and down the
frontier, Native Americans and backcountry militia kept up unsettling and
sporadic fighting throughout the war. After the British ceded territory on both
sides of the Appalachians to the Americans in 1783, Native Americans—who had not
been defeated—ignored maps drawn by whites and continued to fight through the
1790s. Native American military power east of the Mississippi was not broken
until 1815. The key to that defeat was the fact that the independent American
republic was now expanding without opposition from either France or
Britain.
The results of the American Revolution for
American slaves were ambiguous. Early in the war, the governor of Virginia, Lord
Dunmore, had promised freedom to any Virginia slave who joined the British army.
Thousands took the offer, and many more thousands seized wartime opportunities
to disappear. (When Colonel Banastre Tarleton raided Charlottesville, Virginia,
many of Thomas Jefferson’s slaves cheered the British as liberators.) On the
other hand, thousands of blacks (primarily in the North) fought on the patriot
side.
American independence had differing effects
on blacks. On the one hand, it created an independent nation in which
slaveholders wielded real power—something that slaves would remember in the
1830s, when Parliament freed slaves in the British Caribbean without asking the
planters. On the other hand, the ideology of natural rights that was fundamental
to the Revolution was difficult to contain. Many whites, particularly in the
North, came to see emancipation as a logical outcome of the Revolution. Vermont
outlawed slavery in its constitution, and in the 1780s and 1790s most Northern
states took steps to emancipate their slaves. Even Chesapeake planters flirted
seriously with emancipation. Perhaps most important, slaves themselves absorbed
revolutionary notions of natural rights. Following the Revolution, slave
protests and slave rebellions were drenched in the rhetoric of revolutionary
republicanism. Thus American independence was a short–term disaster for the
slaves, but at the same time, it set in motion a chain of events that would
destroy American slavery.
VI | FORGING A NEW NATION |
A | State Constitutions |
In May 1776, even before declaring
national independence, the Second Continental Congress told the states to draw
up constitutions to replace their colonial regimes. A few ordered their
legislatures to draw up constitutions. By 1777, however, the states had
recognized the people as the originators of government power. State
constitutions were written by conventions elected by the voters (generally white
men who held a minimum amount of property), and in a few states the finished
constitutions were then submitted to voters for ratification. The Americans
(white men who owned property, that is) were determined to create their own
governments, not simply to have them handed down by higher authorities.
Without exception, the states rejected the
unwritten constitution of Britain—a jumble of precedents, common law, and
statutes that Americans thought had led to arbitrary rule. The new American
states produced written constitutions that carefully specified the powers and
limits of government. They also wrote the natural rights philosophy of the
Declaration of Independence into bills of rights that protected freedom of
speech and of the press, guaranteed trial by jury, forbade searching without
specific warrants, and forbade taxation without consent. Seven states appended
these to their constitutions; some of the other states guaranteed these rights
through clauses within their constitutions.
These first state constitutions, although
all republican and all demonstrating distrust of government power—particularly
of the executive—varied a great deal. In Pennsylvania, radicals wrote the most
democratic constitution, in 1776. It established a unicameral (one–house)
legislature to be chosen in annual secret-ballot elections that were open to all
male taxpayers; the executive was a 12–man committee without real power. Nearly
all of the other states adopted constitutions with two–house legislatures,
usually with longer terms and higher property qualifications for the upper
house. They had elective governors who could veto legislation, but who lacked
the arbitrary powers of prerevolutionary executives. They could not dissolve the
legislature, they could not corrupt the legislature by appointing its members to
executive office, and the legislature could override their vetoes.
In these revolutionary constitutions—drawn
up hurriedly in the midst of war—Americans were groping toward written
constitutions with clearly specified powers. These constitutions featured limits
for legislatures, executives, and the courts, with a clear separation of power
among the three. They also guaranteed the citizens certain inalienable rights
and made them the constituent power. On the whole, state constitutions reflected
fear of government (and particularly executive) tyranny more than they reflected
the need to create forceful, effective government.
B | The Articles of Confederation |
Americans began their revolution without
a national government, but the Continental Congress recognized the need for a
government that could conduct the war, form relations with other countries,
borrow money, and regulate trade. Eight days after the signing of the
Declaration of Independence, a committee headed by John Dickinson of
Pennsylvania submitted a blueprint for a powerful national government. Among
other things, Dickinson’s plan gave all the states’ western land claims to the
national government, and it created a congress in which states were represented
equally rather than by population. The plan shocked delegates who considered the
new nation a loose confederation of independent states, and they rejected
it.
The Articles of Confederation, which
included a strong affirmation of state sovereignty, went into effect in March
1781. They created a unicameral legislature in which each state had one vote.
The articles gave the confederation jurisdiction in relations with other nations
and in disputes between states, and the articles won control of western lands
for the national government. In ordinances passed in 1784, 1785, and 1787 the
Confederation Congress organized the new federal lands east of the Mississippi
and between the Ohio River and the Great Lakes as the Northwest Territory. This
legislation organized the land into townships six miles square, provided land to
support public schools, and organized the sale of land to developers and
settlers. The Northwest Ordinance of 1787 guaranteed civil liberties in the
territory and banned the importation of slaves north of the Ohio River. The
creation of the territory was among the solid accomplishments of the
Confederation government. Still, the government lacked important powers. It
could not directly tax Americans, and the articles could be amended only by a
unanimous vote of the states. Revolutionary fear of centralized tyranny had
created a very weak national government.
The weakness of the national government
made resolving questions of currency and finance particularly difficult. Neither
the national government nor the states dared to tax Americans. To pay the
minimal costs of government and the huge costs of fighting the war, both simply
printed paper money. While this money was honored early in the war, citizens
learned to distrust it. By 1780 it took 40 paper dollars to buy one silver
dollar. When the Confederation Congress requisitioned funds from the states, the
states were very slow in paying. And when the Congress asked permission to
establish a 5 percent tax on imports (which would have required an amendment to
the articles), important states refused. Under these circumstances the national
government could neither strengthen the currency nor generate a stable income
for itself.
The Confederation also had problems
dealing with other countries. In the Treaty of Paris that ended the Revolution,
for instance, Americans agreed to pay prerevolutionary debts owed to British
merchants, and to restore confiscated property to colonists who had remained
loyal to the king (Loyalists). States refused to enforce these provisions,
giving the British an excuse to occupy forts in what was now the Northwest
Territory of the United States. In 1784 Spain closed the port of New Orleans to
Americans, thus isolating farmers in the western settlements whose only access
to the rest of the world was through the Mississippi River that ended below that
port. The Confederation Congress could do little about these developments. These
problems also extended to international trade. In the 1780s Britain, France, and
Spain all made it difficult for Americans to trade with their colonies; at the
same time, the British flooded American ports with their goods. Gold and silver
flowed out of the country. The result was a deep depression throughout most of
the 1780s. The Confederation Congress could do nothing about it.
The Confederation also had trouble
dealing with Native Americans. The Confederation Congress negotiated doubtful
land–cession treaties with the Iroquois in New York and with the Cherokee,
Choctaw, and Chickasaw nations in the South. The Creeks (as well as many of the
Native Americans supposedly represented at the negotiations) resisted the
onslaught of white settlers, and the Confederation was powerless to do anything
about the wars that resulted.
The Confederation had internal problems
as well. The economic disruptions of the Revolution and the 1780s left many
farmers unable to keep up with their mortgages and other debts. State
governments had often met this problem by printing paper money and by passing
stay laws that prevented creditors from seizing the property of their debtors.
In Massachusetts, however, the upper house of the legislature protected the
investments of creditors by voting down debtor–relief legislation. In 1786
farmers in the western counties, led by revolutionary veteran Daniel Shays, held
conventions to demand the abolition of the upper house. They then mobbed county
courthouses and destroyed the records of many of their debts. They then marched
on a federal arsenal at Springfield, where they were repulsed and scattered by
the militia (see Shays’ Rebellion). Yet Shays’ rebels retained enough
support to elect a legislature that in the following year enacted a stay
law.
C | The Constitutional Convention |
International troubles, the postwar
depression, and the near–war in Massachusetts (as well as similar but less
spectacular events in other states) led to calls for stronger government at both
the state and national levels. Supporters wanted a government that could deal
with other countries, create a stable (deflated) currency, and maintain order in
a society that some thought was becoming too democratic. Some historians call
the citizens who felt this way cosmopolitans. They tended to be wealthy,
with their fortunes tied to international trade. They included seaport merchants
and artisans, southern planters, and commercial farmers whose foreign markets
had been closed. Most of their leaders were former officers of the Continental
(national) army and officials of the Confederation government—men whose wartime
experiences had given them a political vision that was national and not
local.
In the 1780s cosmopolitans were
outnumbered by so-called locals, who tended to be farmers living in isolated,
inland communities with only marginal ties to the market economy, and who tended
to be in debt to cosmopolitans. In the Revolution, most locals had served in
militias rather than in the national army, and they preserved a localist, rather
than nationalist, view of politics. They also preserved a distrust of any
government not subject to direct oversight by citizens. The new state
governments had often reapportioned legislative districts to give new,
fast-growing western counties greater representation. Locals tended to control
legislatures and (as in Shays’ Massachusetts) promote debtor relief, low taxes,
and inactive government—a situation that caused cosmopolitans to fear that the
republic was degenerating into democracy and chaos.
In September 1786 delegates from several
states met at Annapolis, Maryland, to discuss ways to improve American trade.
They decided instead, with the backing of the Confederation Congress, to call a
national convention to discuss ways of strengthening the Union. In May 1787, 55
delegates (representing every state but Rhode Island, whose legislature had
voted not to send a delegation) convened in Philadelphia and drew up a new
Constitution of the United States. The delegates were cosmopolitans who wanted
to strengthen national government, but they had to compromise on a number of
issues among themselves. In addition, the delegates realized that their
Constitution would have to be ratified by the citizenry, and they began
compromising not only among themselves but also on their notions of what
ordinary Americans would accept. The result was a Constitution that was both
conservative and revolutionary.
The biggest compromise was between large
and small states. States with large populations favored a Virginia Plan that
would create a two–house legislature in which population determined
representation in both houses. This legislature would then appoint the executive
and the judiciary, and it would have the power to veto state laws. The small
states countered with a plan for a one–house legislature in which every state,
regardless of population, would have one vote. In the resulting compromise, the
Constitution mandated a two-house legislature (see Congress of the United
States). Representatives would be elected to the lower house based on
population, but in the upper house two senators would represent each state,
regardless of population. Another compromise settled an argument over whether
slaves would be counted as part of a state’s population (if they were counted,
Southern representation would increase). The convention agreed to count each
slave as three–fifths of a person.
The president would be selected by an
electoral college, in which each state’s number of votes equaled its
congressional representation. Once elected, the president would have important
powers: The president appointed other officers of the executive department as
well as federal judges. Commander-in-chief of the military, the president also
directed foreign affairs, and could veto laws passed by Congress. These powers,
however, were balanced by congressional oversight.
Congress, or just the Senate, had to
ratify major appointments and treaties with foreign countries, and only Congress
could declare war. Congress also had the power to impeach the president or
federal judges, and Congress could override a president’s veto. The Constitution
also declared itself the supreme law of the land, and listed powers that the
states could not exercise. See also United States (Government).
Thus the Constitution carefully separated
and defined the powers of the three branches of the national government and of
the national and state governments. It established checks and balances between
the branches—and put it all in writing. The stated purpose of the document was
to make a strong national government that could never become tyrannical.
D | Ratification |
The proceedings of the Constitutional
Convention were kept secret until late September 1787. The Confederation
Congress sent the completed Constitution out for ratification by state
conventions elected for that purpose—not by state legislatures, many of which
were hostile to the new document. Thus the Constitution—which began “We the
people”—created a government with the people, and not the state legislatures, as
the constituent power.
The Federalists, as proponents of the
Constitution called themselves, were cosmopolitans who were better organized
than their opponents. Particularly in the beginning of the ratification effort,
they made greater use of pamphlets and newspapers. In New York, Federalist
leaders Alexander Hamilton, John Jay, and James Madison composed the powerful
and enduring Federalist papers to counter doubts about the proposed new
government. By January 1788 conventions in Delaware, Pennsylvania, New Jersey,
Georgia, and Connecticut had ratified the Constitution.
Opponents of the Constitution, who called
themselves Anti–Federalists, were locals who feared a strong national government
that would be run by educated and wealthy cosmopolitans who operated far away
from most citizens. They were particularly distrustful of a Constitution that
lacked a bill of rights protecting citizens from government attacks on their
liberties.
Ratification contests in the remaining
states were close, but by July 1788, 11 states had ratified, often with promises
that the new government would enact a bill of rights. (North Carolina eventually
ratified in 1789. The last state, Rhode Island, did not send delegates to the
Constitutional Convention and did not ratify the Constitution until 1790.)
VII | LAUNCHING THE NATION: FEDERALISTS AND JEFFERSONIANS |
George Washington was unanimously elected
the first president of the United States in 1789. He presided over a
revolutionary republic that was overwhelmingly rural. The country’s 4 million
people filled the nation’s territory at only 1.7 per square km (4.5 per square
mile; the comparable figure for 1998 was 29.5 per square km, or 76.4 per square
mile).
A | Americans and their Government, 1790–1815 |
Most Americans lived in rural,
self–sufficient neighborhoods. Farm families produced a variety of plants and
animals, consumed much of what they produced, and traded much of the rest within
their neighborhoods. Since the mid–18th century Americans had been sending
surpluses to Europe and to the slave islands of the Caribbean; in return they
received molasses, rum, crockery, tea and coffee, ready–made cloth, and other
European (primarily British) manufactured goods.
Two groups were more heavily dependent
on international trade, and both had tended to support the new Constitution. The
plantation slave–masters of the South grew staple crops for world markets: rice
and indigo in South Carolina and Georgia, tobacco in North Carolina and the
Chesapeake. The markets for these goods were in Europe (again, primarily
England). Northeastern seaport merchants also had a vital stake in overseas
trade.
From the 1790s to 1820, southern farms
and slavery changed dramatically. In the Chesapeake, tobacco had worn out much
of the soil, and world markets for the crop were down. Chesapeake planters began
growing less tobacco and more grain, a change that required fewer slaves. Many
planters trained their slaves as carpenters, blacksmiths, and other craftsmen
and rented them to employers in the towns. Other planters, believing that the
natural rights philosophy of the revolution left little moral room for slavery,
freed their slaves; but many more simply sold their slaves at high prices to
cotton planters farther south and west.
In the 1790s planters south of Virginia
had found that they could make money by growing cotton, thanks to the cotton gin
invented by American Eli Whitney to separate sticky seeds from the cotton
fibers. The result was a stunning boom in cotton. The United States produced
only 3,000 bales of cotton in 1790; that figure jumped to 335,000 by 1820. The
cotton boom created a brisk market in slaves. From 1778 to 1808 the United
States imported as many African slaves as it had imported during the whole
previous history of the slave trade. Nearly all of these slaves entered the
country through Charleston or Savannah and ended up working the cotton
plantations of the Deep South. Another reason for the rise in slave imports was
a promise in the Constitution that the national government would not end the
nation’s participation in the international slave trade until 1808, and planters
wished to stock up before the market closed. The slave–driven economy of the
late 18th and early 19th centuries produced huge amounts of plantation staples,
nearly all of them sold to international (primarily English) buyers.
In 1790 there were few cities. Only 5
percent of the population lived in towns with more than 2,500 inhabitants. And
only five communities (Boston, New York City, Philadelphia, Baltimore, and
Charleston) had more than 10,000 inhabitants. Each of these five cities was an
Atlantic seaport and handled the exporting of American farm staples and the
importing of Old World manufactured goods. They performed very little
manufacturing of their own. After 1793, when Britain and France entered a long
period of war, American seaports handled increased exports as war–torn Europe
bought a lot of American food. They also began to handle more of the trade
between European countries and their island colonies in the Caribbean.
Thus the work of the plantations, the
seaport towns, and (to a lesser extent) the farms of the United States was tied
to foreign trade. The new government of the United States worked to foster and
protect that trade, and these efforts led the new nation into the War of
1812.
A1 | Growth of Democracy |
Another potential problem for members
of the new government who prized order was the rapid growth and increasing
democracy of American society. The revolutionary rhetoric of equality and
natural rights seeped into every corner of American life. Even the poorest white
men demanded the basic dignity that republics promised their citizens. Some
women began to dream in that direction, as did slaves. In 1800 a slave named
Gabriel led a slave revolt in Richmond, Virginia. His small army marched into
the state capital under the banner “Death or Liberty.”
Religious change also contributed to
the new democratic character of the republic. The established churches of
colonial days (Congregationalists in New England, Anglicans—now renamed
Episcopalians—further south) declined, in part because they were relatively cold
and formal, and also because their status as established churches aroused
democratic resentment. At the same time, a great revival among the common people
made Baptists and Methodists the largest American churches. Baptists grew from
400 to 2,700 congregations between 1783 and 1820; Methodists grew from 50 to
2,700 churches in the same years. These churches emphasized preaching over
ritual, stressed Bible–reading congregations over educated ministers, favored
spiritual freedom over old forms of hierarchical discipline, and encouraged
conversions. Of crucial importance to the revival was the conversion of slaves
and, in turn, the slaves’ transformation of Christianity into a religion of
their own. By the second decade of the 19th century, most American slaves were
Christians—primarily Baptists and Methodists. Slaves and free blacks
participated in the revival and were taken into white churches. But white
prejudice and blacks’ desire for autonomy soon resulted in separate African
American congregations. By the early 19th century black Methodist and Baptist
congregations had become fundamental to a growing African American cultural
identity.
Finally, at the western edges of this
increasingly disorderly and democratic republic were Native American peoples who
remained free and on their own land. The Shawnee, Delaware, and other peoples
north of the Ohio River in particular had not been defeated in the Revolution
and did not accept the jurisdiction of the United States over their land. These
northwestern tribes could also rely on help from the British in Canada.
Thus at the edges of the republic—in
the forests of the interior and on the Atlantic Ocean—the new government faced
important problems of diplomacy, problems that sometimes degenerated into war.
Within the republic, the government had to contend with a democratic citizenry,
many of whom deeply distrusted law and authority that came from a distant
capital.
A2 | The Bill of Rights |
The new government of the United
States convened in New York City in early 1789. The First Congress immediately
passed a tariff on imports that would provide 90 percent of the government’s
revenue. It also created a system of federal courts. Congressmen then turned to
the bill of rights that some of the state ratifying conventions had promised
their citizens. Congress ultimately passed 12 amendments to the Constitution.
Ten of these were ratified by the states and became the Bill of Rights.
The First Amendment protected the
freedoms of speech, press, assembly, and religion from federal legislation. The
Second and Third amendments guaranteed the right to bear arms and made it
difficult for the government to house soldiers in private homes—provisions
favoring a citizen militia over a professional army. The Fourth through Eighth
amendments defined a citizen’s rights in court and when under arrest. The Ninth
Amendment stated that the enumeration of these rights did not endanger other
rights, and the Tenth Amendment said that powers not granted the national
government by the Constitution remained with the states and citizens.
B | The Debate over Federalism |
The new national government was
dominated by men who had led the movement for the Constitution, most of whom
called themselves Federalists. They were committed to making an authoritative
and stable national state. This became clear early on when President Washington
asked Secretary of the Treasury Alexander Hamilton to offer solutions to the
problems of the national debt and government finances. Hamilton proposed that
the federal government assume the revolutionary war debts of the states and
combine them with the debt of the United States into one national debt. The
federal government would pay off the parts of the debt that were owed to
foreigners, thus establishing the international credit of the new government.
But the new government would make the domestic debt permanent, selling
government bonds that paid a guaranteed high interest rate. Hamilton also
proposed a national bank to hold treasury funds and print and back the federal
currency. The bank would be a government-chartered and government–regulated
private corporation. The bank and the permanent debt would cement ties between
private financiers and the government, and they would require an enlarged
government bureaucracy and federal taxation. Hamilton asked for a federal excise
tax on coffee, tea, wine, and spirits. The latter included whiskey, and the
excise quickly became known as the Whiskey Tax. The tax would provide some of
the funds to pay interest on the national debt. It would also announce to
western farmers that they had a national government that could tax them.
Hamilton’s plan increased the power of the national government. See also
Federalism.
Hamilton’s measures promised to
stabilize government finances and to establish the government’s reputation
internationally and its authority in every corner of the republic. They would
also dramatically centralize power in the national government. Many citizens and
members of Congress distrusted Hamilton’s plans. The assumption of state debts,
the funding of the national debt, and stock sales for the Bank of the United
States would reward commercial interests, nearly all of them from the Northeast,
who invested in the bank and the bonds to pay the debt. Also, establishment of
the bank required Congress to use the clause in the Constitution that empowers
the legislature “to make all laws which shall be necessary and proper” to carry
out its specified powers—a clause that some feared might allow Congress to do
anything it wanted. Finally, the government would require a large civil service
to administer the debt and collect taxes—a civil service that would be appointed
by the executive. To Madison, Jefferson, and many others, Hamilton’s plans for
the national government too closely duplicated the powerful, debt–driven,
patronage–wielding British government against which they had fought the
revolution.
Jefferson became the leader of a group
that called themselves Democratic Republicans. They wanted the United States to
remain a republic of the small, property-holding farmers who, they believed,
were its most trustworthy citizens. Democratic Republicans envisioned a central
government that was strong enough to protect property but not strong or active
enough to threaten property or other republican rights. Jefferson feared the
national debt, the federal taxes, and the enlarged civil service that Hamilton’s
plans required.
When Jefferson was elected president in
1800, he paid off much of the debt that Hamilton had envisioned as a permanent
fixture of government. The Jeffersonians then abolished federal taxes other than
the tariff, reduced the number of government employees, and drastically reduced
the size of the military. They did, however, retain the Bank of the United
States. Internationally, the Jeffersonians had no ambitions other than free
trade—the right of Americans to trade the produce of their plantations and farms
for finished goods from other countries.
C | Foreign Affairs, 1789–1812 |
Unfortunately for both Federalists and
Democratic Republicans, it was very hard for the United States to act as a free
and neutral country in the international arena because of the wars that followed
the establishment of a republic in France (see French Revolution;
Napoleonic Wars). The French republic became violent and expansionist, and
Britain led three coalitions of European powers in wars against its expansionist
activities. These wars affected the domestic policy and the foreign policy of
the new United States (see American Foreign Policy).
Federalists valued American sovereignty,
but they also valued the old trading relationship with Britain; Americans did 90
percent of their trade with Britain. The Federalists also admired British
political stability, and they sided with Britain in its wars against
France.
In Jay’s Treaty of 1794 the Washington
administration tried to create a postrevolutionary relationship with Britain.
The British agreed to abandon the forts they occupied in the Northwest
Territory. An American army under General Anthony Wayne had defeated the
northwestern Native Americans at the Battle of Fallen Timbers in 1794, and the
British were glad to leave. But the British refused to allow Americans to trade
internationally on any basis other than as part of the British mercantile
system. The Federalists, knowing that they could ask for nothing better,
agreed.
The French regarded Jay’s Treaty as an
Anglo–American alliance. They recalled their ambassador and began harassing
American merchant ships at sea. By 1798 the Americans and the French were
fighting an undeclared naval war in the Caribbean. During this crisis,
Federalists passed the Alien and Sedition Acts. These acts undermined civil
liberties and were clearly directed against Jeffersonian newspaper editors, who
were critical of the Federalist-dominated government. The Federalist government
also began to raise a large army. The size of the Federalist government and the
danger of Federalist repression were the principal issues in the election of
1800. Campaigning for civil liberties and limited government, Thomas Jefferson
was elected president.
Jeffersonians cared more about farmers
than about the merchants who carried their produce to Europe and imported
European goods—particularly when those merchants operated within established
British trade networks and voted for Federalist candidates. Jeffersonians
demanded that the United States be free to trade with any nation (a demand
unlikely to be granted during wartime) and that both France and Britain respect
American sovereignty and neutral rights.
During most of Jefferson’s first term,
Europe was at peace during a break in the Napoleonic Wars. The one major foreign
policy issue was a huge success: Jefferson’s purchase of the Louisiana Territory
from France in 1803 (see Louisiana Purchase). The purchase gave western
farmers free use of the river system that emptied below New Orleans, removed the
French presence from the western border of the United States, and provided
American farmers with vast new lands on which to expand their rural republic.
Ignoring the fact that independent Native American peoples occupied the
Louisiana Territory, Jefferson proclaimed his new land a great “empire of
liberty.”
Britain and France again went to war a
few weeks after the Louisiana Purchase. Americans once again tried to sell food
and plantation crops and to carry goods between the warring European powers and
their Caribbean colonies. Both sides permitted this trade when it benefited them
and tampered with it when it did not. In 1805 the British destroyed the French
navy at the Battle of Trafalgar off the Spanish coast and became dominant on the
ocean. Britain outlawed American trade with France and maintained a loose
blockade of the American coast, seizing American ships and often kidnapping
American sailors into the Royal Navy. This happened to as many as 6,000
Americans between 1803 and 1812.
The Americans could not fight the
British navy, and President Jefferson responded with “peaceable coercion.”
Believing that Britain needed American food more than America needed British
manufactures, he asked Congress in 1807 for an embargo that would suspend all
U.S. trade with foreign nations. Jefferson hoped to coerce Britain and France
into respecting American sovereignty. The embargo did not work, however. Britain
found other sources of food, and the American economy—particularly in the
seaports—stopped. American exports were valued at $108 million in 1807. They
dropped to $22 million the following year. In 1808 James Madison, Jefferson’s
friend and chosen successor, easily won the presidential election against a
Federalist opposition.
D | The Threat of a Second War with Britain |
The United States declared war on
Britain in 1812. The first cause of the war was British interference with
American shipping. The second was military assistance that the British in Canada
were providing to the Native American peoples of the United States interior. In
Ohio, Native Americans defeated two American armies before being defeated
themselves by American troops under General “Mad” Anthony Wayne in 1795. They
and indigenous peoples in other parts of the Northwest Territory continued to
resist white encroachment. Beginning in 1805, the Shawnee, Delaware, and other
northern tribes formed an unprecedentedly large political and military alliance
under the Shawnee leader Tecumseh. Americans under William Henry Harrison,
governor of the Indiana Territory, attacked and defeated them at the Battle of
Tippecanoe in 1811. But Tecumseh’s army, along with Creeks from the South who
had joined him, were a serious threat to white settlement. All of this Native
American resistance was encouraged and supplied by the British in Canada.
After the embargo failed, most
northeastern representatives in Congress were willing to reconcile with Britain
on British terms. Westerners and Southerners, however, would not compromise the
safety of western settlements and the freedom of the seas. Led by young members
who came to be called War Hawks (including Henry Clay, the 34–year–old Speaker
of the House), Congress prepared for war. It would be the first war declared
under the Constitution, and President Madison was careful to leave the actual
declaration to Congress. But in June 1812 he sent a message to Congress listing
British crimes on the ocean and on the frontier. The message ended “We behold …
on the side of Britain a state of war against the United States, and on the side
of the United States a state of peace toward Britain.” Congress, led by Southern
and Western Jeffersonians, declared war two weeks later.
E | The War of 1812 |
The United States entered the War of
1812 to defend its sovereignty, its western settlements, and its maritime
rights. American leaders knew that they could not fight the British navy. They
decided instead to fight a land war, with Canada as the prize. Americans
reasoned that they could get to the British settlements in Canada more easily
than the British could. The capture of Canada would cut western Native Americans
off from British supplies and allow Americans to hold a valuable colony hostage
until the British agreed to their demands.
General William Hull, governor of the
Michigan Territory, led an American invasion of Canada in 1812. The British and
Native Americans threw him back, besieged him at Detroit, and forced him to
surrender his whole army. A second invasion of Canada from western New York
failed when New York militiamen refused to cross into Canada to back up American
regulars who had captured Queenston Heights below Niagara Falls (see
Battle of Queenston Heights). Tecumseh’s northern Native American
confederacy was an important part of the British effort. In the South, Creek
warriors terrorized Tennessee and killed about 250 settlers who had taken refuge
at Fort Mims in Alabama (see Massacre of Fort Mims).
The war went better for the Americans in
1813. Commodore Oliver Hazard Perry defeated a British fleet and gained control
of Lake Erie—and thus of the supply lines between British Canada and the
American Northwest. Americans sailed across Lake Ontario and raided and burned
York (now Toronto). Further west, Americans led by William Henry Harrison chased
the British and Native Americans back into Canada. At the Battle of the Thames
in October, Americans killed Tecumseh. The following spring, American General
Andrew Jackson, with Cherokee allies, defeated and then slaughtered the Creeks
at Horseshoe Bend in Alabama. With these two battles the military power of
Native Americans east of the Mississippi River was finally broken.
The British went on the offensive in
1814. The Royal Navy had blockaded the Atlantic Coast throughout the war and now
began raiding American cities. In the summer, the British raided Washington,
D.C., and burned down the Capitol and the White House. In September the British
attacked Baltimore, but were held off by Americans at Fort McHenry who defended
the harbor. (It was this engagement that inspired a witness, American poet
Francis Scott Key, to write “The Star-Spangled Banner,” which later became the
national anthem.) The British then moved their attention to the Gulf Coast. At
New Orleans, Andrew Jackson’s army soundly defeated the British on January 8,
1815. Neither side in the Battle of New Orleans knew that the war had ended the
previous month with the signing of the Treaty of Ghent.
New England Federalists, opponents of
the war, were also unaware of the treaty when they met at Hartford, Connecticut,
in December 1814. With their commerce destroyed, some wanted to secede from the
United States and make a separate peace with Britain. But the Hartford
Convention settled for proposed amendments to the Constitution (all of which
were directed at the Jeffersonian Republicans’ southern and western majority).
However, when members of the Hartford Convention carried their proposals to
Washington in February, they found the capital celebrating Jackson’s victory at
New Orleans and the end of the war. Thus the Hartford Convention became the
final disgrace for the New England Federalists.
The War of 1812 had been a product of
the Napoleonic Wars in Europe. After Napoleon was defeated in 1814, neither the
Americans nor the British cared to keep on fighting. In the treaty, the British
abandoned their Native American allies, and the Americans dropped their
complaints about maritime rights. Both assumed that peace would eliminate issues
that had been created by war in Europe.
VIII | UNITED STATES EXPANSION |
A | Era of Good Feelings |
The year 1815 marks a watershed in
American history. Before that date American history was closely tied to European
history—particularly to the French Revolution and the Napoleonic Wars. With
Napoleon’s defeat and the success of the Congress of Vienna in 1815, a long
period of peace began in Europe. American leaders paid less attention to
European trade and European war, and more to the internal development of the
United States.
This was bad news for Native Americans
east of the Mississippi River, who had lost their last European ally, Britain,
in 1815. Now they faced only land–hungry Americans who were determined to turn
Native American hunting lands into farms. By the 1830s the federal government
was moving the eastern Native Americans to new lands beyond the Mississippi,
while whites filled their old lands with farms and plantations and began eyeing
more lands to the west.
B | Expansion: Northwest Territory |
In the 1780s there were few white
settlers in the Northwest Territory (the states of Ohio, Indiana, Illinois,
Michigan, Wisconsin, and eastern Minnesota). By 1860 more than one in five
Americans lived in the Northwest, and the geographic center of the population of
the United States was near Chillicothe, Ohio. Nearly all white migrants were
farmers, and they reached the area in two streams.
Before 1830 most migrants were
Southerners, mainly poor and middling farmers from Kentucky, Tennessee, and
western Virginia. In the southern regions of Ohio, Indiana, and Illinois, they
settled near rivers that empty into the Ohio River, providing access to the
Mississippi and the Gulf of Mexico.
Southern migrants in the Northwest
worked their land Southern style. They planted cornfields but left most of their
land wooded, allowing hogs to roam freely and fend for themselves. In this way
farmers subsisted (within their households and through bartering with neighbors)
with relatively little labor or reliance on outside markets.
Trade down the Mississippi became safe
only after Jefferson purchased the Louisiana Territory in 1803 and the army
ended Native American resistance in the Northwest and Southwest in the War of
1812. The trade route became efficient and profitable only with the development
of river steamboats in the 1810s.
After 1830 a new stream of migration
reached the Northwest Territory from the northeastern states. Most of the new
settlers were New Englanders (many of whom had spent a generation in western New
York) who reached their new lands via New York’s Erie Canal, Great Lakes
steamboats, and other new forms of transportation. By the 1840s they were joined
by immigrants from Germany and Scandinavia. Most of these were intensive
commercial farmers. Rather than allow cattle and hogs to roam freely (often
trampling tilled fields), they put their animals in pens. They also planted huge
fields of grain and put up fences.
In 1820 the Northwest Territory sent
only 12 percent of its farm produce to markets outside the region—a sign that
nearly all Northwestern farmers limited their economic lives to their families
and neighbors. By 1840 exports accounted for 27 percent of what Northwestern
farmers produced, and by 1860—with railroad connections to the east
completed—the figure stood at 70 percent. The figures were even higher in the
northern, grain–growing areas. Increasingly, the market for Northwestern farm
products was not in Europe but in the towns and cities of the east as well as
such local centers as Cincinnati, Ohio, and Chicago, Illinois. In turn, these
cities provided farmers with manufactured goods. Land that only a generation
earlier had been occupied by independent Native American peoples was now the
center of a great internal commercial revolution.
C | Expansion: The Southwest |
Equally dramatic was the rapid
settlement of the trans-Appalachian South. At the conclusion of the War of 1812,
Andrew Jackson forced the Creeks to cede huge territories in the Southwest.
Settlers, often with the help of state governments, began pressuring the
Cherokee, Choctaw, and other tribes to give up their lands. The land was eagerly
sought by Southeastern whites who had small, worn–out farms, and who faced lives
of tenancy and rural poverty.
The best lands, however, were taken by
planters who since the 1790s had been reaping huge profits from the cotton boom.
Fertile land beside navigable rivers in Georgia, Alabama, Mississippi,
Tennessee, Louisiana, Arkansas, and Missouri became slave plantations devoted to
cotton. These cotton farms were among the largest, the most intensely
commercialized, and the most profitable business operations in the Western
Hemisphere.
Farmers who owned few or no slaves took
higher, more isolated, and less fertile land in the same states. Like their
cousins who settled north of the Ohio River, they practiced a mixed agriculture
that included animals and plants (primarily hogs and corn), provided for
themselves and their neighbors, and sold the surplus to outside markets. Some of
those markets were reached by floating produce downriver to the seaports, while
other markets were on plantations that grew only cotton and that bought food
from farmers in their region.
The big cotton farms relied on slave
labor, and slaves performed the immense task of turning a huge trans–Appalachian
wilderness into cotton farms. Much of the slave population that was moved west
came from the slave centers of South Carolina and coastal Georgia. But the
cotton boom also provided a market for Virginia and Maryland slaves who were not
as economically useful as they had been in the 18th century. In the 1790s, as
the cotton boom began, about 1 in 12 Chesapeake slaves was moved south and west.
Chesapeake slave exports rose to 1 in 10 in the first decade of the 19th century
and 1 in 5 between 1810 and 1820. The movement of slaves from the Chesapeake to
the new cotton states was immense. The Cotton Belt of the Deep South had become
the center of American slavery. See also Slavery in the United States:
Growth of Slavery.
D | The Indian Removal Act |
With the expansion of the white
agricultural frontier came the final blows to Native American independence east
of the Mississippi. In New York, the once mighty Iroquois were limited to
reservations near the new towns of Buffalo and Syracuse; many of the Iroquois
moved to Canada. The Shawnee, who had led Native American resistance in the
Northwest Territory until 1815, were scattered. Many of the most defiant members
moved to Canada. Others relocated to Missouri, then to Mexican territory in east
Texas or to eastern Kansas.
In the South the 60,000 remaining
Cherokee, Choctaw, Chickasaw, Creek, and Seminole were pressured by the national
government to sell away most of their land at pennies per acre. Legislation
passed in 1819 provided small amounts of government money to train southern
Native Americans in plow agriculture and Christianity on their reduced lands.
The plan took hold among many of them, and whites began calling them the Five
Civilized Tribes. But even as these efforts continued, settlers moved onto lands
that Native Americans had not ceded while the federal government looked the
other way. In his final annual message to Congress in 1824, President James
Monroe recommended that the indigenous peoples who remained in the east be
removed to new lands west of the Mississippi.
The Cherokee, Creek, Choctaw, and
Chickasaw nations rejected the idea of removal and insisted that the national
government live up to the treaties that guaranteed them what was left of their
territory. At the same time, Southern state governments insisted that they and
not the federal government had jurisdiction over Native American lands within
their borders. The claim reinforced southern notions of states’ rights; it also
held the promise of more Native American land for settlers.
The situation reached a crisis in
Georgia, where Governor George Troup extended state jurisdiction to Native
American lands and began giving the lands to poor whites by means of a lottery
in 1825. Troup also sent state surveyors onto Creek lands and warned President
John Quincy Adams not to interfere with this exercise of state authority. Faced
with this threatening situation the Creek and the Cherokee reorganized
themselves as political nations, stripping local chiefs of power and giving it
to national councils. In 1827 the Cherokee nation declared itself a republic
with its own government, courts, police, and constitution.
By 1830 the situation had become a
crisis. New president Andrew Jackson, a Tennessee plantation owner and a famous
fighter of Native Americans, refused to exercise federal jurisdiction over
Native American affairs, allowing southern states to find their own solutions.
The Cherokee took the state of Georgia to court, and in 1832, in the case of
Worcester v. Georgia, John Marshall, chief justice of the Supreme
Court of the United States, ruled that Georgia’s extension of its authority over
Cherokee land was unconstitutional. President Jackson simply refused to enforce
the decision, allowing southern states to continue to encroach on Native
American lands.
In the Indian Removal Act of 1830,
Congress—with Jackson’s blessing—offered Native American peoples east of the
Mississippi federal land to the west, where the United States government had the
authority to protect them. Many of them accepted. Then in 1838, Jackson’s
successor, Martin Van Buren, sent the U.S. Army to evict 18,000 to 20,000
Cherokee remaining in the South and move them to what is today Oklahoma. In all,
4,000 Native Americans died on the march that became known as the Trail of
Tears. Jackson, who more than any other person was responsible for this removal
policy, argued, “What good man would prefer a country covered with forests and
ranged by a few thousand savages to our extensive Republic, studded with cities,
towns and prosperous farms, embellished with all the improvements which art can
devise or industry execute, occupied by more than 12,000,000 happy people, and
filled with all the blessings of liberty, civilization, and religion?” Again,
the white empire of land and liberty came at the expense of other races. See
also Indian Wars: Native American Removal Policy.
E | The Trans-Mississippi West, 1803–1840s |
In 1804, a year after the Louisiana
Purchase, President Jefferson sent an expedition under Meriwether Lewis and
William Clark to explore the purchase and to continue on to the Pacific Ocean.
The Lewis and Clark Expedition traveled up the Missouri River, spent the winter
of 1804 to 1805 with the Mandan people, and with the help of a Shoshone woman
named Sacagawea traveled west along the Snake River to the Columbia River and on
to the Pacific.
Even as they traveled, mounted bands of
Sioux were conquering the northern Great Plains. The Sioux had already cut off
the Pawnee, Otoe, and other peoples of the lower Missouri from the western
buffalo herds and were threatening the Mandan and other agricultural peoples on
the upper reaches of the river. Throughout the first half of the 19th century,
epidemics of European diseases traveled up the Missouri River. The worst of them
came in the 1830s, when smallpox killed half the Native Americans along the
river. The Sioux, who lived in small bands and moved constantly, were not as
badly hurt as others were. They used that advantage to complete their conquest
of the northern sections of Jefferson’s great “Empire of Liberty.”
Farther south, white settlers were
crossing the Mississippi onto the new lands. Louisiana, already the site of New
Orleans and of Spanish and French plantations, became the first state west of
the Mississippi in 1812. Southerners were also moving into the Arkansas and
Missouri territories. Missouri entered the Union in 1821, Arkansas in 1836.
Settlers also began moving into Texas, in the northeastern reaches of the
Republic of Mexico, which won its independence from Spain in 1821. Mexico at
first encouraged them but demanded that new settlers become Catholics and
Mexican citizens. Mexico also demanded that they respect the Mexican
government’s abolition of slavery within its territory. Settlers tended to
ignore these demands, and they continued to stream into Texas even when the
Mexican government tried to stop the migration.
By 1835 the 30,000 Americans in Texas
outnumbered Mexicans six to one. When the Mexican government tried to strengthen
its authority in Texas, the American settlers (with the help of many of the
Mexicans living in that province) went into an armed revolt known as the Texas
Revolution. Volunteers from the southern United States crossed the border to
help, and in 1836 the Americans won. They declared their land the independent
Republic of Texas and asked that it be annexed to the United States. The
question of Texas annexation would stir national politics for the next ten
years.
Americans considered the plains that
formed most of the Louisiana Purchase (the lands over which the Sioux had
established control) to be a desert unsuitable for farming. Congress designated
the area west of Arkansas, Missouri, and Iowa and north of Texas as Indian
Territory in the 1840s. But Americans were already crossing that ground to reach
more fertile territory on the Pacific, in California and Oregon (which included
present-day Washington and much of present–day British Columbia). See also
American Westward Movement: Beyond the Mississippi.
These lands were formally owned by
other countries and occupied by independent indigenous peoples. California was
part of Mexico. The Oregon country was jointly occupied (and hotly contested) by
Britain and the United States. American settlers, most of them from the Ohio
Valley, crossed the plains and poured into Oregon and the Sacramento and San
Joaquin valleys in California after 1841. As populations in those areas grew,
members of the new Mormon Church, after violent troubles with their neighbors in
Ohio, Missouri, and Illinois, trekked across the plains and the Rocky Mountains
in 1847 and settled on Mexican territory in the Salt Lake Valley.
F | The Monroe Doctrine |
The American government in these years
was expansionist. With the end of the second war between Britain and the United
States, the heated foreign policy debate that had divided Federalists and
Jeffersonian Republicans since the 1790s quieted down. In the years after 1815
most American politicians agreed on an aggressively nationalist and expansionist
foreign policy. John Quincy Adams, who served as secretary of state under James
Monroe, did the most to articulate that policy. In the Rush-Bagot Convention of
1817 he worked out agreements with Britain to reduce naval forces on the Great
Lakes and establish the U.S.-Canadian border from Minnesota to the Rocky
Mountains along the 49th parallel. For the first time in their history,
Americans did not have to worry about an unfriendly Canada.
Americans turned their attention south
and west, and to Spain’s crumbling empire in the New World. In the Adams–Onís
Treaty of 1819, Spain ceded Florida to the United States. The treaty also
established the border between Louisiana and Spanish Texas, a border that ran
west along the Arkansas River, over the Rocky Mountains, and to the Pacific
along the present southern borders of Idaho and Oregon. Thus the treaty gave the
United States its first claim to land bordering the Pacific Ocean, although it
shared that claim with Britain.
In part, the Spanish were willing to
give up territory because they had bigger things to worry about: Their South
American colonies were in revolt, establishing themselves as independent
republics. Spain asked the European powers that had stopped Napoleon’s France to
help it stop revolutionary republicanism in Spanish America. Britain, however,
did not agree and instead proposed a joint British–United States statement, in
which both nations would oppose European intervention in Latin America and would
agree not to annex any of the former Spanish territories.
Secretary Adams answered with what
became known as the Monroe Doctrine. In it, the United States independently
declared that further European colonization in the Americas would be considered
an unfriendly act (which agreed with the British proposal). The Monroe Doctrine
did not, however, include the British clause that would have prevented
annexation of former Spanish territory. Although he had no immediate plans to
annex them, Adams believed that at least Texas and Cuba would eventually become
American possessions. At the same time, the United States extended diplomatic
recognition to the new Latin American republics. In short, the Monroe Doctrine
declared the western hemisphere closed to European colonization while leaving
open the possibility of United States expansion.
G | Manifest Destiny |
Few American migrants questioned their
right to move into Texas, Oregon, and California. By the mid–1840s expansion was
supported by a well-developed popular ideology that it was inevitable and good
that the United States occupy the continent “from sea to shining sea.” Some
talked of expanding freedom to new areas. Others talked of spreading the
American ethic of hard work and economic progress. Still others imagined a
United States with Pacific ports that could open Asian markets. Before long,
some were imagining a North America without what they considered the savagery of
Native Americans, the laziness and political instability of Mexicans, or the
corrupt and dying monarchism of the British. God, they said, clearly wanted
hard–working American republicans to occupy North America. In 1845 a New York
City journalist named John L. O’Sullivan gave these ideas a name: Manifest
Destiny. It is, he wrote, “our manifest destiny to overspread the continent
allotted by Providence for the free development of our yearly multiplying
millions.”
H | Annexation: Oregon and Texas |
The new Republic of Texas asked to be
annexed to the United States as early as 1837. The governments of Presidents
Andrew Jackson and Martin Van Buren took no action for two reasons. First, the
question of Texas annexation divided the North and South. Up to the 1840s,
trans–Mississippi expansion had extended Southern society: Louisiana, Arkansas,
and Missouri were all slave states. Texas would be another, and Northerners who
disliked slavery and Southern political power imagined that the Texas territory
could become as many as 11 new slave states with 22 new proslavery senators.
Annexation of Texas was certain to arouse Northern and antislavery opposition.
President John Tyler, who supported the South, tried to annex Texas in 1844 but
was defeated by congressional Northerners and by some Southern members of the
anti-Jacksonian Whig Party. The second reason for avoiding annexation was that
Mexico still considered Texas its own territory. Annexation would create a
diplomatic crisis, and perhaps lead to war.
In the presidential election of 1844
the Whig Party nominated Henry Clay of Kentucky. Clay refused to take a stand on
the annexation of Texas. The Democrats rejected former president Martin Van
Buren, who opposed annexation, and nominated James K. Polk of Tennessee. Polk
ran on a pro-annexation platform: He would annex Texas, and he would assert
American ownership of all of Oregon’s territory disputed with Britain. Polk’s
position on Oregon was intended to reassure Northerners that expansion would
benefit them as well as the South.
This position on Oregon was, however, a
radical change from earlier policies. Previously, Americans had not claimed land
north of the 49th parallel, the present-day United States–Canada border on the
Pacific. Polk claimed all the land up to latitude 54°40’ north, the present
southern boundary of Alaska, which at the time was owned by Russia. The British,
on the other hand, claimed territory as far south as the Columbia River. After
Polk won the election, both sides sought to avoid a serious dispute; they backed
down and accepted the boundary that exists today between Washington State and
British Columbia. The compromise avoided war, but it convinced Northern
expansionists that Polk (and behind him, the Democratic Party) cared more about
Southern expansion than about Northern expansion.
I | War with Mexico |
There was ample reason for that
suspicion. While Polk compromised with Britain on the Oregon boundary, he stood
adamant against Mexico on the question of Texas. Mexico warned that it would
consider the annexation of Texas by the United States a declaration of war. A
Texas convention voted to join the Union on July 4, 1845. Polk and a Congress
strongly favoring annexation not only offered to take Texas into the Union, they
also set the southern boundary of the new state at the Rio Grande—150 miles
south of what most people had agreed was the Texas–Mexico border. The new
boundary gave Texas far more Mexican land (including much of present-day New
Mexico and Colorado) than the Texas Revolution had given it. Polk knew that the
additional territory would provide a gateway to New Mexico and California,
territories of northern Mexico that he and other expansionists coveted along
with Texas. While annexing Texas, Polk offered to buy New Mexico and California
from Mexico for $30 million in late 1845—an offer that the Mexicans angrily
refused. Polk then provoked a war with Mexico in which he would win all that he
had offered to buy.
As Mexico prepared for war, Polk sent
troops into the disputed area north of the Rio Grande. Mexico sent troops north
of the Rio Grande and in spring 1846 fought a skirmish in which the Americans
suffered more than a dozen casualties. Congress declared war on Mexico that May.
Near–unanimous congressional support for the declaration hid the fact that most
Whigs and many Northern Democrats were deeply suspicious of a Southern war to
annex new territory for slavery.
In the war the Americans launched a
three–pronged offensive. General Zachary Taylor invaded northern Mexico from
Texas, capturing the city of Monterrey in September 1846. A second American army
under General Stephen Kearny occupied Santa Fe in August of that year. Kearny
then sent part of his force to join Taylor at Monterrey and marched the rest of
his army west to California, where American settlers had already established an
independent “Bear Flag Republic.” At the same time, the U.S. Navy seized
California ports.
Having lost Texas, California, New
Mexico, and large portions of Chihuahua and Sonora in northern Mexico, the
Mexicans marched toward Taylor’s army near Monterrey. Taylor held off determined
attacks by a Mexican army about three times as large as his own and won the
Battle of Buena Vista in February 1847. The next month the third prong of the
U.S. offensive was launched when General Winfield Scott landed at Veracruz. Five
months later he had fought his way to Mexico City.
As happened in much of the war, the
Mexican army was larger and fought bravely, but the Mexican government and high
command were divided and often incompetent, and the Americans were better armed
and better led. In particular, the Mexicans had no answer to American artillery.
After a series of bloody battles in September 1847, Scott’s army occupied Mexico
City, and the war was over.
The Treaty of Guadalupe Hidalgo in 1848
ceded Texas (with the Rio Grande boundary), California, and New Mexico to the
United States, which agreed to pay Mexico $15 million. The Mexican Cession gave
the United States present–day west Texas, New Mexico, Arizona, California,
Nevada, Utah, most of Colorado, and part of Wyoming. The northern third of
Mexico had become the southwestern quarter of the United States.
The Mexican War was a straightforward
land–grab. The ease with which the United States won and the arrogance with
which it behaved created a distrustful and sometimes violent southern border
area for the country. More immediately, the lands ceded by the Treaty of
Guadalupe Hidalgo became the object of contest and resentment between the slave
and free states—a conflict that would widen into the American Civil War 13 years
later.
IX | SOCIAL DEVELOPMENT: NORTH AND SOUTH |
The regions of the United States that
argued about the Mexican War and its aftermath had grown in divergent ways since
agreeing to be a nation in 1788. The North had experienced a market revolution
based on commercial agriculture and the growth of cities and industry. The
South, on the other hand, remained tied to a plantation system that depended on
slave labor and international markets. The plantation system enslaved the
one-third of all Southerners who were black and excluded more and more poor
whites.
A | The Market Revolution in the North |
By the 1820s, farmers no longer produced
mainly for themselves and their neighbors, selling any excess production on
international markets. Most Northern farms had become business operations. They
specialized in a small range of marketable crops (grain, meat, dairy products)
and sold the food they produced to an internal market made up of Americans who
had moved to towns, cities, and industrial villages.
In turn, these urbanized and
industrialized Northerners provided farmers with finished goods (hats, shoes,
cotton cloth, furniture, tools) that had previously been made in rural
households and neighborhoods or imported from Europe. With this self–sustaining
internal market, the North stepped out of the old colonial relationship in which
America produced food and raw materials for Europe (primarily Britain) in
exchange for foreign finished goods. The northern United States was no longer on
the colonial periphery of the world market economy. It was taking its place as
part of the financial and industrial center. See also Industrial
Revolution: The Industrial Revolution in the United States.
This internal market revolution would
have been impossible without dramatic improvements in transportation. After 1815
Congress repeatedly considered nationally planned and funded internal
improvements. But these plans were voted down by congressmen who favored states’
rights and a strict construction of the Constitution—the notion that Congress
could legislate only in areas explicitly granted to it by the Constitution.
State governments took up the slack by building roads and canals themselves and
by subsidizing private corporations that built them. The result was a system of
roads, canals, and—by the 1840s and 1850s—railroads that reflected no single
vision of a national system. Instead, the transportation map reflected the
ambitions of the most prosperous and active states.
The first and most spectacular example
was the Erie Canal, completed by the state of New York in 1825. It connected the
Hudson River at Albany with Lake Erie at Buffalo. The canal provided farmers in
western New York and in the sections of the Northwest that drained into the
Great Lakes with a continuous water route east to New York City—and from there
to national and international markets. Steamboats provided a similar service for
farms in areas that drained into the Ohio and Mississippi rivers. The upriver
trip from New Orleans to Louisville, Kentucky, had taken three to four months
via keelboat before 1815. Steamboats cut that time to one month. In the 1850s
railroads, although more expensive than water routes, brought the manufacturing
towns and the food–producing farmers even closer together. These improvements
quickly reduced the cost of transportation. The cost of moving farm produce and
manufactured goods over long distances fell 95 percent between 1815 and 1860.
With that drop, farmers could grow wheat in Indiana and sell it at a profit in
New York City, while New England manufacturers could make work shoes and sell
them to the farmers of Indiana. Transportation had transformed the old Northeast
and the new Northwest into an integrated market society.
B | The Growth of Cities |
In the 1820s the urban population of the
United States began growing faster than the rural population, and from 1820 to
1870 American cities grew faster than they ever had or ever would again. For the
most part, that explosive urban growth was driven by the commercialization of
agriculture.
In the early republic every American city
was an Atlantic seaport engaged in international trade. After 1820 new inland
towns and cities rose up to serve farmers’ commercial operations. The fastest
growing urban area in the country in the 1820s, for instance, was Rochester, New
York, a flour–milling and shipping center serving the farmers of western New
York. In subsequent decades western cities such as Cincinnati and Chicago grew
quickly. At the same time, towns devoted to manufacturing for rural markets
across the nation—towns such as Lowell, Massachusetts—grew at an almost equal
rate.
Even in the old seaports, the fastest
growing sectors of the economy were not in the docks and warehouses of the old
mercantile economy but in neighborhoods devoted to manufacturing for the
American market, or among wholesalers who served that market. The huge internal
market provided by northern and western farm families was by far the biggest
source of urban growth in these years.
C | Standards of Living |
The commercial and industrial
transformation of the North and West increased standards of living. Food was
abundant, and manufactured goods found their way into even the poorest homes.
Yet the bounty of progress was distributed much more unevenly than in the past,
and thousands made the transition to commercial–urban society at the expense of
economic independence.
As American cities grew, the nature of
work and society in the city changed in fundamental ways. In 1800 nearly all
manufacturing was performed by master artisans who owned their own workshops and
hired at most a few journeymen (wage-earning craftsmen) and apprentices. After
1815 the nature of manufacturing work changed. As production speeded up, many
masters stopped performing manual work and spent their time dealing with
customers and suppliers and keeping records. The number of journeymen increased,
and they often worked in workshops separate from the store. Increasingly,
less-skilled work (sewing together pieces of shoes, assembling ready–made
clothing from pieces cut in uniform sizes) was farmed out to women who worked in
their homes. Thus successful masters became businessmen, while most skilled men
and thousands of semiskilled women became members of a permanent working class.
Although there had been rich and poor neighborhoods in early seaport towns,
class segregation and stark contrasts between rich and poor became much more
prevalent after 1820.
In the northern and western countryside
there were signs of prosperity. Wallpaper, manufactured dishes and furniture,
and other finished goods were finding their way into most farmhouses, and paint,
ornamental trees, and flowers were dressing up the outside. Yet even in the
countryside, the distance between rich and poor increased, and the old
neighborhood relationships through which much of the local economy had been
transacted became weaker. Debt, for instance, had always been a local, informal
relationship between neighbors. After 1830 a farmer’s most important and
pressing debts were to banks, which required annual payments in cash. Commercial
society also demanded good roads to transport products, and public schools to
teach literacy and arithmetic; local taxes rose accordingly. Farmers spent less
effort maintaining necessary relations with neighbors and more effort earning
cash income to pay taxes and debts. Those who could not establish or maintain
themselves as farmers tended to move out of agriculture and into towns and
cities.
Women and men who left rural communities
to take up wage labor experienced the transition in different ways. White men,
whose citizenship and social standing had rested on being independent property
owners with patriarchal responsibilities, experienced wage labor as a
catastrophic fall from grace. Relatively few, however, ended up in factories,
and those who did took more-skilled and better-paying jobs.
Until the 1840s the factory work force of
the Northeast was made up primarily of women and children. Women who left poor
New England farms (and the crumbling patriarchy that often governed them) and
moved into factory villages valued the independence that wage labor provided
them.
D | Immigrants |
Beginning in the mid–1840s, New England’s
factory work force was increasingly dominated by Irish immigrants—refugees who
often saw factory work in America as a big improvement over famine and
colonialism back home. Much of the labor force in Northern cities and factory
towns and on the new transportation projects was composed of German and,
particularly, Irish immigrants. A trickle of Irish and German newcomers had been
coming to America since the 18th century. There were large German-speaking areas
in the mid-Atlantic states, and the Irish were sufficiently numerous and
politically active to become the targets of the Federalists’ Alien Act of 1798.
These early immigrants possessed craft or agricultural skills, and most of them,
like their British neighbors, were Protestants. A newer immigration grew quickly
after 1815, peaking in the 1840s. The new immigrants were landless peasants
driven from their homelands by famine (see Irish Famine). They took
menial, low-paying jobs in factories and as servants, day laborers, and
transport workers—replacing white women in factories and blacks in household
service and on the docks. Most of these new immigrants were Catholics, and they
arrived in such numbers that by 1850 Catholics were the largest single
denomination in the United States. They overwhelmingly sided with the Democratic
Party in politics.
Many American entrepreneurs welcomed this
new supply of cheap labor. But militant Protestants and many native-born working
people perceived the immigrants as a cultural and economic threat. Arguments
over immigration would shape Northern politics for more than a century after
1830.
E | Northern Blacks |
As the North passed gradual emancipation
laws, freed slaves moved toward cities. In 1820 African Americans made up about
one-tenth of the populations of Philadelphia and New York City. They were
excluded from white churches and public schools and, increasingly, from the
skilled crafts, dock labor, and household service at which they had been
employed. Attacks on individual blacks were routine, and occasionally,
full-blown racist riots erupted—in Cincinnati in 1829 and in New York and
Philadelphia in 1834, for instance. African Americans responded by building
their own institutions: Methodist and Baptist churches, Masonic lodges, schools,
charitable and social organizations, and newspapers. It was from within this web
of institutions that they protected themselves and eventually demanded freedom
for Southern slaves. See also African American History: Free Black
Population.
F | The Market Revolution in the South |
The South experienced a market revolution
of a different kind. In the years leading to the American Civil War, the South
provided three–fourths of the world’s supply of cotton, which was a crucial raw
material for the international industrial revolution. In the same years, cotton
accounted for one–half to two–thirds of the value of all American exports,
contributing mightily to a favorable balance of trade. The plantation was a
business of worldwide significance, and the cotton boom made thousands of
planters rich. At the same time, however, the South’s commitment to plantation
agriculture stunted other areas of its economy, opened the region to intense
international criticism over slavery, and led ultimately to political and
economic disaster.
Plantation agriculture led to an
undemocratic distribution of wealth among whites. The plantation economy
rewarded size: Big farms were more profitable than small ones. Successful
planters were able to buy more slaves and good land, depriving less-successful
planters of these benefits and concentrating wealth in fewer and fewer hands. In
1830, 35 percent of Southern households included slaves. By 1860 the figure
stood at 26 percent, with fewer than 5 percent of white households owning 20 or
more slaves. Most whites lacked the fertile land, the slave labor force, and the
availability of transportation to bring them into the market economy. Along with
slaves, most whites formed a huge majority of Southerners who had minimal ties
to the market and who bought few manufactured goods.
The result was that the South remained in
a colonial trade position in relation to Britain and, increasingly, to the
northeastern United States. Without regional markets, there was very little
urbanization or industrialization in the South. Southern states financed few
internal improvements: Plantations tended to send goods to markets via the river
system, and smaller farmers preferred low taxes and unobtrusive government to
roads and canals. The few Southern cities and large towns were ports on the
ocean or on the river system. These cities were shipping centers for cotton
exports and for imports of manufactured goods. Manufacturing, shipping, banking,
insurance, and other profitable and powerful functions of the market economy
stayed in London and—increasingly—in New York.
F1 | Changes in Slavery |
During the cotton boom, slaveholders
attempted to organize plantation slavery as a paternalistic system in which the
planter exercised a fatherly authority in every area of slaves’ lives. Some
evidence suggests that discipline of slaves became more strict and systematic in
the second quarter of the 19th century, and that whippings and other forms of
physical punishment persisted. The brisk interstate slave trade often destroyed
family and community ties among slaves. At the same time, however, the food
eaten by slaves improved, and more slave families lived in individual cabins
than had in the past. After 1830, masters who had participated in Baptist and
Methodist revivals (and who had been frightened by a bloody Virginia slave
revolt led by Baptist preacher Nat Turner) provided religious instruction to
their slaves. The goal of these changes, proudly stated by the planters, was to
create not only economic dependence but also emotional dependence of the slaves
upon their masters.
For their part, slaves learned to put
the masters’ paternalistic efforts to their own uses. They accepted the food and
housing, listened to the preachers, endured the labor discipline, and then made
their own lives within slavery. Slave family forms, for instance, were a mix of
the European nuclear model and African matriarchy and village kinship, shaped by
the limits imposed by slavery. And while they became Christians, slaves
transformed Christianity into a distinctly African American faith that served
their own spiritual interests. In particular, Moses the liberator (not the
slaveholders’ patriarchal Abraham) was the central figure in slave
Christianity.
F2 | Growing Isolation of the South |
The slave–based plantation economy of
the South was economically successful: Planters were making a lot of money. But
in the long term, Southern commitment to slavery isolated the region morally and
politically and led to disaster because most other white societies were branding
the institution as barbarism.
Northern states abolished slavery soon
after the revolution. Slaves in Haiti revolted and formed an independent black
republic in 1804 (see Haitian Slave Revolt). Four years later the British
(whose navy controlled the oceans) outlawed the African slave trade. In ensuing
years, the Republic of Colombia, or Gran Colombia (present-day Venezuela,
Ecuador, Panama, and Colombia), Mexico, Peru, Chile, and other mainland colonies
won wars of independence against Spain. Each of the new South and Central
American republics outlawed slavery. Finally, the British Parliament emancipated
slaves on British islands in the Caribbean in 1833. By then Brazil, Cuba, and
the southern United States were the only remaining large-scale slave societies
in the world. Southern slavery was producing profits for the masters, and
political and moral isolation for the region.
X | JACKSONIAN DEMOCRACY |
A | Transforming Democracy |
After 1815 Americans transformed the
republic of the Founding Fathers into a democracy. State after state revoked
property qualifications for voting and holding office—thus transforming
Jefferson’s republic of property holders into Andrew Jackson’s mass democracy.
Democracy, however, was not for everyone. While states extended political rights
to all white men, they often withdrew or limited such rights for blacks. As part
of the same trend, the state of New Jersey took the vote away from propertied
women, who formerly had possessed that right. Thus the democratization of
citizenship applied exclusively to white men. In the mid–19th century, these men
went to the polls in record numbers. The election of 1828 attracted 1.2 million
voters; that number jumped to 1.5 million in 1836 and to 2.4 million in 1840.
Turnout of eligible voters by 1840 was well over 60 percent—higher than it had
ever been, and much higher than it is now.
At the same time, however, popular
political activity other than voting declined. Judging by available evidence,
state and national governments received fewer petitions than in the past, and
they paid less attention to the ones they received. In the 1830s, when Congress
received hundreds of antislavery petitions, it simply refused to read them.
Petitioning, parading, and mobbing (each of which included Americans who were
not white males) had all been crucial to the American Revolutionary movement,
and they had continued to play important roles in Jeffersonian America. By the
1830s and 1840s, spontaneous parades and mob actions played smaller roles in
political life, and more-respectable citizens viewed such activities as
disorderly and criminal. Popular participation in politics was more and more
limited to voting.
Furthermore, voting was organized not by
the voice of the citizenry, but by a national two–party system staffed by
disciplined professionals. These professionals included candidates, appointed
office holders, newspaper editors, and local leaders who organized voters, wrote
party platforms, and developed party ideologies in ways that only partially and
indirectly reflected popular wishes. Thus political participation was
democratized by the 1830s. But democracy included only white men, and even they
were transformed from citizens to spectators.
B | Origins of the Party System |
Neither the Jeffersonians nor their
Federalist opponents admitted to being a political party. To them the term
party meant the same as faction. It also meant the victory of
selfishness and contention over the selfless unanimity they felt a republic
needed.
However, two events caused important
politicians to reconsider the value of parties. First, the Panic of 1819, an
economic downturn, introduced Americans to a cycle of booming economy followed
by bust, a cycle that would come to characterize the new market economy during
the 19th century. Some Jeffersonians blamed the panic on the Bank of the United
States, which had been rechartered in 1816. They argued that if the disciplined
coalition of Southern and Western farmers that had elected Jefferson had still
been in place in 1816, Congress would not have rechartered the bank and the
panic would not have happened.
The second event that caused politicians
to reconsider the value of political parties was Missouri Territory’s
application for admission to the Union in 1818. Missouri’s proposed constitution
allowed slavery, and that provision caused heated argument in Congress,
revealing angry differences between representatives of slave states and free
states. Congress ultimately compromised, balancing the new slave state of
Missouri by admitting Maine as a free state (see Missouri Compromise).
Congress then declared that slavery would be allowed in the Louisiana Purchase
territories south of a line drawn west from the southern border of Missouri.
Slavery would be banned north of that line. The immediate crisis was solved, but
the fault line between slave and free states remained open.
The same politicians (Martin Van Buren of
New York was the most active of them) who opposed the Bank of the United States
also argued that Jefferson’s coalition of slaveholding and nonslaveholding
farmers would never have permitted the dangerous, divisive question of slavery
to get into congressional debate. They organized a disciplined coalition for
states’ rights and limited government that supported Andrew Jackson for the
presidency in 1828. That coalition became the Democratic Party.
In the 1820s, many politicians had come to
believe that organized parties were essential to democracy. Parties gave
ordinary men the power to compete with the wealth, education, and social
connections of traditional leaders. Parties also created disciplined
organizations that could control political debate.
C | Democrats and Whigs |
Beginning with Jackson’s administration,
the Democrats were opposed by the Whig Party. The Whigs were led by Henry Clay
of Kentucky, Daniel Webster of Massachusetts, and others who called for an
active national government and who had a nationalist answer to the growing
problem of slavery.
The Whigs proposed what they called the
American System. They wanted a high tariff that would protect Northeastern
factories from European competition while it generated revenue for the national
government. They proposed high prices for government land in the West—a policy
that would slow westward movement and that also would increase federal revenue.
They insisted that the Bank of the United States be maintained to stabilize
currency and to discipline smaller banks. And they wanted to use the money that
tariffs and the sale of lands would give the government to build and maintain
roads and other internal improvements.
The result, they promised, would be a
society with a national market under a government that fostered prosperity and
order. At the same time, the national character of the Whig economy would
discourage arguments among the three sections of the nation—the Northeast, the
South, and the West. The Northeast would manufacture goods for the South and
West. The South would supply cotton to Northeastern factories, and the West
would provide food for both the South and the Northeast. The prosperity of each
section would depend on friendly relations with the other two, and none of them
would want to bring up the divisive question of slavery.
Andrew Jackson and his Democratic
successors proposed to limit the role of government in the market revolution and
in resolving the tensions among the sections. They wanted to abolish the Bank of
the United States, set tariffs at low levels, sell government land at low rates,
and leave the question of internal improvements to the states.
Democrats hoped to create a national
government that never meddled in local affairs (one of the most important of
those affairs being slavery), that played no favorites, and that kept taxes low.
On the question of slavery and states’ rights, Jacksonians favored minimal
central government within a permanent union. When South Carolina threatened the
Union by attempting to nullify the protective tariff of 1828 (Southerners termed
it the Tariff of Abominations because it penalized Southern states that exported
cotton and imported Old World manufactured goods), Jackson threatened South
Carolina with a federal invasion (see Nullification). At the same time,
he let Southerners know that slavery was safe as long as a Democratic Party
committed to states’ rights was in power. Even more than the Whigs, the
Democrats were committed to avoiding any congressional debate that could
possibly affect slavery.
In the 1830s and 1840s Democrats and Whigs
built the most completely national two–party system that Americans have ever
had—both parties relied on support from all sections of the country, and both
were evenly matched in most states. Within that system, politicians knew that
arguments between the North and South must be avoided. Such arguments would,
first of all, split the Whig and Democratic parties in which politicians were
making their careers. Second, and more dangerous, the breakdown of the national
two–party system could realign the parties along North–South lines and focus
national politics on the differences between the North and South. Political
leaders feared that such a breakdown could lead ultimately to disunion and
perhaps civil war. Most historians agree that the national party system’s
eventual breakdown was a crucial cause of the American Civil War
(1861-1865).
D | Social Reforms |
In the second quarter of the 19th century
Americans built a number of institutions and social movements dedicated to
improving the morals of individuals and of society in general. The most
prominent reformers were Northern, middle–class Whigs who had been influenced by
evangelical revivals in the 1820s and 1830s. Those revivals taught an ethic of
improvement: Sin and disorder, they said, were not inevitable results of Adam’s
fall (as described in the Bible). They were the results of bad choices made by
free and morally accountable men and women. Beginning in the 1820s, these
middle–class evangelicals proposed reforms that would teach Americans to make
good moral choices and thus, one individual at a time, improve society and
perhaps make it perfect.
D1 | Schools |
The most pervasive and enduring result
of these movements was a system of tax–supported public schools. The great
school reformers were Northern Whigs such as Horace Mann of Massachusetts and
Calvin Stowe of Ohio (husband of novelist Harriet Beecher Stowe). They proposed
public school systems that were centralized at the state level and that made
attendance mandatory. These schools were geared to teaching patriotism, manners,
and civility, along with reading, writing, and arithmetic.
Among Whig reformers, the goal of public
schools was to build character in individual students. Ultimately, reformers
wished to make a perfect society by filling it with perfect individuals.
Democrats supported the schools, but saw them as a means of providing equal
opportunity to all whites. Democrats, and Southerners from both parties, also
tended to support local control over schools, to favor shorter school years, and
to make efforts to keep taxes low.
D2 | Prisons |
A second institutional reform was
concerned with prisons and asylums. Northern Whig evangelicals proposed new
forms of prisons that were meant less to punish the bodies of criminals (through
whippings, incarceration, and execution) than to improve their souls.
Pennsylvania built a prison in which convicts sat alone in their cells with only
Bibles to keep them company. Most other states adopted the Auburn System, which
took its name from a pioneering prison in New York. Under this system, prisoners
slept in solitary cells but worked in groups—although a policy of absolute
silence was enforced. The products of prison workshops were sold to outside
markets. Whigs favored this system because it promised to rehabilitate criminals
by teaching them personal discipline and respect for work, property, and other
people.
D3 | Temperance |
The largest and most sustained organized
social movement in American history was the temperance crusade against the use
of alcohol that began in the 1820s. Again, Northern Whig evangelicals took the
lead. They argued that alcohol abuse as well as the violence and personal and
social disintegration associated with it had gotten out of control. In fact, per
capita alcohol consumption, which had grown steadily since the 1790s, was at an
all–time high in the 1820s.
Middle–class evangelicals assumed that
poverty, crime, family violence, poor child rearing, and almost every other
social ill was traceable to heavy drinking. A sober citizenry, they argued,
would result in a society free of crime and violence, filled with happy homes
and quiet streets. In the 1840s working people formed their own temperance
movement—first through the Washingtonian Temperance Society, and then through
temperance lodges. Members of both groups turned in the 1850s to campaigns for
statewide prohibition. Beginning with Maine in 1851, 13 states adopted
legislation that outlawed alcohol by 1855. Of those states, all but Delaware
were in the North.
E | Radical Reform |
The great belief of Northern middle–class
evangelicalism—a belief behind most middle–class reform—was that human nature
was not irreparably damaged by original sin. In the 17th and 18th centuries
Protestants had been certain that nearly all of mankind was damned. Only a few
would be saved, they believed, and those only by the arbitrary grace of God, not
by their own efforts. These Protestants also thought that most human beings were
incapable of correct moral behavior unless they were coerced.
In the first half of the 19th century,
most Americans—including nearly all Southern whites—continued to believe that
people were morally defective. Coercive institutions, such as the patriarchal
family and slavery, were necessary to impose order on naturally disorderly
people. Northern middle-class evangelicalism promoted the belief that human
beings could change. Evangelists preached that women and men were moral free
agents who could give themselves to God and thus escape a life of sin.
In this view of human nature, institutions
that hindered individual freedom were unnecessary. Such institutions prevented
men and women from assuming responsibility for themselves, thus making it
impossible for them to freely give themselves to God and to a life of Christian
love and moral improvement. The implications of this view were individualistic
and anti–institutional. Some rejected all human government. A few who believed
in no government joined utopian communities such as one at Oneida, New York,
which practiced a form of free love to remove elements of power from relations
between men and women (see Oneida Community). Others, including many
utopians, became radical feminists and abolitionists.
E1 | Women’s Rights |
The delegates to the first Women’s
Rights Convention at Seneca Falls, New York, in 1848 were veterans of other
reforms. They were members of missionary societies, of the temperance movement,
and of the moral reform crusade (a movement to end prostitution, obscenity, and
lewdness). Most of all, they were veterans of an antislavery movement that
attacked patriarchy and hierarchy in all forms. Many applied the logic of social
reform to themselves, and they began to think of themselves as human beings
first and as women second. See also Women’s Rights: Early Struggles
for Equal Rights in the United States.
Noting that Jesus made no distinction
between the proper duties of women and men, delegates to the Seneca Falls
convention attacked the subordinate status of women. Beginning with a manifesto
based on the Declaration of Independence, women at Seneca Falls demanded civil
and legal equality for women. In particular, they wanted the right to vote. In
the American republic, political participation, they argued, separated people
who counted from those who did not.
E2 | Abolition |
The logic of Northern social reform
applied more clearly to slavery than to nearly any other habit or institution.
From the beginning, slaves resisted their own enslavement. In the 18th century,
Quakers and a few other whites opposed the institution. The American Revolution,
with its rhetoric of universal natural rights, called slavery into serious
question. Northern states abolished it, and Southern evangelicals, along with
some of the leading slaveholders of the upper South, thought about liberating
the slaves. After 1816 the American Colonization Society proposed to
“repatriate” freed slaves to Africa, although the intent of this organization
was less to liberate slaves than to deport free blacks. Free blacks understood
that, and most of them opposed returning to Africa.
But it was not until the 1830s that
significant numbers of middle–class Northerners began to agitate for the
immediate emancipation of slaves and for their incorporation as equals into the
republic. Like other social reforms, abolitionism took root among the most
radical Northern Whigs, and it was based in the middle–class revivals of the
1820s and 1830s.
In 1831 William Lloyd Garrison, a Boston
abolitionist, published the first issue of The Liberator, an antislavery
newspaper. In 1833 Garrison helped form the American Anti–Slavery Society. The
new movement included Northeastern Quakers and Unitarians and Northern
blacks.
Abolitionism’s largest base of support,
however, was among the evangelical middle class of New England, upstate New
York, and the Old Northwest (the former Northwest Territory). These people lived
in a reform culture that saw moral free will and Christian love as pitted
against brutality and power. As the New England Anti–Slavery Society put it in
1833, antislavery “means, finally, that right shall take the supremacy over
wrong, principle over brute force, humanity over cruelty, honesty over theft,
purity over lust, honor over baseness, love over hatred, and religion over
heathenism.” It was in such stark opposites that evangelical reformers viewed
the world.
Sometimes working with white
abolitionists, sometimes working independently, Northern free blacks also
demanded freedom for the slaves. Hundreds of anonymous women and men operated an
Underground Railroad that hid escaped slaves, often smuggling them to Canada.
Along with pamphleteer David Walker, orator and editor Frederick Douglass, and
uncompromising mystic Sojourner Truth, they formed a dedicated wing of the
antislavery movement.
Abolitionists knew that they were a
minority. They also knew that the two parties—Democrats in particular—wanted to
keep moral and sectional questions out of politics and would try to ignore the
abolition movement. They decided to attack the party system as well as slavery.
They organized a postal campaign in 1835, sending massive amounts of antislavery
literature through the mails. Southerners and most Northerners branded this
literature as dangerous, and the Democratic administration could not avoid the
issue.
In the next year, abolitionists began
sending hundreds of petitions to Congress. Some of the petitions were against
annexation of slaveholding Texas; others demanded the abolition of slavery in
the District of Columbia or the end of the interstate slave trade. Each of these
issues was within the constitutional sphere assigned to Congress.
In the process of building these
campaigns, abolitionists turned themselves into an organized movement. They also
urged the national government to debate slavery—something that most congressmen
from both sections and both parties wanted to avoid. President Andrew Jackson,
rather than formally censor the mail, simply allowed local postmasters to
destroy mail that they considered dangerous. And Democrats in Congress, with
help from Southern Whigs, devised a gag rule whereby Congress tabled antislavery
petitions without reading them. At the same time, Northern mobs attacked
abolitionists and their sympathizers as threats to racial purity and social
order.
These measures gave abolitionists what
many of them had wanted: They tied the defense of slavery to assaults on free
speech and the right of petition. No less a figure than ex-president John Quincy
Adams, who had returned to government as a congressman from Massachusetts, led
the fight against the gag rule. It was a fight that convinced many Northerners
that Southern slavery corrupted republican government and threatened Northern
civil liberties. Beginning as a tiny radical minority, abolitionists had helped
force the nation to confront the troublesome problem of slavery.
XI | COMING OF THE CIVIL WAR |
A | An Overview |
As early as the Constitutional
Convention of 1787, American leaders had known that they could not settle the
differences between the states committed to slavery and those that were not. The
three–fifths rule, the constitutional promise not to halt the international
slave trade until 1808, and the banning of slavery in the Northwest Territory
were all attempts to avoid confronting differences between the North and
South.
Some Northerners thought Southerners
would recognize the inefficiency of slavery and end it voluntarily—a hope that
was dashed by the cotton boom and the South’s recommitment to slavery. Many
Southerners thought that an agrarian coalition uniting the South and West could
keep Northeastern commercial interests from running the country. They realized
that hope when a South–West coalition elected Thomas Jefferson president in
1800.
But by the 1830s the market revolution
had tied Northeastern factories and Northwestern farms into a roughly unified,
commercialized North. Most Northerners were committed to free–market capitalism,
individual opportunity, and free labor, and many contrasted what they believed
to be the civilizing effects of hard work and commerce with the supposed
laziness and barbarism of the slave South. For their part, white Southerners
began to see themselves as a beleaguered minority.
Following the 1819 crisis over statehood
for Missouri, a national two–party system developed, and both parties worked to
prevent sectional differences from becoming the focus of politics. They were
successful until the Mexican War gave the United States huge new territories.
Territorial questions had to be handled by Congress, and the question of whether
slavery would be allowed into lands ceded by Mexico immediately became the
all–consuming issue in national politics. By the mid–1850s the old party system
was in ruins. An antislavery Republican Party became dominant in the North and
elected Abraham Lincoln president in 1860. With an antislavery party in control
of the White House, slave states seceded beginning in December 1860. The Union
refused to let them go, and the Civil War began.
B | The Wilmot Proviso |
Both the North and the South saw the
issue of slavery in the territories as a simple question of right and wrong, but
the issue traveled through elaborate twists and turns from 1846 through the
beginning of the Civil War.
Many Northern Democrats in Congress were
disappointed with President James K. Polk (1845-1849). Some represented
market–oriented constituencies that supported a moderately protective tariff and
federal internal improvements. Polk was a Southerner and an old Jacksonian, and
he opposed both of those measures. Northern Democrats also disliked Polk’s
willingness to compromise with the British on expansion into Oregon, while he
went to war with Mexico over Texas. It looked to many Democratic Northerners as
though the Democratic Party was less interested in the expansion of the agrarian
republic than in the expansion of slavery.
Among these Democrats was Congressman
David Wilmot of Pennsylvania. In 1846, during the war with Mexico, he proposed
what became known as the Wilmot Proviso, banning slavery from all territory
taken from Mexico. In subsequent years the proviso was repeatedly attached to
territorial legislation. In the House, combinations of Northern Whigs and
Democrats passed it several times, but the proviso was always stopped in the
Senate. The Wilmot Proviso would become the principal plank in the platform of
the Republican Party.
President Polk and his cabinet favored
extending the Missouri Compromise line west to the Pacific, a solution that
would allow slavery in the New Mexico Territory and in Southern California, but
ban it from Colorado, Utah, Nevada, and Northern California. Neither the North
nor the South favored Polk’s solution. In 1849 President Zachary Taylor proposed
allowing the residents of individual territories to decide the question of
slavery for themselves—a solution that became known as popular sovereignty.
Again, there was too little support. While the Wilmot Proviso stood as the
extreme Northern position, John C. Calhoun, a senator for South Carolina, staked
out an extreme position for the South. Slaves, he said, were property, and
masters could carry their slaves into any territory of the United States.
C | The Compromise of 1850 |
Although no proposed solution was
acceptable to all sides, the question of slavery in the territories could not be
postponed. In 1848 gold was discovered in California, and thousands of Americans
rushed to the region (see Gold Rush of 1849). The previous year, Brigham
Young had led Mormon settlers to the Salt Lake Valley, in what became the
northeastern corner of the Mexican Cession in 1848. At the same time,
slaveholding Texas claimed half of New Mexico. It was at this point that
politicians proposed a series of measures that became known as the Compromise of
1850. California was admitted as a free state. The remainder of the land taken
from Mexico was divided into Utah and New Mexico territories and organized under
popular sovereignty. The Texas claims in New Mexico were denied. The slave trade
(but not slavery) was banned in the District of Columbia, and a stronger
fugitive slave law went into effect. These measures resolved the question of
slavery in the territories in ways that tended to favor the North, then enacted
additional measures important to both antislavery and proslavery forces. The
compromise was less a permanent solution than an answer to an immediate crisis.
It would satisfy neither section. One historian has called it the Armistice of
1850.
D | The Fugitive Slave Law |
The one element of the Compromise of 1850
that explicitly favored the South was the Fugitive Slave Law. A federal law of
1793 required that slaves who escaped to a free state be returned if the master
could offer proof of ownership to a state court. The new law turned these cases
over to federal commissioners, and it denied a captured slave the right to
testify in his or her own behalf or to be tried before a jury. The law violated
Northerners’ notions of states’ rights, it infringed on civil liberties in the
North, and it turned Northerners into direct participants in Southern slavery.
Northern citizens, even those who had not previously opposed slavery, refused to
support the law. While some hid fugitives or helped spirit them into Canada,
nine Northern states passed personal liberty laws that forbade state officials
from helping to enforce the Fugitive Slave Law. In 1852 Harriet Beecher Stowe
published a sentimental antislavery novel, Uncle Tom’s Cabin, as a direct
challenge to slavery in general and the Fugitive Slave Law in particular. It
sold 300,000 copies that year, and 1.2 million by summer 1853.
E | The Kansas–Nebraska Act |
The Compromise of 1850 created a
smoldering truce that lasted only a few years. By 1853 settlers had moved west
of Missouri into what is now Kansas. Congress drew up legislation organizing the
remaining federal lands in the Louisiana Purchase into the Kansas and Nebraska
territories. Under the Missouri Compromise, none of this land was open to
slavery. But Southerners, along with 15 of 20 Northern Democrats in the Senate,
organized the new territories under popular sovereignty: The new states could
decide for themselves whether or not to allow slavery. The Kansas–Nebraska Act
thus abolished the Missouri Compromise line and enacted popular sovereignty—a
measure that was becoming the Democratic Party’s answer to the question of
slavery in the territories.
Northern Whigs in Congress all voted
against the act, leading Southern Whigs to leave the party and join the
Democrats. At the same time, many Northern Democrats openly opposed the
legislation. Thus the Democratic Party shifted and became more overtly Southern,
while Northern Whigs and many Northern Democrats joined coalitions that in 1854
became the Republican Party, exclusively Northern and antislavery. Political
parties were reorganizing along sectional lines.
F | Bleeding Kansas |
With the territory organized under
popular sovereignty, voters would decide the question of slavery in Kansas.
Antislavery settlers flooded the territory, and in response, proslavery
Missourians moved in. When elections were held for the territorial legislature
in 1854, about 5,000 Missourians crossed the border to vote illegally for
proslavery candidates. The resulting legislature legalized slavery in Kansas.
Antislavery forces refused to accept these results. They organized a convention
that wrote an antislavery constitution and they elected their own
legislature.
While this controversy raged in Kansas,
Charles Sumner, an antislavery senator from Massachusetts, gave an impassioned
antislavery speech in which he insulted a number of Southern senators. He said
that one of them, Andrew Butler, “had chosen a mistress to whom he has made his
vows … the harlot, Slavery.” Congressman Preston Brooks of South Carolina was
Butler’s nephew. He was determined to punish Sumner’s attack upon his family’s
honor. He walked onto the floor of the Senate, found Sumner at his desk, and
beat him unconscious with a cane. White Southerners almost unanimously applauded
Brooks, while Northerners ranted against Southern savagery. At almost the same
time as the attack on Sumner, in May 1856, proslavery Kansans attacked an
antislavery stronghold at Lawrence. In retribution, an antislavery fanatic named
John Brown murdered five proslavery settlers in what became known as the
Pottawatomie Massacre. A small–scale civil war was being fought in Kansas.
G | The Dred Scott Case |
At this point the Supreme Court, with a
Southern majority among the justices, tried to settle the problem of slavery in
the territories. It chose the Dred Scott case to do so. Scott was a slave
owned by a U.S. Army doctor who had brought him to the free state of Illinois
and the Territory of Wisconsin, which was free under the Missouri Compromise.
Scott sued for his freedom on that basis.
The Supreme Court answered with a
powerful proslavery decision in 1857. First, the majority stated that blacks
(whether free or slaves) could not be citizens of the United States. As a
result, Dred Scott’s case should never have entered the federal courts. The
court went on to declare that the Missouri Compromise was invalid because
Congress had no right to exclude slaves (who were legal property and therefore
protected under the Constitution) from any territory. With that, the Supreme
Court had adopted the extreme Southern position on the question of slavery in
the territories, and declared the policy of the Republican Party and of a
majority of Northerners unconstitutional.
Meanwhile, Kansas submitted two
constitutions in its application for statehood—one that permitted slavery and
one that did not. President James Buchanan, a Northern Democrat and a solid
supporter of the South, sent the Lecompton (proslavery) Constitution to Congress
with a strong recommendation that it be accepted. In a congressional debate that
at one point broke into a fistfight, enough Northern Democrats finally defected
from their party to reject the Lecompton Constitution. The controversy deeply
divided the Democratic Party in the North and made the election of an
antislavery Republican as president in 1860 very likely.
H | The Election of 1860 |
The breakup of the party system produced
four presidential candidates in the election of 1860. The Democratic Party split
angrily into Northern and Southern wings. Southern Democrats nominated
Buchanan’s vice president, John C. Breckinridge of Kentucky, while Northern
Democrats chose Senator Stephen A. Douglas of Illinois. What remained of the
Whigs renamed themselves Constitutional Unionists and nominated Senator John
Bell of Tennessee (see Constitutional Union Party). Republicans passed
over better–known candidates and nominated Abraham Lincoln of Illinois.
Lincoln had become known nationally when
he staked out the Republican position on slavery in the territories and held his
own in a series of public debates in a Senate race with Douglas in 1858. He was
also known for a speech in which he stated that the United States could not long
endure as a “house divided” between Northern free–labor capitalism and Southern
slavery. On the crucial question of slavery in the territories, Lincoln assured
the South that no president could constitutionally dismantle the institution in
the states. But he would preserve the territories for free labor, thus putting
slavery “in the course of ultimate extinction.”
The election results were starkly
sectional. Breckinridge carried 11 states in the Deep South. Bell carried a few
Upper South states. Douglas, while coming in second in the popular vote, won
only in Missouri and a part of New Jersey. Lincoln carried every Northern state
and thus won an overwhelming victory in the Electoral College—and he did so
without a single electoral vote from a slave state. The Republican Party, with
an antislavery platform and an entirely Northern constituency, had elected a
president of the United States. No possible new coalition would enable the South
to keep that from happening repeatedly.
XII | THE CIVIL WAR |
A | The South Secedes |
White Southerners fully realized what
had happened: National politics now pitted the North against the South, and the
North had a solid and growing majority. The South would never again control the
federal government or see it controlled by friendly Northerners. Many saw no
alternative to seceding from the Union.
Southerners justified secession with
what was called the compact theory. This theory held that the
Constitution had created not a perpetual union but a compact between independent
states that retained their sovereignty. The compact could be broken in the same
way that it had been created: with state conventions called for that purpose. By
this means South Carolina seceded from the Union in late December 1860. By
February 1 (before Lincoln’s inauguration) six more states from the Deep South
had left the Union.
Northerners—including President
Buchanan, Stephen Douglas, and other Democrats—denied the right of secession.
The more lawyerly among them reminded the South that the Constitution was
written “to form a more perfect Union” than the Articles of Confederation. The
Constitution had stated that “the union shall be perpetual.” Thus secession was
a legal impossibility. And in practical terms, Northerners argued, secession
would be a fatal disaster to the American republic. Republics had a history of
splitting into smaller parts and descending into anarchy. Secession, Lincoln
argued, was revolution. Many Southerners agreed and claimed that they were
exercising their sacred right to revolt against oppressive government.
Congress tried to come up with
compromise measures in early 1861, but there was no way of compromising in the
argument over secession. The seven states of the lower South (South Carolina,
Mississippi, Florida, Alabama, Georgia, Louisiana, and Texas) formed themselves
into the Confederate States of America. Their Constitution was nearly identical
to the Constitution of the United States, although it affirmed state
sovereignty, guaranteed slavery, and limited the president to a single six–year
term.
In his inaugural address, Lincoln was
conciliatory without compromising on secession. He also hinted that the national
government would use force to protect military garrisons in the Confederate
states—in particular, Fort Moultrie in Charleston Harbor in South Carolina. When
he tried to resupply the garrison (which had moved to the stronger Fort Sumter),
the South Carolina militia fired thousands of artillery rounds into the fort,
forcing its surrender. With that, the Civil War began.
With the beginning of the war, Virginia,
North Carolina, Tennessee, and Arkansas seceded and joined the Confederacy.
Unionist legislative majorities kept the remaining slave states of Maryland,
Kentucky, Delaware, and Missouri from joining the rebel states. Meanwhile the
western counties of Virginia seceded from that state when Virginia seceded from
the Union and became the new state of West Virginia. Thousands of men from these
border states, however, traveled south and joined the Confederate Army.
B | North vs. South |
On paper, the North possessed
overwhelming military superiority over the South. The North had a free
population of about 22 million. The South had a population of 9 million,
including almost 4 million slaves. The North was a modern industrial power; the
South was overwhelmingly rural. The North possessed nine–tenths of the nation’s
industrial capacity, four–fifths of its bank capital, and three–fourths of its
taxable wealth. The North financed 60 percent of its war effort through the sale
of bonds in its prosperous region. Its paper currency inflated by only 80
percent during the whole war. The South, on the other hand, had to finance the
war by printing paper money that inflated 9,000 percent in four years.
Yet the South had advantages as well. To
succeed, the South did not have to invade and conquer the North. The South had
only to prevent the North from invading and conquering the Confederacy. In a
similar situation during the American Revolution, the British had far greater
military superiority over the Americans than the Union possessed over the
Confederacy, but the British failed to subdue the American revolutionaries. Many
predicted that the Union would fail as well. The South had only to prolong the
war until the North gave up and went home. In addition, the South’s economic
backwardness was an advantage: Northern armies had to operate in hostile
territory in which transportation and communications were very difficult.
Finally, improved weapons (most notably rifled muskets that were accurate at
more than 300 yards) gave a lethal advantage to entrenched defenders over
opponents who attacked them across open ground. Union soldiers did most of the
attacking.
Differing objectives of North and South
and the topography of the contested ground helped determine the nature of the
war. In the west, Northern armies used the Mississippi and Tennessee rivers
(navigable streams that ran into the South) to capture Confederate territory and
to control the river system. By the spring of 1863 the Union controlled all of
the Mississippi River except a Confederate stronghold at Vicksburg, Mississippi.
That city fell in July, and the Confederacy was cut in half (see Campaign
of Vicksburg).
In northern Virginia, however, the South
defended land with Chesapeake inlets and east–west rivers that the Union had to
cross. In this theater the South also had General Robert E. Lee, an almost
mystically skilled commander who constantly outthought his attackers and forced
them to assault him under bad conditions. On two occasions, Lee invaded Northern
territory. He suffered defeats at the Battle of Antietam (in Maryland) in 1862
and the Battle of Gettysburg, Pennsylvania, in 1863. For the remainder of the
war he fought defensively. General Ulysses S. Grant took control of the Union
Army opposed to Lee in early 1864 and attacked Lee that spring. In horrific
battles at the Wilderness, Spotsylvania, and Cold Harbor (all in northern
Virginia), Grant took heavy casualties before trapping and besieging Lee at
Petersburg, south of Richmond, Virginia.
At the same time, Union General William
Tecumseh Sherman marched from Chattanooga, Tennessee, to Atlanta, Georgia. After
a monthlong siege, he captured and burned Atlanta. While Atlanta and Petersburg
were besieged, Northern voters reelected Lincoln in 1864 in an election that was
regarded as a referendum on the war. The South had succeeded in avoiding defeat
and in turning the contest into a war of attrition. But it had not, as
Southerners had hoped, broken the North’s will to continue fighting.
While Grant and Lee faced each other at
Petersburg, Sherman left Atlanta and set out across country to Savannah,
Georgia, destroying everything in his path that was of military value and much
that was not (see Sherman’s March to the Sea). Sherman then turned north,
burned the South Carolina capital at Columbia and kept moving into North
Carolina. Before Sherman could join Grant, Lee’s army fled Petersburg. Grant
caught him at Appomattox, and Lee surrendered. At a cost of 360,000 Union dead
and 260,000 Confederate dead, the United States had been preserved.
C | The Emancipation Proclamation |
At first, the Union and the Confederacy
fought only over the question of secession. The leaders of both sides wanted to
avoid talking about slavery—which all of them knew was the root cause of the
war. Southerners did not present the war as a defense of slavery for two
reasons. First, most white Southerners owned no slaves and might not fight to
protect slavery. Second, the South was trying to win recognition and help from
Britain and France—neither of which would have supported a war for slavery. The
North included many abolitionists, but it also included Democrats and
border–state slaveholders who would fight for the Union but not for
abolition.
As the war dragged on, however, even
Northern anti–abolitionists came to see slaves as a great economic and military
asset for the South. Slaves grew most of the South’s food and performed work
that freed white Southerners for military service. At the same time, thousands
of slaves made the issue of slavery unavoidable by abandoning their masters—even
those in the border states who were Unionists—and fleeing to Union lines. Union
Army commanders called these escaped slaves contrabands (captured property). As
the number of contrabands grew, President Lincoln proposed a gradual,
compensated emancipation of slaves in border states. Lincoln hated slavery on
moral grounds. But he could justify emancipation only as a military necessity in
the war to save the Union.
In a preliminary Emancipation
Proclamation, issued after the Northern victory at Antietam in September 1862,
Lincoln declared that slaves in all states that remained in rebellion on January
1, 1863, would be “forever free.” The proclamation exempted pro–Union border
states and parts of the Confederacy already under Union occupation, and it was
carefully worded as a measure to assist the North in preserving the Union. But
it transformed the Union Army into an army of liberation—fighting to end slavery
as well as to preserve the Union.
Blacks confirmed their emancipation by
enlisting in the Union Army. The North resorted to conscription in 1863 and
gladly accepted volunteers from among freed slaves. Blacks were first used as
support troops and were paid less than white soldiers, but beginning in 1864
they became combatants serving at equal pay. In January 1865 Congress passed the
13th Amendment, which ended slavery forever. It was ratified and became part of
the Constitution in December 1865.
D | Results of the Civil War |
The Civil War finally established the
United States as a nation–state. Secession and state veto power had been
recurring questions from the beginning of government under the Constitution.
Americans before the Civil War spoke of the United States as a plural noun. Walt
Whitman, the great poet of the Union, declared in the prewar years that “the
United States need poets.” Since the Civil War the United States has been a
singular noun (The United States is …). Thus at the highest constitutional
levels, the Founders’ Latin motto E Pluribus Unum (“From many, one”)
finally became a reality.
However, the unification of the country
went further than most Northerners had wanted. The enormous government debt
incurred during the war, followed by the postwar occupation of the South,
created a central government more powerful than even the most nationalistic
Americans had imagined before the war. The many had indeed become one, but only
under a national government that would have frightened most of the Founding
Fathers.
The Civil War had long-term economic and
social results as well. The South was the theater of war, and the physical
destruction of that region was enormous. White Southerners lost their plantation
labor system and their huge investment in slaves. Egyptian and Indian cotton had
entered world markets during the war, and American cotton never regained its
prewar dominance. The South remained the poorest region of the United States for
a very long time.
The Northeast’s economic dominance was
secured by the war, and—although historians debate this point—the war seems to
have sped Northern economic development. Finally, the status of the
trans–Mississippi West (the great prize in the argument between North and South)
was settled on Northern terms. In 1862 Republicans and Northern Democrats passed
the Homestead Act, which gave free government land to settlers if they turned
the land into farms (see Homestead Laws). In the same year Congress
subsidized private companies that built a railroad from Omaha, Nebraska, to
Sacramento, California. The same Congress, in the Morrill Land–Grant College
Act, gave huge amounts of federal land to state governments for the purpose of
establishing state universities. Southerners had blocked similar bills for many
years. With the South out of Congress, Northerners imposed their blueprint for
Northern–style family farms, public education, and market society upon the
West.
Disfranchised groups often saw their
positions improve as a result of the war. Irish and German immigrants had
experienced (and returned) the hostility of native–born Americans in the decades
before the war. About one in four Union soldiers was an immigrant, and their
help in defeating the South temporarily eased anti–immigrant feeling.
Northern women saw new possibilities
open up during and after the war. In wartime they often took jobs previously
done by men on farms and in factories, and thousands served in the Union nursing
corps. Partly as a result, postwar women’s political and reform groups were
larger and more militant than the groups that preceded them.
Finally and perhaps most importantly,
the Civil War was a watershed in the history of African Americans. The war
permanently ended slavery. At the same time, it raised questions about the
economic, social, and political place of African Americans in the United States.
Those questions have been near the center of American public life ever since,
providing the strongest evidence that E Pluribus Unum is a contested
possibility and not an established fact of American history.
XIII | RECONSTRUCTION |
As the Civil War ended, the United States
faced unprecedented tasks: to bring the defeated Confederate states back into
the Union and to determine the status in American society of almost 4 million
former slaves. These goals dominated the years from 1865 to 1877, the era known
as Reconstruction. During these years, Congress imposed a legislative revolution
that transformed the South. Republican legislators passed ambitious laws,
approved major constitutional amendments, overhauled Southern state governments,
and spurred extensive change in the former Confederacy. The most significant
change that Congress made was to enfranchise African American men. The pivot of
reconstruction policy, black suffrage was the era’s major achievement. For more
information, see African American History: Reconstruction.
A | Congress vs. Johnson |
The process of reconstruction (the
process by which the divided nation was reunited) had in fact begun in 1863 when
President Lincoln announced a plan to restore the Southern states to the Union.
Radical Republicans in Congress opposed Lincoln’s plan. After Lincoln was
assassinated in April 1865, they turned hopefully to President Andrew Johnson.
In May 1865 Johnson announced his restoration plan, called Presidential
Reconstruction. His plan disqualified former Confederate civil and military
officers from holding office but brought the ex-Confederate states back into the
Union on undemanding terms.
Presidential Reconstruction took effect
in the summer of 1865. Johnson gave pardons to thousands of Southerners, and
former Confederate states passed “black codes” that curtailed the freed slaves’
rights. Enraged Republicans united in opposition to Johnson, denouncing the
black codes and the president. When the 39th Congress, dominated by Republicans,
convened in December 1865, Republicans planned to revoke the black codes and to
replace Johnson’s program.
In 1866 they passed two laws over the
president’s vetoes: the Civil Rights Act to protect the rights of freed slaves
and an act that extended the life of the Freedmen’s Bureau. The bureau was
designed as a relief organization for blacks and whites who were left destitute
by the war. It also helped blacks by establishing schools, supervising labor
relations, and protecting them from violence and intimidation.
Johnson’s vetoes provoked Republicans
to pass the 14th Amendment, which guaranteed the civil rights of all citizens,
whatever their race, and restricted the political power of former Confederates.
Johnson denounced the proposed amendment because he believed it was an
unconstitutional invasion of states’ rights. After the congressional elections
of 1866, Republicans maintained enough power to pass their own reconstruction
program.
In 1867 Congress passed the
Reconstruction Act, followed by three supplemental acts passed later the same
year and in 1868. These acts invalidated the state governments formed under
Lincoln’s and Johnson’s plans and divided the ex-Confederacy into five military
districts. The acts also provided that voters—all black men and white men not
disqualified by the 14th Amendment—could elect delegates to write new state
constitutions that ensured black male suffrage. A state could be readmitted to
the Union once it had met a series of requirements, including ratification of
the 14th Amendment. Black enfranchisement made Congressional Reconstruction more
radical than Johnson’s plan. Still, even Congressional Reconstruction provided
only a temporary period of military rule, and it did not take property away from
former Confederates or punish them for treason.
When President Johnson tried to block
the new Reconstruction laws, Republicans again united, this time in order to
remove him from office. The House approved 11 charges of impeachment, but
Johnson escaped conviction in the Senate by one vote. Congress then passed the
15th Amendment, which guaranteed black suffrage. Women’s rights advocates
complained that the new amendment ignored their demands for enfranchising women,
but to Republican leaders the woman suffrage issue was not vital. Black
suffrage, in contrast, was imperative: Only with the votes of African Americans
could Republicans control the former Confederate states.
B | Political Developments in the South |
With Congressional Reconstruction in
place, the Southern states, supervised by federal troops, formed new state
governments that were dominated by Republicans. By the end of March 1870 all of
the former Confederate states had been readmitted to the Union. Black male
suffrage was vital to the Congressional plan. By giving 700,000 former slaves
the right to vote, Congressional Reconstruction created a new electorate in the
South; blacks held voting majorities in five states.
Reconstruction-era voters provided
support for a Southern Republican Party, a fragile coalition made up of
carpetbaggers (Northerners who moved south after the war), scalawags
(Southerners, usually former Whigs who joined the Republicans), and African
Americans. Under Republican rule, Southern states built roads and bridges,
promoted railroad development, funded state institutions, started state school
systems, enlarged state government, and increased state budgets. Republican
rule, however, was brief, less than five years in most states.
Southern Democrats, white landowners,
and white voters generally opposed Republican rule. They tried to dismantle
Republican power by terrorizing blacks to prevent them from voting. Without
black votes, the Democrats would be able to defeat the Republican Party and
reclaim their power. The best-known terrorist group was the Ku Klux Klan, formed
in 1866 to suppress black suffrage and restore white supremacy. Klan members
attacked Freedmen’s Bureau officers, white Republicans, and black voters.
Republicans in Congress tried to combat terrorism with three “enforcement acts”
of 1870 and 1871. The acts sought to protect voters, supervise elections, and
punish those who impeded black suffrage. Federal efforts virtually suppressed
the Ku Klux Klan, but violence and intimidation continued, and ex-Confederate
hostility to emancipation seethed.
C | Freedom for Blacks |
Emancipation was a momentous
experience; to former slaves, it represented autonomy and freedom from white
control. Freedom brought waves of migration within the former Confederacy. Newly
freed peoples moved to cities or to other plantations, sought out family members
from whom they had been separated, and secured legal marriages, sometimes in
mass ceremonies. They also formed new institutions. Black churches provided
former slaves with spiritual support. Seeking literacy for themselves and their
children, former slaves started freedmen’s schools. The Freedmen’s Bureau and
Northern philanthropy helped establish more than 4,000 African American schools
and some advanced institutions, such as Howard University in Washington, D.C. In
several locales, blacks strove for integrated public facilities. In 1875
Congress passed a Civil Rights Act to bar segregation in public places.
Typically, former slaves sought not integration with whites but freedom from
white interference.
A paramount black goal was to own
land, which signified independence, but Southern whites retained control over
the land. Reconstruction did not redistribute land in the South, and most former
slaves lacked the resources to buy it. From 1865 to 1866, newly freed African
Americans began to sign labor contacts with planters to do field work in
exchange for wages, housing, food, and clothing. But they found the new system
too similar to slavery, and planters disliked it, too. The labor system that
evolved, sharecropping, seemed preferable. Under this system, landowners divided
plantations into small units and rented them to blacks for a portion of the
crop, usually one-third or one-half. Former slaves favored the new sharecropping
system, which provided more independence than the wage system. Planters also
appreciated the sharecropping system because they retained control of their land
and split the risk of planting with sharecroppers. Owners of large plantations
held on to their powerful positions in society.
A major depression in 1873 drove many
white farmers into sharecropping as well. By 1880 sharecroppers, black and
white, farmed four-fifths of the land in the cotton states. Many sharecroppers
were forced into a cycle of debt; rural merchants who loaned money to buy
supplies charged high interest rates for the loans and secured them with liens
or claims on the next year’s crop. Frequently the loans could not be repaid, and
sharecroppers fell into debt.
Sharecropping bound the South to
easily marketable cash crops that brought in the most income. Southerners did
not diversify their crops or protect their land against soil depletion. As a
result, the productivity of Southern agriculture declined over the years.
D | Political Developments in the North |
While Southern Republicans struggled
to keep Reconstruction afloat, their Northern counterparts faced a changing
economy and other problems. During the Reconstruction years, the North
industrialized rapidly and also endured a massive depression. At the same time,
political corruption became commonplace. These problems distracted Northerners
from the goals of Reconstruction.
The administration of Ulysses S.
Grant, who won the presidential election of 1868 on the votes of newly
enfranchised freedmen, was ridden with scandal. But fraud, bribery, and
corruption in office were not limited to the Grant administration. In New York
City, Democratic boss William M. Tweed looted the city treasury. In the election
of 1872 the Republican Party split over corruption in the Grant administration,
and some Republicans formed the Liberal Republican Party. The split failed to
dislodge Grant, but it meant dwindling support for Reconstruction policy.
A devastating five-year depression
that began with the panic of 1873 also shifted the focus of Republicans in the
North. Banks closed, jobs were destroyed, and businesses went bankrupt. Labor
protests multiplied, and violent incidents occurred; industrial conflict began
to replace regional conflict. Disputes also arose over currency, notably over
inflationary greenbacks, first issued during the Civil War. As a result of the
depression, prices for farm products fell. Forced to take on more debt, farmers
began to call for an increase in the amount of money in circulation. They
believed that a larger money supply would cause prices to rise, increase the
price of their crops, and raise their incomes. Those who favored a stable
currency, in contrast, urged withdrawal of greenbacks from circulation.
Meanwhile, the Supreme Court began to
roll back Reconstruction policy. In the Slaughterhouse Cases in 1873, the
Supreme Court ruled that the 14th Amendment did not give the federal government
control over the entire domain of civil rights. The cases are historically
important because they first posed the problem of defining how state citizenship
related to U.S. citizenship.
The Supreme Court of the 1870s and
1880s discarded other Reconstruction policies. In 1876 and 1883, the Court upset
two out of three of the enforcement acts. The Court also ruled in 1883 that
Congress could not impose a national ban on discrimination in public
accommodations, thus overturning the Civil Rights Act of 1875. The Court’s
decisions reinforced Republican willingness to shed the obligations of
Reconstruction, which many now considered a political liability.
E | “Redemption” |
In the 1870s Republican rule in the
South faltered. After 1872, membership in the Republican Party fell, as
terrorist groups used violence and intimidation to diminish black votes and curb
Republican support. Mobilizing white votes, Democrats sought to regain control
of state governments. Redemption, the Democrats’ term for their return to
power, followed swiftly, as the Republican coalition collapsed.
Once in office, Democrats dismantled
the changes that Republicans had imposed. They rewrote state constitutions, cut
state budgets and social programs, and lowered taxes. They also imposed laws to
curb the rights of sharecroppers and tenants and to ensure a powerless black
labor force. One such law forced debtors to work the land until their debts were
paid.
By the fall of 1876, Democrats had
returned to power in all Southern states except South Carolina, Florida, and
Louisiana. The presidential election that year ended in a dispute over the
electoral votes of these three states. Each party claimed victory. A special
electoral commission gave the contest to Republican Rutherford B. Hayes. But the
commission’s decision had to be ratified by Congress. To secure the election of
their candidate, Republican Party leaders struck a bargain with Southern
Democrats. Republicans vowed, among other promises, to remove federal troops
from Southern states. Democrats promised to accept Hayes and treat blacks
fairly. Under the Compromise of 1877, Hayes became president, the last federal
troops left the South, and the era of Reconstruction ended.
The 1877 bargain ended federal
occupation of the South and Northerners’ efforts to ensure the rights of
Southern blacks. “Today … the Government of the United States abandoned you,”
the Republican governor of South Carolina told his African American supporters,
as the last federal soldiers departed. The Southern Republican Party virtually
vanished. Black voting was not completely extinguished, but violence and
intimidation caused it to decline.
Southern Democrats had triumphed. They
remained in firm control of Southern states without Northern interference.
Ex-Confederates, although humiliated by defeat in the Civil War, regained power.
But the South was now tied to racial oppression and economic backwardness.
The Republicans’ ambitious plan for
Reconstruction failed, although it did leave two positive legacies: The 14th and
15th Amendments ensured black rights and gave the vote to black men. To maintain
the rights of Southern blacks, however, would have meant a far longer period of
military rule—which both Republicans and Democrats of the 1870s wished to
avoid—and postponed any hope of national reunion. Only in the 1960s would the
nation begin to confront the consequences of failing to protect the rights of
black citizens. In the last third of the 19th century, Americans turned to their
economic future—to developing the nation’s vast resources, to wrestling profit
from industry, and to the settlement of the trans-Mississippi West.
XIV | THE TRANS-MISSISSIPPI WEST |
After the Civil War, hope of economic
opportunity lured migrants and immigrants west to the Great Plains and Rocky
Mountain region (see American Westward Movement). Settlers battled Native
Americans for desirable lands, carved out farms, and built mines and ranches. By
the end of the century, the Western territories had turned into states, and
their residents had become part of a rapidly industrializing economy.
A | Native Americans Living on the Plains |
The Native Americans of the Great
Plains included diverse tribes—among them the Blackfoot, Sioux, Dakota,
Cheyenne, Comanche, Arapaho, Navajo, and Apache. After the Civil War, the Native
Americans confronted a growing stream of settlers—prospectors, ranchers, and
farm families. The newcomers brought with them new diseases that ravaged the
tribes. The settlers also killed off the buffalo and thus damaged the Native
American economy.
The Plains peoples defended their land
and their way of life from the oncoming settlers. Fierce battles took place in
the 1860s and 1870s between the Plains peoples and federal troops. Ultimately,
disease and conflict reduced the population and power of the tribes.
Displacement by settlers and concentration on Indian reservations, mainly in
Oklahoma, Wyoming, and the Dakotas, challenged the traditional Native American
way of life.
In the late 19th century, Congress
developed a new policy toward Native Americans. Instead of isolating them on
reservations, as had been done in the mid-1800s, the new policy sought to
assimilate Native Americans into the cultural mainstream. Congressional
policymakers responded to pressure from two different groups. First, some people
sought to suppress Native American culture by converting Native Americans to
Christianity and turning them into farmers. Second, land-hungry settlers and
speculators wanted the Native Americans removed from desirable land in the
reservations.
The Dawes Severalty Act, passed by
Congress in 1887, addressed both concerns. The law broke up reservations and
encouraged private farms. Native Americans families received individual plots of
land, carved from reservations, as well as farm equipment. These families were
to give up their communal way of life on the reservations and become independent
farmers. But few Native Americans profited from the Dawes Act; the greatest
beneficiaries were land speculators, who under the law were able to buy the best
pieces of reservation land.
In 1890 at the Battle of Wounded Knee
federal troops fired on a group of Sioux and massacred from 150 to 370 men,
women, and children. The Battle of Wounded Knee marked the end of Native
American resistance to settlement. For more information, see Native
American Policy.
B | Railroads |
The building of the railroads spurred
western settlement. In 1862 Congress authorized construction of two railroads to
link the Midwest and the West Coast. The Union Pacific Railroad extended
westward from Nebraska; the Central Pacific Railroad went eastward from the
Pacific Ocean. The meeting of the two railroads at Promontory Summit, Utah, in
1869 signified a new era in Western history.
Federal and state governments had long
encouraged the growth of railroads. When Congress authorized building the
transcontinental railroad in 1862, it agreed to loan hundreds of millions of
dollars to the two corporations to construct it. Congress also gave the railroad
companies millions of acres of Western land, which the railroads sold to repay
their loans. In effect, major railroad companies, with federal support, became
colonizers of the West.
To attract settlers who would establish
farms and become paying customers, the railroads advertised in the East and in
Europe. They provided free trips west and offered long-term loans to settlers.
Once the settlers had set up farms, they depended on the railroads to ship their
produce. Farmers often became deeply in debt to the railroads, and to repay
these debts they frequently relied on a single cash crop—typically wheat.
Reliance on a single crop made their incomes dependent on fluctuating world
markets and thus precarious.
The railroads became very powerful.
They established monopolies in specific locales, cut off service, fixed prices,
and discriminated among customers. A railroad might offer rebates to favored
customers or charge more for a short haul than a long one. Aggrieved by such
practices, farmers soon tried to curb the power of railroad corporations.
C | Farmers |
Federal land policy attracted settlers
and land speculators. The Homestead Act of 1862 provided land, originally 160
acres, at no cost if the settler agreed to cultivate the land for at least five
years. As settlers moved into arid areas farther west, however, the 160-acre
plots proved insufficient, so the size of land grants increased.
As farmers settled more western land
from 1870 to 1900, the nation’s agricultural production doubled. Several factors
increased productivity. New farm machinery included the steel plow, which could
slice through the heavy soil of the plains, and the twine-binder, which gathered
bundles of wheat and tied them with string. New varieties of grain, such as
drought-resistant sorghum, enlarged harvests. Barbed wire, patented in 1874,
enabled farmers to protect their property from roaming livestock. Finally, the
railroads made it possible for Western farm produce to be sold in Eastern
cities.
However, pioneers who established farms
in the plains—in Wisconsin, Minnesota, Iowa, Kansas, Nebraska, and the
Dakotas—faced difficult and isolated lives. They also lost much of their
independence. By the late 19th century, farmers had grown increasingly dependent
on large businesses. Railroads transported their crops, banks loaned them money,
manufacturers sold them farm machinery, and unstable international markets for
wheat and corn determined their income. Overproduction, meanwhile, drove prices
down. Farmers were frustrated by sagging prices, rising debt, high interest
rates, and railroad practices such as fixed prices or discrimination among
customers. Farmers no longer felt in charge of their own fates.
To try to address some of their
problems, farmers joined together in 1867 and founded the National Grange of the
Patrons of Husbandry, or the Grange, which established cooperative stores and
urged laws to curb railroad abuses. In a number of states, including Illinois,
Iowa, Minnesota, Wisconsin, and California, the Grangers supported the passage
of laws that regulated railroad rates and practices (see Granger
Movement).
In 1887 Congress passed the Interstate
Commerce Act, which sought to deal with some of these problems. The law required
railroad companies that transported passengers or goods to establish reasonable
rates, to publish those rates, and to adhere to them. It also banned unfair
practices, such as rebates to favored customers. Finally, it created a new
agency, the Interstate Commerce Commission (ICC), to supervise railroad
operations. The new law, however, did little to curb railroad power. Railroads
gained control of the ICC, evaded the law, and won repeal of the Granger laws
that regulated rates; farmers’ protests grew.
D | Miners and Ranchers |
Starting with the California Gold Rush
of 1849, a series of mining booms spurred settlement in the West. When gold
prospects in California dimmed, thousands of prospectors moved eastward into the
Rocky Mountains, seeking gold, silver, copper, and other minerals. Spectacular
gold rushes of the late 19th century drew prospectors to mining camps in Boise,
Idaho; Helena, Montana; and the Black Hills of South Dakota. Some mining towns
became cities, such as Denver. Others, such as Virginia City in the Sierra
Nevada mountains, boomed while prospectors worked the mines, only to become
ghost towns when the prospectors left. The era of individual prospectors was
limited; by the end of the century, they had been replaced by large mining
companies in the Western states.
In the 1860s and 1870s the railroads
transformed the cattle industry, just as they had transformed farming—by
transporting cattle to urban markets in the East. When a rail line reached
Abilene, Kansas, in 1867, Texas ranchers began to drive their cattle north to
Abilene. The cattle then traveled east, destined for packing houses. The cattle
industry began to grow rapidly as railroads made the business more
profitable.
Large-scale ranchers profited, although
the cowboys who drove the herds contended with dull lives and difficult jobs. By
the 1880s, the open-range cattle industry extended from Texas to the Dakotas.
Then the cattle boom peaked. The disastrous winter of 1886-1887, which followed
an unusually dry summer, wiped out herds and forced ranchers into bankruptcy.
Those ranchers who remained in business raised smaller herds of high-grade
cattle, grew crops to feed them, and, to conserve this food supply, fenced in
their livestock with barbed wire. The open range, in which cattle grazed freely,
ended. Some ranchers moved farther west, to Wyoming and Montana.
E | Multicultural West |
Races and ethnicities mingled in the
late-19th-century West. Immigrants from Scandinavia and ethnic Germans from
Russia settled farms in Minnesota and Wisconsin. Irish, Cornish, and Slovak
miners moved to the mountain states. Other Europeans went west as speculators,
adventurers, and prospectors, and some remained as residents. Chinese
immigrants, over 90 percent men, arrived in California in the 1850s. They formed
communities in Western cities, labored on the transcontinental railroad, and
moved eastward with the railroad and mining booms. Japanese immigrants reached
California in the 1890s and settled mainly in rural areas in the Pacific
Northwest. Among African Americans who migrated to the West, a small number
worked as cowboys; some founded all-black communities such as Langston,
Oklahoma, and Nicodemus, Kansas. When the United States acquired Texas, New
Mexico, and California at the end of the Mexican War in 1848, it incorporated
many Mexicans who lived in what had been the northern border area of Mexico.
Clusters of Native Americans lived everywhere.
The mixture of peoples in the West
spurred competition and antagonism more than harmony. Virulent anti-Chinese
sentiment in California pitted native-born workers against immigrants. The
growth of the cattle industry affected land ownership in the southwest, to the
detriment of Mexican Americans. The United States had promised Mexico to protect
the freedom and property of Mexicans who remained in the area ceded to the
United States, but American ranchers and other settlers took control of
territorial governments and forced Hispanic settlers off their land.
Antipathy and violence, moreover,
pervaded much of Western life. Hostilities flared not only between settlers and
Native Americans, but also between ranchers and farmers, sheepherders and cattle
ranchers, Mormons and non-Mormons (in Utah), and labor and management. Yet
despite all these tensions, Americans and new immigrants poured into the
West.
By the 1890s, the western half of the
continent was linked firmly to the nation’s industrial economy. Huge
meat-packing plants in Chicago and big corporations determined the profits of
ranchers. Indebted farmers on the plains, who felt oppressed by railroads and
dependent on distant markets, voiced their grievances through farmers’
alliances. Mining became a big business. Finally, cities arose from mining
towns, from cattle depots, and as “gateways” on the borders of the plains. West
or east, the nation was becoming more urban and industrial.
XV | INDUSTRIALIZATION AND URBANIZATION |
From 1870 to 1900 the United States became
the world’s foremost industrial nation. It emerged as the leader in meatpacking,
in production of timber and steel, and in the mining of coal, iron, gold, and
silver. Overall, the nation experienced a stunning explosion in the scale of
industry and in the pace of production. By the turn of the century,
industrialization had transformed commerce, business organization, the
environment, the workplace, the home, and everyday life.
Many factors fueled industrial growth in
the late 19th century: abundant resources, new technology, cheap energy, fast
transport, and the availability of capital and labor. Mines, forests, and
livestock in the west provided raw materials for major industries, as did iron
in Ohio and oil in Pennsylvania. Railroad expansion enabled businesses to move
raw materials to factories and to send products to urban markets. A steady
stream of immigrants arrived to work in America’s mines and factories.
Technological advances transformed
production. The new machine-tool industry, which turned out drilling, cutting,
and milling machines, sped up manufacturing. A trail of inventions, including
the telephone, typewriter, linotype, phonograph, electric light, cash register,
air brake, refrigerator car, and automobile, led to new industries. Finally,
business leaders learned how to operate and coordinate many different economic
activities across broad geographic areas. Businesses were thus able to become
larger, and the modern corporation became an important form of business
organization. For more information, see Industrial Revolution: The
Industrial Revolution in the United States.
A | Corporations and Consolidation |
In the 19th century, states reduced the
requirements for businesses to incorporate. A corporation is a form of business
partnership; it is a legal entity that is distinct from the individuals who
control it. The corporation (not the individual partners) is responsible for
repaying the corporation’s debts; this is known as limited liability. The
corporate form of business organization made it possible for entrepreneurs to
finance large-scale enterprises because corporations issue stock, certificates
representing shares of ownership in a corporation. By issuing stock, a
corporation can enable thousands of individuals to pool financial resources and
invest in a new venture.
Businesses also grew by combining into
trusts. In a trust, a small group of business people, called trustees, acquire
enough shares in several competing firms to control those companies. The
trustees are then able to manage and direct a group of companies in a unified
way—in effect, creating a single firm out of competing firms. The trustees could
prevent competition among the firms that were part of the trust. A leading
example was the Standard Oil Trust, formed in Ohio in 1882 by John D.
Rockefeller and his associates. Within a decade, trusts dominated many
industries.
States tried to regulate trusts, but big
businesses eluded state control. Afraid that trusts would destroy competition,
Congress in 1890 passed the Sherman Antitrust Act. The act banned businesses
from joining together in ways that controlled markets, as trusts had been doing.
It also outlawed monopoly, in which only a single seller or producer supplies a
commodity or a service. But the law defined neither trust nor monopoly and was
poorly enforced. The courts threw out cases against the trusts and used the law
mainly to declare unions illegal combinations in restraint of trade. For
instance, the courts declared that unions that organized boycotts or strikes
impeded the flow of commerce and thus violated federal law. Standard Oil,
however, continued without interference. In 1892, to avoid Ohio laws, Standard
Oil incorporated in New Jersey as a holding company, a corporation with only one
purpose: to buy out the stock of other companies.
Corporations introduced new styles of
management, or business organization. The railroads, which needed to manage
crews, fuel, repairs, and train schedules over large areas, were the first to
develop new management techniques. The railroads also developed standard time,
which the United States adopted in 1883. Steel industry tycoon Andrew Carnegie,
who continually sought less costly ways to make steel, also introduced new
management techniques. The Carnegie Steel Company used precise accounting
systems to track the costs of all processes and materials involved in making
steel. To do this work, Carnegie hired middle managers and encouraged them to
compete with one another.
New business practices led to larger
corporations. Andrew Carnegie practiced vertical integration; he bought
companies that sold supplies to the steel industry, including coal and iron
mines and a railroad line. Carnegie thereby controlled every stage of the
productive process from raw materials to marketing. Finally, he engaged in
horizontal consolidation by acquiring his competitors. He priced his products so
low that competitors could not compete and make a profit. Then he bought them
out. By 1899 Carnegie’s company was the world’s biggest industrial corporation
and produced one-fourth of the nation’s steel. However, vertical integration and
horizontal consolidation helped concentrate power in a few giant corporations
and limited competition.
According to business magnates such as
Rockefeller and Carnegie, their huge enterprises provided new products at lower
costs and enriched the nation, as well as themselves. Stressing the value of
competition, captains of industry argued that it ensured the survival of the
most competent. Business leaders also endorsed a policy of laissez-faire.
Government, they believed, should leave business alone. In fact, the federal
government adopted policies to benefit big business. Congress passed high
tariffs (taxes on imported products) that impeded foreign competition; federal
subsidies to railroads enriched investors; and courts penalized labor more often
than business.
B | Labor |
The trend toward large-scale production
changed the structure of the labor force and the nature of work. From 1870 to
1900, as the industrial work force expanded, the unskilled worker replaced the
artisan or autonomous craftsperson. The typical workplace was more likely to be
a large factory than a small workshop. Striving for efficiency, employers
replaced skilled labor with machines and low-paid workers. Factory tasks became
specialized, repetitive, and monotonous. The need for unskilled labor drew women
and children into the industrial work force. Some performed piecework, work paid
for according to the amount produced rather than the hours worked, in crowded
tenements; others operated machinery in textile mills and garment plants.
Industrial labor in the late 19th century was often hazardous. Workers lacked
protection against industrial accidents, long hours, wage cuts, layoffs, and
sudden bouts of unemployment.
As the industrial work force grew,
tensions increased between labor and management. They disagreed over issues such
as wages, length of the working day, and working conditions. Labor unions
emerged to protect the rights of workers and to represent them in negotiations
with management. Most employers vigorously opposed trade union activity, and
struggles between workers and employers often became violent.
The first national labor organization,
the Knights of Labor, organized in 1869, tried to include all workers. The
Knights reached their greatest strength between 1884 and 1885, when railroad
strikes raged, and then declined. As the Knights of Labor faded, a new
federation of local and craft unions, the American Federation of Labor (AFL),
was organized in 1886. Led from 1886 to 1924 by Samuel Gompers, an immigrant
cigar maker from England, the AFL welcomed skilled workers, almost all of them
men. The AFL focused on hours, wages, working conditions, and union recognition
by management. It also favored use of economic weapons such as strikes and
boycotts.
Late-19th-century unions attracted only
a small portion, perhaps 5 percent, of the work force, but strikes involved far
more workers. In the last quarter of the century, thousands of strikes aroused
public concern, and several large, violent events evoked fear. The great
railroad strike of 1877 was a wildcat strike (a strike by a union local
without consent of the national union to which it belongs) set off by wage cuts
on a single railroad line. It became a nationwide protest that almost ended rail
traffic and led to scores of deaths. Only the arrival of federal troops ended
the strike.
In the 1880s, a decade of 10,000 strikes
and lockouts, workers often succeeded in averting wage reductions and winning
shorter hours. Most strikes concerned local grievances but some closed down
entire industries and incurred reprisals. The Haymarket Square Riot in Chicago
in 1886 grew out of a strike against a company that built agricultural
machinery. Union leaders called a protest meeting at which police intervened and
a bomb exploded, causing many deaths. Eight people were convicted of murder, and
four were hanged. Repelled by the violence, the public blamed the labor movement
for the casualties at Haymarket Square, and the Knights of Labor lost
influence.
At the end of the 19th century, business
often defeated workers’ demands. In the 1890s, at employers’ requests, federal
troops crushed strikes at Idaho silver mines, Carnegie’s steel plants, and
Pullman railway works. The Pullman strike began when workers for the Pullman
Palace Car Company protested wage cuts. The protest led thousands of workers to
join the American Railway Union, led by Eugene V. Debs. But employers, who
united to break the union, called for an injunction, a court order for workers
to return to work, and attained it under the Sherman Antitrust Act of 1890.
Federal troops arrived to enforce the injunction against the union, riots
ensued, the strike was crushed, and Debs was arrested, convicted, and
imprisoned. The injunction was a powerful tool for business to use against
labor.
Besides the injunction, union organizers
faced other obstacles, such as blacklists (lists of union activists circulated
among employers) and attacks by Pinkerton detectives (agents of a private
detective firm that guarded factories, protected railroads, and battled labor).
In some instances, employers forced workers to sign “yellow dog contracts,” in
which they promised not to join unions. Management retained the upper hand.
C | Immigration |
Industrial workers of the late 19th
century were often foreign-born. From 1865 to 1885, immigrants arrived mainly
from northern and western Europe, as they had before the Civil War; the largest
groups came from England, Ireland, Germany, and Scandinavia. From the mid-1880s
until World War I began in 1914, the number of newcomers from southern, eastern,
and central Europe increased. Many new immigrants were Slavs—Poles, Czechs,
Russians, Ukrainians, Croatians—and others, including Jews, from the
Austro-Hungarian and Russian empires. Among the new immigrants were also Greeks,
Romanians, and Italians, mainly from southern Italy or Sicily. Record numbers of
immigrants arrived in the United States, some 9 million from 1880 to 1900, and
13 million from 1900 to 1914. For more information, see United States
(People): Growth through Immigration and Immigration: From 1840 to
1900.
Late-19th-century immigrants left their
European homes to escape economic problems—scarce land, growing populations, and
the decline of subsistence farming. They came to the United States in hope of
economic gain. Most settled in the United States permanently, but others came
only to amass some capital and then return home. Immigration dropped off during
depressions, as in the 1870s and 1890s, and again during World War I, with
smaller downturns in between. Immigration was encouraged by new technology such
as steamships, which reduced the time needed to cross the Atlantic from three
months to two weeks or less.
Where immigrants settled depended on
their ethnicity and on when they arrived. In the post-Civil War decade, for
instance, Scandinavian immigrants used the Homestead Act to start Midwestern
farms. Two decades later, immigrants usually moved to industrial towns and
cities, where they became unskilled laborers in steel mills, meatpacking plants,
and the garment trade. In Milwaukee, Wisconsin, where the population increased
tenfold from 1850 to 1890, large numbers of Poles and Eastern Europeans found
work in rolling mills and blast furnaces. By 1910 immigrants and their families
constituted over half the total population of 18 major cities; in Chicago, eight
out of ten residents were immigrants or children of immigrants.
Immigrants’ lives changed dramatically
after they arrived. Uprooted, usually from rural areas in Europe, immigrants had
to adjust to industrial labor, unfamiliar languages, and city life. Clinging to
their national identities and religions, immigrants prepared ethnic foods, read
foreign-language newspapers, and celebrated ethnic holidays. At the same time,
they patronized urban amusements, found community support in local political
machines, and adapted to the new environment. Men outnumbered women in new
immigrant communities because men often preceded their wives and families.
Immigrants’ huge numbers, high
concentrations in cities, and non-Protestant faiths evoked nativist or
anti-immigrant sentiments. To native-born Americans, the newcomers often seemed
more alien and more transient, less skilled and less literate than earlier
groups of immigrants. Some strains of nativism rested on belief in the
superiority of Anglo-Americans or Nordic peoples over all others. Other types of
nativism reflected economic self-interest: Native-born workers feared
competition for jobs from new immigrants; they feared also that immigrants would
work for lower wages, which might mean less pay or even unemployment for
them.
Both types of nativism arose on the West
Coast, where immigration from China had been heavy since the 1850s. Responding
to anti-Chinese sentiment, especially among California workers, Congress passed
the Chinese Exclusion Act in 1882. The law curbed Chinese immigration for ten
years, a period that was subsequently extended indefinitely. A small number of
immigrants from China continued to arrive, but the number of Chinese entrants
slowed to a trickle. In the 1890s, meanwhile, Congress tightened immigration
laws to exclude polygamists, contract laborers, and people with diseases.
Nativist groups such as the American Protective Association (1887) urged
immigration restriction.
D | Growth of Cities |
As immigration exploded, the urban
population surged from 6 million in 1860 to 42 million in 1910. Big cities got
bigger: Chicago tripled in size in the 1880s and 1890s. By 1900 three cities
contained more than a million people: New York (3.5 million), Chicago (1.7
million), and Philadelphia (1.3 million).
In the late 19th century, industry
invaded the cities. Previously, cities had served as commercial centers for
rural hinterlands and were frequently located on rivers, lakes, or oceans.
Manufacturing occurred outside their limits—usually near power sources, such as
streams, or natural resources, such as coal. As industry grew, cities changed.
Chicago, for instance, had been a railroad center that served the upper Midwest
as a shipping hub for lumber, meat, and grain; by 1870 it had taken the lead in
steel production as well as meatpacking. Post-Civil War Atlanta, another
railroad hub and commercial center, also developed a diverse manufacturing
sector. Cities quickly became identified with what they produced—Troy, New York,
made shirt collars; Birmingham, Alabama, manufactured steel; Minneapolis,
Minnesota, produced lumber; Paterson, New Jersey, wove silk; Toledo, Ohio, made
glass; Tulsa, Oklahoma, harbored the oil industry; and Houston, Texas, produced
railroad cars.
Population changes also transformed the
city. Urban growth reflected the geographic mobility of the industrial age;
people moved from city to city as well as within them. The new transience led to
diverse populations. Migrants from rural areas and newcomers from abroad mingled
with wealthy long-time residents and the middle class. Immigrants constituted
the fastest growing populations in big cities, where industry offered work.
Urban political machines helped immigrant communities by providing services in
exchange for votes. For immigrants, boss politics eased the way to jobs and
citizenship. Most, but not all, city machines were Democratic.
Just as industrialization and
immigration transformed the city, new technology reshaped it. Taller buildings
became possible with the introduction of elevators and construction using
cast-iron supports and, later, steel girders. The first steel-frame skyscraper,
ten stories high, arose in Chicago in 1885. In 1913 New York’s Woolworth
Building soared to a height of 60 stories. Taller buildings caused land values
in city centers to increase.
New forms of transportation stretched
cities out. First, trolleys veered over bumpy rails, and steam-powered cable
cars lugged passengers around. Then cities had electric streetcars, powered by
overhead wires. Electric streetcars and elevated railroads enabled cities to
expand, absorbing nearby towns and linking central cities with once-distant
suburbs. For intercity transport, huge railroad terminals—built like palaces,
with columns, arches, and towers—arose near crowded business hubs.
Late-19th-century cities were cauldrons
of change. In commerce, they became centers of merchandising with large
department stores, which developed in the 1860s and 1870s. As city populations
grew, the need for safe water, sanitation, fire control, and crime control also
grew. These needs led to new urban services—water reservoirs, sewer systems,
fire and police departments. Reformers attempted to enhance urban environments
with parks and to improve poor neighborhoods with urban missions. Urban
religious leaders of the 1880s promoted the Social Gospel, under which churches
concerned themselves with social problems such as poverty, vice, and injustice.
For more information, see United States (People): Urbanization of
America.
E | The New South |
Industrialization and urbanization also
affected the South. Southern merchants, manufacturers, and newspaper editors of
the 1880s led the campaign for a “New South,” where Southern industrialism would
break the cycle of rural poverty. States provided special breaks for new
businesses and promised cheap labor. Birmingham, Alabama, became a railroad and
steel center where mills hired black workers.
Southern textile mills opened in the
1880s in the Piedmont region from central Virginia to Alabama. Mill owners
depended on low-skilled, low-paid white labor, and their mills attracted workers
from rural areas. Workers settled in company towns where entire families worked
for the mill. The South replaced New England as the nation’s leading locale for
textile mills.
Overall, however, the campaign to
industrialize the South faltered. As late as 1900, only 5 percent of the
Southern labor force, most of it white, worked in industry. Furthermore,
Southern industry did not enrich the South. Except for the American Tobacco
Company, located in North Carolina, Southern industry was owned mainly by
Northern financiers.
For African Americans, the New South of
the late 19th century meant increased oppression; race relations deteriorated.
Black voting was not quickly extinguished; in the 1880s, some African Americans
continued to vote in the upper South and in pockets elsewhere, but black office
holders and voting majorities vanished, fraud and intimidation were common, and
black votes often fell under conservative control. Between 1890 and 1908,
starting in Mississippi, Southern states held constitutional conventions to
impose new voting regulations, such as literacy testing—regulations that
registrars could impose at will on blacks and not on whites. Southern states
also introduced a “grandfather clause,” which exempted from literacy testing all
those entitled to vote on January 1, 1867, (before Congress gave black men the
right to vote) and their male descendents. This enabled most illiterate whites
to go to the polls but stopped illiterate blacks from voting. Some states
imposed stringent property qualifications for voting or poll taxes, which meant
that each voter had to pay a tax in order to vote.
Increasingly, Southern blacks (the vast
majority of the nation’s African Americans) were relegated to subordinate roles
and segregated lives. Segregation laws, or Jim Crow laws as they were known,
kept blacks and whites apart in public places such as trains, stations,
streetcars, schools, parks, and cemeteries. The Supreme Court confirmed the
legitimacy of Jim Crow practices in Plessy v. Ferguson (1896),
which upheld segregation in railroad cars. In the 1890s, finally, the number of
lynchings of African Americans rose markedly. Between 1890 and 1900, more than
1,200 lynchings occurred, mainly in the Deep South. At the end of the century,
the New South remained an impoverished and racist region, with the nation’s
lowest income and educational levels.
F | Farmers’ Protests and Populism |
Beset by crop failures in the 1880s,
Midwestern farmers dealt with falling prices, scarce money, and debt. To cope
with these problems, farmers began forming farmers’ alliances, which multiplied
in the Great Plains and spread to the South, where white and black farmers
formed separate alliances. Working together in these cooperative organizations,
farmers hoped to lower costs by buying supplies at reduced prices, obtaining
loans at rates below those charged by banks, and building warehouses to store
crops until prices became favorable.
In 1889 the Southern and Northwestern
alliances merged and in 1890 became politically active. In the early 1890s,
alliance delegates formed a national party, the People’s Party, whose members
were called Populists, and decided to wage a third-party campaign. The delegates
nominated James B. Weaver as the party’s candidate for president in 1892.
Although he lost, the party won several governorships and legislative seats.
Populism inspired colorful leaders, such as lawyer Mary E. Lease of Kansas, a
powerful orator, and Tom Watson of Georgia, who urged cooperation among black
and white farmers.
Populists supported a slate of reforms.
These included calls for the government to issue more silver coins and paper
currency; such inflationary measures, Populists hoped, would raise farm prices
and enable farmers to pay off their debts. They wanted the government to
regulate closely or even to take over the railroads in the hope of lowering
farmers’ transportation costs. The Populists also supported a graduated income
tax to more equitably distribute the costs of government, as well as tariff
reduction, abolition of national banks, direct popular elections of U.S.
senators, and an eight-hour workday for wage earners.
Economic collapse in the 1890s increased
agrarian woes. The panic of 1893 was followed by a depression that lasted until
1897. Businesses went bankrupt, railroads failed, industrial unemployment rose,
and farm prices fell. The depression increased doubts about laissez-faire
economic policies.
The money question, an issue since the
1870s, dominated the election of 1896. Populists supported the Democratic
candidate, William Jennings Bryan, who called for free silver, or free and
unlimited coinage of silver. Bryan electrified the Democratic convention with a
powerful denunciation of the gold standard. But Republican William McKinley,
with a huge campaign chest and business support, won the election. With
McKinley, Republicans gained a majority of the electorate that lasted, with only
one interruption, until the New Deal in the 1930s.
The corporate elite was now empowered in
national politics. The influence of the Populist Party declined after the
election, but the massive protest stirred by Populists did not completely fail.
Many of the reforms that agrarian protesters endorsed were eventually enacted in
the Progressive Era. But Populists had been unable to turn back the clock to a
time when farmers had more autonomy, or to remedy the economic problems of the
new industrial society.
G | The Impact of Industrialization |
Three decades of industrial progress
transformed American life. By 1900 the United States had an advanced industrial
economy, dominated by big corporations. The corporation harnessed ingenuity,
created unprecedented wealth, and spurred the growth of new cities such as
Chicago, Atlanta, Minneapolis, and Dallas. It increased foreign trade. The value
of exports doubled from 1877 to 1900; imports rose, too, but less rapidly.
Industrial progress revolutionized the marketing of goods and transformed the
office world, now filled with clerical workers, bureaucrats, and middle
managers. It also transformed homes by introducing indoor plumbing, electric
lights, and household appliances. Overall, industrialization made available
labor-saving products, lower prices for manufactured goods, advances in
transportation, and higher living standards.
Industrialization had liabilities as
well. It brought about vast disparities of wealth and unreliable business
cycles, in which overproduction and depression alternated. The economy lurched
between boom and panic, as in the 1870s and 1890s; bankruptcy became a common
event, especially among indebted railroads that had overbuilt. For laborers,
industrialization meant competition for jobs, subsistence wages, insecurity, and
danger. Children worked in coal mines and cotton mills; women labored in
tenement sweatshops; workers faced the prospect of industrial accidents and
illnesses such as respiratory diseases.
Industrialization also exploited natural
resources and damaged the environment. Refiners and steel mills spewed oil into
rivers and smoke into the atmosphere. Finally, industrialization brought a
relentless drive for efficiency and profit that led to ever larger, more
powerful businesses and gave the corporate elite undue power in national
politics. In the 1890s business leaders’ need for yet larger markets led to
pressure on the United States to expand overseas.
XVI | IMPERIALISM |
The United States had a long tradition of
territorial expansion. Gains of adjacent territory in the 19th century—the
Louisiana Purchase of 1803, the areas won from Mexico in 1848, and U.S.
expansion across the continent—all enhanced American stature. More recently, the
defeat and removal of Native American tribes by federal troops had opened the
West to farms and ranches, speculators and corporations.
In the 1890s, several motives combined to
build pressure for expansion overseas. First, business leaders wanted overseas
markets. Products basic to the American economy—including cotton, wheat, iron,
steel, and agricultural equipment—already depended heavily on foreign sales.
Business leaders feared that if the United States failed to gain new markets
abroad, other nations would claim them, and these markets would be lost to U.S.
enterprise. Second, national prestige required the United States to join the
great European nations and Japan as imperial powers (nations with overseas
colonies). Alfred Thayer Mahan presented this position in The Influence of
Sea Power upon History, 1660-1783 (1890). In order to enter the race for
influence, Mahan contended, the United States had to expand its depleted
merchant marine, acquire overseas naval bases, build up a large navy, and find
markets abroad. Third, religious leaders supported efforts to spread
Christianity to foreign peoples. Finally, the United States seemed to be falling
behind in the race for empire; it had not acquired noncontiguous territory since
the secretary of state bought Alaska from Russia in 1867.
Imperial designs evoked criticism, too.
Some Americans opposed U.S. expansion and challenged the drive for an overseas
empire. The Anti-Imperialist League—a coalition of editors, academics,
reformers, and labor leaders—contended that the United States had no right to
impose its will on other people and that imperialism would lead to further
conflict. Foes of imperialism also protested that overseas territories would
bring nonwhite citizens into the United States. Still the economic crisis of the
1890s made overseas expansion seem imperative, especially to the business
community. At the century’s end, the United States began to send American forces
to Hawaii, Cuba, the Philippines, and East Asia.
A | Annexation of Hawaii |
In the 1880s a monarchy governed the
Hawaiian Islands, but western powers, including the United States, Britain, and
Germany, had significant influence in Hawaii’s economy and government. American
business interests dominated the lucrative sugar business. Angered by U.S.
domination, Hawaiian islanders in 1891 welcomed a native Hawaiian, Liliuokalani,
as queen. Liliuokalani attempted to impose a new constitution that strengthened
her power. American planters responded by deposing the queen in 1893.
Proclaiming Hawaii independent, the Americans requested U.S. annexation.
President Grover Cleveland stalled on the annexation treaty; his representative
on the islands reported that native Hawaiians objected to it. Under President
William McKinley, however, in 1898, Congress voted to annex the Hawaiian
Islands. In 1900 Hawaii became American territory.
B | The Spanish-American War: Cuba and the Philippines |
United States involvement in Cuba began
in 1895 when the Cubans rebelled against Spanish rule. The Cuban revolution of
1895 was savage on both sides. Americans learned of Spanish atrocities through
sensational press reports as well as from Cuban exiles who supported the rebels.
Humanitarians urged the United States to intervene in the revolution, and U.S.
businesses voiced concern about their large investments on the island. However,
President Cleveland sought to avoid entanglement in Cuba, as did President
McKinley, at first.
A well-publicized incident drew the
United States into the conflict. On February 15, 1898, an American battleship,
the Maine, exploded in Havana harbor, killing 266 people. Most Americans
blamed the Spanish, and “Remember the Maine” became a call to arms.
McKinley began negotiations with Spain for a settlement with Cuba. McKinley then
sent a message to Congress, which adopted a resolution recognizing Cuban
independence and renouncing any intent to annex the island, but Spain refused to
withdraw. In April 1898 Congress declared war on Spain, and the Spanish-American
War began.
The four-month war ended in August with
a victory for the United States. The first action occurred thousands of miles
away from Cuba in the Philippines, another Spanish colony. There Commodore
George Dewey surprised the Spanish fleet in Manila Bay and sank every vessel in
it.
Next, the United States sent an
expeditionary force to Cuba. The U.S. Navy blockaded the Spanish fleet, and the
Americans landed unopposed. After a bloody battle, in which a regiment of
soldiers called Rough Riders were led by Theodore Roosevelt, the Americans
captured San Juan Hill outside the strategic city of Santiago de Cuba, and
Spanish land forces surrendered. American troops also occupied Puerto Rico and
Manila Harbor. In August 1898 the United States signed an armistice, and later
that year, a peace settlement.
The Senate narrowly ratified the peace
treaty with Spain in February 1899. The treaty provided that Spain would cede
the Philippines, Puerto Rico, and Guam to the United States; the United States
would pay Spain $20 million. In addition, Spain would surrender all claims to
Cuba and assume Cuba’s debt. No wonder the Spanish-American War struck Secretary
of State John Hay as a “splendid little war.” In a few months, the United States
had become a major world power with an overseas empire.
But the story of the “splendid little
war” was not yet complete. In February 1899 the Filipinos, led by Emilio
Aguinaldo, declared themselves independent and began a three-year struggle
against 120,000 U.S. troops. About 20,000 Filipinos were killed in combat.
However, more than 200,000 Filipinos died during the insurrection primarily due
to a cholera outbreak from 1897 to 1903. Barbarities and atrocities occurred on
both sides before the United States captured Aguinaldo and suppressed the
insurrection.
The U.S. Army remained in Cuba until
1901, when the Cubans adopted a constitution that included the Platt Amendment.
The amendment pledged Cubans to allow the United States to intervene in Cuban
affairs when events threatened property, liberty, or Cuban independence. Cuba
accepted the amendment and became in effect a protectorate of the United States.
In the election of 1900, William Jennings Bryan again challenged McKinley, this
time on an unsuccessful anti-imperialist platform.
C | Open Door Policy in China |
American trade with China increased in
the 1890s. The United States had long demanded an Open Door Policy for trading
in China, which was weak, in order to prevent other powers from carving up China
among them. But France, Russia, Britain, and Japan bit off pieces for themselves
by annexation or by establishing spheres of influence, where they exercised
economic privileges.
As its rivals made gains, the United
States feared it would be excluded from all trade in China. In 1899 Secretary of
State John Hay sent the European powers and Japan a series of “Open Door Notes,”
requesting agreement on three points. First, each power would respect the
trading rights of the others within each nation’s sphere of influence; second,
Chinese officials would collect import duties; and third, no nation would
discriminate against the others in matters of harbor duties or railroad rates
within each sphere of influence. Hay declared the principles accepted,
inaccurately, since Russia and later Japan disagreed.
Not all the Chinese welcomed Western
penetration of their culture. In 1900 the Boxer Uprising broke out in China. The
Boxers—a sect of Chinese nationalists who opposed foreign influence in
China—rose up against foreign traders, officials, and missionaries, and
massacred many of them. The United States and the European powers intervened
with troops and put down the insurrection. The European powers seemed eager to
carve up China, but Hay persuaded them to accept compensation to cover their
losses. The United States returned part of its compensation to China. The
McKinley administration had stopped Europe from carving up China.
The quest for an overseas empire in the
late 1890s thus led to substantial American gains. The United States annexed
Hawaii in 1898, conquered the Philippines and Guam from Spain in 1899, turned
Cuba in effect into an American protectorate in 1901, and kept China opened to
American traders and missionaries.
Meanwhile, in September 1901, an
anarchist shot President McKinley, and Vice President Theodore Roosevelt assumed
the presidency. The United States now entered the 20th century and an era of
reform.
XVII | PROGRESSIVISM AND REFORM |
The growth of industry and cities
created problems. A small number of people held a large proportion of the
nation’s wealth while others fell into poverty. Workers faced long hours,
dangerous conditions, poor pay, and an uncertain future. Big business became
closely allied with government, and political machines, which offered services
in return for votes, controlled some city governments. As the United States
entered the 20th century, demand arose to combat these ills.
Progressive reformers sought to remedy
the problems created by industrialization and urbanization. To progressives,
economic privilege and corrupt politics threatened democracy. Never a cohesive
movement, progressivism embraced many types of reform. Progressives strove,
variously, to curb corporate power, to end business monopolies, and to wipe out
political corruption. They also wanted to democratize electoral procedures,
protect working people, and bridge the gap between social classes. Progressives
turned to government to achieve their goals. National in scope, progressivism
included both Democrats and Republicans. From the 1890s to the 1910s,
progressive efforts affected local, state, and national politics. They also left
a mark on journalism, academic life, cultural life, and social justice
movements.
Crusading journalists helped shape a
climate favorable to reform. Known as muckrakers, these journalists revealed to
middle class readers the evils of economic privilege, political corruption, and
social injustice. Their articles appeared in McClure’s Magazine and other
reform periodicals. Some muckrakers focused on corporate abuses. Ida Tarbell,
for instance, exposed the activities of the Standard Oil Company. In The
Shame of the Cities (1904), Lincoln Steffens dissected corruption in city
government. In Following the Color Line (1908), Ray Stannard Baker
criticized race relations. Other muckrakers assailed the Senate, railroad
practices, insurance companies, and fraud in patent medicine.
Novelists, too, revealed corporate
injustices. Theodore Dreiser drew harsh portraits of a type of ruthless
businessman in The Financier (1912) and The Titan (1914). In
The Jungle (1906) Socialist Upton Sinclair repelled readers with
descriptions of Chicago’s meatpacking plants, and his work led to support for
remedial legislation. Leading intellectuals also shaped the progressive
mentality. In The Theory of the Leisure Class (1899), Thorstein Veblen
attacked the “conspicuous consumption” of the wealthy. Educator John Dewey
emphasized a child-centered philosophy of pedagogy, known as progressive
education, which affected schoolrooms for three generations.
A | Progressivism in the Cities and States |
As a political movement, progressivism
arose at the local and state levels in the 1890s. Urban reformers attacked
political machines run by corrupt bosses and monopolies in municipal services
such as electricity or gas. To address these problems, they promoted
professional city managers and advocated public ownership of utilities.
The social settlement movement, which
originated in cities in the 1890s, also became a force for progressive reform at
the local level. Settlement houses offered social services to the urban poor,
especially immigrants. Pioneering settlement houses, such as Hull House, founded
by Jane Addams and Ellen Gates Starr in 1889, provided nurseries, adult
education classes, and recreational opportunities for children and adults.
Settlements spread rapidly. There were 100 settlement houses in 1900, 200 in
1905, and 400 in 1910. Settlement leaders joined the battle against political
machines and endorsed many other progressive reforms.
At the state level, progressives
campaigned for electoral reforms to allow the people to play a more direct role
in the political process. Some Western states adopted practices that expanded
voter rights, including the initiative, the referendum, and the recall. Under
the initiative, citizens could sign petitions to force legislatures to vote on
particular bills. With the referendum, a proposal could be placed on the ballot
to be decided by a vote at election time. Using the recall, voters could
petition to oust officials from their jobs. Progressives also supported the 17th
Amendment, ratified in 1913, which provides for election of U.S. senators
directly by vote of the people, rather than indirectly by state
legislatures.
Progressive reformers used the states
as laboratories of reform. For instance, Wisconsin governor Robert La Follette,
who held office from 1901 to 1906, introduced progressive changes such as
establishing a commission to supervise railroad practices and raising state
taxes on corporations. Following Wisconsin’s example, one state after another
passed laws to regulate railroads and businesses.
Progressives also focused on labor
reform at the state level. They sought to eliminate (or at least regulate) child
labor, to cut workers’ hours, and to establish a minimum wage. By 1907
progressive efforts had led 30 states to abolish child labor. In Muller
v. Oregon (1908), the Supreme Court upheld a state law that limited women
factory workers to a ten-hour day, and many states began to regulate women’s
working hours. Progressives also endorsed workmen’s compensation (an insurance
plan to aid workers injured on the job) and an end to homework (piecework done
in tenements). In New York’s Triangle Fire of 1911, many women leapt to their
deaths from a burning shirtwaist factory. The tragedy reminded people of the
need for higher safety standards in factories and the need to protect workers
from unscrupulous employers.
Some progressive reformers supported
causes that had a coercive or repressive dimension, such as Prohibition, a
movement to prevent the manufacture, sale, or use of alcohol. The Woman’s
Christian Temperance Union (WCTU), founded in 1874, had long campaigned against
alcohol. In 1895 the Anti-Saloon League of America joined the crusade. Together
they worked to gain support for the 18th Amendment, which provided for
Prohibition. The amendment was ratified in 1919 and remained law until 1933,
when the 21st Amendment repealed it. Progressive moral fervor also emerged in
campaigns to combat prostitution and to censor films. Finally, some progressives
endorsed other restrictive causes, now seen as ungenerous or inhumane, such as a
campaign against immigration or support for eugenics, a movement to control
reproduction in order to improve the human race.
Progressive causes won support from a
broad section of the middle class—editors, teachers, professionals, and business
leaders—who shared common values. Progressive supporters appreciated order,
efficiency, and expertise; they championed investigation, experimentation, and
cooperation. Many, including some progressive employers, sought regulations to
make business practices more fair and break up monopolies. To regulate business,
however, progressives had to wield influence on the national level.
B | Progressivism at the National Level |
When progressives began to work for
reform at the national level, their major goal was government regulation of
business. Seeking antitrust laws to eliminate monopolies, they also supported
lower tariffs, a graduated income tax, and a system to control currency. They
found a spokesperson in President Theodore Roosevelt.
Regulation, Roosevelt believed, was
the only way to solve the problems caused by big business. A leading publicist
for progressive ideals, Roosevelt became known as a trustbuster. He revived the
Sherman Antitrust Act, vigorously enforcing it to break up large trusts that
reduced competition and controlled prices. He also pursued a railroad monopoly,
took on the meatpacking trust, and attacked oil, tobacco, and other monopolies.
In 1906 Roosevelt helped push through a meat inspection act, the Pure Food and
Drug Act, and the Hepburn Act. This law expanded the regulatory powers of the
Interstate Commerce Commission, the agency that regulated commercial activity
crossing state lines.
Roosevelt was also a leading nature
conservationist who wanted to preserve the nation’s natural resources. He
withdrew thousands of acres of forests, mineral lands, and waterpower sites from
the public domain to protect them from exploitation by private interests.
Roosevelt doubled the number of national parks and established many national
monuments and wildlife refuges. He also supported a 1902 law to provide
irrigation and hydroelectric development by building dams on some of the
nation’s rivers.
Roosevelt’s successor, William Howard
Taft, was more conservative, and domestic reforms slowed during his
administration. He reluctantly signed a bill in 1909 that slightly raised
tariffs, but he aggressively pursued twice as many antitrust proceedings. Taft
won major victories against Standard Oil Company and American Tobacco Company,
which were ordered by the Supreme Court to break into smaller, competing firms.
Taft also signed laws for progressive measures such as raising corporation
taxes.
Taft lost support in 1912, however,
when Roosevelt, who disagreed with him on tariff policy and railroad regulation,
entered the presidential race as head of the new Progressive Party. Roosevelt’s
program of New Nationalism sought state regulation of big business. New Jersey’s
progressive governor, Democrat Woodrow Wilson, envisioned more limited federal
power. Wilson supported an effort to destroy monopoly and aid small business
through tariff reduction, banking reform, and tightening of antitrust laws. His
program was known as the New Freedom.
Progressivism reached its peak during
Wilson’s first term as president. In 1913 Wilson signed the Underwood Tariff,
which reduced taxes on imported goods. The bill also included an income tax,
permitted by the new 16th Amendment to the Constitution of the United States.
Wilson supported the Federal Reserve Act of 1913, which created a centralized
banking system to act as a lender of last resort to forestall bank crises and to
permit a more elastic currency, one that could be readily expanded or contracted
to suit the national need.
To curb trusts, Wilson pushed through
Congress the Federal Trade Commission Act of 1914 (see Federal Trade
Commission). The law established a commission with authority to prevent business
practices that could lead to a monopoly. He also supported the Clayton Antitrust
Act of 1914, a statute intended to bolster the poorly enforced Sherman Act. The
new law banned interlocking directorates, in which a few people controlled an
industry by serving simultaneously as directors of related corporations. It also
exempted labor unions from the category of illegal combinations and gave workers
the right to strike. Finally, Wilson appointed Louis Brandeis, a leading critic
of big business, to the Supreme Court. Full of moral fervor, Wilson carried
progressive goals into World War I, which the United States entered in
1917.
C | African Americans in the Progressive Era |
Despite their zeal for reform, few
progressives made race relations a priority, and in the South, leading
progressives often endorsed racist policies. In 1900 more than two-thirds of 10
million African Americans lived in the South; most were sharecroppers and tenant
farmers. Rural or urban, Southern blacks faced poverty, discrimination, and
limited employment opportunities. At the end of the 19th century, Southern
legislatures passed Jim Crow laws that separated blacks and whites in public
places (see Segregation in the United States). Because blacks were
deprived of the right to vote by the grandfather clause, poll taxes, or other
means, their political participation was limited. Lynching increased, and a
steady stream of black migrants moved north. From 1890 to 1910, some 200,000
African Americans left the South, and even more moved out during World War I.
For more information, see United States (People): Major Migrations of the
U.S. Population: Black Migration.
As African Americans tried to combat
racism and avoid racial conflict, they clashed over strategies of accommodation
and resistance. Booker T. Washington, head of the Tuskegee Institute in Alabama,
urged blacks to be industrious and frugal, to learn manual skills, to become
farmers and artisans, to work their way up economically, and to win the respect
of whites. When blacks proved their economic value, Washington argued, racism
would decline. An agile politician, with appeal to both whites and blacks,
Washington urged African Americans to adjust to the status quo. In 1895, in a
speech that critics labeled the Atlanta Compromise, Washington contended that
blacks and whites could coexist in harmony with separate social lives but united
in efforts toward economic progress.
Northern intellectual W.E.B. Du Bois
challenged Washington’s policy. In The Souls of Black Folk (1903), Du
Bois deplored Washington’s call for patience and for cultivation of manual
skills. Instead he urged equal educational opportunities and the end of
discrimination. In 1909 Du Bois joined a group of progressives, black and white,
to form the National Association for the Advancement of Colored People (NAACP).
The NAACP strove to end the disfranchisement of black people, to abolish
segregation, and to promote black civil and political rights.
D | The Women’s Movement |
Middle-class women and progressive
reformers shared common goals. In the progressive era, women made great advances
in higher education, the professions, and women’s organizations. By 1910, for
instance, when about 5 percent of college-age Americans attended college, about
40 percent were women. Activist women joined organizations such as the General
Federation of Women’s Clubs, a women’s volunteer service organization founded in
1890. The National Consumers’ League (1899) and the Women’s Trade Union League
(1903) spearheaded efforts to limit women’s work hours and to organize women in
unions. College students read Women and Economics (1898) by feminist
intellectual Charlotte Perkins Gilman; college graduates worked in settlement
houses; and homemakers joined women’s clubs to promote civic improvement.
Reformer Florence Kelley led the charge for child labor laws and other measures
to protect workers. On the left, anarchist Emma Goldman, birth control advocate
Margaret Sanger, and feminist Crystal Eastman promoted aspects of women’s
rights.
Settlement leaders, women’s clubs, and
temperance groups supported progressive measures. The woman suffrage movement,
in turn, won progressive support. Women had been fighting for the right to vote
since the passage of the 15th Amendment gave voting rights to black men. In 1869
two rival organizations formed to support voting rights for women on state and
federal levels. In 1890 the competing suffrage groups united to form the
National American Woman Suffrage Association, which pursued the battle in the
states. As late as 1909, women could vote in only four states (Wyoming, Utah,
Idaho, and Colorado), but momentum picked up. Suffragists used more aggressive
tactics, such as parades, rallies, and marches, and gained ground. They won a
key victory by gaining the right to vote in New York State in 1917, which helped
empower them for their final push during World War I.
E | Foreign Affairs |
Progressive presidents sought to
impose order on the world, and especially to find markets for American products.
For example, Roosevelt believed that a world power such as the United States was
obliged to maintain global peace. He brought Russia and Japan together to sign a
treaty in 1905 that ended the Russo-Japanese War and gave Japan rights in Korea.
Roosevelt also supported expansion of U.S. influence abroad.
Roosevelt intervened in Latin America
to build a canal connecting the Atlantic and Pacific oceans; the canal would
link U.S. East Coast ports with East Asia. The United States negotiated a treaty
with Colombia for rights to build a canal in Panama, at that time controlled by
Colombia. When the Colombian Congress rejected the treaty, Roosevelt encouraged
Panamanian desire for independence from Colombia. This tactic succeeded, and a
revolution occurred. The United States promptly recognized the new government of
Panama and negotiated a treaty that enabled Americans to build the Panama
Canal.
Latin Americans questioned Roosevelt’s
high-handed maneuver. They also objected to the Roosevelt Corollary to the
Monroe Doctrine. The Monroe Doctrine, announced in 1823, declared that the
United States had the right to exclude foreign powers from expanding in the
western hemisphere. It had protected weak 19th-century Latin American nations
from powerful European nations. The Roosevelt Corollary, in contrast, stated
that “chronic” wrongdoing on the part of Latin American nations entitled the
United States to intervene in the affairs of those nations. Most Latin Americans
saw Roosevelt’s policy as a form of imperialism.
Roosevelt applied his corollary first
to the Dominican Republic, which had trouble paying its debts to other nations.
Roosevelt feared that a European power might occupy the country to force
repayment of debts. The United States therefore ran the Dominican Republic’s
custom service for two years and used money collected there to pay the nation’s
debts.
Relations with Japan also became an
issue during Roosevelt’s administration. A conflict erupted in 1906 over
Japanese immigration to the United States. Prejudice against Japanese immigrants
caused a crisis when San Francisco forced Asian children into a separate school.
The Japanese government protested. In a “gentlemen’s agreement” in 1907, both
nations agreed to discourage immigration from Japan. In the Root-Takahira
agreement of 1908, Japan and the United States agreed to respect the territorial
integrity of China and the Open Door Policy.
Roosevelt’s successor, William Howard
Taft, adopted a policy that critics called dollar diplomacy; he encouraged U.S.
bankers and industrialists to invest abroad, especially in Latin America. He
hoped they would replace European lenders and build American influence in the
area. The policy, however, led the United States into unpopular military
ventures. For instance, the nation became involved in a civil war in Nicaragua,
where the United States in 1909 supported the overthrow of the country’s leader
and sustained a reactionary regime.
Woodrow Wilson, an idealist and
humanitarian, disliked imperialism and rejected dollar diplomacy. He hoped to
establish benevolent relations with other nations and wanted the United States
to serve as a force for good in the world. However, in 1913, the United States
landed marines in Nicaragua to ensure that its choice for Nicaraguan president
would remain in power. The Wilson administration then drew up a treaty with
Nicaragua that reduced the country to virtual dependency. In addition, U.S.
troops occupied Haiti in 1915 and the Dominican Republic in 1916. American
business interests continued to prevail in Latin America.
Finally, Wilson came close to
involving the United States in a war with Mexico. In 1913, two years after the
Mexican Revolution, Mexico’s new president was assassinated, and a reactionary
general, Victoriano Huerta, took control. Wilson refused to recognize Huerta’s
unjust regime. Many Mexicans who disliked Huerta, however, also resented
Wilson’s intervention in Mexican affairs. Both sides were poised to fight in
1914, when a confrontation between American sailors and Huerta’s forces broke
out at Veracruz. Wilson accepted the mediation of Argentina, Chile, and Brazil,
but then supported Francisco “Pancho” Villa, a bandit, until Villa crossed the
border and massacred Americans. Wilson sent U.S. troops to pursue Villa in 1916.
The United States withdrew in 1917, which ended American involvement but left a
legacy of distrust in Mexico and Latin America.
Historians debate the impact of
progressivism at home and abroad. Some criticize the progressives’ desire for
order and control, their reluctance to criticize capitalism, and progressivism’s
coercive or restrictive side. Big business, critics contend, eluded progressive
regulations. Other historians applaud progressive initiatives and find in them
precedents for New Deal measures of the 1930s. According to more favorable
interpretations, progressivism expanded democracy, challenged the close alliance
of government and business, considered the public interest, and protected some
of the more vulnerable Americans.
Above all, progressivism changed
American attitudes toward the power of government. In 1917 Americans turned
their attention from domestic concerns to foreign affairs as the United States
became involved in World War I.
XVIII | AMERICA AND WORLD WAR I |
World War I broke out in Europe in the
summer of 1914. The war set Germany and Austria-Hungary (the Central Powers)
against the United Kingdom, France, and Russia (the Allied Powers), and
eventually involved many more nations. The United States declared itself a
neutral nation, but neutrality proved elusive. For three years, as Europeans
faced war on an unprecedented scale, the neutrality so popular in the United
States gradually slipped away.
At the outset, Germany and Britain each
sought to terminate U.S. trade with the other. Exploiting its naval advantage,
Britain gained the upper hand and almost ended U.S. trade with Germany.
Americans protested this interference, but when German submarines, known as
U-boats, began to use unrestricted submarine warfare in 1915, American public
opinion turned against Germany. Then on May 7, 1915, a German submarine attacked
a British passenger liner, the Lusitania, killing more than a thousand
people, including 128 Americans. Washington condemned the attacks, which led to
a brief respite in German attacks. In the presidential race of 1916, President
Wilson won reelection on the campaign slogan “He Kept Us Out of War.”
In February 1917, however, Germany
reinstated the policy of unrestricted submarine warfare. Ending diplomatic ties
with Germany, Wilson still tried to keep the United States out of the war. But
Germany continued its attacks, and the United States found out about a secret
message, the Zimmermann telegram, in which the German government proposed an
alliance with Mexico and discussed the possibility of Mexico regaining territory
lost to the United States. Resentful that Germany was sinking American ships and
making overtures to Mexico, the United States declared war on Germany on April
6, 1917.
The United States entered World War I
with divided sentiments. Americans debated both whether to fight the war and
which side to support. Since the outbreak of war in Europe, pacifists and
reformers had deplored the drift toward conflict; financiers and industrialists,
however, promoted patriotism, “preparedness,” and arms buildup. Some Americans
felt affinities for France and Britain, but millions of citizens were of German
origin. To many Americans, finally, the war in Europe seemed a distant conflict
that reflected tangled European rivalries, not U.S. concerns.
But German aggression steered public
opinion from neutrality to engagement, and the United States prepared for
combat. The Selective Service Act, passed in May 1917, helped gradually increase
the size of America’s armed forces from 200,000 people to almost 4 million at
the war’s end.
A | Over There |
By the spring of 1917, World War I
had become a deadly war of attrition. Russia left the war that year, and after
the Bolsheviks assumed power in the Russian Revolution of 1917, Russia signed a
separate peace treaty with Germany in March 1918. Allied prospects looked grim.
With Russia out of the picture, Germany shifted its troops to the western front,
a north-south line across France, where a gruesome stalemate had developed. Dug
into trenches and shelled by artillery, great armies bogged down in a form of
siege warfare.
In June 1917 the American
Expeditionary Force, led by General John J. Pershing, began to arrive in France.
By March 1918, when Germany began a massive offensive, much of the American
force was in place. Reluctantly, the United States allowed American troops to be
integrated into Allied units under British and French commanders. These
reinforcements bolstered a much-weakened defense, and the Allies stopped the
German assault. In September 1918 American troops participated in a
counteroffensive in the area around Verdun. The Saint-Mihiel campaign succeeded,
as did the Allied Meuse-Argonne offensive, where both the Allies and the Germans
suffered heavy casualties. Facing what seemed to be a limitless influx of
American troops, Germany was forced to consider ending the war. The Central
Powers surrendered, signing an armistice on November 11, 1918. Only the
challenge of a peace treaty remained.
American manpower tipped the scales
in the Allies’ favor. At war for only 19 months, the United States suffered
relatively light casualties. The United States lost about 112,000 people, many
to disease, including a treacherous influenza epidemic in 1918 that claimed 20
million lives worldwide. European losses were far higher. According to some
estimates, World War I killed close to 10 million military personnel.
B | Over Here |
World War I wrought significant
changes on the American home front. First, the war created labor shortages.
Thousands of African Americans left the South for jobs in Northern steel mills,
munitions plants, and stockyards. The great migration of the World War I era
established large black communities in Northern cities such as New York,
Philadelphia, and Chicago. The influx, however, provoked racial tensions and
race riots in some cities, including East Saint Louis, Illinois, in July 1917
and Chicago in July 1919.
Labor shortages provided a variety of
jobs for women, who became streetcar conductors, railroad workers, and
shipbuilders. Women also volunteered for the war effort and sold war bonds.
Women mustered support for woman suffrage, a cause that finally achieved its
long-sought goal. The 19th Amendment, granting women the right to vote,
triumphed in Congress in 1919 and was ratified by the states in 1920.
The war greatly increased the
responsibilities of the federal government. New government agencies relied
mainly on persuasion and voluntary compliance. The War Industries Board urged
manufacturers to use mass production techniques and increase efficiency. The
Railroad Administration regulated rail traffic; the Fuel Administration
monitored coal supplies and regulated gasoline. The National War Labor Board
sought to resolve thousands of disputes between management and labor that
resulted from stagnant wages coupled with inflation. The Food Administration
urged families to observe “meatless Mondays,” “wheatless Wednesdays,” and other
measures to help the war effort. The Committee on Public Information organized
thousands of public speakers (“four-minute men”) to deliver patriotic addresses;
the organization also produced 75 million pamphlets promoting the war
effort.
Finally, to finance the war, the
United States developed new ways to generate revenue. The federal government
increased income and excise taxes, instituted a war-profit tax, and sold war
bonds.
War pressures evoked hostility and
suspicion in the United States. Antagonism toward immigrants, especially those
of German descent, grew. Schools stopped teaching German. Hamburgers and
sauerkraut became “Salisbury steak” and “liberty cabbage.” Fear of sabotage
spurred Congress to pass the Espionage Act of 1917 and the Sedition Act of 1918.
The laws imposed fines, jail sentences, or both for interfering with the draft,
obstructing the sale of war bonds, or saying anything disloyal, profane, or
abusive about the government or the war effort. These repressive laws, upheld by
the Supreme Court, resulted in 6,000 arrests and 1,500 convictions for antiwar
activities. The laws targeted people on the left, such as Socialist leader
Eugene V. Debs, who was imprisoned, and Emma Goldman, who was jailed and
deported. The arrests of 1917 reflected wartime concerns about dissent as well
as hostility toward the Russian Revolution of 1917.
C | Treaty of Versailles |
Even before the war ended, President
Wilson offered a plan for world peace, the Fourteen Points. The plan, announced
to Congress on January 8, 1918, would abolish secret diplomacy, guarantee
freedom of the seas, remove international trade barriers wherever possible,
reduce arms, and consider the interests of colonized peoples. Eight more points
addressed changes to specific boundaries based on the principle of
self-determination, or the right of nations to shape their own destinies.
Finally, Wilson’s points called for a League of Nations to arbitrate disputes
between nations and usher in an epoch of peace. High hopes for the Fourteen
Points prevailed at the time of the armistice but faded by June 1919, when
emissaries of the Big Four (the United States, France, Britain, and Italy)
gathered at Versailles to determine the conditions of peace.
At Versailles, the Allies ignored
most of Wilson’s goals. During postwar negotiations, including the Treaty of
Versailles, they redrew the map of Europe and established nine new nations,
including Poland, Yugoslavia, and Czechoslovakia. Boundaries of other nations
were shifted, and out of the Ottoman Empire, which fought on the side of the
Central Powers during the war, four areas were carved: Iraq, Syria, Lebanon, and
Palestine. These areas were given to France and Britain as mandates, or
temporary colonies. The Treaty of Versailles demilitarized Germany, which lost
its air force and much of its army and navy. Germany also lost its colonies and
had to return to France the Alsace-Lorraine area, which Germany had annexed in
1871. Finally, forced to admit blame for the war, Germany was burdened with high
reparations for war damages.
A spirit of vindictiveness among the
Allies invalidated Wilson’s goals and led to a number of defects in the Treaty
of Versailles. First, Germany’s humiliation led to resentment, which festered
over the next decades. Second, the Big Four paid no attention to the interests
of the new Bolshevik government in Russia, which the treaty antagonized. Third,
in some instances, the treaty ignored the demands of colonized peoples to govern
themselves.
The Treaty of Versailles did include
a charter or covenant for the League of Nations, a point that embodied Woodrow
Wilson’s highest goal for world peace. However, the U.S. Senate rejected the
League of Nations and the entire treaty. Republicans who favored isolation (the
“irreconcilables”) spurned the treaty. Conservative Republicans, led by Senator
Henry Cabot Lodge, disliked the treaty’s provisions for joint military actions
against aggressors, even though such action was voluntary. They demanded
modifications, but Wilson refused to compromise. Overestimating his prestige and
refusing to consider Republican reservations, Wilson remained adamant.
Uncompromising and exhausted, the president campaigned for the treaty until he
collapsed with a stroke. The United States never joined the League of Nations,
started in 1919, and signed a separate peace treaty with Germany in 1921.
Ironically, after leading America to
victory in the war, President Wilson endured two significant disappointments.
First he compromised at Versailles; for instance, he agreed to the Allied
diplomats’ desire for high reparations against Germany. Second, Wilson refused
to compromise with the Senate, and thus he was unable to accomplish his
idealistic goals. His vision of spreading democracy around the world and of
ensuring world peace became a casualty of the peace process.
World War I left many legacies. The
American experience of the Great War, albeit brief and distant from the nation’s
shores, showed the United States how effectively it could mobilize its
industrial might and hold its own in world affairs. However, the war left
Germany shackled by the armistice and angered by the peace treaty. Postwar
Germany faced depression, unemployment, and desperate economic conditions, which
gave rise to fascist leadership in the 1930s. In addition, each of the areas
carved out by the Treaty of Versailles proved, in one way or another, to be
trouble spots in the decades ahead. In the United States, fears of radicalism,
horror at Soviet bolshevism, and the impact of wartime hysteria led to a second
blast of attacks on radicals. In the Palmer Raids in January 1920, agents of
Attorney General A. Mitchell Palmer arrested thousands of people in 33 cities.
The postwar Red Scare abated, but suspicion of foreigners, dissenters, and
nonconformists continued in the 1920s.
XIX | AMERICA IN A NEW AGE |
World War I made the United States a
world power. While European nations tried to recover from the war, the United
States had overseas territories, access to markets, and plentiful raw materials.
Formerly in debt to European investors, the United States began to lend money
abroad. At home, the economy expanded. Assembly-line production, mass
consumption, easy credit, and advertising characterized the 1920s. As profits
soared, American zeal for reform waned, and business and government resumed
their long-term affinity. But not all Americans enjoyed the rewards of
prosperity. A mix of economic change, political conservatism, and cultural
conflict made the 1920s a decade of contradictions.
A | Productivity and Prosperity |
As war production ended, the economy
dipped, but only briefly; by 1922 the nation began a spectacular spurt of
growth. Auto production symbolized the new potential of industry (see
Automobile Industry). Annual car sales tripled from 1916 to 1929; 9 million
motorized vehicles on the road became 27 million by the end of the 1920s. At his
Michigan plant, Henry Ford oversaw the making of the popular black Model T. New
modes of production changed car manufacture. A moving assembly line brought
interchangeable parts to workers who performed specific tasks again and again.
Assembly-line techniques cut production costs, which made cars less expensive
and more available to average citizens.
The effect of auto production spread
beyond car factories. Auto building spurred industries that made steel, glass,
rubber, and petroleum. Exploration for oil led to new corporations, such as Gulf
Oil and Texaco. During the 1920s domestic oil production grew by 250 percent,
and oil imports rose as well.
State-funded programs to build roads
and highways changed the nation’s landscape. Previously isolated rural areas
filled with tourist cabins and gas stations. New suburbs with single-family
homes on small plots of land arose at the outskirts of cities; the construction
industry soared. For more information, see United States (Culture): Way
of Life; Living Patterns.
Finally, the car industry pioneered new
ways to distribute and sell products. Auto companies sold cars through networks
of dealers to customers who often used a new type of credit, the installment
plan. With this plan, the purchaser made an initial payment, or down payment,
and then agreed to pay the balance of the purchase price in a series of
payments.
Cars were just one growth sector of the
1920s. Energy use tripled, and electricity reached 60 percent of American homes.
Industry produced new home appliances such as refrigerators, washing machines,
and vacuum cleaners. As incomes rose, families spent larger portions of their
incomes to buy these durable goods; items previously considered luxuries now
became necessities. Chain stores, such as A&P, put local retailers out of
business; canned goods and commercial breads replaced homemade products. The
young advertising industry, which had appeared in the late 19th century, fed a
desire for consumer goods. Extensive credit abetted this desire, known as
consumerism.
During the decade, American
corporations became larger. Some grew by securing markets abroad, as did the
United Fruit Company in Latin America. Others grew through consolidation. Large
companies came to dominate many industries. By the end of the 1920s, 100
corporations controlled nearly half the nation’s business.
The vast growth of business in the
1920s transformed many areas of life, but failed to distribute benefits equally.
Industrial workers did not reap the profit of increased productivity. Wages rose
but not as fast as prices. Unions competed with company unions
(employer-established organizations) and battled the National Association of
Manufacturers, which sought to break union power. Union membership dropped from
about 5 million in 1920 to 3.4 million in 1930.
Agriculture suffered as well. Markets
for farm products declined after army purchases ended and European farming
revived. Farmers produced more, and prices continued to fall. The annual income
of farmers declined, and they fell further into debt. Like many other Americans,
rural families became mired in a web of credit and consumption.
B | Mass Culture |
Leisure industries, too, turned to mass
production. Amusements of bygone days—amateur theatricals, sleigh rides—gave way
to new industries in entertainment and culture. Rural or urban, Americans
nationwide read mass-circulation magazines, full of advertising, such as The
Saturday Evening Post, Reader’s Digest, or The Ladies’ Home
Journal. They listened on the radio to the same popular music, comedy shows,
and commercials, broadcast by new radio networks such as National Broadcasting
Company (NBC) and Columbia Broadcasting System (CBS). Motion pictures gained
vast urban audiences, and in 1927 Al Jolson’s film The Jazz Singer
introduced sound to movie audiences. Fans followed the careers of movie stars in
film magazines. The press also tracked other celebrities, such as Charles
Lindbergh, who flew the first transatlantic flight in 1927, or novelist F. Scott
Fitzgerald and his wife, Zelda, who epitomized an icon of the 1920s, the
flapper.
Young and uninhibited, the flapper
represented much of what typified the Jazz Age of the 1920s—youthful rebellion,
female independence, exhibitionism, competitiveness, and consumerism. Although a
symbol of liberation, the flapper was in fact the ultimate consumer, dependent
on a variety of products. With her bobbed hairdos, short skirts, makeup, and
cigarettes, she supported growth industries of the 1920s—the beauty parlor, the
ready-made clothing industry, cosmetic manufacture, and tobacco production.
Consumerism linked the carefree, adventurous mood of the Jazz Age with the
dominance of large corporations and their conservative values.
Among African Americans, the great
migration of Southern blacks to Northern jobs during the war created strong
African American communities. During the 1920s these communities were home to
cultural revivals, such as the Harlem Renaissance, where art, music, and
literature flourished. The “New Negro,” a term used by critic and historian
Alain Locke, celebrated African American heritage and racial identity. As black
creativity flourished, African Americans began to raise their voices for
equality. Interest also arose in black nationalism. Some African Americans
became followers of Jamaican black nationalist Marcus Garvey, who urged racial
pride, formed the Universal Negro Improvement Association (UNIA), and led a
“Back to Africa” movement. At its height the UNIA claimed more than 2 million
members. It declined after Garvey was convicted of fraud and deported to Jamaica
in 1927.
C | Political Conservatism |
Many Americans of the 1920s endorsed
conservative values in politics and economics. Republican presidents stood for
these values, or what President Warren G. Harding called “normalcy … a regular
steady order of things.” Under presidents Harding and Calvin Coolidge, tariffs
reached new highs, income taxes fell for people who were most well off, and the
Supreme Court upset progressive measures, such as the minimum wage and federal
child labor laws. Both Harding and Coolidge tended to favor business. “The chief
business of the American people is business,” Coolidge declared.
Republican presidents shared
isolationist inclinations in foreign policy; the United States never joined the
League of Nations. Harding and Coolidge also endorsed pacifist policies. In 1921
Harding organized the International Conference on Naval Limitation, known as the
Washington Conference, a pioneering effort to reduce arms and avoid an expensive
naval arms race. Attended by the United Kingdom, Japan, France, Italy, and other
countries, the conference proposed destruction of ships and a moratorium on new
construction. In 1928, under Coolidge, the United States and France cosponsored
the Kellogg-Briand Pact, which renounced aggression and called for the end of
war. As a practical instrument for preventing war, the treaty was useless.
However, it helped to establish the 20th-century concept of war as an outlaw act
by an aggressor state on a victim state.
While remaining aloof from
international concerns, the United States began to close its doors to
immigrants. Antiforeign sentiment fueled demands for immigration limits.
Protests against unrestricted immigration came from organized labor, which
feared the loss of jobs to newcomers, and from patriotic organizations, which
feared foreign radicalism.
Efforts to limit immigration led to the
National Origins Act, passed by Congress in 1924. The law set an annual quota on
immigration and limited the number of newcomers from each country to the
proportion of people of that national origin in the 1890 population. (In 1929
the basis for the quotas was revised to the 1920 population.) The law
discriminated against the most recent newcomers, southern and eastern Europeans,
and excluded Asian immigrants almost entirely. Latin American immigration,
however, was unlimited. Immigration from Mexico surged in the 1920s, abetted by
the Mexican Revolution and by the need of southwestern businesses for
agricultural labor. More than 1 million Mexicans (10 percent of the Mexican
population) arrived in the United States from 1910 to 1930.
What happened to more critical voices
in the conservative era? Radical political activism waned, dimmed by the Red
Scare of 1919. Social criticism appeared in literary magazines such as The
Masses; in newspapers such as the Baltimore Sun, where journalist H.
L. Mencken published biting commentary; and in popular fiction such as Sinclair
Lewis’s novel Babbitt (1922), an assault on provincial values. Some
intellectuals fled the United States and settled in Paris. Progressivism faded.
Its most enduring vestige, the post-suffrage women’s movement, faced its own
problems.
Enthused by winning the right to vote,
women of the 1920s pursued political roles as voters, candidates, national
committeewomen, and activists in voluntary groups. But the women’s movement
still encountered obstacles. Women’s organizations did not agree on supporting
the Equal Rights Amendment (ERA), first proposed in 1923. The amendment would
have made illegal all forms of discrimination based on sex. The National Woman’s
Party, led by Alice Paul, pressed for passage of the amendment, but most women’s
organizations, including the newly formed League of Women Voters, did not
support it, and the ERA made no progress.
Women reformers also suffered setbacks
in national politics. The Sheppard-Towner Act of 1921, a pioneering health-care
measure aimed at women voters, provided matching funds for prenatal and
baby-care centers in rural areas, but Congress repealed the law in 1929. Other
important goals of women reformers, such as a federal child labor law and the
minimum wage, failed as well.
D | Political Conflicts |
Political and cultural debates divided
Americans of the 1920s. Major issues of the decade reflected a split between
urban and rural, modern and traditional, radical and reactionary. Nativist,
anti-radical sentiments emerged in a 1921 trial, the Sacco-Vanzetti Case. Two
anarchists, Italian immigrants, were tried and convicted of murder. Many
believed that the men’s immigrant origins and political beliefs played a part in
their convictions. The case evoked protests from socialists, radicals, and
prominent intellectuals, and remained a source of conflict for decades. Nativism
also inspired the revival of the Ku Klux Klan in the 1920s. The new Klan
targeted Catholics, Jews, and immigrants, as well as African Americans. It
thrived in the Midwest and Far West, as well as in the South. With its women’s
auxiliary, the Women of the Klan, it raised millions of dollars and wielded
political power in several states, including Oklahoma, Oregon, and Indiana.
Conflict also arose over religious
fundamentalism. In 1925 John T. Scopes, a Tennessee schoolteacher, was tried for
breaking a state law that prohibited the teaching of the theory of evolution in
schools. This theory, its foes said, contradicted the account of creation in the
Bible. Scopes and the American Civil Liberties Union believed that the law
violated freedom of speech, an argument made by Scopes’s lawyer, Clarence
Darrow. Reporters converged on Dayton, Tennessee, to witness the courtroom
battle between traditionalism and modernism. Scopes was convicted, although the
verdict was later reversed on technical grounds (see Scopes Trial).
The battle over Prohibition, finally,
symbolized the divisive spirit of the 1920s. “Drys” favored Prohibition and
“wets” opposed it. The Volstead Act of 1919, which enforced the 18th Amendment,
prohibited the manufacture, sale, or distribution of alcoholic beverages, but
was riddled with loopholes. Organized crime entered the liquor business; rival
gangs and networks of speakeasies induced a crime wave. By the end of the 1920s,
Prohibition was discredited, and it was repealed in 1933.
Meanwhile, the conflict between “wets”
and “drys” played a role in the presidential election of 1928. The Democratic
candidate, Al Smith, governor of New York, was a machine politician and a “wet,”
who represented urban, immigrant constituencies. Republican Herbert Hoover, an
engineer from Iowa, was a “dry” who represented rural, traditional
constituencies. A foe of government intervention in the economy, Hoover
envisioned a rational economic order in which corporate leaders acted for the
public good. Promising voters “a chicken for every pot and a car in every
garage,” Hoover won a substantial majority of votes, except in the nation’s
largest cities. But he had the misfortune to assume office just before the
nation encountered economic collapse.
XX | THE GREAT DEPRESSION |
In 1929, Hoover’s first year as president,
the prosperity of the 1920s capsized. Stock prices climbed to unprecedented
heights, as investors speculated in the stock market. The speculative binge, in
which people bought and sold stocks for higher and higher prices, was fueled by
easy credit, which allowed purchasers to buy stock “on margin.” If the price of
the stock increased, the purchaser made money; if the price fell, the purchaser
had to find the money elsewhere to pay off the loan. More and more investors
poured money into stocks. Unrestrained buying and selling fed an upward spiral
that ended on October 29, 1929, when the stock market collapsed. The great crash
shattered the economy. Fortunes vanished in days. Consumers stopped buying,
businesses retrenched, banks cut off credit, and a downward spiral began. The
Great Depression that began in 1929 would last through the 1930s.
A | Causes of the Depression |
The stock market crash of 1929 did not
cause the Great Depression, but rather signaled its onset. The crash and the
depression sprang from the same cause: the weaknesses of the 1920s economy. An
unequal distribution of income meant that working people and farmers lacked
money to buy durable goods. Crisis prevailed in the agricultural sector, where
farmers produced more than they could sell, and prices fell. Easy credit,
meanwhile, left a debt burden that remained unpayable.
The crisis also crossed the Atlantic.
The economies of European nations collapsed because they were weakened by war
debts and by trade imbalances; most spent more on importing goods from the
United States than they earned by exporting. European nations amassed debts to
the United States that they were unable to repay. The prosperity of the 1920s
rested on a weak foundation.
B | Effects of the Depression |
After the crash, the economy raced
downhill. Unemployment, which affected 3 percent of the labor force in 1929,
reached 25 percent in 1933. With one out of four Americans out of work, people
stopped spending money. Demand for durable goods—housing, cars, appliances—and
luxuries declined, and production faltered. By 1932 the gross national product
had been cut by almost one-third. By 1933 over 5,000 banks had failed, and more
than 85,000 businesses had gone under.
The effects of the Great Depression were
devastating. People with jobs had to accept pay cuts, and they were lucky to
have work. In cities, the destitute slept in shanties that sprang up in parks or
on the outskirts of town, wrapped up in “Hoover blankets” (newspapers) and
displaying “Hoover flags” (empty pockets). On the Great Plains, exhausted land
combined with drought to ravage farms, destroy crops, and turn agricultural
families into migrant workers. An area encompassing parts of Kansas, Oklahoma,
Texas, New Mexico, and Colorado became known as the Dust Bowl. Family life
changed drastically. Marriage and birth rates fell, and divorce rates rose.
Unemployed breadwinners grew depressed; housewives struggled to make ends meet;
young adults relinquished career plans and took whatever work they could
get.
C | Relief Efforts |
Modest local welfare resources and
charities barely made a dent in the misery. In African American communities,
unemployment was disproportionately severe. In Chicago in 1931, 43.5 percent of
black men and 58.5 percent of black women were out of work, compared with 29.7
percent of white men and 19.1 percent of white women. As jobs vanished in the
Southwest, the federal government urged Mexican Americans to return to Mexico;
some 300,000 left or were deported.
On some occasions, the depression called
up a spirit of unity and cooperation. Families shared their resources with
relatives, and voluntary agencies offered what aid they could. Invariably, the
experience of living through the depression changed attitudes for life. “There
was one major goal in my life,” one woman recalled, “and that was never to be
poor again.”
President Hoover, known as a progressive
and humanitarian, responded to the calamity with modest remedies. At first, he
proposed voluntary agreements by businesses to maintain production and
employment; he also started small public works programs. Hoover feared that if
the government handed out welfare to people in need, it would weaken the moral
fiber of America.
Hoover finally sponsored a measure to
help businesses in the hope that benefits would “trickle down” to others. With
his support, Congress created the Reconstruction Finance Corporation in 1932
that gave generous loans to banks, insurance companies, and railroads. But the
downward spiral of price decline and job loss continued. Hoover’s measures were
too few, too limited, and too late.
Hoover’s reputation suffered further
when war veterans marched on Washington to demand that Congress pay the bonuses
it owed them (see Bonus March). When legislators refused, much of the
Bonus Army dispersed, but a segment camped out near the Capitol and refused to
leave. Hoover ordered the army under General Douglas MacArthur to evict the
marchers and burn their settlement. This harsh response to veterans injured
Hoover in the landmark election of 1932, where he faced Democrat Franklin Delano
Roosevelt. Roosevelt was New York’s governor and a consummate politician. He
defeated Hoover, winning 57 percent of the popular vote; the Democrats also took
control of both houses of Congress. Voters gave Roosevelt a mandate for
action.
D | The New Deal |
Roosevelt was a progressive who had been
a supporter of Woodrow Wilson. He believed in active government and
experimentation. His approach to the Great Depression changed the role of the
U.S. government by increasing its power in unprecedented ways.
Roosevelt gathered a “brain
trust”—professors, lawyers, business leaders, and social welfare proponents—to
advise him, especially on economic issues. He was also influenced by his
cabinet, which included Secretary of the Interior Harold Ickes, Secretary of
State Cordell Hull, Secretary of Agriculture Henry Wallace, and Labor Secretary
Frances Perkins, the first woman cabinet member. A final influence on Roosevelt
was his wife, Eleanor, whose activist philosophy had been shaped by the women’s
movement. With Eleanor Roosevelt in the White House, the disadvantaged gained an
advocate. Federal officials sought her attention, pressure groups pursued her,
journalists followed her, and constituents admired her.
D1 | The First New Deal |
Unlike Hoover, Roosevelt took strong
steps immediately to battle the depression and stimulate the U.S. economy. When
he assumed office in 1933, a banking crisis was in progress. More than 5,000
banks had failed, and many governors had curtailed banking operations. Roosevelt
closed the banks, and Congress passed an Emergency Banking Act, which saved
banks in sounder financial shape. After the “bank holiday,” people gradually
regained confidence in banks. The United States also abandoned the gold standard
and put more money into circulation.
Next, in what was known as the First
Hundred Days, Roosevelt and the Democratic Congress enacted a slew of measures
to combat the depression and prevent its recurrence. The measures of 1933
included: the Agricultural Adjustment Act, which paid farmers to curtail their
production (later upset by the Supreme Court); the National Industrial Recovery
Act (NIRA), which established codes of fair competition to regulate industry and
guaranteed labor’s right to collective bargaining (again, the law was overturned
in 1935); and the Public Works Administration, which constructed roads, dams,
and public buildings. Other acts of the First Hundred Days created the Federal
Deposit Insurance Corporation, which insured deposits in banks in case banks
failed, and the Tennessee Valley Authority (TVA), which provided electric power
to areas of the southeast. The government also set up work camps for the
unemployed, refinanced mortgages, provided emergency relief, and regulated the
stock market through the Securities and Exchange Commission.
The emergency measures raised
employment, but the New Deal evoked angry criticism. On the right, conservative
business leaders and politicians assailed New Deal programs. In popular radio
sermons, Father Charles Coughlin, once a supporter of Roosevelt, denounced the
administration’s policies and revealed nativist, anti-Semitic views. The Supreme
Court, appointed mainly by Republicans, was another staunch foe; it struck down
many pieces of New Deal legislation, such as the NIRA, farm mortgage relief, and
the minimum wage.
On the left, critics believed that
Roosevelt had not done enough and endorsed stronger measures. In California,
senior citizens rallied behind the Townsend Plan, which urged that everyone over
the age of 65 receive $200 a month from the government, provided that each
recipient spend the entire amount to boost the economy. The plan’s popularity
mobilized support for old-age pensions. In Louisiana, Democratic governor Huey
Long campaigned for “soak the rich” tax schemes that would outlaw large incomes
and inheritances, and for social programs that would “Share Our Wealth” among
all people. The growing Communist Party, finally, urged people to repudiate
capitalism and to allow the government to take over the means of
production.
D2 | The Second New Deal |
In 1935 the New Deal veered left with
further efforts to promote social welfare and exert federal control over
business enterprise. The Securities and Exchange Commission Act of 1934 enforced
honesty in issuing corporate securities. The Wagner Act of 1935 recognized
employees’ bargaining rights and established a National Labor Relations Board to
oversee relations between employers and employees. Finally, the Work Projects
Administration put unemployed people to work on short-term public projects.
New Dealers also enacted a series of
measures to regulate utilities, to increase taxes on corporations and citizens
with high incomes, and to empower the Federal Reserve Board to regulate the
economy. Finally, the administration proposed the Social Security Act of 1935,
which established a system of unemployment insurance, old-age pensions, and
federal grants to the states to aid the aged, the handicapped, and families with
dependent children. Largely an insurance program, Social Security was the
keystone of welfare policy for decades to come.
In the election of 1936, Roosevelt
defeated his Republican opponent, Alf Landon, in a landslide and carried every
state but Maine and Vermont. The election confirmed that many Americans accepted
and supported the New Deal. It also showed that the constituency of the
Democratic Party had changed. The vast Democratic majority reflected an amalgam
of groups called the New Deal coalition, which included organized labor,
farmers, new immigrants, city dwellers, African Americans (who switched their
allegiance from the party of Lincoln), and, finally, white Southern
Democrats.
At the start of Roosevelt’s second
term in 1937, some progress had been made against the depression; the gross
output of goods and services reached their 1929 level. But there were
difficulties in store for the New Deal. Republicans resented the
administration’s efforts to control the economy. Unemployment was still high,
and per capita income was less than in 1929. The economy plunged again in the
so-called Roosevelt recession of 1937, caused by reduced government spending and
the new social security taxes. To battle the recession and to stimulate the
economy, Roosevelt initiated a spending program. In 1938 New Dealers passed a
Second Agricultural Adjustment Act to replace the first one that the Supreme
Court had overturned and the Wagner Housing Act, which funded construction of
low-cost housing.
Meanwhile, the president battled the
Supreme Court, which had upset several New Deal measures and was ready to
dismantle more. Roosevelt attacked indirectly; he asked Congress for power to
appoint an additional justice for each sitting justice over the age of 70. The
proposal threatened the Court’s conservative majority. In a blow to Roosevelt,
Congress rejected the so-called court-packing bill. But the Supreme Court
changed its stance and began to approve some New Deal measures, such as the
minimum wage in 1937.
During Roosevelt’s second term, the
labor movement made gains. Industrial unionism (unions that welcomed all the
workers in an industry) now challenged the older brand of craft unionism
(skilled workers in a particular trade), represented by the American Federation
of Labor (AFL). In 1936 John L. Lewis, head of the United Mine Workers of
America (UMWA), left the AFL to organize a labor federation based on industrial
unionism. He founded the Committee for Industrial Organizations, later known as
the Congress of Industrial Organizations (CIO). Industrial unionism spurred a
major sit-down strike in the auto industry in 1937. Next, violence erupted at a
steelworkers’ strike in Chicago, where police killed ten strikers. The auto and
steel industries, however, agreed to bargain collectively with workers, and
these labor victories led to a surge in union membership.
Finally, in 1938 Congress passed
another landmark law, the Fair Labor Standards Act (FLSA). It established
federal standards for maximum hours and minimum wages for workers in industries
involved in interstate commerce. At first the law affected only a minority of
workers, but gradually Congress extended it so that by 1970 it covered most
employees. In the 1930s, however, many New Deal measures, such as labor laws,
had a limited impact. African Americans, for instance, failed to benefit from
FLSA because they were engaged mainly in nonindustrial jobs, such as
agricultural or domestic work, which were not covered by the law. New Deal
relief programs also sometimes discriminated by race.
The New Deal never ended the Great
Depression, which continued until the United States’ entry into World War II
revived the economy. As late as 1940, 15 percent of the labor force was
unemployed. Nor did the New Deal redistribute wealth or challenge capitalism.
But in the short run, the New Deal averted disaster and alleviated misery, and
its long-term effects were profound.
One long-term effect was an activist
state that extended the powers of government in unprecedented ways, particularly
in the economy. The state now moderated wild swings of the business cycle, stood
between the citizen and sudden destitution, and recognized a level of
subsistence beneath which citizens should not fall.
The New Deal also realigned political
loyalties. A major legacy was the Democratic coalition, the diverse groups of
voters, including African Americans, union members, farmers, and immigrants, who
backed Roosevelt and continued to vote Democratic.
The New Deal’s most important legacy
was a new political philosophy, liberalism, to which many Americans remained
attached for decades to come. By the end of the 1930s, World War II had broken
out in Europe, and the country began to shift its focus from domestic reform to
foreign policy and defense.
XXI | AMERICA AND WORLD WAR II |
The roots of World War II can be found in
the debris of World War I, which left legacies of anger and hardship. After
World War I, the Treaty of Versailles imposed large reparations on Germany. The
reparations and wartime destruction caused severe economic problems in postwar
Germany. Other European nations grappled with war debts, hunger, homelessness,
and fear of economic collapse. Under these circumstances, totalitarianism
spread.
From 1922 to 1953 dictator Joseph Stalin
controlled the Union of Soviet Socialist Republics (USSR), which was formed
after the Russians Revolution of 1917. The USSR became a police state that
suppressed opponents and deprived citizens of rights. Elsewhere, militarism and
expansionism gained ground. In the 1930s the Japanese military won influence,
and Japan began to expand its territory. In 1931 Japan attacked the Chinese
province of Manchuria. Condemned by the League of Nations for its attack, Japan
quit the league. Italy turned to fascism, a strong centralized government headed
by a powerful dictator and rooted in nationalism. Fascist leader Benito
Mussolini seized power in Italy in 1922.
In Germany, the Nazi Party, led by Adolf
Hitler, came to power (see National Socialism). Hitler believed that
Aryans were a master race destined for world rule. He sought to form a great
German empire—one that gave the German people, in his words, “the land and the
soil to which they are entitled on this earth.” Global depression in the 1930s
helped bring the Nazis to power. In 1932, with 6 million Germans out of work,
the Nazis won more votes than any other party, and in 1933, just as Roosevelt
took office, Hitler became the German prime minister. Like Japan, Germany quit
the League of Nations.
Germany soon revealed its expansionist
goals. In 1933 Hitler began to build up the German military, in violation of the
Treaty of Versailles. In 1936 he sent troops into the Rhineland, a demilitarized
region in western Germany. The same year, Hitler and Mussolini signed an
alliance, the Rome-Berlin Axis Pact. In 1940 the alliance was extended to
include Japan. The three nations—Germany, Italy, and Japan—became the Axis
Powers. The start of World War II was near.
A | Isolationism vs. Internationalism |
Most Americans of the 1930s recoiled
from involvement in the European conflict; they favored U.S. isolationism, and
many supported pacifism. Some believed that “merchants of death” (bankers and
arms dealers) had lured the United States into World War I. The Roosevelt
administration, too, tried to maintain friendly foreign relations. Roosevelt
recognized the USSR in 1933 and set up a Good Neighbor Policy with Latin
America. No state, the United States said, had the right to intervene in the
affairs of another. Roosevelt also made progress toward lower tariffs and free
trade. In 1935 and 1936, Congress passed a group of neutrality acts to keep the
United States out of Europe’s troubles. The first two acts banned arms sales or
loans to nations at war. The third act, a response to the Spanish Civil War
(1936-1939), extended the ban to nations split by civil war.
But as conflict spread abroad,
Americans discarded their neutral stance. Many opposed fascist forces in the
civil war in Spain. There, democratic armies fell to dictator Francisco Franco,
who was supported by Hitler and Mussolini. Japan launched a new attack on China
in July 1937 to obtain more Chinese territory. It quickly overran northern
China. Hitler marched through Europe. Germany in 1938 annexed Austria and then
seized Czechoslovakia without resistance. In August 1939 Hitler and Stalin
signed a nonaggression pact. On September 1, 1939, Hitler invaded Poland, which
led Britain and France to declare war on Germany. Americans increasingly doubted
that the United States could avoid becoming involved.
In September 1939 Roosevelt called
Congress into special session to revise the neutrality acts. The president
offered a plan known as cash-and-carry, which permitted Americans to sell
munitions to nations able to pay for them in cash and able to carry them away in
their own ships. Isolationists objected, but Congress passed the Neutrality Act
of 1939, which legitimized cash-and-carry and allowed Britain and France to buy
American arms. The war in Europe, meanwhile, grew more dire for the Allies. In
June 1940 Germany conquered France, and British troops that had been in France
retreated across the English Channel. Then German bombers began to pound
Britain.
In June 1940 the United States started
supplying Britain with “all aid short of war” to help the British defend
themselves against Germany. Roosevelt asked Congress for more funds for national
defense. Congress complied and began the first American peacetime military
draft, the Selective Training and Service Act, under which more than 16 million
men were registered. After the 1940 election, Roosevelt urged that the United
States become “the great arsenal of democracy.” In 1941 he and British prime
minister Winston Churchill announced the Atlantic Charter, which set forth
Allied goals for World War II and the postwar period. The two nations pledged to
respect “the right of all peoples to choose the form of government under which
they will live” and promised a free world without war “after the final
destruction of Nazi tyranny.” Isolationists criticized each move towards war;
however, the United States was still not actually at war.
In 1941 the conflict worsened. Despite
the nonaggression pact, German armies invaded the USSR. Meanwhile, as Japan
continued to invade areas in Asia, U.S. relations with Japan crumbled. On
December 7, 1941, Japan attacked a U.S. naval base at Pearl Harbor in Hawaii.
The next day it attacked the main American base in the Philippines. In response,
the United States declared war on Japan, although not on Germany; Hitler acted
first and declared war on the United States. The United States committed itself
to fighting the Axis powers as an ally of Britain and France.
B | The Nation at War |
Even before Pearl Harbor, the American
government had begun to mobilize for war. After the attack, the United States
focused its attention on the war effort. World War II greatly increased the
power of the federal government, which mushroomed in size and power. The federal
budget skyrocketed, and the number of federal civilian employees tripled. The
war also made the United States a military and economic world power.
The armed forces expanded as volunteers
and draftees enrolled, growing to almost 12 million men and 260,000 women by
1945. Roosevelt formed the Joint Chiefs of Staff, a military advisory group, to
manage the huge military effort. New federal agencies multiplied. The Office of
Strategic Services gathered intelligence and conducted espionage, the War
Production Board distributed manufacturing contracts and curtailed manufacture
of civilian goods, and the War Manpower Commission supervised war industry,
agriculture, and the military. Other wartime agencies resolved disputes between
workers and management; battled inflation, set price controls, and imposed
rations on scarce items; turned out propaganda; and oversaw broadcasting and
publishing.
As the United States moved to a wartime
economy, the depression ended, and the U.S. economy came to life. Industry
swiftly shifted to war production, automakers began turning out tanks and
planes, and the United States became the world’s largest weapons manufacturer.
New industries emerged, such as synthetic rubber, which compensated for the loss
of rubber supplies when Japan seized the Dutch East Indies and Malaya. The war
economy brought new opportunities. Americans experienced virtually full
employment, longer work weeks, and (despite wage controls) higher earnings.
Unions gained members and negotiated unprecedented benefits. Farmers prospered,
too. Crop prices rose, production increased, and farm income tripled.
Labor scarcity drew women into the war
economy. During the depression, the federal government had urged women to cede
jobs to male breadwinners. However, when the war began, it sought women to work
in war production. More than 6 million women entered the work force in wartime;
women’s share of the labor force leaped from 25 percent in 1940 to 35 percent in
1945. Three-quarters of the new women workers were married, a majority were over
35, and over a third had children under 14. Many women held untraditional jobs
in the well-paid blue collar sector—in shipyards and in airplane plants, as
welders and crane operators. Women found new options in civilian vocations and
professions, too. Despite women’s gains in the workplace, many people retained
traditional convictions that women should not work outside the home. Government
propaganda promoted women’s war work as only a temporary response to an
emergency.
Members of minorities who had been out
of jobs in the 1930s also found work in the war economy. Hundreds of thousands
of African Americans migrated from the South to Northern industrial cities to
work in war industries. More than 1 million black people served in the armed
forces in segregated units; the government ended its policy of excluding blacks
from combat.
As Northern black urban populations
grew, racial violence sometimes erupted, as in the Detroit race riots of June
1943. African Americans linked the battle against Nazis abroad with the fight
for racial justice at home. Membership in the NAACP increased tenfold, and
another civil rights organization, the Congress of Racial Equality (CORE), began
in 1942. Early in 1941, labor leader A. Philip Randolph met with Roosevelt
administration officials to demand equal employment for blacks in industries
working under federal government defense contracts. Randolph threatened to lead
100,000 African Americans in a march on Washington, D.C., to protest job
discrimination. In response, Roosevelt issued a directive banning racial
discrimination in federal hiring practices and established the Fair Employment
Practices Commission. Like African Americans, Mexican Americans and Native
Americans had more job opportunities.
For all Americans, war changed the
quality of life. World War II inspired hard work, cooperation, and patriotism.
Citizens bought war bonds, saved scrap metal, and planted victory gardens. They
coped with rationing and housing shortages. The war also caused population
movement. Americans flocked to states with military bases and defense plants; 6
million migrants left for cities, many on the West Coast, where the defense
industry was concentrated. School enrollment sank as teenagers took jobs or
joined the armed services. People became more concerned about family life,
especially about working mothers, juvenile delinquency, and unruly
teenagers.
The United States began to receive
reports of the Holocaust—the Nazi effort to exterminate all of Europe’s Jews—in
1942, and the State Department recognized Hitler’s genocide by the end of that
year. However, the U.S. government gave precedence to other war matters and did
not found a War Refugee Board until 1944. The board aided in the rescue and
relocation of surviving Nazi victims, but its effort was too weak and too late
to help Europe’s Jews; approximately two-thirds of the Jewish population in
Europe was murdered during the war.
In the United States, civil liberties
were casualties of the war. In February 1942 the president authorized the
evacuation of all Japanese from the West Coast. The U.S. government interned
around 120,000 Japanese Americans, two-thirds of them native-born U.S. citizens,
in relocation centers run by the War Relocation Authority. The internment policy
reflected anti-Japanese sentiment on the West Coast that was rooted in economic
rivalry, racial prejudice, and fear of Japanese sabotage after Pearl Harbor.
(The policy affected only the mainland United States, not Hawaii, where more
than 150,000 residents of Japanese descent lived and where the United States
imposed martial law for almost three years.) Forced to sell their land and
homes, the West Coast internees ended up behind barbed wire in remote western
areas. In 1944 the Supreme Court ruled that the evacuation and internment were
constitutional in Korematsu v. United States. By then, however,
the government had started to release the internees. In 1988 Congress apologized
and voted to pay $20,000 compensation to each of 60,000 surviving
internees.
C | Global War |
Ever since 1941, when Roosevelt and
Churchill issued the Atlantic Charter outlining war goals, the president had
considered the war’s conclusion. At wartime conferences, Allied leaders looked
ahead to the war’s end. In January 1943, for instance, Britain and the United
States met at Casablanca, Morocco, and agreed not to lay down arms until certain
conditions were met: Germany, Italy, and Japan had to surrender unconditionally,
give up all conquered territory, and renounce the ideologies that spurred
aggression. At subsequent meetings, the Allied leaders reiterated this pledge
and also considered postwar occupation plans and divisions of territory.
However, the Western powers and the USSR did not trust one another and disagreed
on the postwar future of nations on the Soviet border.
In 1944 the war in the European theater
reached a climax. On the eastern front, Soviet armies had pushed Germany out of
the USSR. A turning point had come in early 1943 at Stalingrad, where the German
Sixth Army surrendered to Soviet troops. The USSR then moved into Poland and the
Balkans, and pushed the Allies to open a second front in Western Europe. The
Allied armies, under General Dwight D. Eisenhower, prepared a huge invasion of
western France. On June 6, 1944, known as D-Day, thousands of vessels and
aircraft carrying British, Canadian, American troops crossed the English Channel
and landed on the Normandy coast of France.
Allied armies, led by General George S.
Patton, smashed through German lines and started for Paris. Another Allied army
invaded southern France and pressed northward. On August 25, 1944, the Allied
forces liberated Paris after four years of Nazi rule. The Germans continued to
fight in eastern France. Hitler launched a last, desperate offensive, the Battle
of the Bulge, in December 1944. The offensive failed, and German armies were
forced to retreat. Allied armies entered Germany in March 1945, while the
Soviets moved toward Berlin from the east. Germany surrendered on May 8, 1945.
The war in Europe was over.
The treacherous Pacific war—a great
land, air, and sea battle—continued. After the attack on Pearl Harbor, Japan
conquered the Philippines, Malaya, the Dutch East Indies, and Burma. Troops from
the United States, Britain, Australia, and New Zealand tried to stop the
Japanese advance, which reached its peak in the spring of 1942. The turning
point of the Pacific war came in June 1942, at the Battle of Midway. The
American victory at Midway ended the Japanese navy’s hope of controlling the
Pacific. The United States then began a long counteroffensive and recaptured
Pacific islands that the Japanese had occupied. In October 1944 the United
States finally smashed the Japanese fleet in the Battle of Leyte Gulf in the
Philippines.
But Japan refused to surrender. The
United States wanted to end the war with unconditional surrender from Japan. It
also wanted to avoid more battles like those in Iwo Jima and Okinawa, where U.S.
casualties had been heavy. These factors spurred U.S. plans to use the atomic
bomb.
The United States in late 1941
established a secret program, which came to be known as the Manhattan Project,
to develop an atomic bomb, a powerful explosive nuclear weapon. The aim of the
project, directed by physicist J. Robert Oppenheimer, was to build an atom bomb
before Germany did. After Roosevelt’s death in April 1945, Harry S. Truman
became president and inherited the bomb-development program. At this point, the
new weapon had two purposes. First, it could be used to force Japan to
surrender. Second, possession of the bomb would enable the United States, and
not the USSR, to control postwar policy.
Should the United States use the bomb
to finally end the war with Japan? What were American options in 1945? One
option was to invade Japan, which Truman believed would cost half a million
American lives. Some historians have since estimated the likely loss of life at
25,000 to 46,000, although these figures probably cover just the first stage of
a projected November invasion. A second option was not to demand unconditional
surrender but to negotiate with Japan. A third alternative was to let a Soviet
invasion end the war against Japan, which would have diminished U.S. influence
in postwar policy. Scientists who developed the bomb debated what to do with it.
Some found it wrong to drop the bomb without warning and supported a
demonstration explosion to convince Japan to surrender. In Oppenheimer’s view,
this course of action was too uncertain and risky; only the shock of using the
bomb on a Japanese city would force Japan to surrender. President Truman
agreed.
On August 6, 1945, the United States
dropped an atomic bomb on Hiroshima. In minutes, half of the city vanished.
According to U.S. estimates, 60,000 to 70,000 people were killed or missing as a
result of the bomb. Deadly radiation reached over 100,000. On August 8, the USSR
declared war on Japan. On August 9, the United States dropped an even more
powerful bomb on Nagasaki. According to U.S. estimates, 40,000 people were
killed or never found as a result of the second bomb. On September 2, the
Japanese government, which had seemed ready to fight to the death, surrendered
unconditionally.
Should the United States have used the
bomb? Critics of the decision decry the loss of life. They contend that any of
the alternatives was preferable. Others assert that only the bomb, used in the
way that it was, could have ended the war. Above all, they argue, it saved
countless American lives. American GIs, who had been shipped halfway around the
world to invade Japan after Germany surrendered, were elated. The bomb also
precluded a Soviet invasion of Japan and gave the United States the upper hand
in the postwar world. “Let there be no mistake about it,” Truman later wrote, “I
regarded the bomb as a military weapon and never had any doubt that it should be
used.”
D | Effects of the War |
After World War II ended, the use of
the atomic bomb changed the world in many ways. Nuclear power led to a
four-decade-long arms race between the United States and the USSR, and nuclear
annihilation continues to threaten the world today. At the same time, nuclear
power enabled scientists to develop new sources of energy.
During the war, other technological and
medical advances were developed that saved lives and improved living standards
in the decades ahead. Penicillin, a “miracle drug” first used to treat Allied
casualties, was used at home to defeat disease, reduce infant deaths, and extend
life expectancy. DDT, a colorless chemical pesticide, destroyed harmful insects
and prevented typhus and malaria. New fuel mixtures extended the range of
warplanes and later of civilian planes; jet propulsion planes transformed
transoceanic flights and were in commercial use by the late 1950s. Other facets
of technology developed during World War II included radar, semiconductors,
freeze-dried food, infrared technologies, and synthetic materials.
World War II ended Nazi barbarism and
vanquished totalitarian power that threatened to conquer the globe. The cost of
the war was immense. Allied military and civilian losses were 44 million; those
of the Axis, 11 million. The United States lost almost 300,000 people in battle
deaths, which was far less than the toll in Europe and Asia. At home, the war
quenched isolationism, ended the depression, provided unprecedented social and
economic mobility, fostered national unity, and vastly expanded the federal
government. The U.S. government spent more than $300 billion on the war effort,
which generated jobs and prosperity and renewed confidence. Finally, World War
II made the United States the world’s leading military and economic force. With
the Axis threat obliterated, the United States and the USSR became rivals for
global dominance.
XXII | THE COLD WAR |
At the end of World War II, the United
States and the USSR emerged as the world’s major powers. They also became
involved in the Cold War, a state of hostility (short of direct military
conflict) between the two nations. The clash had deep roots, going back to the
Russian Revolutions of 1917, when after the Bolshevik victory, the United
States, along with Britain, France, and Japan, sent troops to Russia to support
the anti-Communists. During World War II, the United States and the USSR were
tenuously allied, but they disagreed on tactics and on postwar plans. After the
war, relations deteriorated. The United States and the USSR had different
ideologies, and they mistrusted each other. The Soviet Union feared that the
United States, the leader of the capitalist world, sought the downfall of
Communism. The United States felt threatened by Soviet expansionism in Europe,
Asia, and the western hemisphere.
The United States and the Soviet Union
disagreed over postwar policy in central and eastern Europe. The USSR wanted to
demilitarize Germany to prevent another war; to control Poland to preclude any
future invasion from its west; and to dominate Eastern Europe. Stalin saw Soviet
domination of Eastern Europe as vital to Soviet security. Within months of the
war’s end, Stalin installed pro-Soviet governments in Bulgaria, Hungary, and
Romania. Independent Communist takeovers in Albania and Yugoslavia provided two
more “satellite nations.” Finally, the Soviets barred free elections in Poland
and suppressed political opposition. In March 1946 former British prime minister
Winston Churchill told a college audience in Fulton, Missouri, that a
Soviet-made “Iron Curtain” had descended across Europe.
President Harry S. Truman, enraged at
the USSR’s moves, at once assumed a combative stance. He believed that Soviet
expansion into Poland and Eastern Europe violated national self-determination,
or the right of people to choose their own form of government; betrayed
democratic principles; and threatened the rest of Europe. In contrast to the
USSR, the United States envisioned a united, peaceful Europe that included a
prosperous Germany. Truman became an architect of American Cold War policy. So
did State Department official George Kennan, then stationed in Moscow, who in
1946 warned of Soviet inflexibility. The United States, wrote Kennan, would have
to use “vigilant containment” to deter the USSR’s inherent expansionist
tendencies. The doctrine of containment became a principle of U.S. policy for
the next several decades.
Throughout 1946 a sequence of events
drew the United States and the USSR deeper into conflict. One area of conflict
was defeated Germany, which had been split after the war into four zones:
American, British, French, and Soviet. Stalin sealed off East Germany as a
Communist state. The two countries also encountered problems beyond Europe.
In 1945 and 1946, the Soviet Union
attempted to include Turkey within its sphere of influence and to gain control
of the Dardanelles, the strait in Turkey connecting the Aegean Sea and the Sea
of Marmara. Control of the Dardanelles would give the USSR a route from the
Black Sea to the Mediterranean. In response, Truman offered Turkey large-scale
aid, and the two countries entered a close military and economic alliance.
Meanwhile, an arms race began; each superpower rejected the other’s plans to
control nuclear arms, and the United States established the Atomic Energy
Commission to oversee nuclear development. Within the year, the Cold War was
under way.
A | The Truman Doctrine |
In 1947 the Cold War conflict centered
on Greece, where a Communist-led resistance movement, supported by the USSR and
Communist Yugoslavia, threatened to overthrow the Greek monarchical government,
supported by Britain. When the British declared that they were unable to aid the
imperiled Greek monarchists, the United States acted. In March 1947 the
president announced the Truman Doctrine: The United States would help stabilize
legal foreign governments threatened by revolutionary minorities and outside
pressures. Congress appropriated $400 million to support anti-Communist forces
in Turkey and Greece. By giving aid, the United States signaled that it would
bolster regimes that claimed to face Communist threats. As George Kennan
explained in an article in Foreign Affairs magazine in 1947,
“containment” meant using “unalterable counterforce at every point” until Soviet
power ended or faded.
In 1947 the United States further
pursued its Cold War goals in Europe, where shaky postwar economies seemed to
present opportunities for Communist gains. The American Marshall Plan, an
ambitious economic recovery program, sought to restore productivity and
prosperity to Europe and thereby prevent Communist inroads (see European
Recovery Program). The plan ultimately pumped more than $13 billion into western
European economies, including occupied Germany. Stalin responded to the new U.S.
policy in Europe by trying to force Britain, France, and the United States out
of Berlin. The city was split between the Western powers and the USSR, although
it was deep within the Soviet zone of Germany. The Soviets cut off all access to
Berlin from the parts of Germany controlled by the West. Truman, however, aided
West Berlin by airlifting supplies to the city from June 1948 to May 1949
(see Berlin Airlift).
B | NATO |
In 1949 the United States joined 11
other nations (Belgium, Britain, Canada, Denmark, France, Iceland, Italy,
Luxembourg, the Netherlands, Norway, and Portugal) to form the North Atlantic
Treaty Organization (NATO), a mutual defense pact. Members of NATO pledged that
an attack on one would be an attack on all. Stalin responded by uniting the
economies of Eastern Europe under the Council for Mutual Economic Assistance
(COMECON). Then late in 1949, Americans learned that the Soviets had
successfully exploded an atomic bomb in August. Finally, in February 1950,
Stalin signed an alliance with the People’s Republic of China, a Communist state
formed in 1949.
The doctrine of “containment” now
faced big challenges. To bolster the containment policy, U.S. officials proposed
in a secret 1950 document, NSC-68, to strengthen the nation’s alliances, to
quadruple defense spending, and to convince Americans to support the Cold War.
Truman ordered the Atomic Energy Commission to develop a hydrogen bomb many
times more destructive than an atomic bomb. In Europe, the United States
supported the independence of West Germany.
Finally, the United States took
important steps to contain Communism in Europe and Asia. In Europe, the United
States supported the rearmament of West Germany. In Asia in early 1950, the
United States offered assistance to France to save Vietnam (still French
Indochina) from Communist rule, and signed a peace treaty with Japan to ensure
the future of American military bases there. Responding to the threats in Asia,
Stalin endorsed a Communist reprisal in Korea, where fighting broke out between
Communist and non-Communist forces.
C | The Korean War |
Japan had occupied Korea during World
War II. After Japan’s defeat, Korea was divided along the 38th parallel into the
Communist Democratic People’s Republic of Korea in the north and the U.S.-backed
Republic of Korea in the south. After June 1949, when the United States withdrew
its army, South Korea was left vulnerable. A year later, North Korean troops
invaded South Korea. Truman reacted quickly. He committed U.S. forces to Korea,
sent General Douglas MacArthur there to command them, and asked the United
Nations to help protect South Korea from conquest.
MacArthur drove the North Koreans back
to the dividing line. Truman then ordered American troops to cross the 38th
parallel and press on to the Chinese border. China responded in November 1950
with a huge counterattack that decimated U.S. armies. MacArthur demanded
permission to invade mainland China, which Truman rejected, and then repeatedly
assailed the president’s decision. In 1951 Truman fired him for insubordination.
By then, the combatants had separated near the 38th parallel. The Korean War did
not officially end until 1953, when President Dwight Eisenhower imposed a
precarious armistice. Meanwhile, the Korean War had brought about rearmament,
hiked the U.S. military budget, and increased fears of Communist aggression
abroad and at home.
D | Cold War at Home |
As the Cold War intensified, it
affected domestic affairs. Many Americans feared not only Communism around the
world but also disloyalty at home. Suspicion about Communist infiltration of the
government forced Truman to act. In 1947 he sought to root out subversion
through the Federal Employee Loyalty Program. The program included a loyalty
review board to investigate government workers and fire those found to be
disloyal. The government dismissed hundreds of employees, and thousands more
felt compelled to resign. By the end of Truman’s term, 39 states had enacted
antisubversion laws and loyalty programs. In 1949 the Justice Department
prosecuted 11 leaders of the Communist Party, who were convicted and jailed
under the Smith Act of 1940. The law prohibited groups from conspiring to
advocate the violent overthrow of the government.
The Communist Party had reached the
peak of its strength in the United States during World War II, when it claimed
80,000 members. Some of these had indeed worked for the government, handled
classified material, or been part of spy networks. Although Communist party
membership had fallen to under 30,000 by the 1950s, suspicion about disloyalty
had grown. Concerned about the Sino-Soviet alliance and the USSR’s possession of
atomic weapons, many Americans feared Communist spies and Soviet penetration of
federal agencies.
Attention focused on two divisive
trials. In August 1948 Time magazine editor Whittaker Chambers, a former
Communist, accused former State Department official Alger Hiss of being a member
of the Communist Party and, subsequently, of espionage. Hiss sued Chambers for
slander, but Hiss was convicted of perjury in 1950 and jailed (see Hiss
Case). In 1951 Julius and Ethel Rosenberg were convicted of espionage for
stealing atomic secrets. They were executed two years later. Both of these
trials and convictions provoked decades of controversy. Half a century later,
the most recent evidence seems to support the convictions of Alger Hiss and
Julius Rosenberg.
Meanwhile, Congress began to
investigate suspicions of disloyalty. The House Un-American Activities Committee
(HUAC) sought to expose Communist influence in American life. Beginning in the
late 1940s, the committee called witnesses and investigated the entertainment
industry. Prominent film directors and screenwriters who refused to cooperate
were imprisoned on contempt charges. As a result of the HUAC investigations, the
entertainment industry blacklisted, or refused to hire, artists and writers
suspected of being Communists.
One of the most important figures of
this period was Senator Joseph McCarthy of Wisconsin, who gained power by
accusing others of subversion. In February 1950, a few months after the USSR
detonated its first atomic device, McCarthy claimed to have a list of Communists
who worked in the State Department. Although his accusations remained
unsupported and a Senate committee labeled them “a fraud and a hoax,” McCarthy
won a national following. Branding the Democrats as a party of treason, he
denounced his political foes as “soft on Communism” and called Truman’s loyal
secretary of state, Dean Acheson, the “Red Dean.” McCarthyism came to
mean false charges of disloyalty.
In September 1950, goaded by McCarthy,
Congress passed, over Truman’s veto, the McCarran Internal Security Act, which
established a Subversive Activities Control Board to monitor Communist influence
in the United States. A second McCarran act, the Immigration and Nationality Act
of 1952, also became law over Truman’s veto. It kept the quota system based on
national origin, although it ended a ban on Asian immigration, and required
elaborate security checks for foreigners visiting the United States.
The Cold War played a role in the
presidential contest of 1952 between Republican Dwight Eisenhower and Democrat
Adlai Stevenson. Many voters feared Soviet expansionism, Soviet atomic
explosions, and more conflicts like Korea. Eisenhower’s running mate, former
HUAC member Richard M. Nixon, charged that a Democratic victory would bring
“more Alger Hisses, more atomic spies.” Eisenhower’s soaring popularity led to
two terms as president.
McCarthy’s influence continued until
the Army-McCarthy hearings of 1954, when the Senate investigated McCarthy’s
enquiry into the army. The Senate censured him on December 2, 1954, for abusing
his colleagues, and his career collapsed. But fears of subversion continued.
Communities banned books; teachers, academics, civil servants, and entertainers
lost jobs; and unwarranted attacks ruined lives. Communists again dwindled in
number after 1956, when Stalin was revealed to have committed extensive crimes.
Meanwhile, by the end of the decade, new right-wing organizations such as the
John Birch Society condemned “creeping socialism” under Truman and Eisenhower.
McCarthyism left permanent scars.
E | The Cold War Under Eisenhower |
When Eisenhower took office in 1953,
he moved to end the war in Korea, where peace talks had been going on since
1951. Eisenhower’s veiled threat to use nuclear weapons broke the stalemate. An
armistice, signed in July 1953, set a boundary between the two Koreas near the
38th parallel. Eisenhower then reduced the federal budget and cut defense
spending. Still, he pursued the Cold War.
When Stalin died in 1953, the United
States and the USSR had an opportunity to ease tensions. However, the USSR
tested a nuclear bomb in 1954, and Eisenhower needed to appease Republicans who
urged more forceful efforts to defeat Communism. He relied on his secretary of
state, John Foster Dulles, who called for “liberation” of the captive peoples of
Eastern Europe and the end of Communism in China. Dulles was willing to bring
the world to “the brink of war” to intimidate the USSR. With reduced
conventional forces, Dulles’s diplomacy rested on threats of “massive
retaliation” and brinksmanship, a policy of never backing down in a
crisis even at the risk of war.
In 1955 the United States and USSR met
in Geneva, Switzerland, to address mounting fears about radioactive fallout from
nuclear tests. Discussions of “peaceful coexistence” led the two nations to
suspend atmospheric tests of nuclear weapons. Still, the United States spent
more on nuclear weapons and less on conventional forces.
Dulles, meanwhile, negotiated pacts
around the world committing the United States to the defense of 43 nations. The
focus of the Cold War now shifted to the so-called Third World, where the
Central Intelligence Agency (CIA) represented U.S. interests. Established in
1947 to conduct espionage and assess information about foreign nations, the CIA
carried out covert operations against regimes believed to be Communist or
supported by Communist nations. In 1954, for example, the CIA helped bring down
a Guatemalan government that the United States believed was moving towards
Communism.
Finally, to stop the USSR from
spreading Communism, the United States became involved in Indochina and the
Middle East. In Vietnam, Ho Chi Minh, a nationalist and a Communist, led a
movement for independence from France. The Truman administration had aided
France, but in 1954 the French were defeated. An international peace conference
in Geneva divided Vietnam at the 17th parallel. The United States refused to
sign the Geneva Accords, which it believed conceded too much to the Communists.
Instead the United States sent economic aid and military advisers to South
Vietnam from 1954 to 1961. Although Eisenhower feared further involvement in
Vietnam, he supported what was called the domino theory: If Vietnam fell to
Communism, all of Southeast Asia might follow.
In the Middle East, the United States
promised a loan to Egypt’s new ruler, Gamal Abdel Nasser, to build the Aswan
High Dam on the Nile River. But when Nasser bought arms from Communist
Czechoslovakia, the United States canceled the loan. Nasser retaliated in July
1956 by nationalizing the Anglo-French Suez Canal, an artificial waterway across
the Isthmus of Suez in northeastern Egypt. Britain, France, and Israel (formed
in 1948) responded with force, which the United States condemned. The invaders
of Egypt withdrew, and the Suez crisis was defused.
In reaction to the Suez crisis, the
United States announced a new policy, the Eisenhower Doctrine: The United States
would intervene in the Middle East if necessary to protect the area against
Communism. In July 1958 the United States sent 14,000 marines to Lebanon during
a civil war that the United States feared would destabilize the region.
In the USSR, Stalin’s successor,
Nikita Khrushchev, did his part to keep the Cold War alive. He extended Soviet
influence by establishing relations with India and with other nations that were
not aligned with either side in the Cold War. In 1955 Khrushchev created the
Warsaw Pact, a military alliance of seven European Communist nations, to secure
the Soviet position in Europe. In 1956 he used force in Hungary and political
pressure in Poland to ensure continued Soviet control of those countries. He
increased Soviet power by developing a hydrogen bomb, and by launching the first
earth satellite in 1957. Finally, he formed an alliance with Cuba after Fidel
Castro led a successful revolution there in 1959.
At the end of Eisenhower’s second
term, the Cold War still dominated American foreign policy. United States
efforts around the world to quell Communist-inspired or nationalist insurgencies
sometimes caused anger. In 1958 angry crowds in Peru and Venezuela stoned Vice
President Nixon’s car. On May 1, 1960, the Soviets shot down a U-2 spy plane,
and plans for a second summit collapsed. When Eisenhower left office, he warned
against “unwarranted influence … by the military-industrial complex.” But the
nuclear arms race had intensified, and the Cold War seemed to be widening.
The Cold War brought divisiveness and
discord in the United States. Americans of the 1950s clashed on the extent of
the threat posed by Communism at home and abroad. Historians debate this
question, too, as well as the origins of the Cold War. Some contend that Soviet
aggression in the postwar era reflected valid concerns for security, and that a
series of hostile acts by the United States provoked the USSR to take
countermeasures. Others argue, variously, that Communism was inherently
expansionist; that Soviet aggression was a natural outgrowth of Communism; that
with Stalin in power, the Cold War was inevitable; that the USSR was bent on
establishing Communist regimes in every region where a power vacuum existed; and
that containment was a necessary and successful policy.
Starting in the early 1990s, scholars
have gained access to Soviet evidence that was previously unavailable. New
revelations from Russian archives—as well as declassification in 1995 and 1996
of U.S. intelligence files on interception of Soviet spy cables, known as the
Venona decryptions—has recently made possible new scholarship on the Cold War
era. For the moment, debates about U.S. Cold War policy are likely to
remain.
XXIII | A WORLD OF PLENTY |
In the post-World War II decade, the
United States was the richest nation in the world. After a brief period of
postwar adjustment, the economy boomed. Consumers demanded goods and services.
Businesses produced more to meet this demand, and increased production led to
new jobs. Federal foreign aid programs, such as the Marshall Plan, provided
overseas markets for U.S. businesses. Finally, the government spent large
amounts of money by providing loans, fighting the Cold War, and funding social
programs. Government spending plus consumer demand led to an era of widespread
prosperity, rising living standards, and social mobility.
A | The Postwar Administrations |
As the nation demobilized, President
Harry S. Truman faced a political battle. A one-time courthouse politician who
owed his political success to the Democratic political machine of Kansas City,
Truman had been a liberal senator and loyal New Dealer. Assertive and
self-confident, he capably assumed the presidency after Roosevelt’s death at the
end of World War II. But in 1946, Truman encountered the Republican-dominated
80th Congress, the first time Republicans won control of both houses since
1928.
In 1947 Congress passed the
Labor-Management Relations Act, known as the Taft-Hartley Act, over Truman’s
veto. The act was a restrictive labor law that handicapped labor and boosted
employer power. For instance, it banned closed shops, thereby enabling employers
to hire nonunion workers; it revived the labor injunction as a way to end
strikes and boycotts; and it allowed states to pass right-to-work laws that
forbade making union membership a condition of hiring.
Congress also rejected Truman’s
efforts to improve civil rights for African Americans. It refused to pass
federal antilynching laws or to abolish the poll tax. In 1948, however, Truman
integrated the armed forces by an executive order. He also ordered an end to
discrimination in the hiring of federal employees.
Southern Democrats never liked
Truman. At the Democratic convention of 1948, they withdrew from the party to
form a states’ rights party, the Dixiecrats. Truman also faced a challenge from
the left, when Henry Wallace ran as the presidential candidate of the
Progressive Party. Both of these challenges took Democratic votes from Truman,
and most observers expected that his Republican opponent, New York governor
Thomas E. Dewey, would defeat him. But the scrappy president won reelection. The
81st Congress continued to reject his social and economic proposals,
known as the Fair Deal. Legislators defeated, for instance, a measure for
national compulsory health insurance. Still, Truman succeeded in raising the
minimum wage, extending social security coverage, and building low-income
housing.
Elected president by big margins in
1952 and 1956, Dwight D. Eisenhower enjoyed immense popularity. A pragmatic,
centrist Republican, Eisenhower believed in smaller government, fiscal
conservatism, and a businesslike administration.
Eisenhower continued some New Deal
policies. He expanded social security, raised the minimum wage, and backed a
huge public works program, the Federal Highway Act of 1956, which provided funds
for the Interstate Highway System. He also cut defense spending and presided
over an era of peace and prosperity.
In 1953 Eisenhower appointed Earl
Warren as chief justice of the Supreme Court, an appointment that began a new
era in judicial history. The Warren Court transformed the American legal system
by expanding civil rights and civil liberties. In the 1950s the Court broadened
the rights of the accused and overturned the 1949 convictions of Communist
leaders who had been tried under the Smith Act. Most important, in Brown
v. Board of Education of Topeka (1954), the Warren court
declared that school segregation violated the equal protection clause of the
14th Amendment. Concluding that “separate educational facilities are inherently
unequal,” it declared segregated schools unconstitutional.
In 1955 the Court ordered the states
to desegregate schools “with all deliberate speed.” However, many people
resisted school integration. In 1957 the governor of Arkansas, Orval Faubus,
tried to block the enrollment of nine black students into Little Rock High
School. In response, Eisenhower, never a strong civil rights supporter,
reluctantly sent federal troops to desegregate the school. The Brown
decision began a new era in civil rights.
The Eisenhower administration also
ushered in the age of modern space exploration. In 1958 Congress formed the
National Aeronautics and Space Administration (NASA) to oversee a civilian space
program. NASA’s birth reflected the Cold War competition between the United
States and the USSR for supremacy in space. The Soviets launched Sputnik 1 (an
artificial satellite) in October 1957. The United States followed with Explorer
1 in January 1958. In 1961 the Soviets hurled the first astronaut, Yuri Gagarin,
into orbit. The same year, Alan Shepard, one of seven American astronauts
trained in Project Mercury, went into space on a suborbital flight. In 1962 John
Glenn became the first American to orbit the earth.
B | The Prosperous Fifties |
Eisenhower oversaw a productive and
prosperous era. Government spending plus consumer demand boosted the gross
national product (GNP). With 6 percent of the world’s population, the United
States produced half the world’s goods. Technological advances, many achieved
with federal aid, ushered in new industries and sped up the pace of production
in old ones.
The nation’s five largest
industries—autos, oil, aircraft, chemicals, and electronics—illustrated a leap
in productivity. The auto industry, the nation’s largest, lowered labor costs by
using more automated machines. Oil replaced coal as the nation’s major energy
source. The aircraft industry profited from defense spending, space research,
and commercial airlines’ shift to jet propulsion. The chemical industry offered
new consumer goods, such as synthetic fibers, chemical fertilizers, and
plastics. Computers, too, began to have an effect in the business world. By the
mid-1960s, more than 30,000 mainframe computers were in use.
As productivity rose, the labor
market changed. Fewer people held blue-collar jobs, and more did white-collar
work. Employment grew rapidly in the service sector, which includes sales work,
office work, and government jobs. More American wage earners worked for large
corporations or for state or federal agencies than in small enterprise.
Businesses expanded by swallowing weaker competitors, as happened in the steel,
oil, chemical, and electrical machinery industries. Corporations formed huge new
conglomerates (mergers of companies in unrelated industries). In addition,
companies offering similar products or services in many locations, known as
franchises, increased; the first McDonald’s franchise opened in 1955.
Some big corporations established
overseas operations and became multinational. Producers in the United States
depended on world markets to buy oil, iron, steel, and food that they exported.
They also increased their overseas investments. Standard Oil (later Exxon), for
instance, developed oil resources in Venezuela and the Middle East. Coca-Cola
swept through Europe, where it set up bottling factories. New types of
bureaucrats ran the big businesses of postwar America. In The Organization
Man (1956), sociologist William H. Whyte wrote that employers sought
managers who would adapt to corporate culture, which rewarded teamwork and
conformity.
C | The Middle Class Expands |
Many factors converged to provide
unparalleled social mobility in postwar America. Most important, income rose.
Between 1945 and 1960, the median family income, adjusted for inflation, almost
doubled. Rising income doubled the size of the middle class. Before the Great
Depression of the 1930s only one-third of Americans qualified as middle class,
but in postwar America two-thirds did.
The growth of the middle class
reflected full employment, new opportunities, and federal spending, which
contributed mightily to widespread prosperity. During the war, for example, the
U.S. government built many new factories, which provided jobs. The federal
government also directly aided ambitious Americans. In 1944 Congress passed the
Servicemen’s Readjustment Act, known as the GI Bill of Rights. Under the law,
the government paid part of tuition for veterans and gave them unemployment
benefits while they sought jobs. It also provided low-interest loans to veterans
buying homes or farms, or starting businesses. The GI Bill and other federal
programs offered mortgages for home buyers.
New middle-class families of postwar
America became suburban families. Of 13 million new homes built in the 1950s, 85
percent were in the suburbs. By the early 1960s, suburbs surrounded every
city.
New families of the postwar era
created a baby boom. The birth rate soared from 1946 to 1964, and peaked in
1957, when a baby was born every 7 seconds. Overall, more than 76 million
Americans were part of the baby boom generation. The Common Sense Book of
Baby and Child Care (1946), by Dr. Benjamin Spock, sold a million copies a
year in the 1950s, and popular culture glorified suburban homemakers.
However, more and more women entered
the job market. New women workers were increasingly likely to be middle-aged and
middle class. By 1960 almost two out of five women with school-age children held
jobs. Some women workers supported households alone; many were wives whose
second incomes helped their families attain middle-class lifestyles.
As suburbs, generally without public
transportation, grew, cars became necessary and auto sales increased. Easy
credit facilitated the purchase of cars. The number of cars on the road leaped
from 40 million in 1950 to 60 million in 1960. The Federal Highway Act of 1956
created the Interstate Highway System, a 68,400-km (42,500-mi) network of
limited-access highways. This system spurred further suburban growth.
Middle-class families bought not only
homes and cars, but educational opportunities. Between 1940 and 1960, the
percentage of college-age Americans who attended college almost doubled. Again,
the federal government played a role. In 1958 Congress passed the National
Defense Education Act, which provided loans to college students and funds for
teacher training and instructional materials. Cold War enthusiasm for
technological advances also affected research. By 1960 one-third of scientists
and engineers in universities worked on government research, mainly defense
projects.
D | Consumers |
World War II limited the products
that consumers could buy, but at its end, consumer demand fueled the postwar
economy. By the end of the 1950s, three out of five families owned homes, and
three out of four owned cars. Consumers chose among a wealth of new products,
many developed from wartime innovations, including polyester fabrics—rayon,
dacron, orlon—and new household appliances such as freezers, blenders, and
dishwashers. Manufacturers urged new models on consumers. Americans acquired
more private debt with the introduction of credit cards and installment plans.
Home mortgages increased the debt burden.
Businesses tried to increase consumer
spending by investing more money in advertising, especially in television ads.
Television played a pivotal role in consumption—both as a product to be bought
and a mode of selling more products. The first practical television system began
operating in the 1940s. Television reached 9 percent of homes in 1950 and almost
90 percent in 1960. Audiences stayed home to watch live productions of beloved
comedies, such as “I Love Lucy” (1951-1957), and the on-the-scene reporting of
Edward R. Murrow. TV Guide became one of the most popular magazines.
Television programming of the 1950s, which catered to potential consumers,
portrayed a middle-class, homogeneous society. But the less visible, less
prosperous parts of society were also an important facet of the postwar
era.
E | Other Americans |
The widespread prosperity of postwar
America failed to reach everyone. In The Other America: Poverty in the United
States (1962), political activist Michael Harrington revealed an economic
underworld of 40 million to 50 million Americans, who were excluded from
affluence and were socially invisible. At the end of the 1950s, nearly one-fifth
of the population lived below the poverty line. The poor included many groups:
the uninsured elderly, migrant farm workers, families in the Appalachian hills,
and residents of inner-city slums.
When many middle-class Americans left
the city for the suburbs, they left behind urban areas with antiquated schools
and deteriorating public facilities. They also left behind high concentrations
of poor people, which meant a dwindling tax base. Federal aid, which provided
the middle class with mortgages and highways, had less influence on the poor.
Federal housing programs, urban renewal efforts, and slum clearance projects
often did little more than move poor city dwellers from one ghetto to another.
What Harrington called the culture of poverty—that is, living without adequate
housing, food, education, medical care, job opportunities, or
hope—remained.
Poverty affected minority groups in
the 1950s. In the 1940s, when labor was scarce, the United States established
the Emergency Labor Program, popularly known as the Bracero Program. Braceros,
whose name derived from the Spanish word brazo (arm), were Mexican manual
laborers allowed to enter the United States to replace American workers who
joined the armed forces. Many Mexicans who entered the United States under the
Bracero Program remained in the country illegally. To curb illegal immigration
from Mexico, the United States in 1954 began Operation Wetback, a program to
find illegal immigrants and return them to Mexico. During the 1950s, several
million Mexicans were deported. But illegal entrants continued to arrive, often
to become low-paid laborers. Most of the postwar Mexican American population
settled in cities, such as Los Angeles, Denver, El Paso, Phoenix, and San
Antonio. One-third of Mexican Americans in the 1950s lived below the poverty
line.
Federal policy toward Native
Americans underwent several reversals in the 20th century. In 1934 Congress
passed the Indian Reorganization Act, which granted Native Americans the right
to elect tribal councils to govern reservations. In 1953 the federal government
changed its position and adopted a “termination” policy. Congress passed a
resolution to end its responsibility for Native American tribes. The resolution
terminated Native American status as wards of the United States, granted Native
Americans citizenship, eliminated financial subsidies, discontinued the
reservation system, and distributed tribal lands among individual Native
Americans. This redistribution made thousands of acres of reservation land
available to non-Indians, such as real estate dealers. From 1954 to 1960, the
federal government initiated a voluntary relocation program to settle Native
Americans in urban areas. The new policies failed, and in 1963 the government
abandoned termination.
African Americans of the postwar era
continued their exodus from the South. Waves of black migrants, mainly young,
left the rural South for Northern cities. The introduction of new machinery,
such as the mechanical cotton-picker, reduced the need for field labor and
eliminated sharecropping as a way of life. From the end of World War II to 1960,
nearly 5 million blacks moved from the rural South to cities in the North. By
1950 one-third of blacks lived outside the South.
Simultaneously, the black population
moved within the South. By 1960 almost three out of five Southern blacks lived
in towns and cities, concentrated in large metropolitan areas such as Atlanta
and Washington, D.C. Large-scale migration to cities spurred rising aspirations,
soon evident in the postwar civil rights movement.
F | The Civil Rights Movement Begins |
In the 1940s and 1950s the NAACP
attacked race discrimination in the courts. It chipped away at Plessy v.
Ferguson (1896), a Supreme Court decision upholding segregationist laws.
The NAACP lawyers’ greatest success was the Brown v. Board of
Education of Topeka decision in 1954, in which the Supreme Court ordered
desegregation of schools. The decision struck a Chicago newspaper as a “second
emancipation proclamation.”
The Supreme Court’s implementation
order of 1955, designed to hasten compliance, ordered desegregation of schools
“with all deliberate speed,” but compliance was slow. When the governor of
Arkansas, Orval Faubus, tried to block the enrollment of nine black students
into Little Rock High School in 1957, television showed the entire nation the
confrontation between National Guard troops and segregationists. Television news
helped make Little Rock’s problem a national one, and television crews continued
to cover civil rights protests.
In December 1955 the black community
in Montgomery, Alabama, organized a bus boycott after Rosa Parks was arrested
for refusing to give up her bus seat to a white man. A local minister, Martin
Luther King, Jr., helped organize the boycott. In 1957 ministers and civil
rights leaders formed the Southern Christian Leadership Conference (SCLC). The
SCLC, which adopted a policy of nonviolent civil disobedience, formed the
backbone of the civil rights movement in the United States.
The civil rights movement expanded on
February 1, 1960, when four black college students at North Carolina A&T
University began protesting racial segregation in restaurants by sitting at
whites-only lunch counters and waiting to be served. Within days the sit-ins
spread throughout North Carolina, and within weeks they reached cities across
the South. To continue students’ efforts and to give them an independent voice
in the movement, college students in 1960 formed another civil rights group, the
Student Nonviolent Coordinating Committee (SNCC). Students and activists soon
adopted other methods of protesting segregation, such as freedom rides—bus trips
throughout the South in order to desegregate buses and bus stations. A powerful
civil rights movement was underway.
Postwar prosperity brought comfort
and social mobility to many Americans. Those who had grown up during the Great
Depression especially appreciated the good life of the postwar years.
Prosperity, however, eluded many citizens. The era, moreover, was hardly placid
and complacent, but eventful and divisive. Signs of change around 1960 included
the growing role of youth, the civil rights protests, and the simmering of
dissent.
XXIV | THE LIBERAL AGENDA AND DOMESTIC POLICY: THE 1960S |
In the 1960s, presidential initiatives,
judicial rulings, and social protest movements generated reform. The civil
rights movement, the women’s movement, the youth movement, and the environmental
movement changed people’s lives. They also created a climate of rebellion,
confrontation, and upheaval. For more information, see Protests in the
1960s.
Handsome, dynamic, and articulate, John
Kennedy defeated Richard Nixon in the presidential election of 1960—the first
election in which televised debates between presidential candidates played a
major role. When he accepted the Democratic nomination, Kennedy urged Americans
to meet the challenges of a “New Frontier.” The term New Frontier evoked the
spirit of exploration that Kennedy wanted to bring to his presidency. His youth
and vigor raised expectations. In practice, however, his actions were cautious
and pragmatic.
In his brief tenure, Kennedy continued
Cold War policies by broadening U.S. involvement in Southeast Asia, overseeing
an arms buildup, and hiking the defense budget. He also inaugurated a long era
of economic expansion, based largely on additional spending for missiles,
defense, and the space race. In 1961 he began the Peace Corps, an innovative
federal program that sent American volunteers to assist needy nations by
providing educational programs and helping communities build basic
infrastructures. After first evading civil rights issues, Kennedy responded to
the calls of civil rights advocates and proposed a comprehensive civil rights
bill. Congress, however, had not passed the bill when Kennedy was assassinated
in November 1963.
At Kennedy’s death, his vice president,
Lyndon Johnson, became president. A Texas politician since the New Deal and a
majority leader of the Senate, Johnson seemed less likely than Kennedy to be an
innovative leader. But, as president, Johnson plunged ahead with domestic
reform. In July 1964 he proposed the landmark Civil Rights Act of 1964, enacted
in memory of Kennedy. The law prohibited segregation in public accommodations
and discrimination in education and employment. Johnson then declared a “War on
Poverty” in the United States. He promoted a billion-dollar campaign to end
poverty and racial injustice. In August 1964 Congress established an Office of
Economic Opportunity (OEO) to direct the Johnson administration’s War on Poverty
program and a Job Corps to train young people for the employment market. Johnson
also supported a volunteer program, Volunteers in Service to America (VISTA), a
domestic version of the Peace Corps; Project Head Start, to educate preschoolers
from disadvantaged families; and several other public works and job-training
programs.
In the 1964 presidential election,
Johnson won a landslide victory over conservative Arizona senator Barry
Goldwater. He then pressed legislators to add to his reform program, which he
labeled the “Great Society.” In 1965 Congress enlarged the War on Poverty by
enacting Medicare (a program of medical insurance for the elderly) and Medicaid
(a program of medical care for the needy), and funding urban development,
housing, and transit. Congress also passed the Voting Rights Act, which
protected the rights of minorities to register and vote. In addition it
established the National Foundation on the Arts and the Humanities to provide
funding for the arts, provided funds to school districts with children from
low-income families, passed the Clean Air Act, and enacted legislation to
protect endangered species and wilderness areas.
Finally, Johnson supported two policy
changes with unexpected future impact. The Immigration and Nationality Act of
1965 removed quotas based on race or nationality that had been in force since
the 1920s, and it paved the way for massive immigration from Asia, Latin
America, and elsewhere. Also in 1965, Johnson issued Executive Order 11246,
which required groups that did business with the federal government to take
“affirmative action” to remedy past discrimination against African Americans. As
Johnson told black leaders, his goals for racial progress meant “not just
equality as a right and a theory but equality as a fact and equality as a
result.” Over the next three decades, the federal government implemented
affirmative action policies to promote the hiring of women and minorities.
Stunning in its scope, Johnson’s
ambitious domestic agenda soon ran into problems. Within three years, the United
States was deeply involved in the Vietnam War; its expense and controversy
undercut many Great Society goals. But the civil rights revolution that Johnson
endorsed made unprecedented gains.
A | The Civil Rights Movement |
African Americans had been struggling
to gain equal rights for many decades. As the 1960s began, the civil rights
movement gained momentum. Individuals and civil rights organizations assailed
segregation in the South and discrimination everywhere. They protested with
marches, boycotts, and refusals to tolerate segregation. Many organizations
conducted their protests with nonviolent resistance. Civil rights protesters
often faced harsh confrontations with their opponents. These confrontations,
which appeared on network television, exposed the struggle for civil rights to a
large national audience.
In the spring of 1961, the Congress of
Racial Equality (CORE) started Freedom Rides to the South to desegregate bus
terminals and protest segregation in interstate transportation. The Freedom
Riders, black and white, challenged white supremacy and drew angry attacks.
In the fall of 1962, a federal court
ordered the University of Mississippi to enroll a black air force veteran, James
Meredith. To prevent his enrollment, white protesters rioted, and President
Kennedy sent federal troops to restore order. In Birmingham, Alabama, in 1963,
Martin Luther King, Jr., and the SCLC led a campaign of marches, sit-ins, and
prayer meetings to challenge segregation and racism. Clashes arose between black
protesters and city police, armed with dogs and cattle prods. News coverage
exposed the violence in Birmingham to people all over the world. Television news
next covered the University of Alabama, where Governor George Wallace in June
1963 barred two black students from entrance.
Responding to African American calls
for action, Kennedy in June 1963 declared civil rights “a moral issue” and
proposed a comprehensive civil rights measure. Congress did not act on the bill,
but the civil rights movement intensified. In August 1963 more than 200,000
Americans marched on Washington, D.C., to demand equal rights. The audience
heard Martin Luther King, Jr., explain his dream of brotherhood, freedom,
justice, and nonviolence. In July 1964, at Johnson’s prompting, Congress passed
the Civil Rights Act, which outlawed segregation in public accommodations; gave
the federal government new power to integrate schools and enfranchise blacks;
and created the Equal Employment Opportunity Commission to stop job
discrimination based on race, religion, national origin, or gender. The law
heralded a new phase of activism.
Since 1961 civil rights activists had
worked on voter registration in Alabama, Georgia, and Mississippi. In the summer
of 1964, CORE and the Student Nonviolent Coordinating Committee (SNCC) organized
the Mississippi Freedom Summer Project. The project recruited over 1,000
Northern college students, teachers, artists, and clergy—both black and white—to
work in Mississippi. These volunteers, who helped blacks register to vote and
ran freedom schools, met harassment, firebombs, arrests, beatings, and even
murder. In August 1964 civil rights workers sent a delegation to the Democratic
National Convention to demand (in vain) the seating of delegates from the newly
formed Mississippi Freedom Democratic Party. Mass protests in Selma, Alabama, in
March 1965 again brought segments of violent confrontations to television
news.
The voting rights campaign of the
mid-1960s had results. In 1965 Johnson signed the Voting Rights Act, which
authorized federal examiners to register voters and expanded black suffrage by
suspending literacy tests for voting. The 24th Amendment, ratified in 1964,
outlawed the poll tax in federal elections. A 1966 Supreme Court decision struck
down the poll tax in all elections. These measures more than tripled the number
of registered black voters in the South. Just as the federal government
responded—after almost a century of inaction—to civil rights demands, waves of
violence and disorder signaled a change in the civil rights movement.
In August 1965 frustrations with high
unemployment and poverty led to riots in the Watts section of Los Angeles, a
primarily black neighborhood. For six days, rioters looted, firebombed, and
sniped at police and National Guard troops. When the riots ended, 34 people were
dead and hundreds were injured. In the summers of 1966 and 1967, urban riots
occurred in the poorer neighborhoods of several Northern cities. The summer of
1967 saw 150 racial confrontations and 40 riots.
In April 1968 Martin Luther King, Jr.,
was assassinated; in the summer race riots broke out in over 100 cities. In the
wake of the riots, the president appointed a National Commission on Civil
Disorders, headed by Otto Kerner, a former governor of Illinois. The Kerner
Commission blamed white racism for the outbreaks of violence. “Our nation is
moving toward two societies,” the Commission report warned, “one black, one
white—separate and unequal.” The report urged job creation, more public housing,
school integration, and “a national system of income supplementation.”
As the urban riots of the mid-1960s
voiced black rage, demands for Black Power changed the tone of the civil rights
movement. Stokely Carmichael, a civil rights activist and SNCC member, led SNCC
away from its commitment to nonviolence and integration. Carmichael popularized
the call for Black Power, a controversial term. To some, Black Power called for
racial dignity and self-reliance. For others, it meant that blacks should defend
themselves against white violence, instead of relying on nonviolence. Still
others believed that the Black Power movement called for black economic and
political independence.
Black Power advocates were influenced
by Malcolm X, a Nation of Islam minister who had been assassinated in early
1965. They admired Malcolm’s black nationalist philosophy, which emphasized
black separatism and self-sufficiency. They also appreciated Malcolm’s emphasis
on black pride and self-assertion.
Conflict soon arose between the older
civil rights organizations, such as the NAACP, and black power advocates, with
their aura of militancy and violence. Some blacks called for racial pride and
separatism instead of colorblindness and integration. Civil rights demands
shifted from colorblinded to color-consciousness.
By the end of the 1960s, the civil
rights movement had strongly influenced other groups, which adopted its protest
tactics. Native Americans had mobilized early in the decade and convened in
Washington in 1964 to press for inclusion in the War on Poverty. In 1968 Native
American leaders demanded Red Power in the form of preferential hiring
and reimbursement for lands that the government had taken from them in violation
of treaties. Mexican Americans supported César Chávez, president of the United
Farm Workers Organizing Committee. Chavez sought improved working conditions for
migrant workers and organized national consumer boycotts of grapes and other
products. The Hispanic movement also campaigned for bilingual and bicultural
education, and Chicano studies in colleges. Finally, the women’s movement of the
late 1960s and 1970s, especially, derived inspiration from the civil rights
precedent.
B | The Women’s Movement |
Like the civil rights movement, the
women’s movement used various means to end discrimination. Activists created
pressure groups, adopted confrontation tactics like sit-ins and marches, and
tried to capture media attention. By the end of the 1960s, feminists had created
an energetic campaign that called both for legal equity and for the
restructuring of gender roles and social institutions.
In 1961, Kennedy established the first
presidential Commission on the Status of Women. In 1963 the commission issued a
report citing employment discrimination, unequal pay, legal inequality, and
insufficient support services for working women. The same year, a new book by
journalist Betty Friedan, The Feminine Mystique, challenged the notion
that women could find fulfillment only as wives and mothers. A final catalyst of
the early 1960s was the Civil Rights Act of 1964, which banned race
discrimination in employment and set up the EEOC to enforce the law.
Unexpectedly, perhaps accidentally, and after heated debate, legislators amended
the bill to bar sex discrimination in employment as well. When the EEOC ignored
gender-based charges, women formed the National Organization for Women (NOW) in
1966. Betty Friedan led the new civil rights group, which urged equal
opportunity and an end to sex discrimination.
Meanwhile, another wing of feminism
developed. Young women who had been active in the civil rights and other protest
movements began to form small “consciousness-raising” groups, which rapidly
expanded in number. In these groups, women met to discuss the inequity of
“sexism,” a counterpart to racism; to strive for “women’s liberation”; and to
start feminist projects, such as health collectives or rape crisis centers.
The two wings of feminism often
clashed. NOW focused on legal change, and women’s liberation urged revolutionary
transformation. But the two factions served complementary functions and
sometimes joined forces, as in The Women’s Strike for Equality in August 1970.
With parades and marches, women celebrated the 50th anniversary of woman
suffrage and pressed for new causes—equal employment opportunity, an equal
rights amendment, and more liberal state abortion laws.
In the early 1970s, the women’s
movement achieved extensive results. In 1972 Congress passed the Equal Rights
Amendment (ERA) to provide for equality of the sexes under the law. However, the
states failed to ratify the amendment. Still, the fact that Congress passed the
ERA signified feminism’s new legitimacy. In Roe v. Wade (1973),
the Supreme Court legalized abortion. Finally, women made astounding gains in
education and employment.
Editors scoured elementary and high
school textbooks to remove sexist elements. In 1972 Congress passed Title IX of
the Higher Education Act, which prohibits discrimination on the basis of sex in
any educational program receiving federal funds, including athletic programs. At
the college and university level, once all-male colleges and military academies
began to accept women students.
In employment, state and federal
courts overturned labor laws that curtailed opportunities for women, such as
laws that barred women from night work or overtime. The courts supported legal
actions against employers that discriminated against women in their hiring or
promotion policies. Women also entered new vocations. Some went into blue-collar
fields, such as construction; others found jobs in banking, finance, business,
and government. The proportions of women in the professions—as lawyers, doctors,
and engineers—increased as well.
One of the most enduring movements to
emerge in the 1960s, the women’s movement left strong institutional
legacies—pressure groups, professional organizations, and women’s studies
programs in colleges.
C | The Youth Movement |
As the baby boom generation veered
toward adulthood, its members began to challenge the status quo. By the
mid-1960s nearly three out of four students finished high school, and about half
of those students went on to college. College campuses filled with young people
who had the freedom to question the moral and spiritual health of the
nation.
One facet of the youth movement was a
disaffected, apolitical counterculture, made up of people who were known as
hippies. These young people decried materialism, mocked convention, spurned
authority, joined communes, enjoyed rock music, and experimented with drugs and
sex. Often hippies asserted their rebellious attitude through elements of
personal style, such as long hair and tie-dyed clothes. In August 1969 hippies
gathered at the Woodstock Festival, a music festival where young people convened
to celebrate love and peace. Woodstock represented a high point in the
counterculture, but hippie lifestyles continued into the 1970s.
Another wing of the youth movement
included activists from political protest movements, such as the civil rights
movement. This wing was more visible on college campuses and more politically
conscious. In 1960 a small group of young people formed Students for a
Democratic Society (SDS) and adopted The Port Huron Statement, written by
student leader Tom Hayden. The manifesto urged participatory democracy, or the
idea that all Americans, not just a small elite, should decide major economic,
political, and social issues that shaped the nation. It also criticized American
society for its focus on career advancement, material possessions, military
strength, and racism. By 1968 some 100,000 young people around the nation had
joined SDS.
Student protesters denounced corporate
bureaucracy and campus administrators. Universities and colleges, they believed,
were dictatorial and exercised too much control over students. Students held
rallies and sit-ins to protest restrictions of their rights. In 1964 a coalition
of student groups at the University of California, Berkeley, claimed the right
to conduct political activities on campus; the coalition became known as the
Free Speech Movement. Political activism and protests spread to other campuses
in the 1960s.
The youth movement’s demonstrations
soon merged with the protests of students who opposed the Vietnam War. By the
spring of 1968, student protests had reached hundreds of campuses. At the 1968
Democratic National Convention in Chicago, antiwar demonstrators clashed with
the police, and the images of police beating students shocked television
audiences (see Chicago Convention of 1968). Violence peaked at an antiwar
protest at Ohio’s Kent State University in May 1970, when National Guard troops
gunned down four student protesters.
The political activities of the youth
movement had enduring effects. Colleges became less authoritarian, ending dress
codes and curfews and recruiting more minority students. Students also
contributed mightily to the movement against the war in Vietnam. Both the
counterculture and student activism, finally, fueled a backlash that blossomed
in the 1970s and 1980s.
D | The Environmental Movement |
A movement to preserve the environment
took root with the best-selling book Silent Spring (1962) by Rachel
Carson. The book attacked toxic pesticides like DDT. Carson described how DDT
threatened both animals and human beings. Her book raised Americans’ awareness
of threats to the environment and moved many to take action. Students and
teachers at over 1,500 colleges and universities and at over 10,000 schools held
teach-ins on the environment. Hundreds of thousands of other Americans staged
protests and rallies around the nation. These activists formed a number of
environmental groups, including the Environmental Defense Fund in 1967, Friends
of the Earth in 1968, Greenpeace in 1970, and the Sierra Club Legal Defense Fund
in 1971. In 1970 some 20 million Americans gathered for what organizers called
Earth Day to protest abuse of the environment.
In response to growing citizen
protests, Congress in 1970 passed the National Environmental Policy Act, which
created the Environmental Protection Agency (EPA), an independent agency
responsible for protecting the environment and maintaining it for future
generations. Congress also enacted laws to curb pollution, preserve wilderness
areas, and protect endangered species. The Supreme Court allowed
conservationists to sue businesses for polluting the environment and government
agencies for failure to enforce the law.
Several events in the 1970s suggested
the danger of environmental threats. In 1978 residents of Love Canal in New
York, who had been experiencing high disease rates, were found to be living on a
former chemical waste dump; the area was evacuated. In 1979 an accident at the
nuclear power plant on Three Mile Island in Pennsylvania showed the potential
dangers of radioactive material in nuclear reactors.
As concern for the environment spread,
more Americans became involved in efforts to maintain forests, parks, and
wildlife refuges; prevent air and water pollution; conserve energy; and dispose
of hazardous waste safely. Environmentalists persisted in their efforts into the
1980s, although often challenged by conservatives who believed that
environmental regulations restricted property rights protected by the
Constitution.
E | The Warren Court |
Judicial activism (taking an active
role in shaping public policy) completed the liberal agenda of the 1960s. Ever
since Earl Warren’s appointment as chief justice in 1953, the Supreme Court had
enraged critics on the right, who pressed for Warren’s impeachment. In the 1950s
the Warren Court had integrated public schools in Brown v. Board of
Education of Topeka (1954). In the 1960s Kennedy and Johnson appointed four
Supreme Court justices, including Thurgood Marshall, the NAACP lawyer who had
argued the Brown case and the high court’s first African American
justice. With a liberal majority in place, the Warren court handed down a series
of landmark cases that enhanced civil liberties and spurred or legitimized
social change.
The Warren Court of the 1960s declared
prayer in public schools unconstitutional, enabled Communists to obtain
passports, and limited communities’ power to censor books and movies (thus
making sexually explicit material available to adults). In Griswold v.
Connecticut (1965) the Court ruled that state bans on contraceptives were
unconstitutional. The Court also consistently upheld civil rights. It found
local ordinances upholding segregation in private businesses (such as lunch
counters) unconstitutional; reversed the convictions of black demonstrators who
had refused to disperse; upheld the civil rights laws of 1964 and 1965; declared
delays in school desegregation intolerable; and upset a state law that forbade
marriage between persons of different races.
Warren Court decisions of the 1960s
affected electoral procedures, too. In Baker v. Carr (1962), the
Court upheld the principle of “one man, one vote,” which meant that state
legislatures had to be reapportioned on the basis of population. Finally, the
Court issued controversial decisions that transformed criminal justice. In
Gideon v. Wainwright (1963), the Court held that a poor person
charged with a felony had the right to be represented by a state-appointed
lawyer. In Miranda v. Arizona (1966) the Court declared that a
confession could not be introduced as evidence unless the defendant had been
informed of his or her rights, including the right to remain silent.
F | The 1960s in Retrospect |
The climate of reform that erupted in
the 1960s continued into the 1970s, where movements for change met different
fates. Feminism and environmentalism continued and prospered. The counterculture
peaked and faded, although drug use exploded. In civil rights, the early goals
of colorblindness ceded place to race consciousness and “identity politics,” or
jousting for place among contending ethnicities. Overall, few great dreams that
pervaded the fervent 1960s were achieved. Hopes for participatory democracy and
an end to racism and patriarchy eluded realization.
Still, in domestic policy, the 1960s
were an era of enduring change. Although the Vietnam War undercut the Great
Society, Johnson’s programs increased justice and fought poverty. The Warren
Court upheld individual rights. The civil rights movement ended legal
segregation, registered black voters, battled race discrimination, engendered
black pride, and vastly liberalized white attitudes. The spread of feminism
forced reexamination of gender roles. Overall, reform movements of the 1960s
expanded free expression, challenged tradition, blasted the placidity of the
1950s, and, for better or worse, dispelled the widespread respect for government
that had prevailed since World War II. Antiwar protest was a vital part of this
process. The Vietnam War of the 1960s and 1970s shattered Americans’ long-held
faith in both the wisdom of the state and in Cold War policies of the
1950s.
XXV | FOREIGN POLICY, VIETNAM WAR, AND WATERGATE |
In the 1960s the United States remained
committed to Cold War goals and sought to stem the spread of Communism around
the globe. Continuing the policy of containment, the United States sent more and
more troops to Vietnam. There, bogged down in jungle fighting and bombing
campaigns, the United States became enmeshed in a long and costly war. When the
United States finally left the Vietnamese to determine their own fate in the
early 1970s, a near-impeachment crisis increased Americans’ mood of skepticism
and distrust of government.
A | Kennedy Administration and the Cold War |
In the early 1960s, President Kennedy
vigorously pursued the Cold War policy of containment. He expanded U.S. aid to
other nations, boosted the size of the armed forces, stockpiled missiles, and
strove to end Soviet influence in Cuba, just 90 miles off the tip of Florida. In
1959 a revolution in Cuba brought Fidel Castro, a leftist, to power. When Castro
took control, he implemented policies designed to eliminate differences between
social classes in Cuba. These policies included confiscating large land holdings
and seizing businesses that belonged to wealthy Cubans and U.S. firms. Concerned
about Communist influence, U.S. officials were wary of Castro. In 1961 a force
of Cuban exiles, trained and supplied by the United States, invaded Cuba in an
attempt to topple Castro. They failed, and the Bay of Pigs invasion was a
fiasco.
Tensions increased between the United
States and Cuba. To deter further U.S. interference in Cuba, Castro sought
economic and military assistance from the USSR. In 1962 the United States
discovered that Khrushchev had set up nuclear missile bases in Cuba from which
rocket-powered missiles could be launched. Kennedy faced a crisis: To destroy
the bases might lead to world war; to ignore them risked an attack on the United
States. In October 1962 Kennedy demanded that the USSR remove the missiles, and
after a few days of suspense, the Soviets agreed to do so.
The Cuban missile crisis was a close
call. Teetering on the brink of nuclear war, both superpowers leaped back in
alarm. Afterward, Kennedy and Khrushchev established a telephone hot line, and
in 1963 they signed the Limited Nuclear Test Ban Treaty that banned nuclear
tests in the air and in the water. But the Cold War rivalry continued. The
United States and the USSR now vied for control in Asia.
Cold War warriors in the United States
believed that Communist aggression posed a threat in Asia. They especially
feared a Communist takeover of Vietnam. If Vietnam fell, they believed,
Communism would engulf all of Southeast Asia. Even in the 1940s, President
Truman provided economic and military aid to prevent the growth of Communist
power in what was then French Indochina. When France withdrew from the area in
1954, the Geneva Accords divided Vietnam into two segments: North Vietnam, ruled
by the Communist Viet Minh; and South Vietnam, controlled by non-Communist
allies of the French.
The United States supported
non-Communist South Vietnam and in subsequent decades increased its commitment
to the region. Under Eisenhower, from 1955 to 1961, America sent economic aid to
South Vietnam. In 1960 Communists and nationalists in South Vietnam formed the
National Liberation Front (NLF), often referred to as the Viet Cong (a label
attached by its foes). The NLF was organized to challenge South Vietnam’s
president, Ngo Dinh Diem, who ruled from 1955 to 1963, and to foster
unification.
Kennedy continued Eisenhower’s efforts
in Vietnam by tripling American aid to South Vietnam and by expanding the number
of military advisers from about 700 to more than 16,000. In 1963 the United
States approved a coup led by South Vietnamese military officers to overthrow
Diem, who was killed. A few weeks later, Kennedy was assassinated and Lyndon B.
Johnson became president. Johnson inherited the problem of U.S. commitment to
South Vietnam, where Communist insurgents were gaining strength.
B | Johnson and Vietnam |
Johnson was in a dilemma. If he
increased American military aid to Vietnam, he would have to divert funds from
his Great Society programs, and he might prod China into war. If he withdrew
American aid, however, he risked the politically damaging charge that he was
“soft” on Communism. Most important, Johnson did not want to be the first
American president to lose a war. He enlarged the war in Vietnam.
After an allegedly unprovoked attack on
U.S. warships in the Gulf of Tonkin off North Vietnam in August 1964, Johnson
authorized limited bombing raids on North Vietnam. At the administration’s
request, Congress then offered an almost unanimous resolution, known as the Gulf
of Tonkin Resolution, that enabled the president to use military force in
Vietnam. In 1965, after a landslide victory in the 1964 election—when voters
endorsed his platform of domestic reform and peace abroad—Johnson again
escalated American involvement. By 1968 more than 500,000 troops were in
Vietnam, and the United States had begun heavy bombing of North Vietnam.
The United States never declared war on
North Vietnam or made a total commitment to winning the war. Vietnam remained a
limited war, one in which the United States purposely refrained from employing
all its military strength. The American commander, General William Westmoreland,
sought to inflict heavy losses on the North Vietnamese and to destroy their
morale. But the North Vietnamese were tenacious. In January 1968 they launched a
massive attack known as the Tet Offensive, which severely damaged U.S. forces
and reached the American embassy compound in the South Vietnamese capital of
Saigon. South Vietnam and the United States finally turned back the Tet
Offensive, but with heavy losses. Americans could not see an end to the war, and
its costs, both economic and human, rose alarmingly.
C | The Anti-Vietnam War Movement |
By the start of 1968, Johnson
encountered mounting opposition to the war. An antiwar movement had arisen in
1964 and 1965 as Johnson began to escalate American involvement in Vietnam. In
1965 students and teachers at the University of Michigan held one of the first
campus teach-ins to spread information about the war. Teach-ins soon were held
at many colleges and universities. Antiwar protests evoked massive support among
draft-age youth, half of them college students. Chanting activists disrupted
draft boards, burned draft cards, occupied campus buildings, and marched on the
Pentagon (see Anti-Vietnam War Movement).
The Johnson administration faced
political critics as well. Arkansas Senator J. William Fulbright began to hold
hearings that questioned why the United States was fighting in Vietnam.
Fulbright stopped supporting Johnson when he learned that the president had
exaggerated enemy aggression at the Gulf of Tonkin. Defense Secretary Robert
McNamara called the bombing campaign a failure and left his post in 1968.
European allies also criticized the American role in Vietnam.
At home, the war generated intense
debate. “Hawks” assailed the policy of limited war and favored an all-out effort
to defeat Communism in Vietnam. Some contended that politicians prevented the
military from winning the war, or that military leaders had no strategy for
victory. Others held that the antiwar movement stifled support for the war,
ruined morale, and undercut the military effort. “Doves,” in contrast, believed
that the United States should never have become involved in Vietnam. The
conflict, they argued, was essentially a civil war, and contrary to containment
doctrine, its outcome was irrelevant to American security. To some critics, the
war was unwinnable, and stalemate was the best foreseeable outcome. In any case,
doves argued, the United States should negotiate with North Vietnam to end the
war quickly.
By 1968 antiwar sentiment affected
electoral politics. Challenging Johnson for the Democratic presidential
nomination, Senator Eugene McCarthy of Minnesota campaigned against the war.
McCarthy roused fervent support among the young, and Vietnam swiftly became the
major issue of the 1968 presidential race. Reconsidering his earlier policies,
Johnson limited bombing in Southeast Asia and initiated peace talks with Hanoi
and the NLF. After he was challenged by McCarthy in the New Hampshire primary,
Johnson decided not to seek reelection and withdrew from the race. The president
became a political casualty of the Vietnam War.
In 1968 an aura of crisis grew with the
assassinations of Martin Luther King, Jr., in April and of Senator Robert F.
Kennedy in June. In Chicago during the summer of 1968, violence erupted when
police attacked antiwar protesters at the Democratic National Convention. In the
election that fall, Richard Nixon defeated Johnson’s vice president, Hubert
Humphrey, and third party candidate George Wallace.
D | Nixon and Vietnam |
Under Nixon, American troop strength in
Vietnam contracted but the war effort expanded. Nixon began a program of
Vietnamization, which meant decreasing the number of U.S. troops, offering only
advice and assistance, and turning the war effort over to the South Vietnamese.
U.S. ground troops gradually returned from Vietnam, but the United States
increased its bombing of North Vietnam. Nixon also extended the war into
Cambodia and Laos, where he secretly authorized bombing to block enemy supply
routes on Vietnam’s border. Finally, Nixon sought a diplomatic escape from war.
He visited China and the USSR and sent Henry Kissinger, his national security
adviser, to secret talks in Paris with the North Vietnamese. Antiwar protests,
meanwhile, continued. In May 1970 Ohio National Guard troops killed four Kent
State University students during an antiwar protest, spurring widespread
outrage.
In 1973, as Nixon began a second term,
the United States and North Vietnam signed a peace treaty in Paris, which
provided for a cease-fire. The terms of the cease-fire included: American
withdrawal of all remaining forces from Vietnam, Vietnamese return of American
prisoners captured during war, and the end of all foreign military operations in
Laos and Cambodia. American troops left Vietnam, but the war between North
Vietnam and South Vietnam continued. South Vietnam finally fell in April 1975,
as North Vietnamese forces entered Saigon. More than 58,000 Americans were
killed in Vietnam, and over 300,000 were wounded. Even after the war’s end,
Americans continued to debate its purpose and the meaning of its failure.
E | The Impact of Vietnam |
The Vietnam War affected the United
States in many ways. Most immediately, it spurred policy changes. The United
States ended the military draft and switched to an all-volunteer army. Congress
passed the War Powers Resolution over Nixon’s veto in November 1973. The
resolution limited the president’s ability to send troops into combat without
congressional consent. Its passage reflected legislators’ desire to restrain
presidential power and to prevent U.S. involvement in a war like that in
Vietnam.
Beyond policy changes, the war in
Vietnam changed the attitudes of a generation. First, the war increased caution
about involvement in foreign affairs. After Vietnam, Americans more carefully
weighed the risks of intruding in another nation’s problems. Second, defeat in
the war diminished American confidence in U.S. superiority, both moral and
military. Defeat in Vietnam was a humiliating national experience.
Finally, the war increased mistrust of
government and its officials. A chain of events beginning in the 1960s—such as
the way Johnson obtained the Gulf of Tonkin Resolution, revelations of secret
bombings of Cambodia under Nixon, and the Kent State tragedy—shattered a faith
in the state that had prevailed since World War II. These events left citizens
with a sense of cynicism: Government leaders were no longer credible. The abrupt
end of Nixon’s presidency only confirmed this sentiment.
F | The Nixon Administration |
Since the 1930s presidential powers had
grown as presidents struggled to overcome the depression, win a world war, and
avoid defeat in the Cold War. These powers continued to grow under Kennedy and
Johnson. Kennedy, for instance, launched covert operations at the Bay of Pigs,
and Johnson engaged the nation in war without congressional approval. President
Richard Nixon wielded more power than any peacetime president, and in the early
1970s the term Imperial Presidency became linked to his administration.
The term referred to a tendency to disregard the Constitution, to view politics
as warfare, to act in secret, to claim executive privilege, to subvert Congress,
and to rely excessively on White House aides.
F1 | The Imperial Presidency |
Long a controversial figure, Nixon
served as vice president for eight years under Eisenhower, lost a bid for
president in 1960 and a run for governor of California in 1962, and then worked
as a corporate lawyer. Elected in 1968 and again, resoundingly, in 1972, Nixon
claimed to represent a new majority that included former Democrats—ethnic
minorities, working-class people, and Southern whites—who were disgusted with
liberal policies. Nixon promised voters that he would restore law and order and
end the unpopular war in Vietnam.
During Nixon’s presidency, the
economy ran into trouble with inflation. In 1971 inflation leaped to 5 percent,
the stock market fell, and for the first time since the 19th century, the United
States had an overall trade deficit, which meant that it imported more goods
than it exported. To fight inflation, Nixon briefly imposed wage and price
controls. His cautious efforts succeeded and prevented inflation from getting
worse. Nixon also urged welfare reform. In 1969 he proposed the Family
Assistance Plan, which would have provided a minimum income for poor families
and supplements for the working poor. The bill died in a Senate committee, but
one of its provisions, a food stamp program, became federal policy.
Nixon’s strength was foreign policy.
His Vietnamization program reduced American casualties and diminished American
involvement in the Vietnam War, although he widened the war by extending it to
Cambodia. Meanwhile, Henry Kissinger, whom Nixon appointed as secretary of state
in 1973, followed a new brand of diplomacy. Kissinger saw world power as divided
among the United States, the USSR, Japan, China, and Europe, and he attempted to
achieve first place for the United States among these major powers. In 1969
Nixon advanced the Nixon doctrine, which held that the United States would
continue to help Asian nations combat Communism but would no longer commit
troops to land wars in Asia.
Most important, Nixon opened
relations with China. In 1972 he made an official visit to China. Nixon’s trip
was the first time that the United States and China had renewed relations since
1949, when Communists took control in China. Nixon also traveled to Moscow to
sign the Strategic Arms Limitation Treaty (SALT I) in 1972. The treaty reduced
stockpiles of nuclear weapons and froze deployment of intercontinental missiles.
Nixon, however, undercut his own achievements by abuses of power that came to
light in his second term.
F2 | Watergate |
On June 17, 1972, with a presidential
campaign in progress, police officers in Washington, D.C., arrested five men
caught breaking into the headquarters of the Democratic National Committee,
located in a residence complex called the Watergate. The incident initially
attracted little attention, but two Washington Post reporters, Bob
Woodward and Carl Bernstein, began investigating the break-in. From their
articles and from Senate hearings, Americans learned that the president, his
aides, and campaign officials had conspired to sabotage Nixon’s political foes.
Nixon initially denied involvement in the scheme. But a series of special
prosecutors, whom the president was forced to appoint, investigated the scandal.
They soon determined that Nixon and his aides tried to cover up the president’s
link with the Watergate break-in and to obstruct the Watergate
investigation.
In July 1974 the Supreme Court
ordered Nixon to produce tape recordings that he made of conversations in the
White House. The transcripts contained evidence that Nixon had broken the law
and knew about the cover-up. At the end of the month, the House Judiciary
Committee adopted three articles of impeachment that charged Nixon with abusing
power, obstructing justice, and defying Judiciary Committee subpoenas. Before
the House could vote on Nixon’s impeachment, the president resigned, on August
9, 1974. Vice President Gerald Ford assumed the presidency and pardoned Nixon in
September.
Watergate undermined presidential
authority and made citizens fear excessive state power, such as Nixon’s secret
bombings of Cambodia and his wiretapping of public officials and journalists.
Nixon’s resignation ended an era of turmoil and animosity. Two presidents in
succession, elected by vast majorities, had left office either diminished, in
Johnson’s case, or disgraced, in Nixon’s case. The Vietnam War eroded the
nation's self-confidence and left a legacy of skepticism. Watergate further
enlarged citizens’ suspicions of government. In the next few elections, voters
sought heads of state untainted by overexposure to power in Washington,
D.C.
XXVI | END OF THE 20TH CENTURY |
Since the mid-1970s, American domestic
politics have been affected by several major trends—the end of the Cold War, a
declining industrial sector, and the rise of a global economy. Meanwhile,
increased immigration changed the population, and life in a multicultural
society generated new conflicts. At the end of the 20th century, Americans faced
a major challenge: How to preserve national unity while respecting social
diversity.
A | Politics Since Watergate |
In the last quarter of the 20th
century, Americans voiced concern about many domestic issues such as the health
of the economy, the illegal use of drugs, a growing crime rate, the quality of
education, dependency on welfare, spiraling health costs, and contentious race
relations. They discussed controversial subjects such as how to fund public
schools, whether abortion should be available, whether homosexuals’ rights
should be protected, and what should be done about the welfare system. As they
had throughout the 20th century, Americans debated the merits of an activist
state and the degree to which the government should regulate free enterprise and
provide social services.
Meanwhile, voting patterns changed.
Many people moved to the Sunbelt, the southern and southwestern United
States, which gave that region of the nation new political clout. African
Americans achieved voting power in the South and within the Democratic Party.
Polls and elections measured a gender gap in political preferences, suggesting
that women were more likely than men to support Democratic policies. Finally,
starting in the 1970s, several significant changes realigned national
politics.
First, the Democratic coalition of the
New Deal era, which had been losing bits and pieces since the 1960s, continued
to falter. Southern Democrats deserted the party and joined the Republican
Party, which gained seats in state legislatures and in Congress. With the loss
of Southern constituents, a Democratic majority was no longer a certainty in the
South. Second, a revival of conservative political beliefs that had been in
progress since midcentury gained force in the 1980s. Third, many Americans began
to consider themselves political independents. Consequently, the outcomes of
national elections often depended on swing voters. Candidates started to cater
to these voters. Overall, the electorate became more suburban, more middle
class, and more likely to vote Republican; the political center expanded and
became more conservative.
Domestic politics during the
administrations of Gerald Ford, Jimmy Carter, Ronald Reagan, George Bush, and
Bill Clinton centered on economic issues. Under President Ford, who held office
from Nixon’s resignation in 1974 through 1976, the nation confronted a mix of
inflation and recession called stagflation—a condition in which both
prices and unemployment rose. One major cause was an Arab oil embargo in the
early 1970s, which generated a steep rise in oil prices. At odds with Congress,
Ford made little progress salvaging the economy and faced criticism from
political foes for his pardon of former president Nixon.
Formerly governor of Georgia, Jimmy
Carter campaigned as an outsider who had not been contaminated by the corruption
of Washington, D.C., politics. As president, Carter—like Ford—grappled with
rising inflation, an energy crisis, unemployment, and battles with Congress.
Admired for his advocacy of human rights abroad, Carter failed to turn the
economy around, and he lost the election of 1980 to former California governor
Ronald Reagan.
In 1980 the growth of conservatism led
to Ronald Reagan’s victory. Reagan vowed to stimulate the economy by
implementing what was called supply-side economics, a theory that tax cuts would
spur economic growth and therefore increase government revenues. Reagan also
promised to cut government. “Government is not the solution to the problem,” he
declared. “Government is the problem.”
In 1981 Congress reduced taxes, cut
social programs, and increased military spending. However, the increased
government revenues predicted by supply-side economics did not appear. Over
Reagan’s two terms in office, military costs rose, revenues failed to increase,
and a huge budget deficit developed. The United States had been a creditor
nation when Reagan was elected, but by the time he left office, the United
States had become the world’s largest debtor nation.
However, Reagan’s popularity remained
high. Inflation that had built up in the 1970s subsided, and unemployment went
down. But economic good times obscured uneven distribution of income and growing
poverty.
Reagan’s admirers were surprised
during his second term when in 1986 and 1987 the public learned about the
Iran-Contra Affair. Members of Reagan’s administration had secretly sold arms to
Iran in exchange for the liberation of Americans who had been held hostage in
Lebanon. Reagan officials then used the profits to subsidize the Contras, a
rebel force that sought to overthrow the left-wing Sandinista regime in
Nicaragua. Congress had forbidden such aid. The scandal, however, did little to
diminish the Reagan legacy. The president’s supporters hoped for less government
and lower taxes, and many of them held conservative social positions, such as
opposition to abortion.
George Herbert Walker Bush, Reagan’s
vice president and successor, inherited Reagan’s agenda and continued Reagan’s
policies. Bush won public approval for his management of the Persian Gulf War in
1991. But like his predecessors, Bush met with public disapproval about the
economy. First the brief war caused oil prices to rise, and war costs put new
pressures on federal finances. Second, Bush had promised “no new taxes,” but in
fact agreed to raise taxes. Finally, the president clashed with Congress over
how to improve the economy and reduce the huge national deficit.
With dwindling support since the
1960s, the Democratic Party had trouble electing its presidential candidates. As
a moderate “New Democrat,” Bill Clinton in 1992 bucked the trend. He supported
centrist, middle-class goals such as efficient government, economic growth, a
balanced budget, and health-care reform. But Clinton’s most important goal—a
sweeping reform of the national health-care system—failed. In 1994 Democrats
lost control of Congress, a dazzling defeat—the first Democratic loss of the
House in 40 years. With this loss, Clinton’s hope for significant health-care
reform vanished.
Clinton succeeded, however, in
achieving centrist measures such as welfare reform. In 1996 Congress abandoned
the welfare system in place since the New Deal; the Personal Responsibility and
Work Opportunity Reconciliation Act limited welfare recipients to five years of
benefits in a lifetime, and required that adult recipients work after two years.
It also denied some welfare programs and food stamps to legal immigrants.
Although the bill disappointed his more liberal Democratic base, Clinton managed
to hold the center. Riding a booming economy, he also succeeded in obliterating
the huge deficit Reagan left behind.
Clinton’s reputation suffered in 1998
with the revelation of an extramarital affair with a White House intern. The
affair was brought to the public’s attention by a special prosecutor originally
appointed to investigate aspects of the president’s past financial and
administrative dealings. The affair and a lack of presidential candor about it
led Congress to hold impeachment hearings. Angered by the zeal of the
prosecutor, Clinton’s supporters continued to endorse the president, as did many
respondents to opinion polls. The president’s detractors denounced him for lying
to the public, to his family, and to his advisers for many months. Impeached by
the House of Representatives in December 1998 on charges of lying under oath and
obstructing justice, Clinton was acquitted by the Senate in February 1999.
B | End of the Cold War |
In the decade after Watergate, the
United States continued its policy of détente, an easing of Cold War
tensions that began under Nixon and Kissinger. Under President Gerald Ford, in
1975, the United States joined the USSR and 33 other countries to sign the
Helsinki Accords, in which member nations vowed to respect boundaries and human
rights.
When Jimmy Carter became president in
1977, he took a firm stand on human rights and tried to combat rights abuses in
Chile, Argentina, Ethiopia, South Africa, and elsewhere. Carter also opened full
diplomatic relations with China in 1979. In 1978 he hosted a meeting between the
president of Egypt and the prime minister of Israel to work toward a peace
agreement. In March 1979 Israel and Egypt signed the Camp David Accords.
Negotiating the settlement was one of Carter’s finest moments. But trouble lay
ahead. In January 1979 a revolution in Iran, led by Muslim clergyman Ayatollah
Ruhollah Khomeini, toppled the ruler of Iran, Muhammad Reza Shah Pahlavi. Fueled
by anti-American feelings, the revolutionaries seized more than 50 American
hostages in November 1979. They were not freed until Reagan assumed office in
1981.
Détente faltered under Reagan, who
revived Cold War antagonisms. The president restarted the arms race, denounced
the Soviet Union as an “evil empire,” and supplied funds and weapons to
anti-Communist forces in Latin America, notably in El Salvador and Nicaragua,
two poor nations beset by revolution. In 1982 the CIA organized and financed a
guerrilla army in Nicaragua; the trail of secret funds for this venture led to
the revelations of the Iran-Contra Affair. Reagan also advocated a huge military
buildup and supported plans for the Strategic Defense Initiative, known as Star
Wars, a multibillion dollar missile defense system. A heated-up Cold War
prevailed through most of Reagan’s tenure.
But suddenly, after George H. W. Bush
took office in 1989, a series of revolutionary changes occurred. Within a short
time, from 1989 to 1990, the Communist Party in the USSR lost control of the
government, and Communists lost power in the Eastern European countries as well.
The Soviet revolution that dominated the 20th century ground to a halt. The Cold
War was over.
The first signs of the end of the Cold
War appeared during the Reagan administration, in 1985, when Mikhail Gorbachev
came to power in the Soviet Union. To reverse the process of economic decline in
the USSR that had been under way since the 1970s, Gorbachev declared a policy of
perestroika (economic restructuring) and glasnost (political
openness). Under Gorbachev, freedom increased but the economy deteriorated.
Blaming Communist Party bureaucrats for the economic problems, Gorbachev
replaced them with a freely elected legislature.
Then in 1989, Gorbachev refused to
send Soviet troops to bolster Communist regimes in Eastern Europe. Those regimes
began to crumble—first in Poland and Czechoslovakia, then in East Germany,
Bulgaria, and Romania. On November 9, 1989, exuberant Germans dismantled the
wall between East and West Berlin. The two Germanies, which had been separated
since the end of World War II, were reunited in 1990. The Baltic nations—Latvia,
Lithuania, and Estonia—declared their independence from the USSR, and other
republics in the USSR followed.
The collapse of the Soviet Union
transformed world politics. Reagan held four major summit conferences with
Gorbachev in three years. The two leaders signed agreements to establish
scientific and cultural exchanges, to reduce strategic arms and conventional
forces, to improve the environment, and to destroy nuclear missiles that had
been placed in Europe. Seeking new opportunities, American businesses swiftly
made inroads in the former Soviet Union.
With the Cold War over and the Soviet
Union dismantled, the United States faced problems elsewhere. In August 1990
Saddam Hussein of Iraq invaded the neighboring nation of Kuwait. In February
1991, after extensive bombing of Iraqi forces, President Bush sent U.S. troops
into Kuwait and Iraq. They were part of a United Nations coalition led by the
United States.
The Persian Gulf War was brief; the UN
coalition swiftly retook Kuwait and defeated Iraq. The American victory failed
to satisfy critics who believed that Hussein should have been ousted from power.
After Hussein’s defeat in 1991 a UN special commission (UNSCOM) was appointed to
force Iraq to disarm. Beginning in 1998, after Iraq repeatedly refused to allow
UNSCOM to inspect its weapons sites, the United States again bombed Iraq on
several different occasions.
After the Persian Gulf War, the United
States assumed an active role in trying to preserve global peace. Many new
challenges emerged in the 1990s. The part of the world once dominated by the
USSR was in turmoil. Trouble arose in formerly Communist Yugoslavia, where
fierce battles erupted among Serbs, Croats, and Bosnian Muslims (see Wars
of Yugoslav Succession). In March 1999 NATO forces began bombing Serbia and
Serbian targets in Kosovo. Both Serbia and Kosovo were part of the Federal
Republic of Yugoslavia (FRY, now the republic of Serbia and Montenegro). The
attack sought to stop Serbian troops from “ethnic cleansing,” which drove
Albanian Kosovars out of the province to neighboring nations. In June 1999 NATO
and FRY military leaders approved an international peace plan for Kosovo, and
NATO suspended its bombing.
Disputes festered outside Europe as
well. In 1989 Chinese-U.S. relations faltered when China crushed a prodemocracy
movement in Tiananmen Square (see Tiananmen Square Protest). In Latin
America, the civil wars of the 1970s and 1980s waned, but other problems
endured: poverty, guerrilla warfare, and drug trafficking.
The Middle East remained another
insecure region. In late 1987 the intifada, a Palestinian uprising
against Israeli occupation, began in the Gaza Strip and spread to the West Bank.
Although Israel and the Palestinians signed agreements in 1993, 1995, and 1998,
peace remained elusive.
Also, throughout the 1980s and 1990s
the United States contended with a worldwide threat of terrorism, at home and
abroad. Terrorists bombed the World Trade Center in New York City in 1993. In
1995 a massive bomb exploded in a truck in front of the Alfred P. Murrah Federal
Building in Oklahoma City, killing 168 people and destroying much of the
building.
At the end of the 20th century,
Americans confronted a world divided between highly industrial societies and
underdeveloped ones. Industrial societies had high literacy rates, high living
standards, and stable birth rates. Underdeveloped societies had extensive
poverty, high rates of disease, high population growth, and low literacy.
Threats of annihilation no longer came primarily from the Soviet Union, parts of
which retained thousands of nuclear weapons, but from disgruntled rogue nations
with nuclear weapons and from terrorists.
C | Toward a Global Economy |
In the last quarter of the 20th
century, the United States moved toward a postindustrial economy, one based more
on services and information processing than on manufacturing. In the 1970s
industrial production declined, most significantly in iron, steel, autos,
rubber, and textiles. Competition from abroad, such as auto and steel imports
from Japan and Germany, forced many Americans out of well-paying jobs, and the
manufacturing sector continued to shrink. See also Globalization.
The service sector, however, expanded.
Some service workers were highly paid, such as computer technicians, engineers,
and managers. Most service workers, however, worked in low-paying jobs, such as
retail sales, fast food, or custodial work. The decline of manufacturing and the
loss of jobs were closely tied to the development of a global economy. In a
global economy, capital and business relationships cross national and regional
boundaries.
Roots of the global economy reach back
to the late 19th century, when large businesses set up overseas operations. In
the 1950s major American companies sought facilities and markets in Europe. The
most recent wave of globalization began in the 1970s, led by the United States
and Japan. In both countries, large multinational corporations produced goods
and ran subsidiary units in other nations. By the 1990s computers and the
Internet (a worldwide network that links computers and provides instant
communication) enabled investors to move capital anywhere in the world
instantaneously, uncontrolled by government.
At the century’s end, Americans were
enmeshed in the global economy; tens of millions of American jobs depended on
world markets. Many U.S. companies set up operations abroad to reduce labor
costs and to ensure access to foreign markets. Consequently, Americans lost jobs
that moved overseas. Meanwhile, foreigners invested capital in U.S. banks,
businesses, and real estate. Japanese companies built auto plants in Tennessee
and Indiana that employed tens of thousands of American workers. The global
economy generated corporate profits, especially for the world’s largest
multinationals. Less fortunate consequences included a rising trade deficit
(Americans bought more in foreign goods than they sold to foreign nations). The
global economy also meant that events in markets around the world had a greater
effect on financial markets in the United States. Many American investors
discovered this effect in the fall of 1998, when stock prices, influenced by
markets in Japan, Europe, and around the globe, wavered wildly.
To attract investment, increase trade,
and regulate the global economy, the United States joined regional trade
organizations. In East Asia, the Asia-Pacific Economic Cooperation (APEC) linked
the United States and Asian nations. In 1988 the United States and Canada signed
a treaty to begin a transition to complete free trade, and in 1994 the
arrangement, the North American Free Trade Agreement (NAFTA), was extended to
include Mexico. The General Agreement on Tariffs and Trade (GATT) treaty, signed
by the United States in 1994, lowered trade barriers and established the World
Trade Organization (WTO) to resolve disputes.
Postindustrialization and the global
economy took a toll on American workers. Between 1979 and 1995, the United
States lost more than 43 million jobs. In the 1980s, as the Cold War wound down,
defense industries folded and left thousands unemployed. In the early 1990s,
major corporations laid off hundreds of thousands of employees. Such drastic
cuts, known as downsizing, were necessary, companies claimed, in order to
compete in the global economy. Workers who moved to new jobs in the service
sector usually earned less, and the ranks of temporary and part-time workers,
lacking benefits or prospects for advancement, grew. Unemployment declined at
the end of the 1990s as the economy soared and the federal deficit shrank.
Organized labor also saw hard times.
As blue-collar jobs vanished, union membership fell, and unions began to lose
leverage as a political pressure group. A turning point came in 1981 when
President Reagan broke an air traffic controllers strike. The strikers’ loss was
especially significant because in this instance, the employer was the U.S.
government. Thereafter, unions struggled to cope with the dual impact of a
postindustrial and global economy. At the same time, income inequality
increased. By the 1990s, chief executive officers (CEOs) earned several hundred
dollars for every dollar earned by the average factory worker.
As labor lost power, management gained
it. Surging insurance costs, for instance, boosted employer demands for managed
care, or health maintenance organizations (HMOs). Such organizations multiplied
after Congress failed to enact the Clinton administration’s health-care plan. By
the end of the 1990s HMOs enrolled more than 60 percent of the population. A
goal of employers—to lower insurance costs—thus transformed a sector of the
economy. But managed care evoked controversy. Proponents claimed that managed
care plans focused on preventing illnesses rather than just treating them.
Critics argued that managed care deprived doctors of authority, forced patients
to cede the right to choose their own doctors, and put cost control before
quality care.
Finally, women’s entry into the work
force in massive numbers changed the economy. In 1980, 51.5 percent of women age
16 and older had joined the labor force, where they made up 42.5 percent of
employed workers. By 1997, 59.8 percent of women were in the labor force,
representing 46.2 percent of all workers. Women brought to the workplace new
concerns—about wage inequality, quality childcare, and the integration of paid
work with family life. A growing industry developed to provide day care for
children, but the government rarely funded such facilities although some people
thought it should.
Despite their increased numbers in the
workplace, women generally received less pay than men. Women’s organizations
demanded pay equity, or comparable worth, an effort to raise pay levels in
occupations in which women predominate. Under a pay equity policy, for instance,
a woman office worker and a male truck driver who worked for the same company
would receive the same pay. Women managers complained about a “glass ceiling”
that limited their prospects for advancement. For women working in office jobs,
new technology transformed office work but devalued the skills of clerical
employees. Many women joined the growing pool of less costly temporary or
contingent workers.
Women’s growing role in the work force
led to changes in public policy. One change was a family leave policy. In 1990
President Bush vetoed a bill that would have offered unpaid leave to workers
with family obligations. But President Clinton in 1993 signed a family leave law
that required companies of more than 50 workers to allow workers up to 12 weeks
of unpaid leave a year to cope with family concerns. The law enabled workers
with obligations related to childbirth, adoption, illness of a family member, or
an aged relative to take short periods of time off without fear of losing their
jobs.
Women’s concerns about economic
equality also led the federal government in the 1980s to develop policies
against sexual harassment in the workplace. Sexual harassment, according to the
Equal Employment Opportunity Commission, refers to behavior that makes sex a
condition of advancement, unreasonably interferes with an individual’s job
performance, or creates an “intimidating, hostile, or offensive working
environment.” Such behavior is a form of sex discrimination prohibited by the
civil rights law of 1964. Beginning in 1986, federal court decisions defined the
concept of sexual harassment and upheld rules against it. Sexual harassment
policy received extensive publicity in 1991 when lawyer Anita Hill testified
before the Senate Judiciary Committee to block the Supreme Court appointment of
future justice Clarence Thomas.
D | A Changing Population |
In the last quarter of the 20th
century, the United States underwent social changes as well as economic ones. By
the century's end, Americans were vastly more diverse. The new diversity
reflected rising immigration rates, a legacy of the Immigration and Nationality
Act of 1965. The act had abolished quotas based on national origins that favored
northern and western Europeans and imposed the first limits on immigrants from
the western hemisphere. According to the law, 170,000 migrants could enter
annually from the eastern hemisphere and 120,000 from the western hemisphere.
The act exempted spouses, children, and parents of U.S. citizens from these
numerical limits.
The new policy had unexpected results.
Legal immigration exceeded congressional limits, due mainly to the family
exemptions. Immigrants from Asia and Latin America quickly surpassed in number
those who came from Europe. In addition, illegal immigration soared. By 1998,
according to census data, immigrants accounted for 9.8 percent of the United
States population, compared with 4.8 percent in 1970.
In 1986 Congress passed the
Immigration Reform and Control Act, which outlawed the hiring of illegal aliens,
imposed penalties on employers who did so, toughened control of immigration on
the Mexican border, and offered amnesty to aliens who could prove that they were
in the country continuously since January 1, 1982. About 3 million undocumented
newcomers gained amnesty under the law. In 1990 Congress passed the most liberal
immigration statute of the post-World War II era. This law allowed 700,000
immigrants to enter the United States each year and provided asylum for
political refugees.
The new immigration of the late 20th
century differed from that of a century earlier. By the 1980s only 10 percent of
immigrants were Europeans. Over 40 percent were Asian—from China, Taiwan, Hong
Kong, Vietnam, Cambodia, Japan, the Philippines, and South Korea. Most of the
rest came from Mexico, other parts of Central America, the Caribbean, and South
America. Hispanic immigrants were the fastest-growing group. From 1970 to 1990
the number of Hispanics in the United States grew from 9 million to 22.4
million. Economic problems in Mexico spurred still more immigration, legal and
illegal. According to the Immigration and Naturalization Service, 5 million
illegal immigrants lived in the United States in 1996. The largest number of
illegal aliens in the 1990s came from Mexico, El Salvador, Guatemala, and Haiti.
Many others came from Canada, Poland, China, and Ireland. In 1996 Congress
passed the Illegal Immigration Reform and Immigrant Responsibility Act, which
made it easier for the government to deport aliens attempting to enter the
United States without proper documents.
Over 70 percent of immigrants who came
to the country in the 1990s settled in six states: California, Texas, Florida,
Illinois, New York, and New Jersey. By 1996 half the nation’s foreign-born
inhabitants lived in California, where protests arose against heavy immigration,
legal and illegal. In 1994 California voters passed Proposition 187, which
revoked the rights of illegal immigrants to state education, welfare, and health
services. Challenged in court, the main provisions of the law never took effect.
Conflict over increased immigration also led to incidents of racial bias, such
as black boycotts of Korean-owned grocery stores in New York in the 1980s and
confrontations between Asian Americans and African Americans in a Los Angeles
riot in 1992.
Critics of immigration policy
contended that lawmakers who passed immigration laws since the 1960s had
underestimated their effect. These critics believed that the new immigration
created more problems than benefits. They saw high immigration rates as
threatening America’s common culture, increasing competition for jobs, lowering
wages, profiting only employers, injuring labor, and especially harming those at
the bottom of the job market. Defenders of liberal immigration policies argued
that the United States had a long tradition as a nation of immigrants. They
stated that immigration boosted the economy, that the taxes paid by immigrants
exceeded the costs they incurred, and that newcomers took jobs no one else
wanted and contributed their skills and education to the U.S. economy. Proposals
to restrict immigration made no progress, but the increasing diversity of
American society led to new issues.
E | Unity and Diversity |
As the 20th century came to a close,
issues arose about whether group identity challenged national identity. Many
Americans wanted to preserve a sense of national unity while respecting social
diversity. They debated the pros and cons of bilingual education, the impact of
multiculturalism, and the merits of affirmative action policies in education and
employment.
Organizations representing
Spanish-speaking Americans began to demand bilingual education in the 1960s.
Mexican Americans in particular urged the use of Spanish in schools and the
teaching of Mexican American culture. In 1968 Congress passed the Bilingual
Education Act, which enabled non-English speakers to be educated in their own
language. The Voting Rights Act of 1975 enabled them to vote in their own
language.
In the 1980s opposition to
bilingualism grew; opponents sought to make English the official language of the
United States and to restrict bilingual school programs. Supporters of bilingual
education replied that forcing students to give up their native languages
weakened their distinctive cultures. Opponents contended that bilingual
education slowed the pace at which non-English-speaking students entered the
mainstream of society. In 1998 California voters passed Proposition 227, which
sought to replace bilingual education that had been in place for three decades
with a program of nearly all-English instruction. Arizona passed a similar law
in 2000.
Multiculturalism is a concept with
many meanings, but it often refers to acceptance of immigrant and minority
groups as distinct communities, distinguishable from the majority population.
Like bilingualism, multiculturalism provokes debate. Advocates of
multiculturalism believe that members of minority groups should enjoy equal
rights in American society without giving up their diverse ethnic cultures.
Multicultural education programs, for instance, strive to teach the content of
different cultures, to build tolerance of these cultures, and to eliminate
discrimination. The hope is to enable students to understand how other cultures
view the world. Multiculturalists reject the idea of a melting pot and
assimilation; they dismiss the idea that national identity must be based on a
common heritage and values.
Critics argue that multicultural
education creates conflict among groups more than it fosters tolerance of one
group for another. Cultural pluralism, critics contend, promotes rivalry and
divisions. Moreover, they assert, European traditions remain central to American
culture and institutions. Some critics find multiculturalism a token gesture
designed to hide continuing domination of American culture by the majority
group. Others argue that recognition of cultural differences and group
identities does not help address social and economic disadvantages.
The policy of affirmative action has
probably evoked the most widespread controversy. President Kennedy first used
the term in 1961 in an executive order requiring groups that did business with
the government to take affirmative action to remedy past discrimination against
African Americans. President Johnson reissued the policy in another executive
order in 1965. During Nixon’s administration, the government increased
affirmative action regulations to include women and other minorities. To
supporters, affirmative action provides opportunities for members of groups that
suffered past discrimination. To opponents, it is reverse discrimination.
Opponents especially object to the use of quotas—the setting aside of a specific
number of college admission places or job slots—for members of minority
groups.
The Supreme Court dealt with this
controversial issue in Regents of the University of California v.
Bakke (1978). The Court upheld the claim of a white applicant that he had
been unconstitutionally denied admission to a state medical school solely on the
basis of race. However, Justice Lewis Powell, Jr., wrote in the majority opinion
that racial preferences in determining admission are permissible if their
purpose is to improve racial diversity among students, and if they do not
stipulate fixed quotas but take race into account as one factor among many.
Soon affirmative action was extended
to employment, and the policy came before the courts in many subsequent cases.
The courts consistently sustained affirmative action policies. Businesses and
schools began to use such policies widely. But controversy persisted, and
affirmative action continued to be challenged at the polls and in the
courts.
In 1996 California voters approved
Proposition 209, an initiative that ended affirmative action throughout the
state in public hiring, purchasing, and other government business. The same year
the Fifth U.S. Circuit Court barred the University of Texas Law School from any
consideration of race or ethnicity in its admissions decisions. The Supreme
Court chose not to review the case.
However, in 2003, the Supreme Court
reaffirmed its decision in Bakke in the case of Grutter v.
Bollinger. In a 5 to 4 decision, the Court reiterated its position on the
racial diversity principle, though it noted that racial preferences should not
be permanently justified. Civil rights organizations and education officials
hailed the decision. “In a society like our own,” wrote Justice Sandra Day
O’Connor in the majority opinion, “race unfortunately still matters. In order to
cultivate a set of leaders with legitimacy in the eyes of the citizenry, it is
necessary that the path to leadership be visibly open to talented and qualified
individuals of every race and ethnicity.”
XXVII | THE EARLY 21ST CENTURY |
By the end of the 20th century the Cold
War had ended, and the United States was riding a wave of unparalleled economic
prosperity. But Americans learned at the dawn of the 21st century that they were
not immune to the dangers posed by a volatile and turbulent world.
On September 11, 2001, terrorists
carried out a devastating attack on the World Trade Center in New York City and
the Pentagon in Washington, D.C. It was the first enemy action on American soil
since the Japanese attacked Pearl Harbor in 1941.
The country also faced an economic
recession beginning in 2001 in which more than a million jobs were lost. The
recession reminded the country that economic good times were not guaranteed to
last forever. While new realities spawned new fears, they also revealed reserves
of resilience and strength in the national character. Faced with unexpected
challenges, a resourceful and increasingly diverse country showed the world that
it could not be easily demoralized.
A | An Increasingly Diverse Population |
The United States had a larger, more
diverse population than ever as the 21st century began. According to the 2000
census, the population grew to more than 281 million people during the 1990s, an
increase of 32.7 million since the 1990 census. Hispanic Americans fueled much
of the population increase. The fastest growing minority group in the United
States, the Hispanic population grew from 22.4 million to 35.3 million, a 58
percent increase, from 1990 to 2000. The Asian American population grew by 48
percent in the 1990s. The census also showed that, for the first time since the
early 1930s, one out of every ten Americans was foreign-born. The country was
getting older as well. The median age in the United States rose to 35.3 years,
higher than ever. The fastest growing age group was a segment of the so-called
“baby-boom” generation—people between 45 and 54.
Most of the population growth took
place in the West and South in cities such as Denver, Colorado, and Atlanta,
Georgia. Big cities in the North and East such as Philadelphia, Pennsylvania,
and Detroit, Michigan, lost population in the 1990s. The nation’s midsection
also emptied out. Sixty percent of the counties in the Great Plains states
(Texas, Oklahoma, Iowa, Nebraska, Kansas, Minnesota, and North and South Dakota)
lost people. Nearly 2,300,000 sq km (900,000 sq mi) in this region met the
definition of frontier: land populated by six or less people per square mile
(2.3 people per square kilometer).
The American family also underwent
dramatic changes. Census data revealed that for the first time, married couples
with children represented less than a quarter of all U.S. households (23.5
percent, down from 38.8 percent in 1970). The number of single mothers, single
fathers, and unmarried couples grew sharply. However, the decline in the number
of so-called nuclear families—two adults and their children—did not necessarily
signal a breakdown in traditional families. Many married couples were simply
waiting longer to have children. And more couples were living longer after their
children left home. Two troubling trends, divorce and out-of-wedlock births,
slowed their growth in the 1990s.
B | The Bush Administration |
As President Clinton’s second term
came to an end, the country geared up for the 2000 presidential election. The
main candidates were Clinton’s vice president, Al Gore, and Texas governor
George W. Bush, the son of former president George Herbert Walker Bush. A
Democrat, Gore stressed protecting the environment and improving education.
Bush, the Republican candidate, campaigned as a “compassionate conservative,”
advocating a tax cut and conservative social policies.
The resulting vote was like no other
in U.S. history. For five weeks after the election, the outcome of the race
between Bush and Gore remained undecided. The critical state was Florida, where
Bush led by just a few hundred votes. A bitter legal dispute arose over the
recounting of some ballots in that state. After a tangled series of court
hearings and recounts in some areas of the state, the U.S. Supreme Court ruled 5
to 4 that the counting should end. The decision effectively awarded Florida’s
electoral votes and the election to Bush. Although Gore won the nation’s overall
popular vote by more than 500,000 votes out of 105 million cast, Bush captured
271 electoral votes to Gore’s 266, and thus the presidency. The extraordinary
closeness of the election reflected, at least to some extent, the public’s
doubts about whether either man was prepared to be president. It also showed
that the country remained deeply divided over which political party was best
able to address its problems. See Disputed Presidential Election of
2000.
Once in office Bush focused on tax
cuts, education reform, and an expanded role for church-based charities in
running social programs. In 2001 Congress approved Bush’s $1.35-trillion tax
cut, which took effect over a ten-year period, lowered income tax rates for all
taxpayers, and included a small refund to many taxpayers. In 2002 Bush signed
into law an education bill that established, among other things, performance
standards for public schools. A second round of tax cuts in 2003 provided
benefits for stock market investors by lowering the tax rate paid on dividends.
The cuts also reduced the estate tax and eliminated it entirely by 2010.
C | Terrorist Attacks on the United States |
American life changed dramatically on
the morning of September 11, 2001. Terrorists hijacked four commercial
jetliners, crashing two into the World Trade Center towers in New York City,
which collapsed into smoldering rubble. Another hit the Pentagon in Arlington,
Virginia, while the fourth plane crashed in rural Pennsylvania after what was
believed to be a passenger uprising against the hijackers. About 3,000 people
died in the attacks. See also September 11 Attacks.
The government shut down all air
traffic for two days as fighter jets patrolled the skies. National Guard troops
were deployed on the streets in New York City and Washington, D.C. The major
stock exchanges were closed.
The event traumatized the nation.
Most Americans saw their country as virtually unassailable as the 21st century
began. With the Cold War over, America’s status as the world’s lone superpower
seemed secure. But as millions watched the catastrophe unfold on television, it
was clear that the country was vulnerable in ways that most people had not
imagined.
After the initial shock, the country
mobilized. Volunteers flooded blood banks and military recruiting stations.
Millions of dollars were raised for the families of victims. A new patriotic
sentiment surfaced as sales of American flags surged. Many people spoke of
simplifying their lives and of spending more time with family and friends.
The U.S. government quickly
identified the hijackers as members of al-Qaeda, an organization that, according
to U.S. officials, connected and coordinated fundamentalist Islamic terrorist
groups around the world. The government also believed that al-Qaeda was
responsible for other attacks, including the bombings of U.S. embassies in
Tanzania and Kenya in 1998 and the attack on the Navy ship U.S.S. Cole in
Yemen in 2000. Its leader, a wealthy Saudi businessman named Osama bin Laden,
had pledged jihad, or holy war, against the United States for its
activities in the Middle East. The group made its headquarters in Afghanistan,
where it was supported by the country’s rulers, an Islamic fundamentalist
movement known as the Taliban.
Instead of launching an immediate
attack, Bush spent the first days following the terrorist attacks consulting
with military leaders and assembling a coalition of nations to fight terrorism.
The coalition included countries in the North Atlantic Treaty Organization
(NATO) alliance, such as Britain.
Fears rose again in early October
when a powdered form of the bacterium known as anthrax began to appear in
letters in some places around the country. Anthrax lives in the soil and is most
often found in grass-eating animals such as cattle. It forms hard-to-kill spores
that, when ingested, can cause serious and sometimes fatal infections. Over the
next few weeks, anthrax killed five people in Florida, New York, Connecticut,
and Washington, D.C. It also forced the temporary closure of two congressional
office buildings. At first some investigators thought that the outbreak was
another form of attack by al-Qaeda. As the investigation progressed, however,
some came to believe that someone inside the United States was responsible.
In early October the United States
went to war, bombing al-Qaeda training camps and missile installations in
Afghanistan. Within a few weeks, U.S. marines joined with Afghan opposition
groups to topple the Taliban. The U.S. forces killed or captured many al-Qaeda
fighters, but bin Laden remained at large.
On the home front, President Bush
signed the Patriot Act in 2001 to give the government expanded powers to monitor
terrorist suspects. Some critics, however, said the new law represented an
infringement on civil liberties. Bush also signed a law in 2002 that created a
new executive department, the Department of Homeland Security (DHS). The
department’s mission was to protect the United States against terrorist attacks,
reduce the country’s vulnerability to terrorism, and aid recovery in case of an
attack. The DHS combined dozens of federal agencies into one department, the
largest government reorganization since the Department of Defense was created in
1947. See also Civil Rights and Civil Liberties.
In 2003 a congressional inquiry
concluded in an 858-page report that the U.S. intelligence community “failed to
fully capitalize on available, and potentially important, information” that
could have helped prevent the September 11 attacks. The inquiry found that U.S.
intelligence agencies failed to share information with each other and failed to
take action based on the information they did have. Specifically, the report
cited the Central Intelligence Agency (CIA) for missing numerous opportunities
to notify the Federal Bureau of Investigation (FBI) that two men linked to
al-Qaeda could be in the United States. The two men were among the future
hijackers, and prior to September 11 had contact with an FBI informant in San
Diego, California. But because the FBI was unaware of their al-Qaeda link, the
bureau did not investigate them, missing what the congressional probe called the
“best chance to unravel the Sept. 11 plot.” Furthermore, the report found, the
CIA failed to put the two men on a watchlist used by the State Department, the
Immigration and Naturalization Service, and the Customs Service to deny
individuals entry into the United States.
The inquiry also found that FBI
headquarters failed to heed warnings from its Phoenix office about terrorist
suspects seeking to enroll in flight training schools or to act properly on a
request from its Minneapolis office to conduct a search of an alleged
conspirator in the terrorist attacks. Prepared by a joint committee of the House
and Senate Intelligence committees, the report disputed an FBI claim that none
of the hijackers had contacted any “known terrorist sympathizers,” finding
instead that five hijackers had contact with 14 persons who had been
investigated by the FBI for possible links to terrorism. The intelligence
community was aware as early as 1994 that terrorists might use aircraft in an
attack and knew as early as 1998 that bin Laden was planning an attack within
the United States, the report concluded.
In July 2004 an independent,
bipartisan commission formally known as the National Commission on Terrorist
Attacks Upon the United States issued its final report after a nearly two-year
investigation into the September 11 attacks. The commission, chaired by former
New Jersey governor Thomas H. Kean, found that neither the administration of
President Bill Clinton, nor the Bush administration, prior to September 11, had
grasped the gravity of the threat posed by al-Qaeda. The report said that “none
of the measures adopted by the U.S. government from 1998 to 2001 disturbed or
even delayed the progress of the al-Qaeda plot. Across the government, there
were failures of imagination, policy, capabilities, and management.” The
commission said its purpose was not to cast blame, but to make recommendations
for improving U.S. counterterrorist efforts, and it put forward several
proposals to unify U.S. intelligence agencies and place them under a national
intelligence director. Congress approved the creation of the office of a
national intelligence director in January 2005.
D | War with Iraq |
After the United States toppled the
Taliban in Afghanistan, the Bush administration turned its attention to Iraq.
Although a U.S.-led coalition had defeated Iraq in the Persian Gulf War in 1991,
Iraq’s leader, Saddam Hussein, remained in power. After that war ended, the
United Nations (UN) ordered Iraq to destroy its biological and chemical weapons.
Weapons inspectors were sent to Iraq to monitor its disarmament. However, in
1998 Iraq announced that it would no longer cooperate with the UN, and UN
weapons inspectors left the country.
In 2002 the Bush administration put a
renewed focus on Iraq as part of its war on terrorism. It claimed that Iraq
supported terrorist organizations and still had an arsenal of banned weapons.
The United States pressed the UN to force Iraq to allow weapons inspectors back
into the country. In October the U.S. Congress passed a resolution authorizing
the president to use military force against Iraq if Iraq did not cooperate with
the UN. The next month the UN passed a resolution cosponsored by the United
States and Britain ordering the immediate return of weapons inspectors to Iraq
and threatening “serious consequences” if Iraq did not disarm. Iraq agreed to
comply with the resolution, and inspectors began working in Iraq that same
month.
In early 2003 the United States and
Britain claimed that Iraq was not cooperating with UN weapons inspectors, and
they sought UN authorization of force against Iraq. However, some countries,
including France, Germany, Russia, and China, wanted to give the inspections
more time to proceed and opposed military action. After weeks of diplomatic
wrangling, the United States decided to forgo UN approval and pursue military
action against Iraq with a coalition of willing countries.
In March 2003 U.S.-led forces invaded
Iraq. By mid-April they had captured the capital city of Baghdād and other major
population centers and overthrown the regime of Saddam Hussein. In May President
Bush declared that major combat operations in Iraq had ended and that an ally of
al-Qaeda had been defeated. However, in the months that followed more U.S.
troops were killed by guerrilla insurgents than during the invasion itself. In
September Bush conceded that there was no evidence proving an al-Qaeda link to
the regime of Saddam Hussein. Unrest continued in Iraq and even the capture and
arrest of Saddam Hussein in December 2003 failed to end it. The insurgency was
concentrated mainly among Sunni Muslims and a segment of Shia Muslims opposed to
the U.S. occupation. See also U.S.-Iraq War.
The United States appointed a
25-member Iraqi Governing Council, consisting of the major ethnic and religious
groups in Iraq, but the council’s authority was subordinate to that of the U.S.
administrator in Iraq, L. Paul Bremer III. In March 2004 the council approved an
interim constitution, although the 12 Shia Muslim members of the council
objected to some of the constitution’s provisions. The constitution guaranteed a
broad array of democratic rights, including rights for women and the Kurdish
minority, and called for elections for a national assembly by January 1, 2005.
The Bush administration transferred sovereignty to an interim Iraqi government
at the end of June 2004, but it maintained about 130,000 troops in Iraq and
imposed a number of orders that introduced privatization to Iraq’s previously
state-run economy.
In the meantime the hunt for Iraq’s
alleged weapons of mass destruction proved fruitless. In October 2003 a team of
U.S. weapons inspectors reported that it had found no weapons of mass
destruction. In January 2004 the head of the group, David Kay, resigned and told
Congress that “we were all wrong, probably” about the existence of such weapons.
Kay urged an independent inquiry into the failure of U.S. intelligence. Kay said
the group not only could not find weapons of mass destruction but more
importantly could not discover any of the facilities needed to produce such
weapons on a large scale. A final report concluded that Hussein had ordered the
destruction of biological and chemical weapons and had discontinued a nuclear
weapons program but tried to keep these facts secret, fearing an attack by Iran.
The two countries had fought a nearly eight-year-long war (see Iran-Iraq
War).
In July 2004 the bipartisan
commission that investigated the September 11 attacks also concluded that there
had been no collaborative relationship between the Hussein regime and al-Qaeda.
Despite the undermining of the two principal reasons for invading Iraq, the Bush
administration maintained that the toppling of the Hussein regime had
nevertheless made the region more stable and more open to democracy.
E | The Faltering Economy |
After nearly a decade of
unprecedented expansion during the 1990s, the American economy began to show
signs of a slump at the beginning of the 21st century. In 2000 the so-called
dot-com bubble—the explosion of companies that sprouted up to take advantage of
the Internet—burst. Analysts cited many reasons for the failure of these
companies. Among them was that investors overestimated the extent to which
consumers were willing to buy goods and services online. When venture
capitalists—the people and companies that provide money to start-up
businesses—became reluctant to invest new funds, the collapse began.
As many Internet companies went out
of business, the stock prices of once high-flying companies such as Cisco
Systems, Inc., and Lucent Technologies began to plummet. Other large companies,
such as Microsoft Corporation and AOL Time Warner, Inc. (present-day Time Warner
Inc.), announced that they would not meet projected profits. And just as
high-technology stocks fueled the market’s rise, they dragged the market down.
Both the Dow Jones Industrial Average and The Nasdaq Stock Market ended 2000
with a loss.
Soon the rest of the economy started
to weaken. The National Bureau of Economic Research, a respected group of
economists, estimated that the U.S. economy actually stopped growing in March
2001. Manufacturing and employment began to decline. The big automobile
companies shut down plants and laid off thousands of workers. As businesspeople
traveled less, airlines began cutting back. By the end of 2001, corporate
profits had suffered one of their steepest drops in decades.
Many economists believe that the
terrorist attacks of September 11, 2001, made the country’s slumping economy
even worse. After remaining closed for several days after the terrorist attacks,
the stock market suffered a record plunge when it reopened, with anxious
investors selling off their holdings. Companies continued to trim workers,
accelerating a downsizing that would total more than 1 million jobs by the end
of 2001. Unemployment reached 8.3 million in December 2001, the highest in seven
years.
The federal government tried to
cushion the economic blows. Within two weeks of the terrorist attacks, Congress
approved $15 billion in aid for the devastated airline industry. But with
billions of additional dollars earmarked for defense spending and domestic
security in the wake of September 11, the government only had a limited ability
to cope with the faltering economy.
As 2002 began, however, the stock
market rebounded strongly, and the pace of corporate layoffs slowed. Interest
rate cuts by the Federal Reserve brought interest rates to record lows and
helped some sectors of the economy. Studies showed that even after the terrorist
attacks, American consumers continued to buy homes and cars in record numbers.
In 2003 the major stock indexes
recorded healthy gains, and other economic indicators were positive. The
recovery failed to replace the estimated 2.4 million jobs lost during the
downturn, however, and some economists characterized it as a “jobless recovery.”
Some corporations announced that they were hiring workers overseas to replace
workers in the United States, a practice known as outsourcing. As 2005 began,
concerns over inflation, motivated in part by a rise in petroleum prices, led
the Federal Reserve to begin raising interest rates at a faster pace than
previously anticipated.
F | Presidential Election Year |
The year 2004 began with the
Democratic Party’s Iowa caucus in January, the kickoff for the party’s
presidential nomination campaign in a presidential election year. By March
Senator John F. Kerry of Massachusetts had won enough delegates in the caucuses
and primaries to secure the nomination at the party’s convention in June.
President Bush ran unopposed in the Republican primaries and was nominated at
his party’s convention in New York City in August.
In the November elections, Bush
defeated Kerry, sweeping the South and the key swing state of Ohio to win both
the electoral college tally and the popular vote. Kerry won the Northeast, the
West Coast, and a number of Midwestern states.
Claiming a popular mandate from the
election, Bush began his second term by calling for a sweeping overhaul of
Social Security. His plan to replace guaranteed Social Security benefits with
private accounts invested in the stock market for younger workers met with
resistance, however.
G | Bush’s Second Term |
The failure of Bush’s Social Security
proposal seemed to set the stage for a series of mishaps for the Bush
administration that resulted in some of the lowest approval ratings for the
president since his election in 2000. Chief among these was the federal
government’s delayed response to Hurricane Katrina, an August 2005 disaster that
left tens of thousands of New Orleans residents, mostly poor and African
American, stranded in the flooded city. It was the costliest natural disaster in
U.S. history. A lobbying scandal involving Republican members of Congress, a
decision to lease some U.S. port operations to a company based in the United
Arab Emirates, and continued disorder in Iraq also contributed to popular
disapproval.
Nevertheless, Bush’s second term gave
him an historic opportunity to realign the Supreme Court in a more conservative
direction. With the death of Chief Justice William Rehnquist and the retirement
of Justice Sandra Day O’Connor, Bush succeeded in winning the Senate
confirmation of two conservative jurists, John Glover Roberts, Jr., who
succeeded Rehnquist as chief justice, and Samuel A. Alito, Jr., who succeeded
O’Connor. With Justice Anthony Kennedy often filling the role of a swing voter
and with some uncertainty about the judicial philosophies of the new appointees,
however, it was unclear if Bush’s new appointments would lead to the overturning
of significant precedents, such as Roe v. Wade.
The extent of presidential power in
relation to the U.S. system of checks and balances spurred controversy during
Bush’s second term. Throughout his prosecution of the wars in Afghanistan and
Iraq, Bush claimed that he had wide latitude as commander in chief to protect
national security. Those claims were the basis for denying Geneva Convention
protections to prisoners held at Guantánamo Bay, Cuba. In 2006 Bush claimed that
he had the authority as commander in chief and under the congressional
resolution that authorized military force in Afghanistan to order the National
Security Agency to eavesdrop on the overseas communications of U.S. citizens and
nationals. The secret program, begun after the September 2001 terrorist attacks,
was disclosed by a 2006 report in the New York Times. Some congressional
critics said the 1978 Foreign Intelligence Surveillance Act required judicial
review of electronic eavesdropping in such cases, making the program illegal.
See also Surveillance, Electronic.
Bush’s assertions of sweeping
presidential power were rejected by several Supreme Court decisions,
particularly Hamdan v. Rumsfeld (2006), which upheld the Geneva
Convention’s protections and struck down the administration’s plans to try
prisoners held at Guantánamo before special military tribunals. The
Republican-controlled U.S. Congress, however, addressed the Hamdan
decision by passing the Military Commissions Act of 2006, which gave Bush
administration officials immunity from prosecution for torture or inhuman
treatment of detainees and suspended habeas corpus for anyone declared an
illegal enemy combatant. Bush also frequently issued signing statements in which
he asserted a presidential prerogative to ignore provisions in legislation
passed by Congress if he deemed the provisions infringed on his alleged powers
as commander-in-chief. See also Civil Rights and Civil Liberties.
As the midterm elections approached,
Republicans were fearful that the president’s low popularity ratings could lead
to a Democratic takeover of one or both houses of Congress. But the Democratic
Party appeared to be disunited, particularly over the Iraq war, and uncertain as
to how to take advantage of the president’s unexpected setbacks. In the meantime
the U.S. economy showed considerable resilience, having recovered all the jobs
lost during the recession of 2000. Despite economic growth, however, the United
States continued to lose jobs in the manufacturing sector, and the two major
automakers, General Motors Corporation and Ford Motor Company, announced plans
for massive layoffs as their market shares dwindled.
Republican fears were validated by
the results of the 2006 midterm elections, which saw Democrats gain control of
both houses of Congress. The Democrats also took six state houses from the
Republicans, giving them a majority of the country’s governorships. President
Bush called the election results a “thumping,” and the day after the election he
asked for and received the resignation of Secretary of Defense Donald Rumsfeld,
the architect of the Iraq war planning. Polls showed that voters overwhelmingly
disapproved of Bush’s handling of the war. The congressional leadership of the
Democratic Party resisted calls from the left wing of the party for Bush’s
impeachment but pledged to hold hearings and investigations into the prewar
intelligence that led to the war, the administration’s handling of the war,
including allegations of torture and prisoner abuse, and the way in which Iraq
war funds had been spent.
As 2007 progressed Bush’s handling of
the Iraq war continued to come under intense criticism, particularly after Bush
largely ignored the recommendations of the bipartisan Iraq Study Group. The
conviction of Vice President Dick Cheney’s chief of staff in the Valerie Plame
Wilson affair, and the controversial firing of eight U.S. attorneys also
distracted the Bush administration from pursuing its second-term agenda.
The highly regarded Iraq Study Group,
made up of leading foreign policy experts from both parties, had issued its
final report in December 2006 and made 79 recommendations for how to wind down
the war and bring home American troops by 2008. Included among the
recommendations was a call for U.S. negotiations with Iran and Syria and a
renewal of the Israeli-Palestinian peace process with the aim of achieving a
region-wide peace settlement in the Middle East. In a January 2007 nationally
televised address, however, President Bush instead called for an additional
20,000 troops to be sent to Iraq. The Democratic-controlled Congress
characterized the proposed “troop surge” as an escalation and for the first time
passed legislation that called for a definite timetable for a U.S. troop
withdrawal. A number of Republican members of Congress began to take up the
troop withdrawal position, which President Bush characterized as “tying the
hands” of U.S. military commanders in Iraq. Bush vowed to veto any legislation
with a troop withdrawal deadline.
Meanwhile, the Bush administration
was distracted by the conviction in March of I. Lewis “Scooter” Libby, Cheney’s
former chief of staff, for perjury and obstruction of justice in the
investigation of who leaked the identity of covert Central Intelligence Agency
(CIA) agent Valerie Plame Wilson. In his concluding remarks to the jury, the
U.S. prosecutor said that Cheney himself was “under a cloud” for his role in the
affair, and many political observers believed that Cheney’s once-prominent role
in the administration was being sidelined. The same month congressional
investigations into the firing of eight federal prosecutors cast a cloud over
Attorney General Alberto Gonzales, who later resigned.
H | The Beginning of the 2008 Presidential Campaign |
President Bush’s lame-duck status and
the Democrats’ narrow control of Congress made it increasingly apparent that
little would be accomplished domestically during Bush’s last years in office.
The Democrats lacked enough votes to override presidential vetoes of their
legislation, and Bush lacked enough Republican votes in Congress to enact his
own program. As a result the race for the presidency in 2008 got off to an early
start, and it promised to be the most expensive election in U.S. history. By
April 2007 six declared Democratic candidates had raised $78 million, and five
Republican candidates had raised $53 million, for a combined total of $131
million, with many more months of fundraising to go. Among the leading
Democratic contenders were senators Hillary Rodham Clinton and Barack Obama and
former vice-presidential candidate John Edwards. Among the leading Republicans
were former governor of Massachusetts Mitt Romney, former New York City mayor
Rudy Giuliani, and Senator John McCain.
They were vying for leadership of a
country that reached a population of 300 million in 2006. They were contending
over a number of issues, ranging from the ongoing wars in Afghanistan and Iraq
to global warming, health insurance, immigration, nuclear weapons proliferation,
and the health of the U.S. economy.
By March 2008 the field of candidates
had narrowed to Clinton and Obama among the Democrats and McCain, who became the
presumptive Republican nominee after winning enough delegates in the Republican
primaries and caucuses to secure the nomination at the Republican convention in
September. The range of issues appeared to narrow as well as polls showed voters
increasingly focused on two issues: the state of the economy and the U.S.-Iraq
War.
As 2008 began the new chairman of the
Federal Reserve, Ben S. Bernanke, warned that the U.S. economy appeared to be
headed for a recession. A number of factors contributed to the economy’s growing
malaise, including a decline in housing prices, a sharp increase in oil prices,
a growing number of housing foreclosures and personal bankruptcies, a
diminishing savings rate, rising budget deficits, the growth of income
inequality, and a crisis in financial markets that led the Federal Reserve to
offer a $200-billion loan program to investment banks to offset their losses in
the mortgage market.
For several years Americans had, on
average, stopped saving money. Some economists became alarmed when the personal
savings rate in the United States reached negative territory—that is, Americans
on the whole were spending more money than they earned. Many observers
attributed the negative savings rate to the steady increases in housing prices.
Americans appeared to be banking on the value of their home for their
retirement, rather than setting aside money in savings accounts. Meanwhile,
other observers were warning that the housing market was a bubble that would
eventually burst, and in 2007 it did, as home prices declined nationwide by 8.9
percent, one of the sharpest drops in U.S. history.
A rise in housing foreclosures added
to the gloomy picture. Defaults on home mortgages reached an all-time high in
September 2007. Particularly hard-hit were people who had taken out adjustable
rate mortgages, arrangements that enabled them to pay monthly mortgages at a
relatively low interest rate. After the rates rose, many of these people could
no longer afford to make their monthly payments. Many of the foreclosures
affected people who received so-called subprime mortgages—that is, loans made to
people whose credit ratings or income usually disqualify them for a home
purchase loan. A report by the U.S. Congress estimated that as many as 2 million
families with subprime mortgages would lose their homes due to their inability
to meet rising mortgage payments.
Further complicating the growing
housing crisis was the discovery that due to banking deregulation no one really
knew who held the mortgages that Americans were defaulting on. In March 2008 the
Federal Reserve announced a $200-billion loan program for about 20 large
investment banks to reassure investors worried about losses in the mortgage
market. The Fed also extended an additional $30 billion for J. P. Morgan Chase
& Co. to acquire a leading investment bank known as Bear Stearns, which was
threatened with bankruptcy due to losses in the mortgage market.
The first part of this article was
contributed by Paul E. Johnson. The second part, from Reconstruction to the
Early 21st Century, was contributed by Nancy Woloch.
XXVIII | MORE INFORMATION |
This is one of seven major articles
that together provide a comprehensive discussion of the United States of
America. For more information on the United States, please see the other six
major articles: United States (Overview), United States (Geography), United
States (People), United States (Culture), United States (Economy), and United
States (Government).
No comments:
Post a Comment