America as Apocalypse
Addendum to Some Musings on Humanity
![[object Object] [object Object]](https://substackcdn.com/image/fetch/$s_!dCHe!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30cbebc9-aeed-4bbb-896d-25a59b3fc6a0_476x635.jpeg)
America: A Cultural Analysis
I.
The United States today is arguably the most influential country in history, a position shaped by unprecedented economic, military, political, and cultural reach. This dominance is not inevitable, nor permanent, but its current scale is unmatched.
Economic and Global Influence
Economy: At roughly $25 trillion, the U.S. economy is the largest in the world. States like California, Texas, and New York individually rank among the world’s top economies if treated as countries.
Trade and Currency: Between 80–90% of global trade is denominated in dollars, establishing the U.S. dollar as the de facto global standard. Even the euro, though widely used, is effectively supported by the Federal Reserve.
Financial Markets: The U.S. holds 42% of global equities with just 4.5% of the world population.
Resources: It is the world’s largest producer of oil, natural gas, food, and timber.
Manufacturing: The U.S. is the second-largest manufacturing nation after China.
Cultural and Soft Power
American films, music, video games, social media platforms, and technology dominate global cultural consumption.
English serves as the global language of science, trade, and aviation, amplifying U.S. influence worldwide.
Military Reach
U.S. defense spending exceeds that of the next ten largest countries combined.
Military alliances integrate allied forces into U.S.-led networks, while hundreds of thousands of troops are stationed in over 30 countries, projecting a historically unprecedented global military presence.
America as a Nation in Motion
Unlike European countries with centuries of stable cultural cores, the U.S. has been defined by rapid expansion and population growth:
1770–1920: Population grew from 2 million to 105 million, while territory expanded from 410,000 to 3 million square miles.
States like California, Texas, Michigan, and Minnesota saw exponential population growth, often with incoming settlers creating entirely new cultural landscapes.
Immigration intensified the “double foreignness” of American life: newcomers had to adapt both to the U.S. and to newly settled regions.
From 1910 to 1940, the first Great African-American migration reshaped Northern cities like Chicago, further transforming cultural and economic patterns.
Until World War II, most of the United States was populated largely by strangers. Rapid territorial expansion and explosive population growth created societies without coherence or historical continuity. California, for example, had been a state for roughly a century by 1950, yet its population had multiplied fifty- to eighty-fold since its early days. Cities and towns—boom towns—sprang up overnight, their residents largely unfamiliar with each other or their surroundings.
The abundance of land defined the New World. Unlike Europe or parts of Asia, where land scarcity constrained opportunity, the U.S. offered vast, unclaimed territories. This opportunity attracted migrants seeking a new life, but it also obliterated Old World social patterns. European land tenure customs and communal norms became irrelevant; the geographic and social landscape demanded new cultural structures. Land speculation and settlement became central to the nation’s foundation, shaping its social and economic logic.
American society later explained this rapid cultural recomposition through the metaphor of the Melting Pot—a narrative of assimilation and unity. Yet the critical question is: what were immigrants assimilating into? There was nothing fixed to absorb them. Unlike tribal societies, where rituals and masks transform known individuals into symbolic figures, Americans had no preexisting cultural framework to emulate. In a land of strangers, identity had to be performed, communicated through behavior, attire, and language, rather than inherited from tradition.
English became the assumed language of American identity, even amid large non-English-speaking communities.
“Acting American” became a social necessity, masking the underlying cultural void.
Generationally, even locally born Americans often had no family roots in their towns, reinforcing the sense of impermanence and novelty.
This profound absence of historical or social patterns shaped a unique cultural psychology: identity is constructed rather than inherited. It fostered both adaptability and a persistent sense of rootlessness, explaining the American fascination with ancestry, cults, millennial movements, and ideological fervor. Where most societies inherit coherence, Americans often improvise it, seizing fragments of heritage or ideology to anchor themselves in a fundamentally fluid environment.
World War II provided the first broad sense of American identity: “We are not Communists.” This negative definition offered clarity for decades, underpinning cultural, political, and social cohesion. With the fall of the Berlin Wall in 1989, that organizing principle vanished, revealing the underlying absence of self-definition in American society. The cultural and political confusion of the past thirty years stems, in part, from this vacuum: Americans had long relied on external contrasts to define themselves, and now those contrasts no longer sufficed.
Across most of the U.S., particularly outside the historically stable East Coast and South, 90% of the culture evolved without deep historical or generational moorings. Rapid demographic shifts, territorial expansion, and immigration created societies without inherited patterns, forcing Americans to forge identity through action, performance, and assumption. This destruction of old forms without the production of fixed new ones remains central to understanding American culture, behavior, and worldview.
II.
Culture is generally understood as the patterns, behaviors, traditions, and habits of a people. In most of the world—China, India, Europe, Africa—these patterns develop over centuries, sustained by geographic and social continuity. Even when disrupted, societies retain a shared sense of history and belonging. The United States, by contrast, has largely lacked these stabilizing elements. Rapid population growth, mass immigration, and internal migration produced societies where most residents were new, strangers, or had no generational roots, making shared traditions almost impossible.
American history is unusual not simply because people are indifferent to it, but because the nation was structurally precluded from inheriting deep history. Unlike Europe or Asia, where cultural memory and social continuity provide coherence, the U.S. was settled in a landscape that had been systematically emptied of its preexisting cultures.
Pre-Columbian North America may have housed 5–50 million people; a conservative estimate is roughly 10 million.
Old World diseases—smallpox, measles, influenza—wiped out 50–80% of the population even before sustained colonial settlement.
By the time English colonists arrived, many Native societies were already destabilized or collapsed, with cities abandoned, agricultural systems disrupted, and social structures fractured.
This created a landscape where European settlers encountered a cultural vacuum, not a thriving, continuous civilization.
The colonial expansion that followed compounded this cultural void:
Early English settlements, from Virginia (1607) to Pennsylvania (1681), arrived in waves, often negotiating with Native tribes only to violate agreements when expansion became convenient.
Treaties—360+ signed between 1775 and 1871—were systematically broken to secure land for settlers. The government protected settlers, not Native populations, reinforcing the erasure of Native claims, rights, and history.
The Homestead Act and other land policies required settlers to pretend the land was uninhabited, reinforcing the illusion of “virgin territory” and justifying the displacement of Indigenous peoples.
This was not accidental. It was necessary to establish settler society in a vast, underpopulated landscape. The cultural result was profound: the deep history of North America was effectively written out of living memory for most Americans, leaving the country without inherited patterns of culture, social norms, or historical continuity.
Without access to sustained historical memory, Americans have constructed culture in a vacuum:
Native American history, though rich and abundant, is culturally segregated from the broader American identity.
Even when discovered, artifacts and human remains, like the Kennewick Man, are treated as subjects of scientific inquiry or tribal claims, not as shared national heritage, contrasting sharply with European examples such as Ireland’s Bog Man, which is embraced as part of collective identity.
The result is a society where culture must be improvised, identity performed, and history selectively acknowledged. The absence of deep continuity, traditions, and inhabitation is a defining feature of the United States and a key to understanding its distinct cultural psychology.
African-American history presents another profound dimension to the United States’ lack of historical continuity. The transatlantic slave trade, most active between 1720 and 1780, brought millions of Africans to the American colonies. Slavery was not formally outlawed until 1808, due to a specific clause in the U.S. Constitution. Consequently, African-Americans often have deeper roots in the U.S. than most European immigrant groups, whose arrival peaked between 1850 and 1920.
Yet, despite this longer presence, African-American history has historically been excluded from mainstream narratives. States like Louisiana and Mississippi had populations that were roughly 50% African-American in 1880, and Virginia about 30%, yet this community’s contributions were systematically erased from the construction of “American” identity. Their history is often segregated—celebrated separately as African-American history but not integrated into the collective narrative of the nation.
Literature reflects this tension. Flannery O’Connor portrays the existential threat to white identity posed by African-American agency and visibility in works like Everything That Rises Must Converge and Revelation. Ralph Ellison’s _Battle Royal__ captures the African-American perspective of navigating a society that recognizes them only when they intersect with white institutions, such as Jackie Robinson’s breaking of baseball’s color barrier, while the _rich history of African-American achievements outside these contexts remains marginalized.
Thus, African-American history is present yet culturally partitioned, reinforcing a broader pattern: the U.S. has consistently excluded or compartmentalized historical narratives to preserve certain identities and social hierarchies.
The United States also ignores much of its European colonial heritage:
France controlled regions from Canada to Louisiana, yet this history is rarely acknowledged as foundational to the U.S.
The Netherlands established New Amsterdam (now New York), yet Dutch roots are overlooked.
Sweden colonized parts of Delaware, Pennsylvania, and New York, largely erased from memory.
Russia maintained missions in Alaska, while Spain ruled vast territories in Arizona, Texas, New Mexico, and California, leaving behind Spanish missions and cultural legacies often ignored in education and public awareness.
Even European immigration history, celebrated in identity claims like “Irish-American” or “Italian-American,” ignores earlier colonial roots in favor of more recent arrivals. The United States, in effect, has deliberately severed connections to deep historical roots, allowing expansion and settlement without reckoning with preexisting claims or cultural legacies.
Oscar Wilde’s visit to San Francisco’s Chinatown captures the peculiar result of this historical erasure. Wilde reportedly described it as “the only civilized part of the United States”, recognizing the continuity of Chinese culture transplanted in a foreign land. The observation is striking: while Chinese immigrants maintained a deep, shared cultural framework, the rest of the U.S. lacked established, coherent patterns—cultural, historical, and social.
Chinatowns illustrate what the United States might have had: continuous, organized communities with traditions, language, and history. Instead, the broader society emerged in a vacuum, with patterns disrupted, histories ignored, and identity constructed by exclusion rather than continuity.
The United States’ unique history—marked by the erasure of Native American cultures, the marginalization of African-American contributions, and the neglect of European colonial legacies—has created a society without deep-rooted cultural continuity. Unlike most nations, where history informs identity and social cohesion, American culture has been built in the absence of inherited traditions, requiring improvisation, adaptation, and exclusion to establish a national identity.
This vacuum explains much about the distinctiveness, instability, and seeming peculiarity of American culture: it is a society whose history has been selectively erased, segmented, or ignored, producing a culture that is consciously and unconsciously aware of what it has deliberately left out.
III.
The distinctiveness of American culture can largely be traced to the absence of deep, intergenerational social and cultural structures. Native American history, African-American history, and the relatively short history of settlement in the Midwest and West mean that the U.S. emerged in a historical and social vacuum. This absence of continuity helps explain the centrality of American individualism, a trait repeatedly noted by observers of American society.
American individualism is not merely a philosophical abstraction; it is a practical response to a historical void. With little inherited social framework or deeply rooted traditions, Americans turned inward, constructing identity and meaning around the self rather than established institutions or communal history. While societies worldwide have debated the balance between individual and collective—through Plato, Aristotle, Rousseau, medieval monks, and Chinese philosophers—the American expression of individualism is uniquely radical.
The Heritage Society captures this vividly:
“Individualism has been the primary force of American civilization for three centuries… The American Pioneer is the epic expression of that individualism, and the Pioneer spirit is a response to the challenge of opportunity, to the challenge of nature, to the challenge of life, to the call to the frontier.”
Here, the American individual becomes the central agent of history and culture, responding to opportunity and challenge in the absence of inherited social structures.
Ralph Waldo Emerson and Henry David Thoreau articulate this extreme American individualism. In Self-Reliance, Emerson declares:
“Society everywhere is in conspiracy against the manhood of every one of its members… The virtue in most request is Conformity; self-reliance is its aversion.”
“Life only avails, not having lived… only the present moment of becoming matters.”
“We must go alone.”
Emerson positions society, institutions, and inherited norms as obstacles to the individual, asserting that true power and divinity reside within the self. History, reputation, and collective morality are subordinate to the individual’s immediate and ongoing act of creation.
Thoreau enacts these principles at Walden Pond, embracing radical self-sufficiency and rejecting inherited constraints:
“Better if they had been born in the open pasture and suckled by a wolf… One generation abandons the enterprises of another like stranded vessels… Men are not so much the keepers of herds as herds are the keepers of men.”
For Thoreau, societal norms, property, and inherited obligations are shackles; freedom arises from living deliberately and independently, even at the cost of comfort or conformity.
American individualism, in its Emersonian-Thoreauvian form, is a response to historical absence, a culture-building mechanism in a society lacking the deep continuity found in Europe, China, or elsewhere. It prizes self-reliance, personal agency, and rejection of inherited authority over tradition, history, and social expectation. This mindset has profoundly shaped American politics, culture, and social imagination, creating a civilization unlike any other in its radical celebration of the individual.
American individualism is not merely the celebration of the self; it is a radical, often isolating worldview. Its contours can be understood through five key principles:
1. Society is the Enemy
In American thought, society is not a supportive framework but an obstacle. Institutions, traditions, and collective norms are repressive forces that constrain the individual. True self-realization requires resistance to societal pressures, because civilization and its structures are inherently hostile to personal genius.
2. The Past and Future are Lost
American individualism rejects continuity. Inheriting wealth, land, or status is seen as a limitation on personal freedom, while leaving such legacies to the next generation perpetuates constraint. The past is discarded, the future unplanned; each individual must carve their existence independently. Emerson and Thoreau emphasize rupture over inheritance, breaking with history to affirm the primacy of the present moment.
3. Other Men are Contemptible
In this framework, other people are rarely allies. They are viewed as a deluded, fearful, and conformist herd, incapable of realizing their own potential. The individual does not simply flourish alongside society; they must contend with the weakness and mediocrity of others, whose collective influence threatens autonomy.
4. Given Identity Does Not Count
All inherited or socially assigned identities—family, community, or local affiliation—are insufficient and often obstructive. One’s essence must emerge independently, unshaped by external definitions or social roles. Identity is not received; it must be constructed from nothing.
5. You Must Make Something of Yourself
With society, history, and inheritance stripped away, self-creation becomes imperative. Personal achievement is the sole measure of value. Emerson’s insistence that “doing” matters more than being underscores the American ethic: one’s worth is determined by what one accomplishes alone, often in isolation, against nature or circumstance. The frontier, in this sense, is both literal and metaphorical—a stage for the individual to assert independence.
This extreme individualism generates desperation for self-definition. Without inherited structures, Americans are compelled to act, produce, and assert themselves constantly. Failure to do so is equated with moral and social failure, producing a culture obsessed with productivity and personal achievement. Workforce participation and meritocratic ideals reflect this compulsion: one must work, grow, and create to validate one’s existence.
Unlike American individualism, Stoicism historically emphasized cultivating oneself to serve society effectively, preparing for civic life and contributing to the common good. Epicureanism, by contrast, promoted community and selective engagement. Neither philosophy envisioned a radical, isolated self detached from history, society, or responsibility—the hallmark of the American model. Modern American appropriations of these philosophies often invert their intent, emphasizing self-sufficiency detached from civic duty, mirroring the isolationist individualism that Emerson and Thoreau championed.
American individualism manifests in cultural narratives that exaggerate universal potential, exemplified by Malcolm Gladwell’s Outliers. The idea that anyone can become a Mozart with sufficient effort reflects a core tenet: talent and inheritance must be erased to validate the myth of radical self-creation. The reality, that abilities are unequal and shaped by history and environment, is suppressed to preserve the ideal of complete personal autonomy.
Ayn Rand epitomizes the most extreme articulation of American individualism. She asserts that:
“Every form of happiness is private. Our greatest moments are personal, self-motivated, and not to be touched.”
This claim is deeply flawed. Across human history, joy and fulfillment have often been collective and shared. Ancient Greek festivals, Confucian ceremonies, and communal experiences like attending a symphony create a happiness impossible to achieve alone. Beethoven’s Ninth Symphony exemplifies this: despite his deafness, his genius was realized through the shared collaboration of musicians and audience, demonstrating that even the solitary genius relies on communal interaction.
Rand’s vision denies this. In the American individualist ethos, victory and achievement are solitary. Heroes, from the cowboy to literary archetypes, are defined by their isolation—accomplishing feats alone and departing before forming lasting ties. Community, tradition, and shared experience are seen as obstacles to authentic self-expression.
American individualism imposes profound limitations on social cohesion and understanding of civilization:
Striving Above All: Identity must be earned; inherited wealth, status, or community ties are dismissed. One must constantly work, compete, and prove oneself, creating a culture obsessed with productivity and self-creation. This valorization of self-reliance often blinds Americans to the value other cultures place on community, family, or inherited tradition.
Resistance to Structure: Any guiding framework—from classical music to communal traditions—is treated as restrictive. The ideal is absolute autonomy, even at the expense of mastering craft or engaging with enduring cultural forms. Structure is perceived as constraint rather than a tool for growth.
Erosion of Shared Identity: By rejecting inherited and societal structures, American individualism strips people of stable sources of identity and belonging. Without community or historical continuity, individuals often experience a profound existential anxiety. This vacuum helps explain the rise of identity politics: people seek validation and meaning where traditional forms of belonging no longer provide it.
The ethos of radical individualism produces a nihilistic, competitive, and isolating culture. Success is framed as self-made; failure, as moral or personal inadequacy. This worldview, articulated in literature from Ayn Rand to Cormac McCarthy, imagines a world in which individuals confront an indifferent wilderness or society of “herd-like” mediocrity, with happiness rendered entirely private and unshared.
Even figures like Emerson and Thoreau, while more moderate, embody this tension between individual autonomy and social engagement. Thoreau’s retreat to Walden exemplifies the pursuit of selfhood in isolation, yet historically he maintained contact and dialogue with his peers, highlighting the mythologized extremes of the American individualist ideal.
The technological and industrial revolutions enabled a scale of individualist pursuit previously impossible. Where earlier generations were constrained by agricultural and social structures, modern Americans can more fully enact the ethos of radical self-reliance—sometimes to the point of unhealthy isolation and obsession with personal achievement.
IV.
American culture is often described as highly materialistic. Yet this is not merely a generic love of things—it is a particular form of materialism deeply intertwined with radical individualism. Unlike most societies, where material expression often supports tradition or communal identity, in the United States material possessions became a central marker of selfhood. Without traditional social structures to define belonging, Americans turned to personal wealth, property, and goods as the foundation of identity.
Several historical and structural factors made this uniquely American materialism possible:
Land and Opportunity:
The United States offered unprecedented access to arable land. By the late 1800s, the country possessed at least three times the farmland of the entire United Kingdom. Programs like the Homestead Act allowed individuals to claim land almost freely, creating a cultural ethos of material abundance and personal stake in the land. This was a stark contrast to Europe, where land was scarce and inherited, and wealth accumulation was constrained by centuries-old hierarchies.Economic Growth and Industrialization:
Rapid industrialization and mechanization fueled material aspiration. The U.S. experienced roughly 4% annual GDP growth during the 19th century—unprecedented for the era. Combined with abundant natural resources, this created a society in which material acquisition was not only desirable but achievable, reinforcing the perception that wealth and possessions were both virtuous and inevitable.Material Wealth as Universal Value:
Over time, material success became synonymous with competence and moral worth. Media coverage often elevates wealth as a proxy for insight or authority, reflecting the belief that money confers universal expertise. In this way, wealth is treated less as a tool and more as a cultural ideal, shaping societal values and aspirations.
World War II amplified the American material ethos. Even before direct involvement, the U.S. rapidly scaled production, demonstrating unparalleled industrial capacity and logistical mastery. For instance:
Aircraft production skyrocketed from 2,000 planes in 1936–1938 to 40,000 annually by 1944.
Fleet carriers increased from 6 to nearly 100.
Liberty ships were produced at the rate of two every three days, ensuring the Allies were consistently supplied.
This extraordinary output underscored the U.S.’s relentless material power, impressing both allies and adversaries alike. As Stalin noted, American industrial support was critical to Soviet survival. Material abundance was no longer just a cultural preference—it became a strategic and existential advantage.
American materialism, in combination with extreme individualism, creates what can be called nihilistic materialism. Unlike societies where wealth supports community or tradition, in the U.S., possessions often stand in for identity, meaning, and social structure. Material success is both a measure and a goal in itself, producing a culture where the pursuit of stuff defines value, status, and purpose.
This framework situates American materialism not as mere greed, but as a historically conditioned, culturally reinforced, and uniquely potent ideology—fueled by abundant resources, rapid growth, and a societal structure that prizes individual achievement above communal bonds.
After World War II, the United States emerged from isolationism as an unprecedented military and industrial superpower, while much of the world lay in ruins. With fascism discredited and communism increasingly unattractive, it became clear that material prosperity—the ability to produce and wield economic power—was the decisive measure of national strength and social success. The lesson of the war was unmistakable: those who could create and deliver material goods prevailed.
The 1950s and 1960s were marked by rapid economic growth and consumer abundance. Millions of homes were electrified, labor-saving appliances became widespread, and the mass production of cars, refrigerators, and other goods transformed daily life. This period reinforced the American belief in material prosperity as morally and culturally “right”, further accelerating a preexisting cultural investment in land, wealth, and individual acquisition.
While Americans had long valued material wealth, the postwar era introduced a subtle but powerful shift: from materialism to consumerism.
Consumerism is not merely owning goods; it is the pursuit of goods for the sake of acquisition itself.
Industries, particularly the automobile sector, exploited this psychology by producing annual “new models,” creating continuous demand for items that were not substantially improved.
This drove a society-wide emphasis on buying as an activity, reinforcing economic growth while making material possession an end in itself.
Storage practices, high consumption rates, and the ubiquity of retail chains like Walmart illustrate how acquisition became more important than use. Material abundance alone was insufficient; the act of consuming became a central cultural and economic imperative.
Consumerism naturally evolved into conspicuous consumption: the desire to acquire goods recognized and admired by others. In a society lacking traditional structures for identity and belonging, material possessions became a primary vehicle for self-expression.
Clothing, cars, and branded lifestyle products (e.g., Harley-Davidson) signal identity to others, not to the owner.
Logos, labels, and branded merchandise communicate social affiliation, status, and taste.
Popular culture amplifies this effect, as with movie-related merchandise: fans buy T-shirts and hats to visibly associate themselves with a shared cultural experience.
This phenomenon reflects the broader American pattern: identity, social signaling, and status are increasingly expressed through material ownership.
In contrast, many other societies do not embed consumerism so deeply. For example, some European theaters sell nothing beyond tickets, emphasizing the experience over material acquisition. Such practices underscore that America’s relentless focus on consumption is historically contingent and culturally specific, not a universal human trait.
So, why nihilism? Because when conspicuous consumption, consumerism, and materialism converge, they create an impossible promise: the idea that acquiring the right objects can confer meaning and self-worth. Material goods, in themselves, cannot fulfill this role. While making, selling, or using objects—whether instruments, sailboats, or furniture—is inherently valuable, the obsession with owning “the right” items transforms necessity into illusion.
People often equate material acquisition with life satisfaction: a bigger TV, a newer car, a more expensive guitar. The irony is that this focus on possession over experience undermines the very purpose of the objects:
Music stores emphasize selling guitars rather than providing space for lessons.
Sailboats and cars are purchased for status rather than use.
Motorcycles may sit unused, preserved to maintain resale value or social image rather than for enjoyment.
In essence, materiality attempts to fill a hole it cannot fill. True mastery, joy, or fulfillment arises from practice, experience, and engagement—not from accumulation. A piano does not teach you music; a sailboat does not teach you to sail. Ownership without participation is nihilistic: it substitutes appearance for substance.
Even renouncing consumption can become a form of conspicuous signaling. Wealthy figures like Warren Buffett craft an identity around frugality, but this is itself performative:
“I am wealthy, yet disciplined in my consumption.”
“My choices reflect my values and taste.”
The act of abstention becomes a form of consumption, reinforcing identity publicly rather than fulfilling personal need. Identity remains bound to material signaling, even in denial of materialism.
Humans require connection, purpose, and belonging. Attempting to satisfy these needs through objects alone is futile:
Communicating identity via possessions cannot overcome loneliness.
Shared interests do not automatically create meaningful relationships.
Buying goods cannot replace skill, experience, or engagement.
This mismatch between human need and material fulfillment leads to nihilism: a cycle of consumption without meaning. Evidence abounds in our society: storage units filled with unused possessions, multiple coffee makers, unopened items left behind by previous generations. The value of material goods has diminished; accumulation persists largely out of habit or compulsion.
Nihilism manifests subtly in cultural trends. For instance, 30% of Americans under 30 lack driver’s licenses, signaling a decline in the traditional car-as-freedom ideal. Consumption as identity marker is losing potency, undermined by cultural fragmentation. When material acquisition no longer effectively communicates status or selfhood, long-standing consumerist structures face instability.
Brands that retain value, like Harley-Davidson or Michael Jordan’s shoes, do so because they encode a historical cultural moment still recognizable across generations. But as fragmentation deepens, the ability of objects to convey meaning diminishes for the majority, leaving society grappling with the limitations of materialism as a foundation for identity.
When material wealth and consumerism are mistaken for meaning, society encounters a profound nihilism: the illusion that things can confer self-worth, social belonging, or purpose. Fulfillment emerges not from acquisition, but from engagement, practice, and authentic experience. Recognizing this mismatch is critical to understanding the limits of consumer culture and the fragility of identity built on materialism alone.
V.
When exploring broad questions like Democracy in America or the cultural history of the United States, one pervasive influence emerges: Calvinism. Most Americans are shaped by Calvinist principles, often unknowingly. Even without attending church, many carry values rooted in Puritanical thought—discipline, moral vigilance, and a sense of communal responsibility. Calvinism’s influence is cultural, not merely religious.
Understanding Calvinism requires some theological context. Early Christian thought, particularly Thomas Aquinas, emphasized that the world is fallen and humans are inherently sinful. Salvation required divine grace—a concept central to both Catholicism and Protestantism, but applied differently.
The Protestant Reformation, led by Martin Luther, dramatically restructured European society. By challenging Catholic authority, Luther liberated individuals from the Church’s control over daily life, marriage, education, and legal matters. Yet this liberation introduced a profound spiritual uncertainty: individuals now had a direct relationship with God, without intermediaries, and no absolute certainty of salvation.
John Calvin expanded on Reformation ideas in Geneva, creating a tightly governed, religiously disciplined society. Calvinist thought rests on several pillars:
Total Depravity: Humans are inherently sinful; salvation is only possible through God’s grace.
Predestination: God has already chosen the “elect” who will be saved; most are not.
Personal Relationship with God: Each individual is accountable directly to God, without mediating clergy.
Calvin’s Geneva exemplified these principles: a highly structured, surveillance-driven society in which citizens monitored each other’s moral conduct. Obedience and communal virtue were enforced, not merely suggested.
Calvinist ideas arrived in the New World via the Puritans, who sought to establish “new Genevas.” They aimed to build communities of the elect, morally upright and visibly devoted to God. This gave rise to the idea of American exceptionalism: a “city on a hill” intended as a model of righteous living.
The Salem Witch Trials illustrate the extreme stakes of this worldview: deviance or perceived sin threatened the moral and spiritual integrity of the entire community. Punishing witches was not about individual justice but preserving the collective elect.
Calvinist principles shaped American political rhetoric and expansion:
Leaders like John Winthrop, JFK, and Ronald Reagan invoked the “city on a hill,” framing the United States as morally exemplary.
Westward expansion often mirrored Calvinist impulses: establishing communities where religious and moral ideals could be enacted, from Puritans to Mormons.
The emphasis on individual responsibility, moral discipline, and communal observation persists subtly in American culture, shaping notions of work ethic, civic duty, and national identity.
Even in secular contexts, the Cultural Calvinism of America—belief in moral responsibility, hard work, and social order—remains influential, guiding how Americans perceive society, governance, and their own role in the world.
Calvinism initially relied on surveillance communities with fixed borders and strict oversight to enforce moral conduct. Success required stability: outsiders and constant movement undermined communal discipline.
In the United States, this model eroded rapidly due to two factors:
Mobility: People could leave communities at will, undermining social enforcement and forcing leaders to appeal to popularity rather than strict control.
Strangers: Communities increasingly included outsiders, making surveillance ineffective and weakening traditional mechanisms of moral accountability.
Without the constant oversight of a Geneva-style society, Americans adapted. The emphasis on individual responsibility to God persisted, but the metric of moral fitness shifted: prosperity became proof of righteousness. Wealth was no longer mere economic success; it signaled that one was a member of the elect, visibly upright and morally trustworthy.
Unlike Europe, where wealth and virtue were often seen as separate, in America, material success became a marker of divine favor, reinforcing both individualism and social visibility. Community reinforcement now came through appearances—houses, possessions, and public displays—rather than surveillance or direct feedback from authorities. Social media extends this dynamic today, ensuring that moral and social validation is still performed publicly.
The American Calvinist ethos shaped politics and social norms across the spectrum:
Left-wing expressions: Emphasize inherited guilt and endless contrition, reflecting the belief in inherent human depravity. Social justice movements often echo this, targeting systemic inequities as manifestations of moral failure.
Right-wing expressions: Focus on moral threats to the community, often casting behaviors like sexual nonconformity as dangers to societal order—modern echoes of Puritan “witch hunts.”
Both sides share a preoccupation with purity, worthiness, and moral surveillance, an enduring legacy of Calvinist thought.
Calvinist principles shape ideas of deservedness in the distribution of resources:
The right tends to prioritize the worthy or elect, rewarding success and ability.
The left focuses on the deserving poor, those who, by circumstance, need support to achieve moral or social fulfillment.
This reflects the deep Calvinist concern with aligning material and spiritual fitness—ensuring that only the morally upright or chosen benefit from communal resources.
Calvinist influence also shaped cultural expression:
Catholic traditions—like Boccaccio or Chaucer—allow humor, moral ambiguity, and human folly because divine mediation absolves sin.
Puritanical culture suppresses such levity, emphasizing moral education, social conformity, and theological purpose in all creative works. Entertainment without moral or religious instruction was considered dangerous or corrupting.
In short, Calvinism in America evolved from rigid communal oversight into a system where individualism, prosperity, and public reputation substitute for direct moral surveillance, creating enduring social, political, and cultural patterns.
Many peculiar features of contemporary American culture—especially its radical individualism—trace back to Calvinist theology. Unlike classical Greek culture, which celebrated physical, moral, and intellectual vitality, Americans often struggle with widespread psychological and emotional unwellness. Where Greeks saw self-care and excellence as evidence of worth, Calvinism instills doubt and self-reproach: because the world is fallen and humans irredeemably sinful, self-trust is impossible. Pleasure and joy are often treated with suspicion; pain and struggle become proof of moral effort, a direct echo of Puritan teachings.
Without the communal surveillance of Geneva-style societies, Americans sought external validation to measure virtue. Prosperity, success, and visible achievement became markers of being among the elect. This creates a society where people willingly endure hardship, push themselves relentlessly, and cultivate status and wealth—less for personal satisfaction and more to demonstrate worthiness to the community.
This mindset transforms ordinary choices into existential stakes. Education, work, and self-discipline are not simply personal goals—they are civic and spiritual imperatives. A child who resists learning, or an adult who loses a job, is not merely falling short personally; in Calvinist terms, they threaten the moral fabric of the collective “city on the hill.”
American culture thus fuses individualism, material success, and moral vigilance. It amplifies the drive to achieve, to be seen achieving, and to endure hardship as evidence of virtue. Even absent explicit religious belief, these patterns shape behavior, creating a society that prizes struggle, performance, and visible proof of worth above intrinsic well-being.
VI.
The Great Transformation refers to the period roughly between 1933 and 1950, a relatively brief era in which the United States underwent a radical reconfiguration of its cultural, political, and economic patterns. Events of this period amplified the preexisting currents of Calvinism, individualism, and materialism, while also positioning the U.S. as a global power.
Before Franklin D. Roosevelt, the United States was largely continentally focused and isolationist, consolidating territory across North America rather than projecting power abroad. The federal government was financially weak, funded primarily through tariffs, which comprised over 90% of receipts, amounting to only 2–3% of GDP. Most Americans had little direct interaction with federal authority, and Senators were chosen by state legislatures, reinforcing a decentralized, state-centered system.
Throughout U.S. history, federal authority expanded during emergencies and contracted afterward:
Revolutionary War: Congress borrowed extensively, but postwar, power returned to the states.
Civil War: The federal government temporarily centralized resources and troops, then demobilized afterward.
World War I: The U.S. entered late with limited military capacity; after the war, spending and military size returned to prewar levels.
Even leaders like Theodore Roosevelt sought limited engagement abroad, favoring pragmatic power dynamics over moral crusades. Woodrow Wilson, despite his idealistic vision of U.S. destiny, ultimately adhered to a pattern of temporary wartime centralization followed by postwar retrenchment.
The Great Depression (1929–1933) exposed the limits of federal revenue and capacity. With tariffs collapsing as global trade dried up, the government was forced to dramatically expand taxation and spending under FDR’s New Deal. Key developments included:
Expansion of income taxation, rising from 5% to 90% of the population by the end of World War II.
Establishment of Social Security, federal relief programs, unemployment insurance, and regulatory frameworks such as the FDIC.
World War II further entrenched federal power, centralizing the military, economic production, and social programs. By war’s end, the U.S. emerged not only as a military victor but as a global superpower, a transformation cemented by deliberate postwar policy choices.
Calvinist moral frameworks amplified the perception of global conflicts in binary terms of good versus evil:
Nazi Germany’s atrocities and Japan’s attack on Pearl Harbor reinforced the narrative of moral clarity.
This framing legitimized permanent U.S. global engagement, contrasting sharply with previous cycles of withdrawal after crises.
Even when postwar realities—negotiated settlements, ambiguous alliances with the Soviet Union, and compromises over Eastern Europe—were complex, public discourse largely simplified the narrative into a moral victory, reinforcing American exceptionalism and the notion of a divinely sanctioned global role.
The Great Transformation represents the moment when temporary federal expansion, economic reform, and wartime mobilization became permanent structural features of the U.S. state. It institutionalized a powerful, centralized government capable of global influence, while embedding a cultural ethos of individual responsibility, moral vigilance, and prosperity as virtue. These developments set the stage for the United States’ postwar domestic and international identity.
The decisive factor in World War II was material and industrial capacity. The United States simply outproduced all other nations, leveraging intact infrastructure, massive federal investment, and European demand for goods. German tanks or Japanese ships were no match; even if one nation destroyed dozens of U.S. tanks, the United States could produce hundreds more. The lesson was clear: industrial productivity wins wars. This reinforced the American commitment to materialism, demonstrating that resources, manufacturing, and logistics—more than battlefield experience or military tradition—were decisive.
U.S. industrial might extended beyond weapons. Troops were well-fed, clothed, and supplied with vehicles and equipment—a stark contrast to German and Soviet forces, which often struggled with logistics and relied heavily on antiquated transport like horse-drawn carts. Victory thus validated a worldview in which material wealth, productivity, and organization were morally and strategically superior.
World War II also reinforced a moral framework rooted in Calvinist dualities: the United States as the force of good, the Axis powers as evil. Material success aligned with this moral clarity. Even the alliance with the Soviet Union was temporarily glossed over; postwar, the narrative shifted to highlight freedom and individual liberty as uniquely American values. This dual emphasis—on material capability and moral righteousness—became a defining cultural paradigm.
The war generated profound domestic changes:
Global exposure: Roughly 11 million Americans served overseas, broadening awareness of international affairs and fostering global interest.
The Great Migration: African-Americans relocated from the South to industrial centers, reshaping the nation’s demographics.
The GI Bill: Nearly 8 million veterans, including about 1 million African-Americans, accessed higher education, fueling the rise of an educated middle class and laying foundations for the Civil Rights Movement.
These structural transformations reinforced the narrative: industrial capacity, individualism, and moral purpose define national strength.
The collapse of the Soviet Union left the United States as the sole superpower, inheriting the mantle of “good.” Yet the binary lens of good versus evil, which had simplified global understanding during World War II, no longer aligned with reality. Complex geopolitical challenges—from terrorism to China’s rise—cannot be reduced to moral absolutes. Attempts to cast modern conflicts in this framework often obscure underlying strategic, economic, and political realities, creating confusion both at home and abroad.
This crisis of interpretation is reflected in recurring debates over U.S. foreign policy—from Vietnam to Iraq—where simplistic moral narratives clash with intricate global dynamics. The cultural reliance on moral clarity, materialism, and individualism, forged during the Great Transformation, now struggles to account for a multipolar, interdependent world.
The material and cultural gains of 1933–1950 reshaped American identity. Prosperity, individual liberty, and moral certainty became foundational, but their unexamined continuation has left contemporary society baffled when these narratives fail to explain complex realities. The postwar superpower status, once a source of confidence, now confronts an era where clarity is elusive, and historical assumptions about morality, productivity, and national purpose are questioned.
VII.
Libertarianism in America can be seen as the extreme extension of core cultural traits—individualism, materialism, and the belief in personal liberty. Its rise helps illuminate broader patterns in U.S. history and cultural identity, even if the philosophy itself is often underdeveloped in popular discourse.
At its core, libertarianism emphasizes:
Individuality and Rights: The individual is paramount, and protecting individual rights is essential.
Spontaneous Order: Society functions best when people freely interact, usually guided by market mechanisms, without government interference.
Rule of Law and Property Rights: Legal structures, especially property rights, are vital to securing individual freedom.
Limited Government: Government is a trade-off: more government means less individual liberty.
Free Markets and Economic Autonomy: Individuals should freely create, exchange, and pursue self-interest, viewed as beneficial to society.
Productivity as Virtue: Success demonstrates proper use of personal liberty, serving both the individual and society.
Libertarianism closely aligns with Calvinist notions of individual responsibility and material achievement, taken to their logical extreme.
Friedrich Hayek (1899–1992) is central to libertarian thought. A product of early 20th-century Europe, Hayek witnessed the collapse of old regimes, the rise of totalitarianism, and the threat of centralized planning. He became a classical liberal, asking:
How can society maximize individual freedom?
In The Road to Serfdom, Hayek argued that centralized economic planning inevitably leads to tyranny, restricting individual liberty. He also acknowledged, however, that governments must provide basic social needs—food, housing, healthcare—underscoring that even minimal state intervention is necessary.
Hayek’s ideas influenced the Chicago School of Economics, Milton Friedman, and the postwar American embrace of free markets and limited government, though many questions he raised about balancing freedom with social responsibility remain unresolved.
Modern libertarianism places the sovereign individual at its core: each person owns themselves, akin to property. While this concept extends personal liberty to its extreme, it raises philosophical and practical issues:
Inherited Obligations: Individuals emerge from families, communities, and cultures they did not choose. Libertarian self-ownership tends to ignore these social and historical debts.
Public Goods and Free Riders: Roads, schools, and infrastructure exist as shared resources. If everyone acted purely as sovereign property owners, society would face a free rider problem, exploiting communal benefits without contributing.
Coercion and Corporations: Libertarian rhetoric often criticizes government coercion but ignores the disproportionate power of corporations. True freedom requires protection against exploitation by organized entities, not merely the absence of government.
The Paradox of Property Rights: Absolute protection of property requires a robust legal and enforcement system—a strong, centralized government. Modern libertarianism paradoxically demands extensive government structures for a philosophy that opposes government.
Libertarianism resonates deeply in American culture because it embodies longstanding cultural assumptions about individuality, property, and self-interest. Its appeal is not purely philosophical—it reflects deeply held societal biases and priorities.
Popular discussions illustrate this clearly. For example, articles calculating the economic value of maternal labor reduce parenting to a property exchange, quantifying care, feeding, and household work as if the parent were a contractor selling labor. While monetization can measure economic contribution, it strips the human relationship of love, duty, and joy, reducing it to transactional terms. This framing mirrors libertarian ideals: isolated, self-owning individuals optimizing value for themselves rather than engaging in communal or relational obligations.
Libertarian thought assumes that individuals achieve maximal freedom alone, detached from social networks. This is visible in modern cultural movements:
Tiny House Movement: Individuals celebrate ownership of a small, self-contained space, emphasizing independence and debt-free living. While appealing, this isolates people from communal living and shared experiences, undermining the social and emotional benefits of collective life.
Minimalism and “Few Things” Movements: Simplifying life by reducing possessions focuses almost exclusively on personal utility and joy, rarely considering communal benefit. Objects are valued for how they serve the self, not society. Practices like Marie Kondo’s “sparks of joy” illustrate this inward orientation.
Urban design reflects similar tendencies. Western U.S. planning often prioritizes individual pods—tiny personal spaces or autonomous self-driving cars—over collective infrastructure like public transit. Even projects like single-occupancy dorms with minimal natural light reveal the extent to which libertarian ideals prioritize solitary autonomy over communal benefit.
Libertarianism thus functions less as a philosophy and more as a mirror of cultural biases: a clear articulation of the American preference for individualism and self-sovereignty, often at the expense of shared social goods.
Libertarianism equates freedom primarily with property ownership and individual autonomy, neglecting other essential forms of liberty:
Freedom from Fear: Ownership and self-defense (e.g., gun culture) may preserve autonomy but do not eliminate anxiety or insecurity.
Capacity and Education: True freedom requires access to knowledge and learning. Without communal institutions like schools, individuals cannot fully realize their potential.
Freedom of Movement: Mobility is essential to exercise choice and opportunity, yet it depends on shared infrastructure such as roads. Libertarian thought sometimes acknowledges this only in the narrowest terms, like support for free immigration.
Most liberties cannot be reduced to property or personal ownership. By focusing almost exclusively on the self as sovereign property, libertarianism ignores communal dimensions of freedom, such as collective security, shared resources, and cooperative achievement.
American libertarianism is compelling not for its solutions but for its clarity in exposing the assumptions underlying U.S. cultural life:
A focus on individual autonomy and property reflects a deep Calvinist-materialist heritage.
It emphasizes self-interest and isolation while marginalizing communal obligations.
It narrows the concept of freedom to material and transactional terms, often overlooking emotional, social, and collective dimensions of liberty.
Libertarian thought thus illuminates a paradox: it articulates a powerful vision of freedom while simultaneously revealing the cultural blind spots that constrain American society. Understanding libertarianism is less about adopting it as a philosophy than about understanding why American culture consistently values autonomy, property, and individualism above communal engagement.
VIII.
Neoliberalism has been arguably the most influential philosophical and economic movement of the late 20th century. While communism shaped the early 20th century, neoliberalism’s reach over the last 40–50 years is systematic, pervasive, and transformative, shaping not just economies but social and political structures worldwide.
At its core, neoliberalism emerges from a fusion of materialism, individualism, and Calvinist ethos, distilled into a global political and economic philosophy. Understanding this movement requires recognizing how these cultural roots—especially from the United States—have been projected onto the global stage.
Neoliberalism coincides with unprecedented reductions in global poverty. In 1970, over half the world’s population—approximately 1.8 billion people—lived on less than $1 per day. By 2019, fewer than 700 million lived in extreme poverty, despite the global population nearly doubling to 8 billion. While multiple factors contributed—including the Green Revolution and economic liberalization in China and India—neoliberal policies played a central role in lifting hundreds of millions from extreme poverty.
Despite these gains, neoliberalism also reflects a particular cultural ideology, producing both prosperity and inequality:
Thatcher (UK), Reagan (US), and Deng Xiaoping (China) were pivotal in advancing neoliberal reforms.
These reforms transformed societies from predominantly rural poverty to urban and industrial growth, dramatically altering social structures.
However, neoliberal imposition often ignored local histories and social realities, leading to social dislocation, especially in countries adopting austerity policies to qualify for global financial integration.
Even well-intentioned leaders, like François Mitterrand, were constrained by global neoliberal pressures, illustrating the limits of national autonomy under this system. The European Union itself emerged partly to create scale and leverage within a neoliberal global economy.
Neoliberalism rests on a set of interconnected ideas, now so normalized that they appear self-evident:
Free Markets: Lower trade barriers to maximize efficiency and growth.
Privatization: Transfer state-run sectors to private control, assuming greater efficiency.
Deregulation: Reduce government intervention to facilitate commerce.
Globalization: Encourage cross-border trade, investment, and capital flows.
Individual Responsibility: Emphasize personal autonomy and decision-making in economic and social life.
Fiscal Austerity: Minimize government spending and debt to allow private sector growth.
The fall of the Soviet Union and U.S. global supremacy enabled neoliberalism to dominate international economic policy. Institutions like the World Bank imposed austerity and market reforms on developing nations, often disregarding local conditions. EU expansion similarly enforced neoliberal standards, from debt ratios to open markets, creating social strains in member states.
The dominance of neoliberalism faced profound challenges in the 21st century:
2008 Financial Crisis: Massive government bailouts of banks contrasted sharply with austerity for ordinary citizens, revealing the contradictions of “free markets.”
COVID-19 Pandemic: Global economies required unprecedented intervention, exposing the limits of minimal government ideology.
Rising Populism: Political responses, from Trump’s tariffs to Brexit, reflect resistance to perceived overreach by globalized neoliberal structures.
Neoliberalism concentrates power in a small, technocratic elite. Central bank decisions in the U.S. and EU—raising interest rates to curb inflation—affect global borrowing costs, infrastructure investment, and food prices, especially in developing countries. Yet these decisions are largely unelected, demonstrating how neoliberal structures extend influence far beyond the democratic accountability of most affected populations.
As globalization and austerity face growing criticism, alternatives remain underdeveloped. Movements such as Modern Monetary Theory hint at new economic thinking, but no dominant global replacement has emerged. Meanwhile, trade barriers and nationalistic policies challenge the coherence of the neoliberal model. The world is witnessing the limits of a system once assumed inevitable, prompting debate about the next phase of global economic organization.
The driving—but often unspoken—goal of neoliberalism is to create a system that, in aggregate, is “best for everyone in the world.” In theory, free trade, globalization, and market liberalization should reduce poverty and maximize prosperity. While these policies have dramatically reduced extreme poverty, the objectives are rarely explicitly defined or debated, and the human costs are often obscured.
For instance, the free movement of people—a supposed complement to free trade—has caused political and cultural friction across the EU, fueling populist movements like Brexit. Similarly, trade agreements such as NAFTA boosted overall economic activity but displaced workers, creating sharp trade-offs for individuals who bore the costs without receiving commensurate benefits. These trade-offs are seldom acknowledged; neoliberal policies are often treated as inevitable, leaving little room to discuss who wins, who loses, and what “the good” truly means.
This global imposition reflects a Calvinist ethos embedded in U.S. culture: individual liberty, free markets, minimal government interference, and the belief that suffering signals moral or economic rectitude. Wealth and power are taken as proof that this model works, leading to its zealous export worldwide.
After the Soviet Union’s collapse, neoliberal specialists were sent abroad with a prescriptive approach: “Follow the book; it will be painful, but correct.” Local conditions, histories, and cultures were largely ignored. Suffering was framed as necessary for eventual prosperity—a hallmark of the Calvinist logic underlying U.S.-style neoliberalism.
Non-compliance with neoliberal orthodoxy is costly—not just economically, but politically and socially. Governments that deviate face punitive measures from international bond markets and other global financial actors. While the U.S. can borrow with impunity, smaller states are constrained by the need to conform or pay higher interest rates.
Even well-intentioned interventions, such as local housing regulations, clash with globalized capital flows. When property is treated as a global asset, local residents compete against international investors, driving prices beyond reach. Efforts to regulate foreign investment or stabilize markets often provoke resistance from entrenched financial interests, highlighting the tension between local welfare and global market logic.
Neoliberalism is not merely an economic model—it is a U.S.-rooted cultural worldview. Its core principles—individual liberty, material accumulation, and a Calvinist belief in the moral value of suffering—permeate global governance, financial markets, and policymaking. This worldview is alien to many societies, creating friction, confusion, and resistance.
Developing countries, EU states, and even superpowers like China face pressures from this pervasive system. Attempts to deviate—through alternative fiscal policies, protectionism, or welfare expansion—are frequently constrained or punished by globalized financial and market mechanisms.
As neoliberalism faces crises—the 2008 financial meltdown, the COVID-19 pandemic, and rising populism—its claims of inevitability are increasingly questioned. The tension between global markets and local social needs is growing, and the ideology’s cultural origins are more apparent than ever.
While a clear alternative has yet to emerge, the pressure for change is mounting. Intellectual and political ferment suggests the possibility of a new “reformation”: a rethinking of global economic organization and the assumptions underpinning neoliberal hegemony.
Neoliberalism, rooted in a particular cultural history, continues to shape the lives of nearly everyone on the planet, producing both profound gains and deep inequalities—often in ways that remain unexamined and poorly understood. Recognizing its origins, mechanisms, and hidden costs is essential to engaging with the global economic order critically and constructively.
IX.
American society is built on Calvinist roots, radical individualism, materialism, and a worldview of abundance, creating a culture that blends libertarian ideals with a nihilistic materialist outlook. Yet this ethos produces a profound tension for those living within it.
From childhood onward, the pervasive message is: you are on your own. Dependence is framed as weakness, while self-reliance is glorified. This narrative transcends politics; it is simply the cultural story Americans inherit. The traditional nuclear family has been pared down to an atomized model, leaving individuals socially isolated while being told both that independence is virtuous and loneliness inevitable.
Libertarianism and other movements—whether “back-to-nature” self-sufficiency or the pursuit of wealth—are attempts to reconcile this tension. They promise autonomy, yet fail to acknowledge the biological and social reality: humans are communal animals, entirely dependent on complex social, economic, and technological systems for survival and fulfillment. The result is persistent intellectual dissonance, as people strive for independence while living in one of the most interconnected societies in history.
Even for Americans who prize family, the societal model encourages detachment across generations. Children are expected to leave home, pursue their “own lives,” and interact with family mainly in occasional reunions. This redefinition of family fosters isolation: functional family ties—daily, intimate, communal connections—are largely absent. Decisions to remain near family are framed as sacrifices, reinforcing the narrative that true adulthood requires independence and detachment.
Paradoxically, American individualism generates insecurity and a sense of scarcity despite unprecedented material abundance. People are socially and economically enmeshed, yet are conditioned to believe they must compete fiercely for limited resources—prestigious schools, high-paying jobs, and social status—even when such scarcity is largely constructed.
This competitive framing extends to education: the U.S. has far more colleges and universities per capita than any other nation, yet students internalize a narrative of extreme competition, perceiving the opportunity to access these abundant resources as extraordinarily scarce. Similarly, jobs are plentiful, yet the cultural message emphasizes struggle and self-reliance as if employment were a rare commodity.
The insistence on independence forces Americans to seek abstract, institutional validation for their worth. Isolated from familial and communal feedback, individuals rely on impersonal social, cultural, and economic structures to confirm that they are “doing well.” This reliance paradoxically undermines the very self-sufficiency that the culture celebrates.
In contrast, communal societies provide immediate, interpersonal feedback: daily interactions with family and neighbors offer tangible confirmation of moral and social contributions. In the American system, however, individuals are left to navigate loneliness, competition, and abstraction, perpetuating insecurity, overwork, and material accumulation as proxies for belonging and achievement.
American individualism celebrates autonomy, yet lives in deep contradiction with human social nature. The cultural ethos glorifies independence while people remain inextricably linked to communal networks and societal structures. Attempts to reconcile this tension—through wealth, libertarian ideals, or self-sufficiency—cannot succeed because they ignore the fundamental reality of human dependence and the need for immediate, personal feedback.
The result is a society where loneliness, insecurity, and competition dominate, even amid unparalleled material abundance, shaping the daily experience and psychology of virtually every American.
In American society, both the right and the left engage in performative signaling, revealing shared assumptions about morality and identity.
On the conservative side, people display symbols—flag stickers, guns, trucks, even specific cultural tastes—to signal, “I am a good person.” Popular Country Western music repeatedly enforces this message: material and cultural markers become proof of virtue, signaling alignment with the community’s moral expectations.
On the progressive side, similar dynamics appear in choices like electric vehicles. Priuses and hybrids, concentrated in certain neighborhoods, signal environmental virtue. The act of fulfilling a practical need—getting from A to B—becomes inseparable from signaling one’s goodness to strangers.
This obsession with signaling is rooted in Calvinist ideas of purity. Americans are perpetually judged against moral and social standards: either you conform, joining the “select community,” or you fail and risk exclusion.
Consider professional conferences: in forestry, one gathering celebrates environmental love, the other celebrates industrial forestry. Both are less about practical expertise and more about demonstrating moral alignment. Meeting these symbolic standards is central; failing them isolates individuals from abstract, impersonal institutions that serve as arbiters of social approval.
The result is pervasive unease and insecurity. Wealth, status, or competence cannot insulate you from the constant pressure to prove yourself; the cultural narrative perpetually warns: something bad is coming, and you may not be prepared.
To navigate this isolation, Americans turn to materialism as a surrogate for belonging. Private golf clubs, luxury communities, or elite memberships attempt to create a sense of inclusion based on financial capacity rather than genuine social connection. Money substitutes for community, yet fails to satisfy the deeper need for recognition and belonging.
In contrast, cultures like Confucian China define morality through communal and familial obligations. Success is measured by fulfilling family duties rather than self-serving achievement. Social and moral expectations are clear and stable, reducing the constant anxiety present in American life.
American thought privileges quantitative abstraction over qualitative experience. Isaac Newton’s revolution, elevating measurement and objective law above subjective experience, laid the groundwork for this worldview. Success and value are judged against abstract principles, not lived experience.
In practice, this means everyday pleasures—friendship, leisure, or well-being—are subordinated to future-oriented, abstract goals: school, career, wealth accumulation. Individual experience is dismissed in favor of impersonal systems that claim to know what is right.
This mirrors the Calvinist moral framework: the standard is unknowable, judgment is remote, and failure is inevitable. Americans are conditioned to fear dependence, equating reliance on others with moral failure.
The result is a cultural trap: individuals are isolated, resources are framed as scarce, and validation must be sought from abstract, impersonal systems. Nihilistic materialism becomes a coping mechanism—acknowledging the futility yet relentlessly pursuing symbolic markers of worth.
The tension between independence and human social reality defines the American experience. One is perpetually caught in a cycle of competition, signaling, and material pursuit, knowing that no external measure can truly satisfy the deeper need for belonging. This aggressive, self-consuming tension shapes both the American worldview and the psychological experience of being an American individual.
X.
Having explored how American cultural and economic power shapes individual outlooks at home, we now turn to its influence abroad.
Plato’s final myth in The Republic illustrates a profound principle about choice and wisdom. Souls draw lots determining the lives they will lead: heroes, tyrants, or ordinary people. Even the last chooser, Odysseus, finds contentment in a humble life of a quiet farmer.
The lesson is clear: wisdom allows one to select a life of quality despite circumstances. While multiple paths may lead to fulfillment, discernment determines which choices are beneficial and which are harmful.
In contrast, cultural narratives constrain freedom. In the United States, the ideal is extreme individualism: true freedom is the pursuit of the self, while communal ties are framed as a limitation. Even freely choosing to engage in family or community is culturally coded as a mistake.
The United States exercises unprecedented global economic influence. Its currency underpins over half of global trade, and US Treasury securities serve as the foundation for the world economy. The 2008 financial crisis revealed how dependent global markets are on US financial systems, creating a moment of universal uncertainty.
Countries integrating into this system, such as Hungary entering the EU in 2004, were required to privatize industries, liberalize markets, and adopt neoliberal policies. The promise of prosperity often collided with harsh reality: economic crises compounded debts, devalued currencies, and imposed social costs, fueling resistance movements and populist backlash.
Across the globe, the United States’ economic model sets the framework for development, often leaving nations bound to a system of continual adjustment and external validation.
US cultural influence rivals its economic power. Over the last 60–70 years, American film, music, and television have reached unparalleled global penetration:
Film: Nearly all top-grossing global films have American origins, spreading narratives and ideals to the world.
Music: Jazz, Rock, and Hip-Hop celebrate individuality, rebellion, and personal achievement, contrasting with communal or traditional artistic values elsewhere.
Television: Popular shows project the American way of life, emphasizing career ambition, wealth pursuit, and urban individualism, often portraying alternative lifestyles or rural simplicity as inferior.
Through these media, American ideals of self-reliance, personal success, and material achievement are exported, influencing behaviors and aspirations worldwide.
The United States presents a narrow, prescriptive notion of what constitutes a “good life”: freedom is individual, success is material, and dependence is failure. Globally, this framework reshapes economic systems, cultural aspirations, and social norms, often creating tension when local traditions or communal structures collide with these imported ideals.
Across the globe, American cultural ideals and neoliberal economic policies operate in tandem, creating a mutually reinforcing system. Both tell the same story: individual ambition and market-driven competition define success.
For example, neoliberalism frames small and mid-sized farming as uncompetitive. Trade liberalization pits local farmers against global conglomerates with immense capital, subsidies, and efficiencies. The result: local agriculture is weakened or destroyed, making rural life economically precarious. Simultaneously, global media—television, film, and music—promotes the narrative that true success comes from leaving home, moving to a city, and achieving wealth and status as an individual.
This produces a stark choice for many:
Remain in a communal, rural life, sacrificing wealth and prestige.
Move to a metropolitan hub to pursue the global narrative of success.
Cities appear to offer opportunity because global economic flows favor those embedded in networks of capital, corporations, and financial markets. Local or regional economies, by contrast, are increasingly marginalized. The cultural message is clear: individual mobility and ambition are paramount, while local ties and communal engagement are undervalued.
Plato anticipated this tension: culture and circumstance shape our choices, but wisdom allows us to discern which lives will truly provide fulfillment. Yet American popular culture consistently mocks the choice of a quiet, rural, or communal life, favoring the pursuit of wealth and status.
Consider Louisiana and the Netherlands. Despite Louisiana’s greater oil and natural gas wealth per capita, the state suffers high poverty, low educational quality, high crime, and falling life expectancy, whereas the Netherlands excels on all these metrics.
The difference is cultural and political choices:
In the Netherlands, resource wealth is reinvested communally, supporting education, infrastructure, and social welfare.
In Louisiana, wealth is privatized, benefiting companies and individuals while public investment remains minimal. Environmental degradation compounds social challenges.
The lesson: wealth alone does not determine societal outcomes—how it is allocated and the cultural priorities surrounding it matter deeply.
Global economic and cultural systems reinforce each other:
Neoliberal economics encourages mobility, deregulation, and market-driven success.
American cultural narratives valorize individual achievement and wealth, presenting iconic figures like Steve Jobs or Jeff Bezos as aspirational models.
Together, they limit imagination and alternative ways of organizing society, emphasizing individualism, materialism, and competition over communal or regional approaches.
Several interwoven values underpin this global system:
Erasure of historical and regional identity—local traditions are marginalized in favor of global integration.
Individualism—success is defined through self-reliance and personal ambition.
Materialism—wealth and possessions are central to social validation.
Calvinist morality—a focus on worthiness, punishment for failure, and moralizing economic outcomes, evident in the obligations imposed on indebted or developing nations.
These forces are highly unified, though they often appear disparate. They shape both the global economy and the cultural imagination, producing widespread unease, dislocation, and competition.
To counter these pressures, society must reassess its core values. Communal, regional, and local priorities could offer alternatives to the dominant narratives of individualism and market-driven success. Breaking the cycle of materialist ambition and extreme mobility could allow societies—and individuals—to pursue flourishing lives defined by well-being, stability, and communal benefit.
The global tension we observe today arises not from scarcity or inevitability, but from the powerful interplay of cultural ideals and economic systems, and our limited capacity to imagine and enact alternatives. Recognizing these patterns is the first step toward reclaiming agency over how we live and organize society.
Capitalism and Degeneracy
I.
Although capitalism shapes nearly every aspect of modern life, public discourse treats it almost exclusively as an economic system. This obscures the fact that the economic domain is only one expression of a broader political order that makes capitalism possible.
Politics and economics are not separate worlds; economic institutions function only within political structures. To understand capitalism’s history and its future trajectory, we must therefore begin with its political foundations, especially the origins of private ownership—the element that truly distinguishes capitalism from previous economic arrangements.
Across diverse thinkers—Marx, Hayek, Locke—and across standard reference works, capitalism consistently appears in three core features:
Private ownership of the means of production and distribution
Market exchange for profit, usually under competitive conditions
Limited government intervention
These features form the conceptual baseline. But the first—private ownership—is the essential and historically transformative component. The others arise from and depend upon it.
The means of production have existed throughout history: farms, workshops, foundries, trade networks. What is historically new is not the existence of production but the private character of ownership.
Modern people assume private property is natural—something as obvious as gravity. But for most of human history, private ownership in the modern sense did not exist. Land, the most important pre-industrial productive resource, was typically controlled by emperors, kings, or feudal lords. Individuals could hold land only contingently, through royal favor, feudal obligation, or communal tradition. At any moment, the sovereign could revoke these rights.
Throughout ancient and medieval societies, possession was always conditional, its security dependent on the ruler’s power, not on an independent legal right. Property could be—and often was—confiscated at a moment’s displeasure.
For ownership to be truly private, there must exist a counterweight strong enough to restrain the central authority. Historically, this has taken many forms:
State law, backed by courts and enforcement (e.g., Roman law)
Aristocratic and feudal networks, where lords and retainers formed power blocs that even kings had to respect
Local customary rights, binding communities through inherited obligations
In all such systems, control was defined by relationships, obligations, and negotiated rights, not absolute ownership. A duke or baron did not “own” land in the modern sense; land was embedded in a dense web of usage rights, hereditary claims, community norms, and feudal duties. A field might support peasants, widows, knights, and neighboring villages according to long-standing agreements. Such land could not be freely sold, subdivided, or transformed because its “ownership” was communal, historical, and distributed.
In these societies, economic activity was woven into a framework of obligations rather than market contracts. Property lacked the absolute, individualized quality we associate with modern ownership.
Capitalism becomes possible only when ownership becomes individualized, portable, alienable, and legally protected. This development—culminating in systems such as the Napoleonic Code—created the idea that:
“This is my land, and I may use, sell, or pass it on as I choose.”
That idea took centuries to emerge and required profound changes in legal institutions, political power, and social organization. Once established, however, it enabled an economic order defined not by feudal obligations or communal norms but by private rights, market exchange, and capital accumulation.
The rise of modern private property was neither universal nor inevitable. Many ancient civilizations—India, China, Egypt—developed complex markets and substantial wealth, yet never fully articulated a system of private ownership comparable to that which emerged in Europe. In India, home ownership and merchant wealth grew in certain late medieval cities, but the core productive assets—especially agricultural land—remained embedded in networks of feudal, religious, and customary obligations. China saw similar intermittent experiments, particularly under the Qin and during periods of intense trade, but these advances appeared in fragmented episodes and were often reversed.
In both cases, political authority retained decisive control over land. What never fully emerged was the European notion of ownership as individual, alienable, and legally protected.
A central insight is that private property develops only when something constrains the state’s power to claim everything as its own. In ancient political systems, from pharaonic Egypt to absolutist France, the ruler’s authority was considered ultimate. Land, wealth, and even lives were held at the ruler’s discretion. Confiscation was not viewed as an injustice but as an expression of sovereign right.
To defend “private” property against such authority, a countervailing force must exist. Historically, this took many forms:
Feudal nobility, with their own armed retainers
Religious institutions, such as the medieval Church, whose landholdings were protected by independent hierarchies
Chartered corporations, like the Dutch and British East India Companies, which maintained private armies to secure their claims
Commercial leagues, notably the Hanseatic League, which pooled economic and political power to defend local autonomy
Where such counterforces were strong, private ownership could take shape. Where they were weak, property defaulted to the state.
From the early modern period onward, Europe increasingly shifted the protection of property from military strength to law. Courts, codes, and legal procedures—whether derived from English common law or the Napoleonic Code—became mechanisms by which individuals could challenge even the government itself. This created a “third power” standing between the individual and the state.
Laws limiting forced quartering of soldiers illustrate this transformation. In earlier systems, governments routinely compelled citizens to house and feed troops. Modern legal reforms established the principle that the state could not intrude on one’s home without consent or compensation. Such protections signaled something new: a domain of ownership the state itself was barred from violating.
Private property also depended on determining who counted as a bearer of rights. English common law traditions, beginning with the Magna Carta, first restricted the king’s power over nobles and gradually extended protections to broader populations.
This expansion involved profound social conflict. The American Civil War, for example, centered on whether enslaved people could be treated as property. The pro-slavery argument insisted they could; the abolitionist argument insisted that one human being cannot be another’s property. Earlier societies had confronted similar dilemmas, such as Solon’s reforms in ancient Athens, which curtailed debt slavery to preserve civic stability.
These struggles show that the boundaries of private ownership were historically constructed, contested, and revised—not naturally given.
Modern theories—especially those influenced by John Locke—portrayed private property as a natural right, something inherent to human existence. But historical evidence does not support this claim. Private property arises from political arrangements, often secured through conflict, negotiation, and institutional invention. It is therefore best understood as a political philosophy, not a natural or purely economic phenomenon.
The idea that property rights are “self-evident” obscures their actual origins in centuries of legal and political struggle. Once these rights are naturalized, questions about how they were created—and whose interests they originally served—tend to disappear.
Although capitalism is often defined by competitive markets, competition has historically been limited, controlled, or avoided. The Hanseatic League suppressed internal competition to protect member interests. Modern corporations regularly pursue strategies of consolidation—“rollups,” mergers, territorial dominance—designed to reduce competition, not encourage it. Firms from Walmart to private equity groups seek market positions that minimize rivals and maximize pricing power.
The ideal of widespread free competition has therefore been far less common than the ideal of protected advantage.
Capitalism is also described as requiring minimal state intervention. Yet the preservation of private property depends on intensive government involvement. Courts, police, and regulatory systems represent a vast machinery that upholds property rights, adjudicates disputes, and enforces contracts.
Corporations do not seek the absence of intervention; they seek intervention favorable to their interests. The state is therefore constantly involved—not in spite of capitalism, but because capitalism requires it.
Private property, competition, and legal structures did not arise from nature but from specific historical conditions. As those conditions change—politically, technologically, and globally—the system built upon them also changes. Today, many of the foundations that shaped early capitalism no longer resemble the contexts envisioned by Locke, Smith, or Hayek, even though their vocabulary and conceptual frameworks endure.
This mismatch produces confusion about the present and uncertainty about the future. Understanding capitalism today requires recognizing that it arose from historical contingencies that may no longer exist.
Conclusion
Capitalism rests on private ownership, competition, and state-enforced legal protections.
Private property is the central and historically distinctive element.
Its development required robust forces capable of limiting state power.
Competition has been irregular and often suppressed.
Government involvement in capitalism is extensive and foundational.
The notion of property as a “natural right” obscures its political origins.
To understand where capitalism is going, we must understand how its defining structures were created—and how radically the conditions sustaining them have changed.
II.
The concept of private property has always depended on the presence of a force capable of limiting the state’s power. Historically, as governments became more centralized, greater independent institutional strength was required to preserve any meaningful sphere of private ownership. In Europe, this counterweight increasingly took the form of courts and judicial systems. Judicial independence—articulated both in English common law and later in the Napoleonic Code—became the mechanism through which property could be defended even against the state itself.
Thinkers such as Locke and Adam Smith built their theories upon these developments. Locke argued that the state exists chiefly to protect natural rights, including private property, though he never fully explained how such rights originate. This exposes a recurring philosophical problem: if private property requires prior protection, how does it arise in the first place? The origins of “the private” are far less self-evident than later political philosophy assumed.
Capitalism, in this sense, is best understood as a political philosophy with economic consequences, not merely an economic system. Its economic features—markets, trade, prices—rest on political assumptions about ownership, the individual, and the limits of sovereign power. These assumptions are often overlooked, even though they form the system’s foundation.
Central to this political philosophy is a particular understanding of the individual. Enlightenment thinkers—Locke, Smith, and others—treated the individual as the holder of property and the basic unit of social order. They assumed human beings are rational, capable of self-governance, morally accountable, and primarily motivated by self-interest. Smith famously extended this into economics, arguing that individuals pursuing their own perceived advantage inadvertently promote the common good through an “invisible hand.”
Yet this view takes for granted what earlier philosophical traditions doubted. Classical thinkers such as Plato and Socrates believed that most people lacked the knowledge to act wisely on their own and required guidance rather than unfettered freedom. The modern presumption that individuals left to their own devices will produce optimal outcomes is therefore historically unusual and philosophically contested.
Moreover, the assumption that self-interest expresses itself as the pursuit of personal wealth is a product of societies already structured by private property. For most of human history, people understood their interests in terms of communal obligations, social standing, and shared resources. Wealth accumulation made little sense in systems where land and productive assets were not individually owned. Even today, daily behavior shows that people often prioritize leisure, relationships, or status over material gain. The modern association of self-interest with economic accumulation reflects a particular institutional environment, not an inherent human tendency.
Smith’s examples—the butcher, the baker, the urban trader—reinforced this urban, monetized perspective. But these cases represented a small minority of the population in early modern Europe, where most people lived in rural subsistence economies. Theories built on urban cash-based interactions were therefore applying a narrow model to a predominantly agrarian world. Not surprisingly, this helped generate deep tensions between urban commercial values and rural communal traditions.
The English enclosure movement illustrates how these philosophical assumptions, legal innovations, and economic changes converged. Beginning in the sixteenth century and continuing for centuries, landholders consolidated common fields, expelled long-standing peasant communities, and fenced off shared resources for private use. Enclosure created the material conditions for a society built on private property, wage labor, and market exchange. It also marked the transition from communal subsistence arrangements to a world in which economic life was reorganized around individual ownership and profit.
What emerged—modern capitalism—was not the natural unfolding of human behavior, but the outcome of specific political ideas about ownership, the individual, and the limits of state authority. These ideas reshaped legal structures, transformed social relations, and redefined what people came to see as their own “self-interest.”
From the sixteenth century through the early nineteenth, England transformed its property regime through a long series of parliamentary acts and court decisions. The beneficiaries of these legal battles were overwhelmingly members of the gentry, who gradually converted customary communal rights—grazing, wood gathering, seasonal uses of fields—into exclusive private holdings. Peasants had exercised these rights for centuries, but because they rarely possessed formal documentation and had limited access to courts, they lost case after case to landowners whose claims were increasingly supported by written deeds and a growing bureaucratic state.
This process, known as enclosure, was neither swift nor uniform, but its cumulative effect was profound. Land that had long been used cooperatively was fenced, legally redefined, and thereafter defended by state power. Courts and, when necessary, armed force were deployed to protect the newly consolidated estates against local resistance. The shift was not unique to England: analogous developments occurred in France under the Napoleonic legal order, in parts of China, and across settler colonies such as the United States, where Indigenous land was repeatedly appropriated through treaty violations backed by state enforcement.
The social consequences were dramatic. In the Scottish Highlands, for example, enclosure and related clearances reduced the population by around 70% between the early eighteenth and nineteenth centuries, and the region has never regained its earlier numbers. Across Britain, peasants displaced from their land were compelled to migrate to cities and enter a labor market they had historically avoided. For most of human history, people oriented their lives around subsistence, community ties, customary obligations, and non-monetary forms of value. Wage labor and the pursuit of private accumulation were marginal activities, not central motivations.
Enclosure changed this by creating conditions in which people had little choice but to earn money. Similar pressures emerged throughout colonial contexts, where populations reluctant to enter market economies were subjected to taxes payable only in cash or to other forms of coercion that forced participation in wage labor and commodity production. What was framed as the emergence of a “free market” was, in practice, a systematically engineered transformation of social life that restricted alternatives and compelled entry into new economic relations.
Underlying these developments was a political philosophy that linked private property, individual autonomy, and market exchange. Thinkers such as Locke and Smith imagined individuals as rational, self-interested actors whose pursuit of personal goals—especially in urban, commercial settings—would generate the greatest social good. Their theories assumed that labor is a form of private property the individual may freely sell, and that state interference should be minimized except to protect ownership and enforce contracts.
Historically, these assumptions described only a small segment of society, yet they came to justify sweeping reorganizations of land, labor, and governance. The modern capitalist order emerged from the conjunction of two conceptual shifts: the construction of legally protected private property, and a normative theory of human behavior that defined individuals as agents whose best life is achieved through market participation. Together, these principles reshaped economies, displaced traditional ways of living, and established practices—wage labor, private accumulation, market dependence—that now appear self-evident but are, in fact, recent and culturally specific innovations.
A recognition of this history highlights how unusual the contemporary economic model is when viewed against the broader record of human societies, and how many alternative arrangements have existed—and could exist—beyond the assumptions embedded in modern capitalism.
III.
The emergence of capitalism rests on two interlocking philosophical innovations: a new conception of the private and a new conception of the individual. Only when these ideas crystallized together—an individuated person endowed with rights, and a sphere of property belonging exclusively to that person—did the modern capitalist order become thinkable.
Seen from this perspective, capitalism is more than a system of markets, factories, or financial instruments. It represents the victory of a particular vision of human life, one that reconfigures social relations around personal autonomy, property, and self-directed economic activity. This transformation is so deep that it reshaped the basic structure of society over the past three centuries, a change often overlooked in discussions focused solely on economic performance or industrial development.
Historically, privacy in the modern sense barely existed. As late as the early twentieth century, rural Europe still built one-room houses in which large families lived communally, shared nearly all possessions, and had virtually no individualized space. Even where land was nominally private, usage remained partly communal—grazing rights, hunting rights, and shared pastures persisted well into the modern period.
This pattern held globally. Classical Chinese family compounds, for example, housed multiple generations, servants, and relatives within a single interdependent structure. Private rooms existed, but the overarching system was one of shared obligations, distributed authority, and collective ownership rather than personal property in the modern sense. Across the ancient world, architecture itself confirms the rarity of spaces meant for solitary occupation or individuated ownership.
The concept of private property as an individual’s exclusive domain—material, spatial, and legal—is therefore a recent historical development.
Equally new is the idea of the individual as a rights-bearing subject, possessing inherent and universal entitlements independent of ruler, community, or religious authority. Thinkers such as Locke, Hobbes, Rousseau, and Montesquieu articulated this view, which later appeared in political documents like the Declaration of Independence and the Declaration of the Rights of Man and of the Citizen.
This doctrine claimed that rights such as life, liberty, and property were innate, not bestowed by kings or negotiated through status or kinship. Historically, however, rights were always conditional—conferred by sovereigns, embedded in hierarchies, and shaped by obligations rather than universal principles. The modern theory of inalienable rights is thus revolutionary, not an inheritance from antiquity.
Importantly, one of these newly asserted rights was the right to private property, linking the individuated self to the emerging domain of individually owned goods, land, and capital. Together, these ideas supplied the philosophical foundation for capitalism.
Still, philosophical shifts alone do not explain why rulers tolerated—and eventually fostered—merchant activity that earlier civilizations had viewed with deep suspicion. The decisive force was the revenue crisis created by standing armies.
Early modern European states needed unprecedented amounts of money to fund continuous warfare. In the mid-sixteenth century, France could collect only around 4% of its national income as revenue. By the mid-eighteenth century, this had risen to 20–30%, with the majority spent on the military. Even tenfold increases in tax revenue proved insufficient.
To expand their fiscal capacity, states had to encourage commerce, banking, and overseas trade—activities they had long restricted. Merchants who had once been marginal or mistrusted became crucial to state survival, and economic innovation became a geopolitical necessity.
This shift marked a sharp break from earlier eras. Across the ancient world, merchants were viewed with pervasive suspicion. Long-distance trade functioned through chains of small, trust-based exchanges, not through integrated market networks. Caravans were often detained outside cities, subjected to taxes, scrutiny, and sometimes extortion. Relationships, not abstract institutions, governed trade; anonymity was dangerous, not normal.
Commerce was tolerated but rarely esteemed. Its destabilizing potential—foreign goods, unfamiliar wealth, itinerant traders—made it a perennial target of regulation and hostility.
The convergence of new philosophical ideals (private property and individual rights) with new fiscal demands (the need to finance modern states) produced a system in which markets, merchants, and capital moved from the margins to the center of social life. Together, these forces generated the economic and political order now called capitalism—a system impossible without the conceptual pairing of private and individual.
From the mid-sixteenth to the late eighteenth century, European states faced a fundamental contradiction: their political structures depended on restricting commerce, yet their survival increasingly required expanding it. Traditional regulations—limited market days, controlled fairs, and guild-dominated cities—kept trade intentionally small. These rules preserved existing hierarchies by preventing merchants from accumulating too much wealth or influence.
As these restrictions eroded, merchants began asserting new claims: protection of their goods, freedom to trade, and legal recognition of their activity. States responded by authorizing regular markets and loosening commercial controls, primarily because such activity generated much-needed tax revenue. Fiscal pressure was immense. Standing armies consumed the majority of state budgets—often 60–90 percent—forcing governments to expand monetary taxation. This shift required peasants and artisans to enter a money economy whether they wished to or not. Taxes could no longer be paid in produce; they demanded coin. The result was a sweeping monetization of everyday life.
This fiscal transformation elevated merchants from marginal figures to essential partners in state finance. Wealthy trading families—such as the Fuggers and Rothschilds—became powerful precisely because rulers needed access to capital, even as those rulers recognized the political danger of empowering non-noble elites. The process was gradual and contentious but reached a decisive turning point by the French Revolution. Napoleon’s ability to mobilize France’s economy for war demonstrated that military survival now depended on industrial capacity, infrastructure, scientific development, and national participation. The granting of “inalienable rights” functioned as part of this mobilization strategy, binding citizens to the state and supporting the new fiscal-military order.
Within this system, marginalized groups sometimes occupied paradoxical roles. Jewish bankers, for example, became disproportionately represented in finance not because Jews dominated the profession—most did not—but because rulers found them politically easy to control. Their restricted legal status meant they lacked the communal power that protected Christian nobles or guild elites. States could borrow from them while retaining the option—often exploited—of coercion or expulsion rather than repayment. This dynamic illuminated a broader structural reality: whenever new financial mechanisms empowered individuals, they simultaneously threatened older centers of authority.
Scientific and administrative reforms emerged within the same pressure system. Court officials such as Galileo were employed not to pursue abstract science but to increase mining output, improve taxation, and strengthen state capacity. This demand for expertise fostered new educational institutions, broader circulation of technical knowledge, and expanding professional classes. To secure investment for industrial projects—mines, manufactories, early mechanized production—states granted long-term privileges, exemptions, and monopolies. Public assets were privatized or leased, borders became more permeable for commerce, and merchant privileges accumulated into durable legal rights.
These continuous carve-outs weakened traditional aristocratic power while embedding a new logic: that private property, private enterprise, and private profit were indispensable to state functioning. By the late eighteenth century, economic advisors and fiscal administrators wielded influence approaching that of the old nobility. The balance of authority had shifted decisively toward those capable of generating revenue.
This environment also laid the foundations for early multinational corporations. Chartered companies like the Dutch and British East India Companies operated across borders with powers previously reserved for states—raising private armies, negotiating treaties, and establishing trade monopolies. Rulers recognized these entities as potential threats but tolerated them because they produced vital income and resources. Their emergence followed directly from the expanding rights of individuals to own, transport, and deploy property beyond the reach of any single government.
The larger pattern is clear: Europe’s fragmented geopolitical landscape generated intense competition that pushed states toward innovation, commercialization, and the liberalization of economic life. Other large empires—such as the Ottoman and Qing—faced fewer immediate pressures and often resisted the social disruption such changes entailed. In Europe, however, the combination of fiscal desperation, military rivalry, and philosophical developments concerning individual rights fused into a new system. The modern capitalist order emerged from this convergence: a monetized economy; legally empowered individuals; expanding markets; increasingly standardized banking; and commercial actors operating across multiple jurisdictions.
In this crucible, private property and the private individual became mutually reinforcing pillars of state power. Once joined and institutionalized, they supplied capitalism with its enduring momentum.
IV.
The long arc of global capitalism can be understood as an attempt to command forces that ultimately exceed their summoners. Once invoked, these economic and political powers proved extraordinarily difficult to restrain.
A major turning point in this transformation was the emergence of a new conception of the individual and a new model of enterprise—mutually reinforcing developments that crystallized in early modern Europe. Unlike earlier civilizations, which anchored moral life in one’s fixed social role, early modern Europe began to imagine individuals as possessing inherent rights and personal autonomy.
In traditions such as those expressed in the Bhagavad Gita, moral action was defined by fulfilling one’s caste-determined duties. Confucian thought similarly emphasized the moral primacy of one’s familial and social roles. Even in ancient Greece, questions of virtue were framed within obligations to the gods and the polis. Across these cultures, rights were granted by divine, royal, or aristocratic authority and were earned through service and obedience.
The early modern period—particularly the Reformation and the political upheavals that culminated in the articulation of the “Rights of Man”—introduced a radically different idea: that individuals are born with inherent rights, independent of church or crown. This conceptual shift eroded traditional hierarchies and created a new space for personal and economic self-assertion.
It was within this intellectual and political environment that the Dutch East India Company (VOC) was founded in 1602. The Dutch, a small but trade-dependent Protestant federation surrounded by larger Catholic powers, sought a way to secure economic strength and political survival. They consolidated competing merchant ventures into a single entity and granted it an unprecedented charter.
The VOC received not only a trade monopoly in the East Indies but also powers normally reserved for sovereign states: the authority to wage war, negotiate treaties, build forts, administer justice, impose taxes, and mint currency. It effectively operated as a semi-autonomous state pursuing commercial profit on behalf of the Dutch Republic.
Domestically, the VOC introduced another innovation: tradable shares. Investors could distribute risk by collectively financing voyages whose losses were frequent but whose successful returns were spectacular. A single profitable expedition could multiply an investor’s capital a hundredfold, creating a new class of merchants whose wealth rivaled that of Europe’s nobility. This economic power quickly translated into political influence.
The VOC’s success—arguably making it the most valuable company in history—reshaped European thinking about statecraft, commerce, and law. Other states took notice. Some sought to seize Dutch wealth by force; many instead attempted to replicate the Dutch model, redesigning financial, legal, and tax structures to cultivate their own merchant elites and overseas commercial empires.
What emerged was a feedback loop in which new economic institutions produced new kinds of individuals—autonomous, rights-bearing, profit-seeking—and these individuals, in turn, drove the expansion of capitalist structures. The result was a global system whose power, once called into being, could not easily be controlled.
The rise of the Dutch East India Company represented the emergence of a new kind of power in world history. Earlier civilizations had relied on extensive trade systems, but these were always administered directly by the state. In ancient Rome, for example, the massive grain supply required to feed the city’s population was managed by government officials because securing food for the populace was considered an essential state responsibility. The Greeks developed colonies, China and Japan periodically opened and shut their borders, and merchants existed everywhere—but nowhere were commercial enterprises entrusted with sovereign authority or framed as engines of autonomous profit-seeking.
The Dutch changed this. Drawing on existing legal and commercial practices, they assembled them into a new institutional form: a corporation endowed with state-like powers yet operating independently of direct state control. Once established, the VOC was simply told to “go”—to wage war, extract resources, build fortifications, and return profit. The state provided protection but refrained from interference. In doing so, it created a novel center of power: neither religious, nor military, nor governmental, but incorporating aspects of all three.
This structure required and fostered a new type of individual—entrepreneurs willing to take immense risks, investors comfortable with abstract shares, and merchants whose ambitions became models for economic behavior. Unlike the archetypal heroes of ancient Greece or the moral exemplars of Confucian and classical Chinese thought, these figures defined themselves not through aristocratic virtue, spiritual service, or hierarchical duty but through commercial initiative and individual autonomy. Historically, no tradition had held merchants up as paradigms of moral or civic identity; in early modern Europe, that began to change.
The founding of the VOC in 1602 preceded the high Enlightenment, but many of the philosophical ideals later articulated by Locke, Montesquieu, Rousseau, and Smith echoed social dynamics already unfolding. The emergence of corporations, the widening sphere of individual liberty, and the loosening of religious and educational constraints formed a reinforcing cycle. New legal and economic possibilities encouraged more expansive understandings of personal freedom, while Enlightenment thinkers interpreted these developments as evidence that unshackled individuals could generate social and economic progress.
This intellectual and political climate also shaped movements like the Scottish Enlightenment, where Nonconformist universities encouraged experimental inquiry and intellectual independence. The broader effect was a reconfiguration of power: individuals were increasingly imagined as rights-bearing agents, and corporations as entities capable of shaping global events. These ideas—now so familiar—were historically rare before 1800.
By the time European states sought to emulate the Dutch model, especially through the British East India Company, the implications were clear: once unleashed, these corporate powers became difficult to control. They shaped not only economies and states but also the values and expectations of the individuals who participated in them. Over the following centuries, these forces became central drivers of European and global history.
V.
The history I am outlining may sound almost too neatly aligned with my argument, yet its components are well established—though rarely connected. My goal is to present them as broadly and coherently as possible, not merely as retrospective curiosities but as elements that clarify otherwise opaque features of capitalism’s development.
A crucial transformation occurred in early modern Europe: the emergence of a new conception of the individual as an autonomous, rights-bearing proprietor, articulated by thinkers such as Locke, Hume, Hobbes, and Rousseau. Simultaneously, states—exhausted by centuries of continuous warfare—developed an acute need for revenue. This fiscal pressure encouraged innovations in finance, property relations, and market practices. These developments were not linear or unilateral; they reinforced one another. The new individualist ideology grew alongside new financial institutions, private property norms, and an increasingly assertive commercial class whose economic interests demanded legal protection and political influence. In effect, a third actor appeared between rulers and ruled: corporate entities seeking security, markets, and resources through military, legal, and diplomatic means.
This synergy helps explain why Europe became the earliest and most aggressive incubator of capitalist practices. The combination of evolving ideas about the individual and persistent fiscal strain created a fertile environment for experimentation in private markets, joint-stock companies, and state-supported trade. Over time, these experiments coalesced into what we now identify as the origins of capitalism.
This dynamic also illuminates why the United States emerged as capitalism’s most emphatic expression. Many of the earliest English and Dutch settlements in North America—Virginia, Plymouth, Massachusetts Bay, Newfoundland, New Netherland—were founded in the 1600s as joint-stock corporations. Their explicit purpose was revenue generation for investors and sponsoring governments. Colonization itself was not new; what was new was the systematic organization of colonies as profit-seeking enterprises. Settlers were not simply agents of imperial expansion or military occupation. They were participants in commercial ventures designed to produce returns, employ labor (free, indentured, and enslaved), and generate taxable trade. This corporate logic shaped both the institutions governing the colonies and the expectations of those who lived within them.
By contrast, Spain largely extended its empire through direct royal administration, viceroys, and officials tasked with enforcing the crown’s authority. Individuals were not typically conceptualized as independent proprietors with economic autonomy. France occupied an intermediate position. These structural differences created fundamentally different colonial cultures. In the English and Dutch cases, colonies were semi-autonomous economic systems intended to sustain themselves and produce profit; thus, their eventual assertion of political independence was not anomalous but consistent with their original design.
The same model appeared across the globe in the chartered corporations of the era, such as the British East India Company, founded around 1600. Granted monopolies and trade privileges, these companies expanded rapidly and became central pillars of national revenue. By the time of the American Revolution, foreign trade supplied roughly two-thirds of Britain’s income, whereas France derived perhaps a quarter to a third from overseas commerce. This level of integration helps explain why the Napoleonic Wars became genuinely global conflicts: closing ports and breaking trade networks threatened the fiscal foundations of entire states.
Yet the corporate system generated crises of its own. The British East India Company suffered chronic mismanagement and corruption, and when it neared collapse, the British government intervened—unsurprising, given that a significant fraction of Parliament owned company stock. The Tea Act of 1773 was part of this rescue effort. By granting the Company the right to ship tea directly to the American colonies and sell it at reduced prices, the government aimed to stabilize the corporation and increase tax revenue. Colonial merchants, however, saw this as an attack on their economic liberties and an infringement of what they understood as natural rights: the right to control their own markets and resist monopolistic manipulation from afar.
The ensuing conflict was not merely a dispute over taxation but a clash between two incompatible visions of economic order. The British government viewed trade monopolies as tools of statecraft; colonial merchants understood themselves as autonomous market actors whose rights precluded such interference. The American Revolution, therefore, was also a revolution in defense of a particular market structure—one rooted in the corporate, profit-oriented character of the colonies from their inception.
American colonists insisted that neither taxation nor territorial restriction could be imposed upon them without their consent. They argued—however self-serving the logic—that Native Americans lacked legitimate landownership because they did not “use” the land in a manner Europeans recognized. Under this rationale, Indigenous territory was considered free for the taking; once settled or cultivated, it became private property within the new legal order. This logic mirrored the English enclosure movement, but on a continental scale.
The United States did not emerge merely alongside the rise of capitalism; it emerged from it. The new ideals of natural rights, self-ownership, and private property were not incidental influences on American political thought—they were the ideological foundations of the Constitution itself. What appeared to be a revolt against “tyranny” was, in reality, a rejection of the older, intricate web of mutual obligations, communal norms, and inherited legal constraints that characterized premodern societies. The new republic was built as an expression of the capitalist conception of the individual: autonomous, proprietorial, and unencumbered.
A third factor amplified this trajectory: unlike Europe, the new nation faced no entrenched institutional or cultural counterweights. Indigenous societies were treated as external to any binding order and therefore as barriers to be removed rather than communities with whom obligations existed. The absence of historical restraints—no feudal residues, no ancient corporate bodies, no centuries-old negotiated rights—created what many settlers explicitly described as a blank slate. America, in their own language, was a place to “begin again,” free from the inherited limitations that slowed capitalist development elsewhere.
This combination proved decisive. Those who settled the colonies already operated within the emerging capitalist mindset; the legal and institutional structures of settlement were themselves products of capitalist innovation; and the North American continent, as Europeans imagined it, offered almost no institutional resistance to these forces. Fertile land, oceans, and natural resources mattered, but they were hardly unique advantages—Europe had them too. What distinguished the United States was the absence of brakes: the dismantling of old social systems, the wholesale transfer of land, and the embedding of profit-seeking and proprietorship into the very logic of national expansion.
As a result, a marginal colonial experiment in the late eighteenth century became a major economic power within a century, and the dominant global economy within another. This early outpouring of capitalist energy also shaped American cultural memory. A society built on rupture from the past—on self-invention and economic freedom—does not easily cultivate a strong sense of historical continuity; it defines itself by what it left behind.
These dynamics unfolded just as Europe was forced to confront similar economic transformations. The American Revolution and the Napoleonic Wars exposed the degree to which commercial systems, trade networks, and corporate power had become central to state survival. Napoleon’s continental blockade illustrated how deeply Europe had come to rely on global commerce, and its economic disruptions forced statesmen at the Congress of Vienna to confront questions about trade, navigation, and market access as core elements of political reconstruction. By the early nineteenth century, the fusion of new legal conceptions of the individual with powerful commercial institutions had become a defining global force.
In this environment, the foundations were laid for the first fully multinational corporations of the nineteenth century, which would accelerate the global reach of these intertwined economic and political transformations.
VI.
The rise of capitalism fundamentally reshaped how individuals perceive themselves, society, and the economy. It is not merely an economic phenomenon; it is inseparable from transformations in political authority and conceptions of the individual. Capitalism emerged with a new understanding of human agency: individuals as autonomous agents capable of owning themselves, making choices freely, and participating in markets. These three dimensions—selfhood, governance, and economic exchange—are mutually constitutive, not sequentially causal.
In the United States, this transformation was accelerated by the absence of historical and institutional constraints. With Indigenous populations forcibly removed, settlers could implement new economic and legal norms without interference from entrenched social systems. The American experiment, therefore, became a kind of fast-forwarded demonstration of the capitalist ideal: a society built around individual property, self-ownership, and market freedom.
This conception of the individual extended into fundamental legal debates. The American Civil War, often framed as a moral conflict over slavery, was centrally a dispute about property rights: whose ownership claims trumped whose? Slaves were legally treated as property, and the conflict revolved around whether individuals could own themselves or be owned by others. The post-war amendments—the 13th, 14th, and 15th—redefined citizenship and rights in terms of self-ownership and legal recognition, formalizing the capitalist conception of the individual as both autonomous and legally protected.
Capitalism also transformed everyday cognition. Individuals internalize vast networks of abstract valuations—prices for goods, services, and labor—allowing them to navigate complex markets efficiently. Modern consumers can instantly assess the value of thousands of items, an ability that would have been unimaginable in premodern subsistence economies, where goods were exchanged rarely, locally, and relationally. This abstraction replaces concrete, narrative-based social relations with a universal numerical logic: value is encoded in price, not in personal ties or historical context.
The abstraction of value extends to moral indifference toward production conditions. For example, British consumers and officials in the early nineteenth century simultaneously interdicted the slave trade while profiting from slave-produced cotton, illustrating how markets reduce social and ethical considerations to price signals. Price, in the capitalist framework, becomes the primary measure of worth; all other contexts—historical, moral, or personal—are secondary or irrelevant. This abstraction is not incidental; it is central to the system’s function. Markets operate through price as a signal, allowing complex societies to coordinate without attending to the particulars of individual transactions.
Cultural and moral values continue to shape social behavior, but under capitalism, these signals are often subordinated to economic imperatives. In France, for example, Sunday remains largely a day of rest: most stores close, reflecting a centuries-old moral framework prioritizing religious observance and human well-being over commerce. In contrast, in the United States, commercial activity proceeds uninterrupted every day of the week. Here, economic efficiency and profit dominate; all other values are secondary or expendable.
This shift extends deeply into education. Historically, education was a pursuit of free individuals—liberally educated for intellectual and civic development rather than economic gain. Modern capitalism, particularly in the post-war United States, has transformed education into an investment, measured almost exclusively by its financial return. Tuition costs have skyrocketed, and state support has declined, reflecting a system that values economic efficiency over personal or societal enrichment. Similarly, professional vocations, once often tied to ethical or religious service, are now primarily remunerative. Becoming a doctor, for example, is largely framed as a path to wealth rather than a commitment to public service.
Even everyday decisions and relationships are framed economically. Individuals evaluate friendships, family ties, and career choices through implicit cost-benefit calculations: how much is a relationship worth compared to a higher-paying opportunity? Markets and prices shape social priorities, often reducing personal and moral considerations to transactions. The result is a worldview in which everything—time, labor, education, even human interaction—is quantified and subjected to economic logic.
Society is structured around economic imperatives, often leaving little room for alternative values. From education to career choice, we are conditioned to evaluate decisions primarily through financial logic. Students are pressured to attend “the best” schools—often defined by prestige or earning potential—rather than by fit, friendships, or meaningful engagement. Yet research consistently shows that job satisfaction depends on two factors: feeling useful and working with people one likes. Financial reward rarely compensates for the absence of these intrinsic values.
This tension is evident across daily life. Efforts to introduce moral or ethical considerations—organic food, child safety, environmental stewardship—frequently clash with price-driven behavior. Consumers may desire healthier or ethically sourced food, but if the price feels “too high,” economic reasoning overrides these values. The same logic shapes views on labor and care: the work of raising children or providing healthcare is often measured solely in monetary terms, sidelining contributions that are socially and morally vital.
Modern capitalism reflects and reinforces these priorities. Corporations, markets, and pricing structures are not external forces acting on individuals; they are expressions of our own conditioned values. The rise of corporate power, the framing of education as investment, and the monetization of professional vocations all mirror the way society has internalized economic calculation as the primary metric of worth.
Even politics is shaped by this orientation. Historical struggles, such as the abolition of slavery, were codified in terms of property and legal rights rather than moral or humanistic arguments, illustrating how economic and legal frameworks dominate the interpretation of justice. Ultimately, capitalism is not a distant force—it is an extension of our collective priorities, reflecting the values we have internalized and normalized. Recognizing this is essential to understanding why social and ethical considerations struggle to compete with economic logic.
VII.
We now arrive at the present: where we were, where we are, and where we are heading. A crucial point is that our modern values—shaped by capitalism and individualism—are difficult to recognize historically, because they differ fundamentally from those that guided previous generations. This disconnect affects both our governmental and economic systems.
On the corporate side, governments created new entities—companies—that introduced entirely new forces into the world. These corporations, though born from older value systems, quickly developed their own objectives, often diverging from the goals of the states that spawned them. The Dutch and British East India Companies exemplify this: initially intended to serve the state, they instead prioritized the profits of their members. Their actions—wars, exploitation, corruption—often conflicted with the interests of the governments that chartered them. The companies were instruments of wealth creation, but for individuals rather than for the state. Governments frequently had to intervene to control their excesses, highlighting the tension between political goals and corporate priorities.
Early economic thinkers—Locke, Hobbes, the physiocrats—tried to conceptualize corporations through familiar models, like the baker or the butcher. This analogy is misleading: small-scale trades cannot capture the novel dynamics of global corporations, whose creation introduced unprecedented forces shaping politics, economies, and societies.
The evolution of corporations accelerated with multinational expansion. The Singer Sewing Machine Company in 1867 marked one of the first examples of a corporation operating across borders, establishing a new paradigm: companies no longer existed primarily for a single nation’s benefit. Their global operations created wealth independently of national interests, challenging traditional assumptions about trade, loyalty, and governance. Governments soon realized that corporations could pursue profit without regard for national consequences—illustrated vividly in wartime contexts, where multinational operations sometimes indirectly aided enemy states, simply because profit remained the sole guiding principle.
Historically, the educated elite of the British Empire were trained in Latin, Greek, philosophy, and history, and were expected to adhere to Anglican faith. This classical education instilled a specific value system, emphasizing civic duty, moral reasoning, and communal responsibility. Fast forward to the mid-20th century, and business education in the United States had transformed completely: core subjects were accounting, finance, economics, marketing, management, and business law. The shift reflects a profound change in values—from cultivating civic and ethical understanding to the single-minded pursuit of profit.
Modern corporations are not merely larger or more efficient versions of traditional businesses—they embody a fundamentally new worldview. Multinational in scope, these entities operate outside local or national loyalties, and their primary—and often sole—goal is maximizing return for shareholders. Local culture, history, and ethical considerations exist only insofar as they serve that goal. Corporate structures naturally select for individuals who internalize this profit-driven logic, reinforcing the system over generations. Unlike historical rulers, who had to balance multiple responsibilities—protecting citizens, managing resources, maintaining alliances—corporations are unconcerned with broader social or political obligations.
This divergence has profound implications. Where colonial governments once imposed constraints on exploitation, corporations often pursued profit independently, generating wealth for members but frequently at the expense of local populations or even state interests. The withdrawal of colonial authorities sometimes removed moderating forces, leaving corporate power unchecked, demonstrating that exploitative outcomes are often rooted in institutional structures rather than explicit state policies.
The modern corporate paradigm—global, multicommunal, and profit-maximizing—has reshaped both political systems and individual expectations. Wealth has become an abstract ideal, celebrated culturally and embedded in narratives from the American Dream to hip-hop, reinforcing the notion that success is measured primarily by financial achievement. Unlike a local baker or merchant, who serves a community while earning a livelihood, corporations pursue comparative advantage, growth, and dominance, redefining the scale and nature of economic competition.
This rapid transformation—from state-oriented commerce to multinational corporations with autonomous objectives—remains recent and poorly understood. Legal frameworks allowing cross-border corporate operations only emerged in the mid-19th century, and generalized free trade agreements were not established until after World War II. In just over 150 years, corporations evolved from national instruments to global entities, fundamentally reshaping political and economic relationships. The challenge today is that society is still adjusting to these new structures, often misinterpreting their origins and purposes.
In short, corporations are new forces with distinct value systems, separate from the governments that created them. They were designed to generate wealth—but for individuals and shareholders, not states—and this distinction continues to define our modern economic landscape. We live in a world dominated by corporate entities that answer primarily to their internal logics rather than the governments or societies that created them. This new structure has transformed social, political, and economic life, creating a tension between enduring human values and the imperatives of multinational capitalism—a dynamic that continues to shape the modern world.
VIII.
We have now arrived at the contemporary moment, shaped by the consolidation of a neoliberal global order. This system—semi-liberal, semi-democratic—prioritizes free markets and individual freedom, operating under the principle that wealth generates freedom, which in turn generates more wealth and stability. The logic is seductive: remove barriers, allow global trade to flourish, empower individuals economically, and society will naturally prosper. This is the culmination of Enlightenment thought, extended through modern libertarian philosophy: the ideal of the autonomous individual exercising agency through market participation.
The collapse of the Soviet Union seemed to confirm this narrative. Global trade, financial integration, and the dominance of the dollar-based system accelerated dramatically. The assumption was that the world had reached an era of unprecedented prosperity and liberty. Yet this vision was never accurate, because it misread the values underlying human life. Corporations do not operate under human-centered ethics; they pursue profit as an absolute, often indifferent to social, cultural, or environmental consequences.
Two illustrative examples clarify this disjunction. The first is the Xbox One launch, which introduced restrictive digital rights management. Consumers could not share, resell, or even transport games freely—despite having purchased them. From a human perspective, this violated natural social behavior: sharing, gifting, and communal enjoyment. From a corporate perspective, it was necessary to maximize revenue. The ethical conflict is stark: corporate imperatives override longstanding social norms, even in trivial domains like gaming.
The second, far graver, example is the Green Revolution. Following World War II, returning American soldiers, empowered by the GI Bill, applied scientific knowledge to agriculture with the goal of ending famine. They dramatically increased global food production, reducing starvation and poverty—a profound human achievement. Yet, from the 1970s onward, the drive to feed the world was co-opted by multinational agribusinesses such as Monsanto, Archer Daniels Midland, and Cargill. Their goals were no longer humanitarian. Innovations like terminator seeds or genetically engineered Bt cotton prioritized profit over survival. Farmers became dependent on expensive seeds, debt increased, yields failed to meet expectations, and in some cases, livelihoods were destroyed. The rhetoric of feeding the world masked the singular corporate objective: maximizing return.
These examples illustrate a fundamental shift. The world’s most powerful actors—global corporations—operate outside the moral and cultural frameworks that historically guided human societies. Wealth, efficiency, and market dominance supersede human-centered values. While markets are assumed to reflect freedom and well-being, in practice they often erode them, exploiting both natural and social systems to extract maximum profit. The result is a world in which human welfare can be subordinated to abstract financial imperatives, creating both ethical dilemmas and tangible suffering.
The debate over genetically modified crops often misses the central issue. Scientific consensus may suggest that genetic modifications are safe, but the critical question is control: who owns the seed, who sets the price, and who decides who has access. Companies like Monsanto claim ownership through intellectual property rights, ensuring that farmers cannot save, share, or replant seeds freely. Even if the crops themselves are beneficial, the system is structured to maximize corporate profit rather than address human needs.
Consider promises of innovations like saltwater-tolerant rice. While it could theoretically feed millions, in practice, these seeds are expensive and accessible primarily to large agribusinesses. The resulting harvest is sold to the highest bidder, not necessarily those who need it most. Large-scale industrial agriculture often diverts food to more profitable uses, such as ethanol production, rather than lowering prices or feeding the hungry. The aim is never to feed the world; it is to maximize profit. Food abundance exists, yet systemic priorities ensure that wealth, not human welfare, dictates distribution.
This pattern extends across global agriculture. The Green Revolution, pioneered by returning World War II soldiers, originally sought to alleviate hunger through innovation and technology. Their efforts were profoundly successful. Yet, corporate interests eventually co-opted these initiatives. Terminator seeds, Bt cotton, and similar technologies prioritize continuous sales over sustainability, often harming small farmers and undermining local food security. Corporations like Monsanto, Archer Daniels Midland, and Cargill pursue maximum revenue regardless of human consequence. Their public rhetoric of “feeding the world” masks a singular commitment to profit.
This divergence reflects a broader historical trend. Corporations, especially multinational entities, operate independently of any national government, guided solely by profit. Governments and societies initially assumed these institutions would align with broader social goals, but this assumption is increasingly untenable. Free-market expansion has created unprecedented wealth, yet it also entrenches values that prioritize corporate gain above human welfare. Cultural resistance in societies with different value systems—such as certain Islamic cultures or historically state-directed economies like Japan, South Korea, or China—illustrates that the global adoption of these corporate values is neither natural nor inevitable. Economic structures can be deliberately shaped to achieve goals other than wealth maximization.
Recent crises have exposed the fragility and artificiality of the neoliberal narrative. The 2008 financial collapse revealed that markets are inseparable from government intervention, undermining the myth of self-regulating capitalism. COVID-19 further highlighted the flexibility of state power, as governments implemented emergency fiscal measures previously deemed impossible. Both events challenged the assumption that profit-maximizing corporations naturally serve public good, forcing a reevaluation of systemic values.
Meanwhile, geopolitical realities complicate the global picture. Leaders like Xi Jinping pursue state-directed economic and political objectives, subordinating market logic to political control. Profit and free-market principles are tolerated only insofar as they align with these broader ends. The presumed universality of neoliberalism—free trade, wealth accumulation, and market-driven freedom—faces both structural and ideological limits.
China’s housing crisis illustrates how political priorities can diverge radically from economic norms. Xi Jinping’s approach—allowing the housing market to collapse so homes become more affordable—shocks many observers accustomed to profit-driven reasoning. Wealth destruction and bankruptcies mattered little; the goal was access to housing, not the accumulation of capital. Similarly, when Ant Group’s planned stock offering clashed with regulators, billions of dollars in corporate value vanished overnight, demonstrating that Chinese corporate and political priorities operate under a fundamentally different logic than Western free-market assumptions.
Russia offers another contrast. Vladimir Putin’s invasion of Ukraine confounded predictions based on economic reasoning. Traditional analyses assumed nations act to maximize financial gain, yet Putin’s primary goal is power, not wealth. Sanctions and economic restrictions aimed at curbing Russian influence underestimated the fact that monetary loss is irrelevant when the pursuit of geopolitical dominance takes precedence. Such cases reveal the limits of a worldview that equates profit with motivation and power with progress.
These divergences challenge deeply ingrained narratives about global economics and governance. For much of the modern era, Western assumptions have held that corporations and governments naturally operate to maximize wealth, efficiency, and, implicitly, human welfare. This mindset has conditioned societies to equate success with monetary gain, encouraging the belief that personal and societal freedom is attainable through wealth accumulation. Crises like the 2008 financial collapse and COVID-19 exposed the fragility of these assumptions, showing that markets rely heavily on state intervention and that wealth alone does not ensure security or freedom.
Moreover, cultural and systemic diversity underscores that profit-maximization is not a universal human value. Islamic finance, for example, integrates ethical, legal, and religious principles that prioritize social equity over unbounded profit. Many societies historically—and still today—reject the assumption that individual wealth accumulation should dominate public policy or social organization. These alternative frameworks remind us that economic logic is culturally contingent, not universal.
Finally, the corporate mindset itself often necessitates extreme abstraction and moral detachment. Executives may make decisions purely by numerical metrics, treating human suffering, environmental harm, or social inequity as data points rather than ethical considerations. In contrast, most individuals, when given resources, would pursue collective benefit rather than exploit vulnerability for profit. Recognizing this dissonance is critical for understanding why contemporary global systems often appear misaligned with human welfare.
In sum, the world is in a state of profound narrative disruption. The Enlightenment-era assumption—that the right combination of governance, economics, and human agency naturally produces stability, prosperity, and freedom—is no longer functional. Crises, geopolitical realities, and cultural divergence have exposed the fragility of this worldview, creating a vacuum in our understanding of global order. We now face the task of interpreting and navigating a world where the old rules no longer suffice, and where power, values, and incentives operate under multiple, often conflicting logics.
IX.
Recent events—from housing crises under Xi to geopolitical moves by Putin—have prompted societies and thought leaders to question whether the neoliberal world order is functioning as promised. Broadly, the consensus is clear: it isn’t. Yet recognizing failure is not the same as knowing what works, leaving us in a state of uncertainty.
This crisis of confidence is partly generational. Over decades, local, regional, and national perspectives were largely supplanted by a global corporate worldview, in which profit and centralized corporate power became the dominant values. Traditional concerns for social goods, communal welfare, and local stability were displaced. GDP growth, rising stock markets, and global economic expansion were presented as universal solutions, even as homelessness, unaffordable healthcare, and housing shortages persisted. The simplification—“wealth generation solves all problems”—has long justified policies that overlook social consequences, from disrupting agrarian societies to mass urban migration in China and beyond.
Global corporations, as independent actors with unprecedented power, further complicated this picture. Unlike nation-states, these entities operate on global scales, unbound by local obligations, with motivations centered entirely on profit. While early phases of globalization sometimes brought broad benefits, over time corporate activity increasingly became a structural exercise of power, often at the expense of human welfare. The notorious case of Nestlé’s marketing of powdered infant formula in developing countries exemplifies this: in pursuit of profit, companies deliberately undermined breastfeeding, risking infant malnutrition and death. International regulations were created to mitigate these harms, yet violations persist, reflecting a corporate logic indifferent to human life.
The resulting societal response has been contradictory. On one hand, individuals—especially younger generations—have adapted by internalizing capitalist lessons, emphasizing personal gain and strategic opportunism. This manifests in job-hopping, demanding conditions, and exploiting technological tools to bypass institutional norms, reflecting a pragmatic, sometimes ethically fraught response to systemic inequities. On the other hand, many adopt deep cynicism, assuming that all actions are driven by monetary gain, dismissing social or ethical intentions as irrelevant.
Both responses underscore a broader disillusionment: the old social and economic narratives—loyalty, communal responsibility, and the promise that wealth generation aligns with human flourishing—are losing credibility. This dissonance destabilizes institutions, corrodes trust, and reshapes human behavior in ways that are simultaneously adaptive and disruptive.
We live in a world dominated by systems that feel fundamentally exploitative, yet no viable alternatives exist. People sense that the current system is failing—whether it’s young workers navigating precarious labor markets or global citizens observing corporate excess—but we are unable to articulate solutions. Even highly educated, globally informed analysts often interpret the wrong metrics as indicators of success: Russia’s GDP growth during wartime or Ukraine’s rebound is celebrated as economic vitality, despite massive human suffering. The obsession with GDP and stock markets is essentially ritualistic, like reading omens in ancient Greek sacrifices: we mistake numerical growth for actual well-being.
This misalignment obscures real problems. In the United States, housing prices exemplify systemic dysfunction. In some small towns, prices per square meter surpass those in Paris. Relative to 1970, housing would need to fall by roughly 65% to return to affordability. The causes are largely policy-driven: repeated capital gains deductions, subsidized mortgages through government-backed entities, and tax breaks for investment properties inflate demand, encouraging speculation rather than providing homes to live in. Addressing this requires deliberate “wealth destruction”—deflating asset values to restore affordability—a solution that is politically unthinkable despite its rational necessity. It illustrates how deeply entrenched our attachment to wealth, rather than to social goods like housing, shapes our decisions.
This pattern repeats globally. Centrist parties tinker with minor reforms, offering marginal improvements that fail to tackle structural issues, while populist movements gain traction by at least acknowledging systemic failure, even if their proposed solutions are fantastical or ideologically driven. The public instinctively gravitates toward those who recognize the problem—“the house is on fire”—even if the proposed remedies are imperfect. The core issue, however, remains unaddressed: our inability to conceive, discuss, and implement economic and social policies that genuinely prioritize human welfare over abstract metrics of wealth.
In short, we are stuck in a state of cynical awareness. We see exploitation, inequality, and mismanagement, yet lack the frameworks, courage, or imagination to propose actionable alternatives. The metrics we rely on are largely symbolic, policies serve vested interests, and public discourse fails to envision radical but rational solutions. Our collective confusion is the defining feature of the current phase of global capitalism: we know the system is broken, but we cannot yet imagine a better one.
X.
History offers a vivid lens on our modern predicament with regards to the housing crisis. Consider Girolamo Savonarola, the 16th-century Florentine preacher who seized control of his city, criticizing the corruption of the Church. Despite universal recognition of the Church’s failings, any proposal to act on this critique met with fatal consequences. The parallel today is striking: we understand the systemic failures of wealth, housing, and corporate power, yet anyone proposing decisive solutions—like reducing asset values to restore housing affordability—is immediately dismissed, ridiculed, or feared. Ideology, like the Church in Savonarola’s time, provides a narrative framework that makes alternative visions seem impossible. It organizes our understanding of the world but also blinds us to solutions.
This ideological blindness appears globally. A decade ago, French economist Gene Arthuis argued that Europe’s economic problems could be solved through GDP growth and regulatory harmonization, a conclusion seemingly predetermined by allegiance to the ideology of growth. Today, similar arguments emerge in the U.S., where authors advocate “building more houses” to solve the housing crisis. Yet even when new construction occurs, speculative investment, REITs, and inflated prices prevent affordability. The logic is circular: growth and wealth are assumed to be inherently beneficial, yet the material realities for ordinary people remain unaddressed. Even policies that could modestly improve access take decades to implement and are rarely considered politically viable.
This pattern extends beyond economics. In Afghanistan, Carter Malkasian demonstrates that despite massive investment in infrastructure, education, and military support, cultural and religious dynamics rendered much of it ineffective. Local communities were not motivated by wealth or stability alone; external material abundance often exacerbated resentment rather than resolving conflict. Here, too, ideological assumptions—that prosperity automatically produces cooperation—blinded policymakers to deeper realities.
No amount of wealth—whether at Google, Apple, or any other corporation—solves the problems we face. Even with unprecedented profits, these companies will continue to prioritize growth, efficiency, and shareholder returns over human well-being. Employees may be squeezed, laid off, or pressured, regardless of corporate earnings. Similarly, housing production or wealth generation does not inherently create affordability or equity; in fact, they often exacerbate existing problems. Excess wealth and profit, rather than scarcity, are frequently part of the issue.
This dynamic extends beyond corporations. Consider U.S. infrastructure: we excel at building new projects but fail at maintenance, because investment in construction generates visible growth, jobs, and economic activity, whereas maintenance does not. Bridges decay, roads crumble, and public safety suffers—not for lack of money, but because the system rewards creation over preservation. A more effective solution—halting new projects and redirecting resources to upkeep—remains politically unthinkable.
The same principle applies to broader economic policy. Conventional wisdom insists that increasing corporate profits or GDP automatically solves social problems. Yet higher earnings rarely translate into more employment, better housing, or improved services. Policies assume that wealth generation is the path to societal improvement, but the evidence consistently undermines this assumption. Ideology, deeply embedded in our capitalist framework, blinds us to alternatives, framing growth and profit as inherently beneficial while rendering any deviation inconceivable.
This ideological constraint shapes debates across contexts. In France, for example, retirement age reforms are argued in terms of fiscal necessity—too many retirees, too much government expenditure—without considering solutions that redistribute time rather than money, such as more vacation over longer working years or incentive-based approaches. Likewise, housing affordability is often treated solely as a matter of supply, ignoring structural incentives that inflate costs and concentrate wealth.
Adopting a different mindset can reveal overlooked solutions. Drawing on Daoist philosophy, one asks not “What can we do?” but “What can we stop doing?” Many societal problems arise not from too little action, but from excessive or misdirected effort. Redirecting resources, curbing speculative incentives, and questioning default assumptions about growth and profit can yield practical, humane solutions without additional spending.
The challenge is systemic: our economic, political, and social frameworks constrain imagination. Well-meaning experts, even highly educated ones, are often trapped by ideology, producing analyses that are technically sophisticated yet fundamentally blind to alternative possibilities. Recognizing this ideological grip—seeing problems not as matters of scarcity but of excess, misalignment, or perverse incentives—is the first step toward meaningful change. Understanding this does not provide immediate solutions, but it clarifies the landscape: conventional approaches alone are insufficient, and true progress requires rethinking the assumptions that structure our world.
XI
It is reasonable to ask how we might anticipate the future, yet certainty is impossible. History, however, offers analogies: systems once dominant can persist for centuries and then gradually transform. Consider the Reformation, which reshaped European culture and religion over 500 years, or the influence of Zoroastrianism on Islamic Persia, where cultural practices endured long after political conversion. In Japan, the Meiji Restoration and modernization transformed traditional society, yet writers like Jun’ichirō Tanizaki mourned what was lost, revealing the slow imprint of systemic change.
These examples illustrate that profound societal shifts rarely happen overnight. Revolutions—whether political, economic, or cultural—are historically rare, costly, and often violent, as seen in the French Revolution or Soviet collectivization, which required immense human suffering to enforce change. Even when power structures are challenged, old patterns persist. The Catholic Church, despite losing half of Europe during the Reformation, remains globally influential five centuries later. Transformation is incremental, shaped across generations, not instantaneously achieved by top-down decrees.
Capitalism exemplifies this complexity. While flawed in distribution—healthcare and housing are evident failures—it functions extraordinarily well in efficiency and production. Any replacement must contend with what capitalism already does effectively, or risk losing functional systems entirely. As a result, change tends to be gradual, layered, and partial, blending old structures with emerging practices.
Current examples include Europe’s energy transition. Governments are promoting renewables not purely for economic gain but to address climate risk and reduce dependence on geopolitically volatile fossil fuels. This represents a subtle yet significant shift: interventions occur despite potential GDP slowdown, increased costs, and lower corporate profits. Similarly, in agriculture, organic and ecological approaches are increasingly justified for intrinsic value—beauty, biodiversity—not just efficiency or yield.
An old system shows signs of failure when its critiques can be accepted—or even ignored—without resistance. Consider historical examples: if the Catholic Church had threatened Florence with excommunication over Savonarola, but the population had responded, “Fine, we’ll manage,” it would have revealed a system losing its grip. The capacity to enforce belief or obedience wanes before an ideology collapses entirely.
We are witnessing a similar process today. Change rarely comes as a single movement or dramatic revolution; it is gradual, emerging from multiple directions and challenging previously dominant systems. The hegemony of capitalism, for instance, is not absolute. Slowly, cracks appear—values that were once subordinate to profit are now being prioritized for their intrinsic worth: environmental protection, sustainability, biodiversity, intellectual enrichment, and human freedom. This reflects a shift from a single metric of success—GDP or corporate profit—to a more complex valuation of society.
Education illustrates this transformation. Traditional debates fixate on financial return, yet students and institutions increasingly question what education is truly for. Its greatest power lies not in immediate economic gain, but in expanding awareness, understanding opportunities, and fostering the capacity to define one’s own conception of success. This shift—from instrumental to intrinsic value—signals a loosening of the old ideological monoculture.
History offers precedents for managing diversity under overarching structures. The Roman and Ottoman empires governed vast, heterogeneous populations through loose central authority, creating coherence amid diversity. Likewise, the future is likely to be shaped by multiple value systems coexisting under emergent overarching narratives, which have yet to fully crystallize. Environmentalism may provide one such framework, accommodating various approaches and perspectives while fostering coherence.
We are living in a period of confusion rather than sudden revolution. Existing systems will persist, though increasingly constrained and moderated by competing values. Over time, this evolution will expand freedom of thought and action, while compelling society to search for new narratives capable of unifying diverse values and guiding collective life beyond the old capitalist framework.
XII
Often discussed purely as an economic system, capitalism is better understood as a political, social, and cultural philosophy: a comprehensive worldview that frames individuals, governments, and societies in terms of markets, private ownership, and monetary exchange. It is a philosophical argument about how we should live, measure value, and understand ourselves.
Capitalism encourages individual freedom, yet this freedom is largely defined through consumption and the pursuit of personal happiness. It privileges the present over both past and future. Historical examples illustrate an alternative vision: Capability Brown, an 18th-century landscape designer in England, planted estates meant to mature over a century, benefiting generations long after his death. This long-term perspective—valuing legacy and intergenerational stewardship—stands in stark contrast to the short-termism inherent in capitalist thinking, where immediate utility and consumption dominate.
This temporal narrowing mirrors the spatial and social narrowing produced by capitalism’s logic of enclosure. Modern experiences, such as Euro Disney, epitomize this: spaces of wonder and beauty are walled off, accessible only to those who can pay. By contrast, historic urban landscapes like Paris, with its gardens, museums, and streets, offer shared beauty at minimal cost, reflecting a communal, intergenerational ethic absent in modern commercialized spaces.
Capitalism’s emphasis on privatization extends to personal life. Homes are designed for isolation, possessions are walled off, and identity is constructed around consumption. While this system promises freedom, it often produces loneliness, insecurity, and a sense of lack—paradoxically, the very outcomes of a system designed to create and extract value.
One of capitalism’s core values is the privatization of both wealth and space. Success is often measured not only by acquiring resources, but by creating environments that separate and insulate individuals from one another. Custom-designed homes, walls, and personal amenities become tools for isolation, producing loneliness even amidst abundance. This dynamic extends globally: corporations privatize essential resources—water, food, land—forcing individuals to buy what was once communal. Subsistence farmers are displaced, their land consolidated into large-scale enterprises that sell products back to the very communities that once produced them.
These developments are not the result of a master plan, but of historical and cultural evolution. Capitalism emerged and solidified in the context of global power struggles: nations that prevailed in war exported and imposed this philosophy, with the United States rising as the dominant exemplar. Its success shaped global acceptance, leaving few alternatives beyond the ideological extremes of communism or liberal democracy.
The consequences of this worldview are evident in contemporary politics and policy. Debates in England over heating subsidies reveal the limits of governmental intervention in a system dominated by private interests. Energy companies, generating billions in profit, are expected to shoulder social responsibility, yet the ideological commitment to market logic often prevents this. Similarly, housing crises—misattributed to supply shortages—demonstrate how cultural assumptions and systemic structures, rather than economic scarcity alone, determine outcomes.
Capitalism’s philosophical framing blinds societies to alternatives. It naturalizes privatization, enforces short-term thinking, and redefines individual success in terms of consumption and isolation. Reflecting on personal and collective values—how resources are shared, what constitutes a worthwhile life, and what should be preserved for future generations—can illuminate tensions and frustrations inherent in the system.
Ultimately, capitalism is not merely an economic system, but a cultural, intellectual, and historical movement. Understanding it requires examining how it shapes our perception of ourselves and our world. By deconstructing these assumptions, we can better evaluate both the system and the possibilities for change, recognizing that while capitalism persists, it is neither inevitable nor immutable.
End


Powerful work. Many important points were addressed here, especially the influence of Calvin, Luther, and the Protestant Reformation on English and American aspirations. That comment on Chaucer and Boccaccio regarding humor and folly is essentially part of the Hellenic-Roman literary tradition, which loves these playful little moments of ambiguity and folly.