Mark Antonio Menaldo is Department Head and Associate Professor of Liberal Studies at Texas A&M University-Commerce.
From heady academic categories to buzzword nomenclatures, “culture” applies to almost anything. The dominant way of thinking of “culture” today is as a “social phenomenon.” “Culture” is widespread, amorphous, and has a life of its own. When we use “culture” this way, we assume that social abstractions have an all-encompassing effect on our cognitive perspectives. I embraced this perspective as an undergraduate student after being influenced by postmodern philosophy. In my sophomore year, an eccentric professor made us read a German philosopher’s book, Martin Heidegger’s Being and Time (1927), in an Introduction to Existentialism course. Naturally, this heady tome was over my head. However, that class set me on a course of study of German and French postmodernism that lasted well into graduate school. After extended reading and teaching steeped in ancient and modern texts, I understood how a reductionist understanding of “culture” became our prevailing wisdom and a catchall word. In this essay, I consider the etymology and philosophical history of “culture” and the consequences that a new cultural orthodoxy imposes on our thinking, motives, and behavior. This new orthodoxy tells us that all human experience is reducible to “culture” and, simultaneously, that we must respect cultural differences. This orthodoxy is placid on the surface, but its undercurrent pulls us down into the depths of political and social strife.
In the fifteenth century, the word “culture” appeared in English, denoting a cultivated piece of land, derived from the Latin word cultura stemming from colere, meaning “to tend, cultivate.” Agriculture combines this idea of cultivation with a field (agri), so “culture” was a practice tied to a place. From the sixteenth to the nineteenth centuries, “culture” also meant refined manners and minds. This second meaning adopted “culture” as a metaphor for human conduct but maintained the original sense of tending fertile ground. The aristocratic class tilled and managed the arts and sciences to cultivate the person’s intellectual, moral, and aesthetic excellencies. As Alexis de Tocqueville described in Democracy in America (1835), this sovereign class shared two essential understandings: culture was composed of studied enjoyments and refined pleasures, and all human societies shaped themselves into either civilization or barbarism. This definition of “culture” coincided with the rise of the French Salon, in which aristocrats devoted their time and effort to the art of civility, a refined code of manners, and lively conversation.
Nowadays, we are suspicious of the claim that any one culture is superior to another, and for good reasons. Yet, “culture” retains its original meaning. We say someone of “high culture” has a taste for arts, literature, and knowledge. However, “culture” also signifies a “way of life,” and, according to Cambridge Dictionary, it describes “the attitudes, behavior, opinions, etc. of a particular group of people within society.” This second meaning is a modern and generic term for many things that exist traditionally, socially, politically, and fleetingly. Like a large woven basket, “culture” allows our intellect to grasp many things simultaneously, including cultural transmission and change. What brought about this change in the meaning of “culture” from the circumscribed cultivation of a few higher things to our expansive modern notion that it encompasses almost anything? I offer a meat-and-potatoes answer to this question. I contend that German thinkers subjected philosophy to “culture,” thereby inverting the Platonic view that “culture” is merely a shadow of reality.
In the late eighteenth century, a relatively unknown philosopher named Johann Gottfried von Herder wrote a dense, sprawling, unfinished work titled Outlines of a Philosophy of the History of Man (1784-91). Although Herder retained the language of civilization and cultivation, he claimed no difference existed between transmitting culture from the ground (indigenously) or by the light (deliberate learning). Herder thought that the word “culture” was indeterminate and that human cultivation was happening organically and everywhere in how people transmit their way of life. Civilization was a matter of degree and dependent upon “the point of view, from which we examine it.” Not only did Herder anticipate linguistics and anthropology, but also Friedrich Nietzsche’s groundbreaking idea of “perspectivism.” Nietzche was critical of the posture of the impartial philosopher who contemplated objective reality. In his Geneology of Morals (1887), he said, “there is only a perspectival seeing and only a perspectival knowing.” Nietzsche prepared the way for overturning the dogmatic schools of philosophy that had stood for over two millennia.
History swallowed up Western philosophy in the next century. No one worked harder to overturn traditional philosophy than Martin Heidegger. In Being and Time, he claimed that transcendental categories, such as the knowing subject and objectivity, were just expressions of particular horizons. Not only did Heidegger join the Nazi party, but he also became Rector of Freiburg University, hoping to bring it in line with Hitler’s racist ideology. Heidegger’s ambition was a grand refounding of the University through a reformulation of philosophy. He argued that traditional philosophy had buried the authentic person and the world around him under notions of the knowing subject and substance. Heidegger claimed that “moods” were superior to analytic thought, death was the door to understanding life, and the human experience was marked by being thrown into history, much like a foundling abandoned at the doorstep of an orphanage. Heidegger offered a quasi-mystical path to secular salvation through the embrace of “resoluteness” and “authentic death,” which is made possible in the “historizing” circumstances of one’s “heritage.” Heidegger’s ideology amounts to this, being authentic requires a radical openness to the “inner truth and greatness of” Nazism. He justified his politics through his intellectual project, which substituted feelings for thought and community over the individual. Not everyone idealized the mythical “Aryan” race, and Jewish scientists and scholars fled Germany. Notable among them was Albert Einstein.
The natural sciences bristled at the notion that its methods and achievements were historically and culturally contingent. These disciplines held to the discovery of scientific facts by posing new hypotheses and refuting existing theories through logical contradiction and empirical testing. Abandoned by scientists, the next generation of continental thinkers, from Hannah Arendt to Jacques Derrida, looking for their philosophical voices, accepted Heidegger’s historicity as their point of departure. The primary drivers of continental philosophy from the 1960s onward became community before individual, language before consciousness, and radical otherness before individual responsibility. The new goal of philosophy became cultural criticism (an instrument of social change).
What was decided by the new nobility (the café Society of Paris) was exported to American universities. Armed with French Theory, American scholars, with pragmatic insistence, put theory to work. Using terms like “problematizing” and“deconstruction,” these scholars would bang their gavels, condemning ancient history for insidious forms of power and old books for being storehouses of subjugated meanings. Under the spell of Heidegger and French postmodernism, my undergraduate work was full of fancy neologisms and incomprehensible insights. In its heyday, this movement of American intellectuals promoted multiculturalism, which did have beneficial effects. My peers and I took courses in African and African American philosophy, and white and non-white students believed that we were capable of mutual understanding. My generation had not reached the current cultural conviction that whites need unending expiation for their predecessor’s sins, which they inherited through structural racism.
Unlike the visionary explorers who embarked on uncharted voyages, my professors were a new priestly caste that taught my generation to see the waymarks and signposts, once part of the journey of traditional liberal education, as expressions of a dominant Western culture. It was all the rage to deface a revered text with the “dead white male” moniker and move on to the next. As anti-foundationalists, we breathed the rarified air of deconstruction and cheered the leveling of high and low, original and imitation, truth and falsehood, and good and evil. We felt comfortably superior to the thinkers of the past. The only thing left standing was the novel cultural orthodoxy with a democratic twist. Nothing stands before or lies beyond “culture.” And all cultures are equal, but some are more equal than others. As students, we did not see ourselves as soil tillers but torrents moving violently through “the canon,” exposing all its inconsistencies and ambiguities. We understood the authors better than they understood themselves, or so we thought.
In this century, we walk along a rubble-strewn path of what began as an esoteric and European philosophical flood. The “Books Ngram Viewer” tool shows how much a word or phrase appears in scanned Google books over time. Search for the word “civilization” and then “culture.” You will find an inverse relationship comparing the results, especially from the mid-last century to today. Civilization decreases while culture rises quickly, peaks in 2005, and remains steady. Merriam-Webster announced that culture was the word of the year for 2014. As an idea, “culture” is not something cultivated; it is. It is the meaning of meaning, so it lacks formal, logical, and temporal stability. It is also a vapid, banal filler word that turns the concrete into something abstract. It is amorphous on the one hand but all-pervasive and controlling on the other hand because it imposes itself on the individual as a group identity.
“Culture” is before human thought; it predetermines and encodes itself into the psyche or human soul.“Culture” is amoral and does not offer directions to you if you feel lost and seek meaning outside of “culture.” Instead, “culture” is like those whimsical directional signposts that point to many real places (Sherwood Forest) and fictional ones (The Enchanted Forest). These signposts don’t guide travelers, so a sojourner wanders in the wild. How does this lack of direction affect the practical judgments of weary travelers?
The American roadside gas station, brought to perfection by the Texas company Buc-ees, is as worthy as the Great Sphinx, in Egypt, of being deemed a cultural monument. Some groups believe that distinct cultures should be banished, so they pull down monuments. Other cultures that most of us will never encounter, like pygmies, are being watched like spotted owls for fear that they will go extinct. These divergent actions pose a problem for the people who claim to be anti-foundational since they must ask themselves, “On what grounds can anyone agree that any one culture is worth preserving and another is worth purging?” The only honest answer, it seems, is that a sizable group feels strongly enough about saving or casting out a culture and that this feeling takes the shape of a value. Yet, these momentary consensuses beg the question, what happens when feelings suddenly change in opposite or altogether new directions?
Moreover, what happens when oppressive swathes of emotions and sentiments encircle individuals they do not share with a larger group? The majority never condescends to individuals but demands their obedience. As Tocqueville observed, this passionate majoritarianism lacks the stabilizing forces of tradition and is made worse by rapid changes caused by democracy.
In America, where fortunes are won and lost in a day, and almost anything can be bought and sold, “culture” changes hands quickly among a mass of people who want things fast, cheap, and easy. American popular culture is expressed, consumed, and transmitted through osmosis. Those who look down their nose at mass culture are in no position to defend civilizational boundaries since they are, after all, Americans. Their critics cast them as cultural elites, a stodgy and dying breed of people who still frequent country clubs, go to the opera, and attend social parties. Coastal upstarts who drink almond milk, attend branded elite universities, and listen to NPR podcasts at cafes are replacing the sham aristocracy. The new cultural orthodoxy tells these elites to practice cultural sensitivity when encountering different cultures. This task is not very burdensome; all it takes is a bit of moral exhibitionism. Buy the right brands, swear off the wrong ones, and whitewash unpleasant thoughts. In this easygoing moral landscape, people justify their comfort and self-satisfaction.
Organizations also peddle the new cultural orthodoxy using office speak and bland buzzwords. For example, everyone welcomes a culture of transparency in the workplace as an antidote to what people call “toxic work culture.” If honesty were the best policy, the organization would substitute the word toxic for lies or corruption. But no one is that blunt. It’s easier for the bosses to say that they are fostering an environment of greater transparency rather than promising to tell the plain truth. These so-called changes in institutional culture aim to appear good without being good. They add tinsel to rotted trees.
The new cultural orthodoxy encourages people to identify with collectivities over rare personages. When 20th-century literary criticism turned the heroic genre into social commentary, generational cultures became the main characters of our age. As Tocqueville noted, in democracies, the individual melts into the crowd, and society takes the place of the community. Gertrude Stein invented the “lost generation” for those who served in WWI. The “lost generation” was followed by the “greatest generation, baby boomers, and Generation X.” Whatever the merits of these generational labels, the reality is that such arbitrary naming and cut-offs promote herd mentality and monolithic behavior of the worst kind. The obsession with generations encourages fatalistic empty talk about an anonymous society where young people see themselves as cultural bystanders. For example, if the popular press is to be believed, the stereotypical millennial cannot drive vehicles, buy a home, get married, and have children. No one cares to do anything about these foggy observations except to begin groaning about how Generation Z grew up with a bottle in one hand and a smartphone in the other.
We also speak of “culture” as war, an intergroup conflict over meaning and values. As David Hanson Hunter observed in 1992, the culture wars in America were the provenance of the clash between the orthodox right and liberal secular society. Today, everything finds its way into the vortex of the culture wars. The right speaks stridently and hammers its opponents. Politicians fearmonger about cultural mongrels invading our lands and infecting American civic culture. A rag-tag contingent takes the fight head-on. What was once good ol’ boy fashioned bigotry is repurposed as white identity politics and economic populism. This ethnocentric cultural group colonizes the Republican party and exiles anyone who dares defend the Constitution before their Dear Leader, Donald Trump. Trump’s offspring see themselves as the victims of “cancel culture” and call for a “national divorce” between blue and red states. Not to be outdone, the governors of Texas and Florida strut their tail feathers by “owning the libs” and banning acronyms they don’t understand, such as CRT (Critical Race Theory). Anxious about their impending demographic defeat, many cultural conservatives beat a hasty retreat and hermetically seal themselves from “the culture.”
These days the left wields a more subtle and complex cultural sword through identity politics. It conquers “culture” through amorphous values, such as inclusion, and defends its new acquisitions by policing insensitivity, appropriation, bias, erasure, racism, and sexism. It dispenses punishments by shaming, de-platforming, and exiling people for cultural misconduct. American universities are in the midst of this kind of cultural revolution. On campuses nationwide, students shout down speakers, and administrators cower by disinviting guests and canceling events that harm the dogmas of progressive identity clans. Students target faculty for what they say, write, and teach. The shock troops push stalwarts of free thinking to the right. It is a fact that 80 percent of college students self-censor on college campuses, and so do faculty. Once the outspoken leaders of dissent, the left has ushered in its form of Foucaldian power. Liberal education, one that promotes reasoned discourse for the sake of public deliberation, is consigned to self-indulgent nostalgia.
The new cultural tribes transmit their rhetorics beyond the classroom, in the Twitter-sphere, and in other forms of social media. They make no distinction between the sacred and the profane. All chase their cultural prejudices in this condition of anarchy, brutalizing each other as if it were a blood sport. The battle lines drawn, these cultural combatants rush each other with their lances poised to attack. Someone may call you racist or woke, depending on your opponent’s tactic. They are engaged in a cultural civil war for the grand prize of promoting variable meanings. Life in the state of culture is, as Hobbes said: solitary, poor, nasty, brutish, and short. Like all wars, this one is waged by the ambitious who want to get their hands on the steering wheels of “culture” to help them capture this century’s “commanding heights” of government, education, tech, and media. They use cultural voids and confusion to feed people’s fear and anger. The ancient political faith, liberalism, is boarded up and left derelict.
What image can help us consider this descent from our original faith in first principles and liberalism into this cultural pit? A great mountain peak once acted as a water tower and fed many tributaries. In the distant past, a few people lived at the foothills. They would go up, drink its crystalline waters, come back down, and tell others about what they saw. The people who followed the mountaineers were to spread their message to the thirsty people in the plains and valleys. But there has been a long drought, so people stopped climbing the mountain and began fighting for the remaining water. The people no longer remember the mountaineers nor cast their gaze to the peak. Here is no water but only rock.
Begin again. What could “culture” mean for us in the twenty-first century? Is it a wild and sprawling garden that is a natural haven belonging to all? This image is both a good and noble goal, but I don’t think that our reality fits this Romantic depiction. What I have described in this essay is like the vegetation of abandoned and decaying city lots, which are uncultured. While the foliage is spontaneous and sustainable, this uncared-for growth is ugly. No one wants to stop and look at the weeds and rubbish. I agree that mine is a dour outlook, so I harken back to the word’s original meaning. A culture needs tending by feeding the soil, watering efficiently, pruning, and weeding. A gardener has a purpose, cultivates his garden, and sows his seed in good soil. Even wild landscapes benefit from stewards.
“Culture” is shapeless when viewed from afar. This distance weakens us and gives the impression that impersonal forces “structure” us. I do not think this tendency to abstraction is a healthy philosophy; it makes ordinary people aloof to human cruelty and personal suffering. In the words of Voltaire, “We must cultivate our garden.” In our current circumstances, we should use language responsibly in the places we use it: the classroom, the boardroom, books, essays, blogs, and tweets. Using language to obfuscate and hide the truth or express ourselves clearly is a choice. Despite our very real limits, I am advocating for a deliberate use of language, honest interpretations of enduring texts, and an acceptance that logos (reasonable speech) is the outward form by which we express our inward thoughts.