top of page

How Do You Think You Are?

We all take as a given, like fish in water, a cultural and historical preoccupation with the “self” as something to nurture, protect, express, develop, and share. Nevertheless, not only is this an historically recent cultural phenomenon, not rooted in our biology, but only emerging in the late 16th and early 17th centuries. Moreover, as the anthropologist Clifford Geertz once said, that however incorrigible it may seem to us, it is a “rather peculiar” idea in the context of world cultures. So, it is not only historically recent to us, but is largely restricted to what cultural psychology calls W.E.I.R.D. cultures: Western, Educated, Industrialized, Rich, and Democratic.

In the West, we often trace much of our intellectual and psychological heritage to classical Greece. Even here, of course, we need to remember that the idea of a nation is modern, that early modern Italy was a collection of little kingdoms, of Germany well past the Reformation, a collection of principalities, and even of Greece, not really a “nation” until the 20th century, a collection of autonomous and quite distinct city-states. Nevertheless, while some of our cultural ideals may be so traceable, I think we would have found much of the Greek psyche (sometimes translated as “soul,” and certainly the root of “psychology” ) to be alien to ideas of selfhood.

The warrior ethos of ancient Greece valued physical strength, the closeness of comrades, and the virtue of glory in battle or public greatness. For Homeric Greeks, this level of achievement was a rare virtue (arete) which could be kept from you by fate (tyche). Virtue was an achievement, not a state of the psyche as it became for the later Stoics and Christians. It is not something that abides quietly in private but is short, glorious and very public. Even in the recent (2004) film Troy, we see Achilles leaving his mother’s side, and the promise of a long life, of children and family, to make history. As Achilles tells the boy who fetches him to fight a battle for Agamemnon, when the boy says he does not have that kind of courage, Achilles says, “That is why no one will remember your name.” When Achilles leads his Myrmidons onto the beach at Troy, he tells them that is where they will find immortality: “Seize it.”

Psyche is the breath of life, like the Hebrew word nephesh, also translated as “soul,” it is more than mere breath, but less than the whole individual mind or soul. It can wander around in sleep or in a swoon, and certainly leaves at death, but it is not the cause of behavior or awareness. Actions are directed by phrenes, in the diaphragm and associated with rational planning, by the thymos, the heart and emotion, and by the nous, the organ of knowledge, of accurate perception and clear cognition (which could be clouded by thymos), none of which survive death. But even the psyche in the afterlife is a bit of a mental cripple, looking like the body at death, including its wounds -- why Achilles’ mutilation of Hector’s body is an atrocity. But even motivation may be less internally caused than an act of one of the gods, as Achilles attributes victory to Mars, holding fast behind his shield. Much of the Iliad includes not actions originating intrapsychically, but coming from the gods. Even the immortality of a dualistic existence among the Platonic Forms, later co-opted by the Christian heaven, or of the Active Mind of Aristotle are abstract, not individuated, changed by life, or equipped with mortal memory.

Part of our debt to Hellenic culture is a nascent individuality, a possibility of actions autonomous both from the power of a dominance hierarchy, and from the smothering conformity of tribalism and unquestioned group loyalty. Yet the tensions that autonomous individuality presents are far from resolved, or even resolvable. Without 300 Spartiatoi, dispatched to stand and die at Thermopylae in 480 BCE, a Hellenic tradition of reason and democracy might never have survived. But these were not lone souls. These homoioi, these peers, these equals, bore with them a tradition, inculcated since childhood, in which exercises in the management of fear were designed over generations to promote an esoterika harmonia, a state of self-composure, of self-mastery, which might produce exoterika harmonia, the symphony which delights the ears of the gods. It is the latter which unites lover to lover, husband to wife, and guides the phalanx to move and strike as one. In politics, it produces a polis of concord and unity to which each individual donates the noblest expression of character. These Hellenes were free individuals, but two fought together as a dyas, and each man’s shield was as important in protecting his comrade as himself in the phalanx. Indeed, the loss of helmet or breastplate in battle, carried for one’s own protection, was excused without penalty. Discarding one’s shield, carried for the safety of the whole line, was punished with the loss of all citizenship rights.

The Spartan advantage in battle was due to the glue that held fellow warriors together, a love believed to surpass all others but the love of a mother for her child. Their contempt of death was not for glory, nor for the selfish self that looks to its own preservation, but for the one way the gods permitted mortals to surpass them, to give all one possesses, one’s life, to spend one’s substance for one’s comrades. It never ceases to bother me that what would seem to be primary to even a Christian ethic, to give oneself to others, to sacrifice even one’s very life so that they might be redeemed, is belied by what is putatively the very motivation for producing moral behavior (which appears to require not the intrinsic reward of doing good, but that good be instrumental for obtaining something decidedly less, even self-serving), that it is not really a sacrifice, that one gets one’s own flawed soul, even one’s own flawed body back for eternity (though here it is actually one’s “more perfect body,” whatever that might be). Is redemption a property that inheres in an individual, or is it a communal healing? For the Greeks, to abandon one’s comrades, or to prove oneself unworthy and thus to endow others with courage, was the worst fear. One conquered fear through its opposite, to forget all else, and act for the one who stood at your shoulder, for love. But it is the men who had it easy. It was the women bearing up under their grief that provided the model for Greece to stand.

The Greeks understood the debt that freedom owed to the dyas, the phalanx, and the context of culture, socialization, and lifelong emotional shaping that made it possible. Yet we moderns seem to be left with a world of opposed alternatives, which Morris Berman, in Dark Ages America summarizes as “secular” versus “tribal.” He argues that countries like Israel, Iran, and Turkey, as well as in the “red/blue” differences in the U.S., represent such an opposition, reflected in a spectrum from nigh-obliviousness to other human beings, to the claustrophobia of tribal or traditional cultures where relationships always trump individual achievement.

We take pride in the Faustian progress of Western Civilization (Gandhi once was asked what he thought of Civilization: “I think it would be a good idea.”). We take pride in the victories of reason over blind faith, of individual rights and freedom of thought over community authority or theocracy, of enlightenment over inquisition. Has the historical development, from the Greeks to the present, of autonomous individuality, led to a universe of abstract bureaucracy, of the hegemony of non-human corporate entities, of a moral vacuum and unacknowledged spiritual despair? Are our ideas of our unique and autonomous individuality responsible for the evident social decay, the increases in interpersonal violence and family breakdown, and of our desperate loneliness?

The Greeks of the archaic era, attributing their freedom to the coordination of the phalanx, and the hominoia of the polis, had an ethos of economic equality (including rules for the sameness of appearance). Someone showing off their wealth was a “fish-eater.” One who ate alone was called monophagus (single-mouthed). Since it was the polis that made it possible for everyone to achieve virtue (arete), one who failed to participate in the public life of the polis and surrendered to the demands of oikos, to live a quiet life at home, was considered an idiot, an individualist who was called “worthless” by Pericles. The greatest virtue was sophrosyne, self-control and self-knowledge, with nothing in excess, including any rejection of the world or of physical pleasure. One enjoyed it, but was not controlled by it, one is not a slave to one’s passions.

Most important, however, both to philosophy and to the eventual development of scientia was the critical tradition, best exemplified by Socratic dialogue, the questioning of assumptions and received dogma. A closed system of thought, like religious orthodoxy, where critics are heretics or infidels, is defended by attacking one’s critics. My colleague Phil Cary argues that mature religious traditions have a critical component (p 17), but certainly “political correctness” operates this way, where opponents run from “running dog lackeys of the capitalist warmongers,” say, in a communist “re-education” camp, or on American college campuses, where the best defense against critical challenge is simply to wax indignant at it being morally “offensive,” and shaming by accusation. Open systems of thought, like a genuinely democratic system, built to be altered by its own amendments, thrive on systematic criticism. One of the Vietnam-era speeches given by my father, a Lutheran Campus Pastor at Purdue University, was that a genuine patriot was one who welcomed the value of dissent. The purpose of systematic criticism is to improve ideas, a process traceable to Thales of Miletus (585 BCE), the first teacher to say to his students “This is how I see things -- how I believe things are. Try to improve upon my teaching.” A genuinely critical attitude requires overcoming intellectual laziness and a natural hostility to one’s critics. One must separate their character (and their intelligence) from the quality of their ideas, else one can be reduced to name-calling and the heresy-hunting more appropriate to an inquisition than to a quest for knowledge. Yes, this can sometimes be hard to do, especially with one’s foundational beliefs, even when these beliefs are merely “what I have always been told,” which is why colleges regularly assert that critical thinking is one of their central educational tenets (though empirical research suggests that most students show little improvement and, indeed, even faculty can be rather thin-skinned). Critical thinking is not just finding flaws with another’s arguments, but one’s own. As Oliver Cromwell once said, “In the bowels of Christ, think it possible you might be mistaken.”

People use their intellects to seek out knowledge. But they also use them to protect their identity, and when push comes to shove, the latter often trumps the former. The problem is that in “identity protection mode,” providing more information, however clear, does not persuade people – it backfires, and polarizes people even more. The problem is magnified exponentially in the modern era, when, as Roy Baumeister’s pointed out in his classical review Identity: Cultural Change and the Struggle for Self (1986), obsession with the “self” becomes a problem. The word “self” doesn’t appear in the Oxford English dictionary until 1595. Our current conception of self begins to emerge only in the late 16th and early 17th centuries. In agrarian and traditional societies, identity is conferred by social status and lineage. The obsession with self and identity is not a problem until the emergence of middle-class Western-industrial society after about 1800, where one has choices about occupation and ideology (though less so than we are currently taught from childhood), is able to explore alternatives, and create a unique self. The problems that this leaves us with are that of constructing some continuity over time, and differentiating ourselves from others. In a culture where we are taught to define ourselves not by what we share with others, but what make us different from them, the cost is alienation from both self and other, social fragmentation, isolation and loneliness, magnified by communicational technology.

According to Baumeister, while identity was assigned by location, class, lineage, and gender in the Medieval era, there was an emphasis on individual judgment and individual participation in the Christian Church. As Aries and Duby make clear in Volume II of their masterful collection on The History of Private Life, ideas about privacy lead to quite different understandings of the self. Historically, “privacy” meant being separate from public life, not a personal separation. Privacy simply meant the familial or domestic. Indeed, even in the classical era, the polis of participating citizens ended at the threshold of domestic space, where the master’s rule was law for all. If one’s home is no more than a single room, where the master’s bed is shared with children and even servants, it is hard to imagine how one might develop a sense of an independent self. Modern views are often constructed upon a metaphor of private space, where seating is commonly on benches until well into modernity, and a growing sense of privacy and comfort might also include a greater segmentation and privatization of space, and a growing self-consciousness which might be imagined therein. One was rarely physically alone, even traveling in groups, where a lone traveler might be considered dangerous or deranged. Even the mansions of early modernity often didn’t separate rooms with hallways or, indeed, have separate bathing facilities, something hard to imagine among a contemporary generation for whom shared locker rooms are uncomfortable, and even beach changing facilities increasingly have individual changing booths. It is only in the last decade or so that single-room dormitories have become common on College campuses, and the increases in people living alone have increased precipitously, including the majority in most European and Anglo-American cities.

It was with the Protestant Reformation, at least for the educated, when religious belief came to be more of a choice, as capitalism enhanced material well-being. Individualistic attitudes were on the rise, as were increased desires for privacy, and a greater self-awareness. Changes in both church and state, including a waning of the power of the Church, and the questioning of political legitimacies in the era of revolutions, the creativity, passion, the cultivation of an inner self of the Romantic era came to be seen not as determined by God but by the individual. In the absence of social consensus, one needed a personal ideology. In Romantic literature of the 19th century, heroes struggle against social constraint, and by the 20th century, there appears to be a proliferation of occupational and ideological choices, along with the concomitant alienation from authoritative institutions. By the 20th century, individuals feel increasingly overwhelmed and helpless in the struggle for continuity and differentiation of self. With the decline of traditional religious guidance and the industrial revolution, with urbanization and the rise of capitalism, with the de-mystification and disenchantment of the world by science, the discovery of the unconscious, and the horrors of war, identity became the spiritual problem of our time.

The contemporary Western self is personal, private, chosen, bounded from others, and defended against violation. While virtue once included an awareness that even withholding truth is a form of dishonesty, such a belief is now well-buffered by notions of privacy. There are some questions for which “I’d rather not say,” or “that’s none of your business” are appropriate responses, and many for which the very asking of the question can be considered an offensive intrusion, boundaries which are historically fluctuating (indeed, I have heard complaints about the offensiveness of even discussing this fact). In the early 21st century reality often risks being overtaken by illusion, which becomes preferable to or even defines a limited domain of experience. Edward Sampson, the critical social psychologist, has said that It is now fashionable to advocate some degree of self-deception or distortion as effective defensive screens against the complex, brutal reality of the modern world. There is no longer understood to be a privileged reality against which to make judgements, or we understand a politically privileged view not to be ontologically privileged and can therefore be hedged.

One of the fundamentals of evolutionary biology is that the extended childhood and extensive enculturation of homo sapiens means that our adaptations can be learned rather than direct products of biological evolution. This means that our greatest strength is that we can change historically rather more quickly than we could by genetic variation and selection. But that also means that much of who and what we think we are, and how we think we are, is not a biological given but a product of history. This means that the rather peculiar Western view of the self is not a natural object, and that there is no “real person” outside of history and culture.

While I did ultimately choose to become a psychologist rather than, say, an anthropologist or an historian, it has been one of my life-long frustrations with academic psychology that while child development was always a concern, and attention to neuroscience and biology has given evolutionary psychology disciplinary purchase, the discipline all but pretended that history didn’t exist. Indeed, one of the gems of my undergraduate career was Kenneth Gergen’s 1973 classic article on “Social Psychology as History,” wherein he reminds us all that research psychology only studies the kind of individual that exists at this particular time in history (and, largely, in the context of Western culture). Since one cannot do controlled experiments in which history is a manipulated variable, any generalizations are necessarily restricted to contemporary homo sapiens (and again, largely in the culture in which the research is done) beyond which the field has no idea anyway. When I tried to talk to colleagues about Aries and Duby’s wonderful five-volume History of Private Life, I was referred to the history department, whose denizens, largely interested in political and military history, referred me back to psychology. The history of the psyche drops through the cracks. Research psychology tends to ignore what their methods do not give easy access, and for all theoretical purposes, largely pretend that it does not exist, leaving a blinkered view of a species that has millennia of history and wide cultural variation.

Happily, there is a tradition of ethnopsychology that runs back at least a generation, and, indeed, there is a prominent group of psychologists specializing in indigenous psychologies, specifically addressing the anthropology of the self. All cultures distinguish between self and other, but they differ on the boundaries. Contemporary Westerners focus on “what makes me different,” so have a sharp self/not-self boundary, tending to the egocentric, whereas Asian cultures tend to have more fluid boundaries, with a sociocentric self defined in and through others, not by how I am distinct from them, but through who I am with or who I am like. For contemporary Westerners, the self consists primarily of a private interior, the more well-bounded the better. It is interesting how easily the metaphorical nature of privacy and interiority are forgotten in the literalism of tracing it all to brain function. We make sense out of behavior by a vocabulary of an individual’s “inner world,” where, say, the Baining of New Guinea focus on external behavior and relationships. While we may justify anger as an infringement of personal rights, like violating one’s personal property, a Pacific atoll culture like the Ifaluk justify anger in terms of an infringement on group well-being, like failing to share. Even use of “I” is considered excessively egocentric, and the Ifaluk tend to think more in terms of a collective “we.” For us, proper behavior consists in acting in ways that are consistent with our inner feelings; for the Javanese, acting on such feelings would be considered kasar, impolite, uncivilized or vulgar. For the Balinese it is the performance rather than the performer which is more important, the performers considered transient, so their individuality is muted. For us, a “performance” is a pretense that disguises real feeling, and the real self. For the Ilongot of the Philippines, there is no gap between presentation and self, as there is no recognition of an autonomous self apart from outward behavior. Even modern, industrialized and prosperous Japanese tend to have a more inclusive and harmonious sense of self; the notion of an independent actor bounded and opposed to their environment is foreign to them. As Rom Harre has said, “to be a self is not to be a certain kind of being, but to be in possession of a certain kind of theory.” In the contemporary West, a person is envisioned to have a richly furnished interior, is expected to behave consistently with their inner feelings, and form clear boundaries between self and other. We have a kind of “possessive individualism” where we see ourselves, our capacities, and our feelings to be disposed of at will, even encouraged to “own them.” Native Americans are more likely to see even the self not so much as a private possession, but part of something greater, to be given care and nurtured.

The toxic nature of contemporary individuality is something about which I have written several scholarly publications, but who reads those? Still, one of the best accounts I ever read was published in 1990 by Philip Cushman in the flagship journal of the American Psychological Association, the American Psychologist, “Why the Self is Empty: Toward a Historically Situated Psychology.” He presents an account of the contemporary Western self as bounded and masterful, but empty, constantly needing to be “filled up,” with material goods, experiences, identifications, and relationships. Lacking community, tradition, and shared meaning, these absences are experienced “interiorly” as a lack of personal conviction and worth, and a chronic, undifferentiated emotional hunger. Popular culture emphasizes consumption, soothing and charisma instead of critical thought, and there is a nationwide difficulty in maintaining personal relationships. Not only is this concomitant with problems of social and even individual fragmentation, but the decline of social units results in the self becoming the locus of salvation, with consumption and therapy as the cure, and, I suppose, distraction as the symptomatic relief. The central cultural paradox is that the capacity for self-soothing that such an autonomous individual requires, comes largely from parents increasingly less able to provide it, suffering from narcissistic wounds of their own. This results in common feelings of wearing a mask, of presenting a false or exaggerated sense of self increasingly at odds with interior experience and feeling. Unfortunately, psychotherapy itself contributes to this myth of autonomous individuality, belied by a world where we are interdependent with a massive and historical network.

Take any object in front of you, or even worn on your body and try thinking of it not as a commodity, with meaning and value intrinsic to the object itself. Think about how it was produced and distributed to you. I think of a pen, and the history of language and literacy, of the history and social distribution of writing implements, of the scientific research behind the plastics, the viscosity of the ink, and the ergonomics of its design, of the factories in which they are made, of the capital which sustains them, of the people working inside them, of distribution, marketing, and sales, and all of the people involved in each of these steps, of my own development and learning, and of the biography of my personal use of this object. Everything around you, including the systems of housing and shelter, of heating on a cold winter’s day, of the public goods of roads, communication, and education behind all these things. In an era when even the self has become commodified, it is hard to get our minds around just to what extent we are all in this together, depend on each other, even as we sustain a myth of a far greater autonomy than we have ever or will ever have. It’s not just who you think you are but how you think you are, and the worlds within you that you have not invented and can never sustain alone. You don’t have to. You never did. As Robin Dawes once said, “Our own cognitive limitations may lead us to confuse the cumulative technological advances of our society with the power of a single human mind. The fact that a lot of us, over a long period of time, with the aid of a printing press, [electronic] and verbal communication can create an H-bomb [antibiotics, a national highway system, the Internet, or a smart-phone, take your pick] does not mean that any of us singly can think very straight.” We each in fact are parts of something much larger, the history of our species and its future on this planet, and we each have a role to play. Your contribution to what is larger than you, and what of yourself you give to it, is all the meaning your life will ever have, and it’s finally not about you.

bottom of page