top of page

Tropic Discourse

I suppose I have been interested in metaphor and other tropes since I went to a conference on “Metaphor and Thought” back when I was in grad school in the 1970s. No doubt it goes back further, to creative writing in high school, or the first college classes I ever took, in poetry, and in creative writing. I know I spent some time in grad school with a colleague from the AI program at UC San Diego, trying to figure out how one might start to develop language comprehension programs that could understand metaphor. For various reasons of education, working as a research psychologist, and considering myself something of a literary person, I believe metaphor is central to both language and thinking.

I remember an undergraduate friend who, for whatever reasons, thought she wanted to study clinical psychology at Ohio University in Athens, Ohio (not the Ohio State University, with its football consciousness). I think maybe her boyfriend was going there. But in her interview there she was told by two young faculty that if she wanted to become a psychiatrist, she should get a medical degree first, but it she really wanted to understand human minds and emotions, she should read Kafka and Dostoevsky. That may have been long before the whole subfield of Emotion really became a hot area of study in the last ten or twenty years, odd only if you think people go to see a shrink because they feel bad, not because they can’t think properly. I recently discovered that no less a figure than Noam Chomsky, one of my intellectual heroes back in the days he was still a linguist inventing Transformational Genetic Grammar, and I was studying psycholinguistics, also has said that reading literature is really the best route to understanding human mental life. I also had a young colleague in the European Society for the Study of Science and Religion who won an early scholarly award for a paper explaining her motivation for leaving a graduate program in neuroscience. She had come to believe the operationalization of many of the concepts needed to understand consciousness operationalization being how we define concepts to make them more easily measurable, one of the hallmarks of empirical research, led to such brutal oversimplifications as to render much of the research trivial, and that reading Kierkegaard’s would make for a substantially more enriched and fulfilling life. Indeed, part of the motivation for my early retirement from academic psychology is that I thought that understanding the narrative that defined emotion, mind, self, and even soul might be better served outside it.

I always like teaching about Freudian psychoanalytic theory in a course on personality, as he, along with his protégé Carl Jung, were willing to think about the overall structure and history of human minds rather than doing the infinitely fragmented piecework of too much empirical research. Who was ever going to do the synthetic work of pulling all the pieces together to present a picture of a whole, integrated human life. I did manage to get my department to make the course in Theories of Personality one of our “integrative” junior-level courses, to do exactly that. There are plenty of empirical problems with depth psychology, of course, which may not mean so much that it is untestable or unfalsifiable, but that it is difficult to formulate in testable or falsifiable hypotheses, not that one cannot. But one of the annoyingly simplistic critiques included arguing that Freud’s ideas, say about fixation on the anal stage, are not literally true.

Freud’s psychosexual stages are defined by different “border crossings” between what is inside and outside one’s body, consisting of the orifices by which such commerce occurs, which are both lubricated by mucous membranes, and rich with neural receptors. While it is empirically true that severity of toilet training is neither necessary nor sufficient to produce the concatenation of personality characteristics described as an “anal” personality, this particular set of characteristics, of being hyper-organized, overcontrolling, obsessed with cleanliness and order has been empirically shown to co-occur. But why should its origin be restricted to toilet training? Certainly the anus is one of those orifices, lined with mucous membranes and rich with neural receptors, that is likely to be, at some point, one of the foci of psychosexual development. But wait, isn’t this really just a synechdoche, one of those interesting tropes wherein a part is taken to represent the whole? Indeed, isn’t this whole stage of psychosexual development about how the growing child learns to control his body in social prescribed ways? Certainly different cultures have different rules and expectations about the elimination of waste, but there are a huge range of “body-control” behaviors that a child is also learning: To sit still in church, not to bite one’s sister, that there are circumstances in which there are different expectations about vocal volume, that falling down screaming in a grocery store aisle to get one’s way is not likely to be effective? And aren’t those “anal stage” characteristics really, as another Freudian protégé (actually of Freud’s daughter Anna) Erik Erikson put it, actually a psychosocial stage of “autonomy vs shame and doubt”? Please class, would anyone care to speculate as to why Sigmund Freud won the Goethe prize for literature rather than a Nobel for science (or at least medicine, if Freud is to be identified as a neurologist)? Because much of his understanding of human development is decidedly not literal! So if a critique takes it’s use of tropic discourse literally, and then critiques it for not being literally true, is the problem with the theory or with the critique?

There are larger problems in our culture of text-based electronic communication, some of which I have addressed in this blog before. I’ve recently begun to participate on several area “story slams,” where one is given a topic or theme, and asked to tell a story about one’s life hat is “true to the best of one’s recollection.” Unfortunately, even here, where one expects to find more literary types to be, well, a bit less literal and more likely to use various alternative tropes, I was surprised to find so many of the stories about “scars,” to be about actual physical scars, rather than more interesting psychological or emotional ones. We identify people by their scars, and despite the truism that what doesn’t kill you can make you stronger, it can also leave you crippled in any number of unhealthy ways. Even a slam on “skeletons in the closet,” included a number of stories that were literally about skeletal remains.

It is certainly the case that contemporary college students, especially in the transition from Millenials, to the iGeneration seem to have increasing difficulty in comprehending nonliteral discourse. (Or at least tropic discourse. When I regularly asked students how likely it was for one of their peers to actually do something they said they would do, the average answer is “about 50%.”) I once got called on the administrative carpet over complaints about a dramatic rendition I did during an account of Freudian psychosexual stages, of a homophobe’s overwhelming disgust response to a question about homosexual practices. To be fair, of course, despite our pornography-laden culture’s inclusion of sodomy within the normative repertoire, some of my more uninhibited gay friends are willing to talk about the written guidance available even for its practitioners to overcome their disgust responses. For whatever reasons, the gay students in the back row of the class, apparently perfectly well understanding my brutal sarcasm, found it hilarious. But the innocent coeds in the front row, either inattentive or unaware of my dramatic tendencies, and the uses of ironic discourse, lodged complaints that my presentation “might” be offensive to gays. One wonders what stereotypic understandings might have led to this judgment, and whose prejudices were actually on display. When I refused my Dean of Faculty’s suggestion that I might consult with some senior member of my department (being myself some 25 years post-tenure, decades into senior Professorship, and a recipient of “senior merit), and he threatened to “interview the class,” I told him to go ahead. Months later the complaints were adjudged “unfounded,” though I was cautioned that I might better “frame” instances of nonliteral discourse.

In a culture of text-based communication, you have to frame you use of tropes, removing all subtlety of discourse, the “what if” possibilities of tropic discourse, by making explicit that “it’s really not.” What if the atom were really an orbital system? How far can we push the model? Are billboards really warts on the landscape? The meta-historian Hayden White’s Tropics of Discourse organizes historical development, Freudian defenses, and even literary genres according to a sequence of tropes. It is true than in spoken discourse there are a host of nonverbal cues to a wide range of nonliteral uses, absent from or more difficult to discern in text-based communication. And it is about interpretation, and the necessity for interpretation, in almost all symbolic discourse. It is interpreted whether you think so or not. Like in “original language” interpretations of the Constitution, I once had a colleague insist, in a judicial committee adjudication, that her interpretation of a bylaw was the only legitimate one. Happily, as chair of the committee, I could over-rule her by pointing out that the majority of the committee thought otherwise. But one does wonder about the political and cultural consequences of “dead metaphors.” If we accept that our years be enumerated using the “Common Era” designation (CE), instead of Anno Domini (AD), are we being more inclusive of Non-Christian cultures, or merely codifying an acquiescence to colonialism?

In Tropics of Discourse, as Hayden White points out: “Tropic is the shadow from which all realistic discourse tries to flee. This fight however, is futile; for tropics is the process by which all discourse constitutes the objects which it pretends only to describe realistically and analyze objectively” (p 2). The conventional technique, and the one upon which the “critical thinking” of a liberal arts education putatively depends for civil discourse, assesses validity by checking an argument’s fidelity to the facts, and then its adherence to logical consistency. Indeed, the myth is that it is on the basis of such arguments that we hold or assent to some position or other, the grounds on which it is to be defended, and the only really justifiable method by which education is supposed to occur. The deepest flaw with this presumption, both in higher education and in public discourse more generally, is that this isn’t really how people form or sustain their most important beliefs. Arden Schorr, of the Cultural Cognition Project at Yale provides evidence that while the intellect not only seeks out knowledge, it also works to protect identity, and it is the latter that has hegemony in cases of conflict. In such cases, providing more information only polarizes positions further. Not only can this be seen in much of our current public discourse but, unfortunately, even in the identity politics that has overcome most “critical thinking” in the “political correctness” of social intercourse even at the college and university level. Another rationale for my leaving it. As Hayden White argues:

“This critical technique manifestly flies on the face of the practice of discourse, if not some theory of it, because that discourse is intended to constitute the ground whereon to decide what shall count as fact in the matters under consideration and to determine what mode of comprehension is best suited to the understanding of the facts thus constituted” (p 3).

Indeed, White argues that the very process of understanding is, of necessity, tropological in nature, since it involves turning the unfamiliar into the familiar, generally by using figurative tropes. He further asserts that there is an archetypal plot to all discursive formations, which moves from an original metaphoric characterization of a domain of experience, through metonymic deconstructions of its elements, to synecdochic representations of the relations between superficial and essential attributes, and finally, to the contrasts or oppositions can be discerned from those representations. White suggests that these “diataxes” are not only mirrored in our very processes of consciousness, but also underlay all of our efforts to endow our world with meaning. I find it fascinating that the last phase so clearly illustrates the “poetic logic” of the relationship between irony, seeing something as other than what it seems to be, and the very processes of critical thinking found, say, in Piaget’s stage of formal operations, specifically looking for and reasoning with counterfactuals.

Formal operational adolescents can understand that the way one acts in one situation need not match how one acts in another. They understand the difference between the posing of a persona and a full actual self, even if the personae of electronic communication are facile escape from the vulnerabilities of genuine, face-to-face interaction, where one cannot control all the ways one’s being may utter forth. The crystallization of this archetypal tropic plot in young adults gives them a power of thought that is both conscious and self-conscious, making them not only capable of logic, but of irony, of being able to say one thing and mean another or to mean one thing and say it in a host of alternative, even mutually exclusive or illogical ways. So it is imaginative or even poetic thinking, “the most shocking metaphorical transfer, the most paradoxical catechesis, the most contradictory oxymoron, even the most banal pun” (p 21) which can give us the capacity to bring even logical thought under criticism and questioning, whether in its presuppositions, its structure, or its adequacy to an existentially satisfying relationship to reality. What might otherwise be a regression in service of the ego may, however, be not only a regression, but an intellectual collapse, or a deep failure of comprehension, when we find a metaphor being taken literally.

I’ve always had a problem with so-called Biblical “literalists.” Never mind that the first few centuries of Christianity only showed a proto-orthodoxy emerging around what any number of alternatively gnostic sects effectively called “the literalist heresy.” I say “effectively,” because there were not the clear distinctions between objective history and mythologized versions of a past until well into the classical era, and that only among the literate and educated. Even a distinction between orthodoxy and heresy would have been historically dependent upon the emergence of an orthodoxy, an idea particular to Christianity that there is a “right belief” enforced by a priesthood that could not exist without it. The religious scholar Huston Smith said that it may have been this orthodoxy which gave it any historical purchase. If “everyman” can experience the presence of the Christ, as Saint Paul did – using the same language to refer to this experience as the apostles who actually knew the man or could have had any “literal” experience of a risen Christ. Indeed, I have heard it argued that the Fourth Gospel, attributed to John, was constructed at least in part to counter some of the dangerous “gnostic tendencies of early Christianity. I do love the poetry of this Gospel, In the beginning there was the Word (Logos), but it is the only Gospel with the story of Doubting Thomas (Thomas also being the name of the Fifth, ultimately “noncanonical” Gospel, which includes actual sayings of Christ, much of which is pretty gnostic in character). If there is just one truth, then why multiple stories of many events, all the way back to the four stories of creation in Genesis, and many midrashes of stories from earlier books in later ones, particularly from the Christian corpus of the “New Testament” from the Torah, and at least four gospels. A midrash, by the way, is just an updated or revised version of an earlier story, common among the Hebrews, like Leonard Bernstein’s midrash on Shakespeare’s Romeo and Juliet, called West Side Story (which was originally penned as an East Side Story, a love story between a Jew and a Christian, adjudged too radical even for 1960’s Americans, who would hear Martin Luther King, Jr.’s casting of himself as Moses in his famous “I have been to the mountaintop” speech). There may have been as many as 70 gospels in the first centuries Anno Domini, but even the existence of the four canonical ones might lead one to respond to someone declaiming the Gospel Truth, “which one?” But why in God’s name would a good believer in Christianity, whose best known Incarnation taught largely in parables, insist on Biblical literalism, never mind the translational variations produced over the millennia. When Jesus talks about removing the log from your own eye before worrying about the speck in your neighbor’s, was he just talking about pieces of wood in people’s eyes? Perhaps you have to think that if you bear such a large metaphorical one in your own. Wouldn’t that be ironic?

I do feel pressed to say something further about “orthodoxy” itself. Christianity and Islam seem to me to be the only well-known world religions based on holding to a particular set of beliefs, rather than a “right” set of practices, an orthopraxy? Generations of Hebrew Talmudic scholarship suggests a pretty wide range of interpretation, though there are also Jewish “orthodoxies.” But something rankles pretty deeply to anyone raised in the western educational tradition, particularly in its recent incarnation (yea, that’s a metaphor) as scientific reason. We have been taught (I used to teach this rather explicitly) that there is a difference between an opinion, which might have its origins from the unquestioned authority of “what you were always told,” down to hearsay, gossip, or personal idiosyncrasy, and a position, putatively formed on the basis of logic and evidence, and defensible in terms of them. Sadly, it is psychology that has much to say about the origins of opinions, and the vociferousness or even violence with which they can be held in this era of identity politics, a whole world of research that would seem to belie the liberal arts presumptions of higher education -- that you can actually change people’s beliefs on the basis of logic and evidence, before veering off in the direction of the “right thinking” of Political Correctness. Still, on too many college campuses today, one of the most common defenses against having to think about anything further about a particular position, is to deem it “offensive.” And in particularly sensitive areas, even documented justification in the name of academic freedom is no defense, no longer of some objective standard, or even a “reasonable person,” but, quite literally, anytime there is more than one complainant, even if incommensurable and asynchronous. But never mind the tamer forms of shaming and social disapproval, behind whatever administrative closed doors, and upon whatever administrative carpets, for whatever opaque motivations.

Historically, both in the Christian tradition, but even within ideological ones with no religious claims whatever, one could be forced, under whatever kind of duress, up to and including unspeakable violations of both psychological and even bodily integrity, to profess the “correct beliefs,” or face serious sanction, “re-education,” or even the cleansing death of the “secular” punishment following an auto da fe. Once put to the question, the answer is already known. Lacking “due process,” or any presumption of innocence, without any opportunity to face one’s accusers, to cross-examine, or even to rebut, the inquisition can grind out its feasance of terror. Even if, under torture, I am perfectly willing to say whatever you want me to say, it may take me a while to figure out what that is you want me to say (since it is obvious to you, as it should be to anyone actually holding to the correct beliefs), and then to convince you that I really believe this rather than that I am only saying it in order to avoid further torture. To do so, I may need to twist myself into acts of self-deception so arcane as to even appear justifiable by the logic or evidence that I would normally require to so justify them. But then, not all of my beliefs have ever been so held anyway, so why these. It seems to me that it requires a logic bent to the point of self-contradiction that I should be able to profess something, however meet and right it might be to do so, without thinking it true.

Ultimately any critical tradition has to be able to justify itself, and there are plenty of arguments that any mature tradition becomes critical of necessity (Phil Cary talked about this in a 2016 IRAS conference on “How We Can Know,” in his talk on “Knowing Traditions: Self-Critical Rationality,” later published in Zygon: Journal of Religion and Science). In his 2009 book Reason, Faith, and Revolution, Terry Eagleton suggests that there might be a more nuanced view of religion than one that reduces it to a flawed explanatory system based on unsupported beliefs about a supernatural agent, and that might have value in noncircular justifications of rationalism, and a temperance of the political self-contradictions of an ideology of tolerance and diversity. Most important is Eagleton’s critique of self-origination, self-authorship, and self-sufficiency which presume, as Stanley Fish’s review put it, “to pull progress and eventual perfection out of our own entrails.”

Here below the Tropic of Cancer, one becomes far less concerned with either progress or perfection, and more often with grace, or forgiveness, or simple generosity, or even one’s own consciousness. Attitude does change with latitude. “Well, mon, we be jammin’. Don’t worry, be happy.”

bottom of page