From an existential point of view ... there can be no authentic (true) or inauthentic (false) self, but only an authentic way of Being-in-the-world.
~Miriam Donaghy, psychotherapist
At its best, authenticity can lead us to be more sincere and straightforward in our dealings with other people. At its worst, it can lead to the naïve belief that we will become our best selves without the hard labour of decisive personal growth. Plagued by the burden of relationships, infinite choice, and our own guilt, we have become hooked on “authenticity” as a source of faux salvation. But surrendering to some false “true self” may foster a new kind of self-complacency. It’s time to reexamine authenticity from an existential perspective: less as a destination than as our life’s work as responsible adults.
Besides limiting our potential growth, popular authenticity appears to feed us several false promises. Chief amongst these is the promise of absolution in the form of the “true” self, the promise of freedom through resurrection of one’s “inner child,” the promise of individualism by living the “unconventional” life, and the promise of social validation through public displays of vulnerability.
Absolution in the form of the “true” self
The tendency to split ourselves into “true” and “false” parts is as common within the realm of mental health as it is within authenticity culture. Belief in this basic duality underlies the work of highly influential psychologists like Donald Winnicott and R.D. Laing, for whom the “true self” was that spontaneous part of us which refuses to submit to society’s demands.
When taken to the extreme, this construct serves to sever the integrity of the self, and even to absolve us of responsibility for certain socially destructive or “maladaptive” behaviours. Compassion seems to dictate that we should neither be judged on the basis of these “false” parts, nor asked to own them, since they represent the masks we have had to adopt in order to survive a hostile environment. Instead, we are given to understand that we should be judged on the basis of a buried “true” self, whose virtue, creativity, and spontaneity would have almost certainly been brought to light had it been supplied with more fertile soil. There are, however, some rather dangerous consequences of this selective ownership of self-parts, which naturally excludes those regarded as less socially desirable.
While I am familiar with this type of splitting of the self from my psychotherapy training, I was first introduced to it during my early teenage years, when I attended an eating disorder clinic. There, discussion of our disordered eating patterns was governed by a set of linguistic conventions, which were uncritically adopted by the patients and their families. Most notably, we found ourselves using the subject I in front of any “normal” (i.e., prosocial, non-anorexic) thoughts, motives, and actions, while The Condition was used in front of those which were considered to be “abnormal” (i.e., antisocial, anorexic). The refrain “You are not your Condition” reverberated throughout those cramped, low-ceilinged rooms, cushioning our fragile bodies. How soothing it was in those moments to believe that this might have been the truth.
Except that it wasn’t—not entirely. The truth is that I was my condition. I was just as much the owner of my “good behaviours” (consuming an entire boiled egg without retching) as I was my “bad” ones (burying balls of food in my cheeks to spit out later). I would love to cast off those parts of myself for which I have contempt: she who was hell-bent on self-immolation, and she who dreams of a hermitic existence comprised solely of reading, writing ruminative essays, and watching Married at First Sight without guilt. Having dismissed these less desirable parts as alien intruders, it would be so much easier to love the remainder—nicely sanitized and fit for public consumption. But such consoling thought experiments can only be stretched so far before they collapse, bringing about painful disillusion. Regardless of my most cunning attempts at evasion, the fact remains that both the “good” and the “bad” are contained within one whole yet conflicted self—a self which belongs categorically to me.
Nonetheless, I do not begrudge the therapists at the clinic their misuse of language. Owning the darker parts of yourself demands that you finally acknowledge the threat they pose to your existence. I did not want my anorexia to be so real that I had to identify with it, and yet it was real enough to cost me the strength of my bones, the health of my heart, and the functionality of my relationships. While being shielded from the authenticity or “realness” of my own destructive potential alleviated my anxiety in the short term, it ultimately precipitated a much deeper crisis. Without being facilitated to take ownership of my tendency to starve myself, I was never fully persuaded of my own power to influence it.
While I was not and will never be responsible for certain “givens,” like my genes (my biological sister also developed anorexia despite having been raised by different parents) or the culture into which I was born (which appears to worship at the altar of the thin, hairless, and angular), I am responsible for how I choose to relate to these givens. On the surface, this realization often feels like a curse, because it is hard to distinguish between responsibility and blame. But upon digging deeper, there is the possibility of discovering your own freedom. And freedom, according to Sartre, is “what you do with what’s been done to you.”
Freedom through resurrection of the “inner child”
The clinic’s commitment to splitting the self into the categories of I and The Condition works on an assumption central to the humanistic model of the person. According to this model, we are each defined by an indestructible benign human “essence” or nature. This serves not only as a benchmark for the “truth” or “authenticity” of our motives but as the source of all “positive” pro-social development. The founders of humanistic psychology, who include Abraham Maslow and Carl Rogers, believed that under the right facilitating conditions, we are primed for such growth, which is fuelled by the so-called “actualizing tendency”—an intrinsic human drive towards self-fulfilment, or in humanistic terms, “self-actualization.”
According to Rogers, listening to the spontaneous promptings of the gut (a procedure which he called the “organismic valuing process”) tends to guide us out of our self-alienation towards this preordained authentic self. Our current cultural fixation with finding, soothing, and cultivating our “inner child” likely has its origins in this psychological construct, which locates the “true” self in the unconstrained (i.e., not yet socially conditioned) human. If we are to subscribe to a simplified version of this particular narrative, our best bet at being authentic is to return to the inherent innocence and spontaneity of the child, who is uncorrupted thus far by the demands of culture and society.
And so, it is hardly surprising that we call for the liberation of self-expression from the shackles of normative expectations and apparently arbitrary cultural rules. We dream of getting rid of our profound self-consciousness—the apparent source of our “inauthenticity”—by being more like a child or like my grandmother who has dementia: each can be seen to be beautifully in touch with their subterranean cauldrons of unrestrained impulse. Each appears to be free of the almighty burden of dissonance: that dreadful feeling that arises whenever we speak or act against our internal thoughts, values, and inclinations. This feeling, which surfaces every day, multiple times a day, reminds us of our endless infidelity to our “true” selves: You’ve just cheated on me … again!
As romantic as the idea of uninhibited self-expression can be, the reality is distinctly less attractive. One might even wonder how the benefits of this kind of “authenticity” are not far outweighed by the almost guaranteed social rejection. Infants—still under the spell of their own magical omnipotence—demand what they want without hesitation or self-restraint. They are, some might think, little authenticity machines worth emulating. But reality proves otherwise. While the often inappropriate observations of children are usually met with affection and laughter, my grandmother’s remarks about people’s wonky teeth failed to elicit the same positive response.
And yet, it can feel surprisingly good to believe that we each have inside of us an “inner child” or a “true self” just waiting to be released, not unlike the embryonic xenomorph in Alien’s iconic chest bursting scene. It feels good because it supplies us with a simple, tangible solution to a much more complex existential reality: reverse your own social conditioning (i.e., let your “inner child” out of its box) and you won’t feel cheated any longer. There is however one major problem with this popular formula. What we have rejected as our “inauthenticity” or our “infidelity” to true self or our estrangement from our “inner child” is one of the inescapable conditions of being a social animal.
Freud’s dynamic model of the human mind—flawed though revolutionary for its time—held that this very conflict between the demands of one’s interior and exterior lives is central to the human condition. For the rest of our days, we will be haunted by inner impulses, feelings, and thoughts which we will not, for a multitude of reasons, be in a position to act upon or openly communicate. This feeling of tension speaks not only of stifling social constraints but of our deep human sensitivity to the expectations and requirements of cooperative living. Since this sensitivity is crucial to the functioning of our society, the ensuing discomfort or sense of dissonance cannot and should not be fully eliminated.
Individualism through the unconventional life
Rogers was not the first person to think that our social and cultural conditioning was responsible for “perverting” our authentic nature. He was preceded by Rousseau and Hegel, both of whom advocated being “true to oneself” at the expense of violating the social code. From this perspective—which seems to have influenced contemporary ideals of individualism—obedience to power and adherence to social roles are the worst kinds of self-debasement. However, our tendency to turn against anything that might be called “normative” and towards self-imposed guidelines presents us with a rather unsettling dilemma: In order to be an authentic individual, you must prove yourself to be unconventional.
It could be argued that today’s most reliable route to unconventionality is social nonconformity and rebellion, including the rejection of social roles, norms, and language in favour of new, self-ascribed identities. More generally, this involves taking a firm and highly conspicuous stand against the “status quo.”
Social nonconformity—absolutely necessary and morally right under certain circumstances—is physically, psychologically, and spiritually exhausting when chosen as a way of life. Demands for new, subversive, and ever more vulnerable bouts of “authentic” self-expression can foster states of perpetual rage and dissatisfaction—not just with other human beings but with oneself. Such demands can make us afraid. Afraid of being encroached upon. Afraid of being seen as an imposter. Afraid of being inappropriately “assigned” by others who will remain forever guilty of misunderstanding “who we really are.”
Problems can arise when the aim of our social nonconformity is not constructive social change but rather the right to be and to express our “authentic” self—a vague concept which is becoming increasingly hegemonic. Under such conditions, authenticity can become a social asset and a goal in and of itself, as opposed to the by-product of real social engagement. Many of us promote, publicize, and trade on authenticity as though it were a pair of designer sneakers. Needless to say, there is something inherently inauthentic about our self-declared pursuit of authenticity.
A serious limitation of this movement towards the unconventional (non-conforming) life is that, not surprisingly, the more followers the trend gains, the less unconventional it becomes. Thus, rebellion against conformity in the name of authenticity has the potential to become just a different, shinier vehicle for a new conformity. One has to wonder whether, in spite of our desire to be uniquely set apart, we experience an inherent and irresistible pull back towards the herd.
In a 2020 Scientific American article, psychology professor Jennifer Beer reveals that our feeling of authenticity conflicts with our mental concept of authenticity as a state of nonconforming individualism. According to Beer, this feeling is determined less by our perceived degree of alignment to our “true” selves than by whether we display certain socially prized patterns of behaviour. Research has shown that we feel most authentic when we are “being extroverted, emotionally stable, conscientious, intellectual and agreeable,” all of which constitute socially desirable and conforming qualities. Beer writes that:
…when it comes time to actually make a judgment about our own authenticity, we may use criteria that are closer to how we judge the authenticity of an object such as food. A passion fruit tiramisu may be unique, but the authenticity of tiramisu is judged by its conformity to a conventional recipe. Similarly, it appears that the more we conform to social conventions about how a person should act, the more authentic we feel.
This shows that while we are eager to be seen as authentic, we remain afraid of the inevitable isolation that this entails. We long to be rebels and nonconformists—successful, self-reliant and “true to who we are”—but only to the extent that it feels comfortable. Perhaps we have settled on a happy compromise in flocking to subtler types of social conformity that can at least masquerade as individualism.
Self-worth through public displays of “vulnerability”
Woody Allen’s parody of the neurotic Jew living in 1970s-era New York is arguably one of the wittiest portrayals of the dangers of becoming “authentic” to the point of self-obsession. Although Allen’s protagonist spends decades mulling over the rather obscure, clichéd interpretations of his analyst, his own pseudo-introspection never quite translates into real self-awareness. This is most likely due to the fact that he shows very little interest in or genuine love for other people, who appear to perform the sole function of listening to his interminable, self-indulgent monologues. Such a habit of excessive self-disclosure or public “vulnerability” is far less entertaining when taken out of the comedy context and into the real world.
Take for example the “personal essay,” the massive popularity of which appears to stem from our voyeuristic enjoyment of deeply raw, often unprocessed material. Take the carefully curated snapshots of staged connection that litter our social media feeds: spontaneous marriage proposals that just happen to have been filmed, intimate moments between couples captured by carefully poised phone cameras, selfies of individuals with pursed lips staring off into the distance. Amid all this, we cannot forget the influencer who succumbs to the cultural pressure to live her “authentic life,” or at least to broadcast the intention to do so to an army of devout, inspiration-hungry followers, who respond with virtual party poppers and clapping hands. In such cases as these, cultivating one’s public image and garnering social approval are placed before real authenticity, which can only be seen to arise out of genuine human connection and immersion in life.
Perhaps we would be wise to remain alive to our increasing fascination with mawkish confessionals and the swapping of pseudo-insights that smack less of self-understanding than narcissism. Our culture’s growing taste for conspicuous authenticity, coupled with our own unmet needs for validation, are a recipe for a highly addictive kind of solipsism.
Revival of an older “authenticity”: a word for people not objects
Our commodification of authenticity has the potential to draw attention away from real personal growth, the kind that happens slowly and effortfully behind closed doors. While “staying true to who we are” might free us from the cultural pressure to be “better versions” of ourselves, we also run the risk of restricting our development to the realm of the comfortable, the familiar, or worse, the fashionable.
The early existentialists—were they to be resurrected—would barely recognize our pop psychology definition of authenticity, which boils down to “letting go of who you think you’re supposed to be and embracing who you are.” Mostly, that is because they entertained an entirely different concept of the “self.” Philosophers like Heidegger and Sartre believed that existence precedes essence. From this perspective, the self is not some fixed nature or substance waiting to be discovered; rather, the self—ever evolving and unfolding—is a continuous process of becoming.
When viewing the self as a dynamic process, many of the problems that arise out of our assumption of its constancy seem to evaporate. We are, for example, no longer plagued by the impossibility of measuring absolute authenticity when the characteristics, wants, desires, and motives of an individual vary across time. When authenticity is understood as a state which can never be fully arrived at, we no longer expect that what was “authentic” at the age of 21 will continue to be so at the ripe age of 50. Under these conditions, we might not have a fixed substantial self to embrace, but what we can embrace is the certainty of our own evolution.
Having been incubated within the cosiness of entitlement culture, authenticity of the existential variety can feel like a bitter pill to swallow. It does, after all, substitute the carefree child for the responsible adult, destiny for the burden of free choice, and safe distractions for the “ultimate concerns” of life. And yet, it appears that this pill is one which we cannot afford to scrap, since it brings with it the opportunity to be the uncontested author of one’s own life. Regardless of the lack of cosiness, that certainly beats playing the minor part of a curious bystander.
The “inner child” is no match for the responsible adult
Self-insights alone are not enough for authenticity or therapeutic growth, both of which depend upon our ability to relate genuinely to others and to sustain mutually enriching relationships. In addition to social engagement, authenticity demands that we not only have the desire but the will to make our own choices. In Existential Psychotherapy, psychiatrist Irvin Yalom posits that the role of the therapist is to “remove encumbrances from the bound, stifled will” of the patient, thereby freeing up their ability to make decisions.
From an existential perspective, the true self is created not discovered, and the means of this self-creation is choice. In a Frontiers article published last month, clinical psychologist Per-Einar Binder proposes that what is important is not “the external characteristics in what is chosen” but “the quality of awareness and ownership in how one chooses.” This explains why living the “unconventional” life is not necessarily a straightforward route to authenticity. According to Binder, “choosing to live according to conventional ideals can be done with authenticity” but that “obeying conventionalism as an outer demand”—which leads to disownment of one’s own choices—cannot.
Authorship of our own self-narratives, our own sets of values, and our own overarching life projects implies responsibility—a tradition which, in today’s world, has become extremely unpopular. Contrary to what one might like to think, authenticity emerges not through our resurrection of the unselfconscious and uninhibited “inner child,” but through our willingness to act like the responsible adult. More specifically, authenticity looks like the adult who does not run away from choice.
Thus, one could say that we are authentic to the extent that we are responsible. We are authentic insofar as we own (and actively resist the temptation to disown) our free choices, which, in large part, determine the trajectory of our lives. (Note that I neither wrote that our choices fully determine this trajectory—since there still exist external factors outside of our control—nor that our wishes and/or choices will lead to the “manifestation” of the perfect life.) Since, from the Heideggerian perspective, our individual identities are constantly in question, the act of choosing can be thought of as taking a stand on who we are and who we will become.
Free choice trumps the illusion of destiny
One of the most seductive ways of avoiding responsibility is to uncritically accept and commit to the “givens” of our existence—our careers, certain stubborn personality traits, the past, our dysfunctional marriages, our financial status—without ever seeking to transcend them. Believing that we are powerless to be anything other than the cards we have been dealt can prove more comforting than taking on the burden of introducing new cards, or at a minimum, reshuffling them. To Sartre, this denial of our own “transcendence” (i.e., our freedom to transform the “givens” of our lives through new interpretations, responses, and choices) was equal to living in “bad faith.” In other words, I am inauthentic when I refuse to see myself as anything other than an immutable object, incapable of choice and hence fundamentally unfree.
Sartre proposed that in order to be authentic, we must honour the multidimensionality of our being and its potential, rather than limiting it to a single prespecified function. When I convince myself, for example, that I am nothing but my career, then the whole of who I am and who I could become is subsumed by my professional role. While this might alleviate the burden of my freedom—which hands me responsibility for my own self-creation—it undermines my will (i.e., the source of my decision-making) and severely restricts all present and future possibilities. In Being and Nothingness, Sartre described this particular variety of inauthenticity, as demonstrated by the working people of the day:
Their condition is wholly one of ceremony. The public demands of them that they realize it as a ceremony; there is the dance of the grocer, of the tailor, of the auctioneer, by which they endeavour to persuade their clientele that they are nothing but a grocer, an auctioneer, a tailor. A grocer who dreams is offensive to the buyer because such a grocer is not wholly a grocer. Society demands that he limit himself to his function as a grocer. … There are indeed many precautions to imprison a man in what he is, as if we lived in perpetual fear that he might escape from it, that he might break away and suddenly elude his condition.
Existential guilt says that we are falling short of the people whom we might have become. Sole identification with any single facet of our being—even if that facet is one’s highly praised, indefatigable drive to succeed—not only creates the illusion of choicelessness but imparts a narrow view of our potential. Within this context, Heidegger reminds us of the debt (Schuld) we owe to ourselves. Personally, I remained blissfully unaware of this debt until the age of 24, when, having developed an energy-limiting chronic illness, I faced one of the most difficult decisions of my life. Barely able to be out of my bed for more than a couple of hours at a time, I chose to leave a fully funded organic chemistry doctorate at Oxford University.
It was only possible to make this decision by deliberately stepping outside of the role that I had assigned to myself. And having worked frenetically to satisfy its requirements for almost a decade, turning away was no easy task. It was simply absurd to one day discover that I was too weak to lift solvent bottles or to stand long enough to perform purifications of my chemical products. I had no idea who I was outside of my functions as an organic chemist—all those deeply ingrained movements and mystical rituals. It would have been the easiest thing in the world for me to deny myself the possibility of being anything else.
At least I was partly shielded from my loss by a storm of numbing confusion in which all the pieces of my life were uprooted and flung mercilessly through the air, producing the internal sense of being alive inside a snow globe. When eventually all the pieces settled again, I painstakingly set about creating a new pattern amidst the wreckage. I decided on a different life-project: one which I thought would be sustainable inside my new, markedly less dependable body. Taking all that I had learned about the absurdity of suffering and the gift of human resilience, I entered the field of psychotherapy.
It is rare that we examine the darker side of choice: the inevitable limitation of possibilities. Yalom reminds us that this limitation brings us closer to finitude and death, which Heidegger defined as “the impossibility of farther possibility.” Even great artists and intellectual thinkers—whom we presume to be “at home” with such existential concerns—design the craftiest ploys and most elaborate psychological devices in order to circumnavigate the dreadfulness of choice. In his 1915 poem “The Road Not Taken,” Robert Frost remembers “two roads diverged in a yellow wood.” The poem is an exercise in persuading himself that the road he chose was far more attractive than the unchosen way, and that in fact, there was no real opportunity cost. Upon writing the final lines, he is quite sure in retrospect that he embarked upon the unconventional (“less travelled”) life path and that this has “made all the difference.”
Just as Frost ultimately renounced the “road not taken,” I ended up renouncing the academic career that was no longer on my horizon. Looking back over my shoulder at the doctorate students whom I was forced to leave behind, I refused to see a set of high-achieving individuals climbing steadily towards the apogee of their academic careers. Instead, I saw a bunch of dry, shrivelled souls who had confused meaningless accolades and academic success with the source of life’s true wealth: understanding the mysteries of the human mind. Of course, this was a wealth reserved only for those who became psychotherapists! Thus, it appears that even when we must choose, a comfortable compromise lies in devaluing the unchosen alternative.
Even more comfortable than devaluation is believing that we have no choice at all. I was struck throughout my education by a tendency among certain high achievers—who built the most impressive castles out of their own success—to feel haunted by a sense of confinement. With their castles came captivity. You could just about detect it in the postdoctoral student who, despite her misery, clung to the fallacy of sunk costs and refused to abandon her research. To carry on was, in her own words, the “path of least resistance.”
If you did not happen to notice it in the student, you were sure to glimpse it in the untenured assistant professor who slept in her office. When I informed her that I was leaving to go do something else, she looked at me wide-eyed with horror. She said that she could never see herself doing anything but science because it was all she had ever done and that she had no other talents. It was plain to see that she had stepped inside academia with the ease of a hand slipping inside a glove. She fit there neatly, not with the confidence of someone who has just purchased the glove for herself, but with the well-concealed anxiety of someone who was once told that she had to wear it by some mysterious outside force. Perhaps the simplest definition of inauthenticity is this: believing that there wasn’t a choice when there was one.
Distractions are a poor substitute for facing life’s “ultimate concerns”
Therapist Miriam Donaghy describes Heideggerian authenticity…
…not as being true to one’s self, but as being true to existence. In other words, authenticity is being open to, or facing, the “givens” of existence, including our “thrownness” [the fact of our having been “thrown” into certain conditions which have not been freely chosen] and inevitable death, whilst inauthenticity equals turning away from or denying them.
In a broad sense, authenticity asks us to confront the various facets of our human condition, including those from which we would feel most comfortable fleeing. Both Heidegger and Sartre recognized our human tendency to try to escape freedom, which leaves us riddled with anxiety. We are anxious because freedom demands that we create our own subjective reality inside an indifferent world. A world that lacks any pregiven meaning structure.
Despite our best efforts to eradicate it, anxiety is integral to the human condition. From an existential viewpoint, it arises from our awareness of what Yalom calls the “ultimate concerns of life”: death, freedom, isolation, and meaninglessness. And yet, as terrifying as is the confrontation of these givens, it will not kill us. The same, however, cannot necessarily be said for our endless, futile attempts at denial and avoidance (take it from a recovered anorexic who skirted death).
Within the related fields of psychology, psychiatry, and psychotherapy, it is generally agreed that psychopathology results not from our confrontation of the anxiety itself but from the rigid psychological mechanisms we employ to defend ourselves against it. In Existential Counselling in Action, psychologist Emmy van Deurzen describes this anxiety—the kind that emanates from looking deep inside our human condition—not as “evidence of pathology” but rather as “the essential reminder of our vibrant and dangerous aliveness.” Rather than trying to eliminate the ineliminable, perhaps we should invest our energies in developing the resilience needed to survive it.
One could argue that the contemporary ideal of authenticity is seductive precisely because it offers us a strategy of avoidance against the “ultimate concerns” of our lives, while the existential concept is less attractive because it asks that we confront them. While existential authenticity hands us the burden of our own freedom by asking us to take ownership of our self-defining choices, contemporary authenticity shields us from this burden by asking that we simply surrender to a fixed, preordained self. When we accept the latter version, we apply to our living selves a word that is best kept for the world of objects.
We want to believe that authenticity is a final destination: the zenith of self-understanding or the growth fanatic’s nirvana. We want to believe that our true selves are waiting to be found, lurking in some forgotten interior place. We want to believe that the bad times are nothing more than predestined pit stops on the journey to self-fulfilment. And oh how comforting it would be to know that this journey was purposefully designed to be long and tortuous for the sake of supplying us with fortune-cookie life lessons at each hairpin bend. But despite the strength of our longings, it is far more likely that our true selves are in the process of being continually created rather than found.
In the absence of any grand external design, we steer our ships, often blindly and haphazardly, towards unknown ports. It is terrifying to realize that we are personally responsible for the direction in which we have projected our future selves: not just for the growth that has been actualized but for the potentials that have been neglected. And yet, it is empowering to find that there is always the possibility for transformation by changing our basic commitments. It is empowering to find that among all this, there is the freedom to be something different.