The Anatomy of the Cliché; or, Do We Live In A Culture of Authenticity?

Kerwin Fjøl
50 min readJan 23, 2023

--

Including some light-hearted discussion on the following whimsical topics: Method acting — Gena Rowlands vs. Adam Driver vs. Eminem. Second-order observation: seeing through the eyes of the mass. We understood pro wrestling better when we thought it was real. The writer’s audience is always a fiction. The printing press and the lens. What is a cliché? Stereotype and archetype. Mozart’s starling bird funeral. TikTokers crying about dead pets. Performance and schemata. Byung-Chul Han is wrong — authenticity and individualism. Works cited. Moeller and D’Ambrosio are wrong — authenticity and iconicity. The meaning of cringe. From pick-up artistry to inceldom. POV: You don’t know what POV means. The decline of voice and the rise of pose. Fellini’s Ginger e Fred. Media neither “cool” nor “hot.” Carnal audience and virtual audience. Outsideness and exteriority. The death of the beautiful soul.

I. Film

“You’re sick. You’re sick. You need a rest. Your eyes are terrible. Lie down. Lie down. Here, you’re awful! You’re sick! Get a doctor for him! He’s a crazy man! Get him! Get him! No! No!”

That’s what Mabel (Gena Rowlands) says to the family doctor in John Cassevetes’s A Woman Under the Influence (1974). Her husband (Peter Falk) has just arranged a group intervention in which he informs her that she will be committed to a mental institution, and so she panics in response, insisting that the doctor is the one who ought to be locked up. In this scene, Mabel opens her eyes widely. She lurches her jaw. She hunches over, keeping her forearms tightly wound. She moves in staccato rhythm. She makes the sign of the cross with two fingers, as if warding away a vampire. She looks like she has snapped.

Cassevetes always denied that the character had truly gone crazy in interviews, but the telltale signs are there in the acting. The clichés. And while Rowlands’s performance has been widely praised by critics and moviegoers alike, some recent viewers have found those moments of extremity off-putting. A critic from TimeOut London tells us, “Rowlands unfortunately overdoes the manic psychosis at times, and lapses into a melodramatic style which is unconvincing and unsympathetic.” Another review from a fellow called Josh Larsen says “everything Rowlands does is for the camera’s benefit, whether she’s unleashing a tirade front and center or off in the background, bugging her eyes, flapping her hands, and inhaling a cigarette as if it’s been rolled with ground-up chili peppers.” A user review on IMDB from someone named “Rockwell_Cronenberg” says, “I wanted so badly to adore her […] but it’s her more manic displays where she totally lost me. In all of her ticks and eccentric behavior, Rowlands felt so calculated, none of it coming off naturally or with any sense of believability.” And a gentleman from EFilmCritic tells us, “Rowlands is simply ridiculous.”

These more recent criticisms are onto something. Rowlands is a phenomenal actress throughout the film, but it is those moments in which the demeanor that she expertly crafts for her character falls apart and she becomes a stereotype of a crazy person. According to Cassevetes, Rowlands approached the character Mabel through method acting. She became Mabel in day-to-day life. He tells one interviewer, “Gena put so much of herself into the film that as the picture went along the problems became more and more hers. By the time the picture was over she had so thoroughly understood the investigation that she had made that she had become thoroughly proficient in those aspects of life. She became like Mabel in a sense, after the picture was done.” How disappointing it would be for some, then, to observe the same Rowlands with her more understated, genuinely idiosyncratic moments give way to the Rowlands who behaves like a cartoon character. Yet, here is the conundrum, and it is quite a conundrum indeed, something that perhaps ought to put the very concept of method acting into question. Anyone behaving in an extreme fashion — undergoing a “limit-experience,” as some fancy people say — becomes a cliché, whether she wants to or not.

Think, for just a moment, about Eminem’s acting job in 8 Mile. Watch it (actually, don’t) and count how many times he explodes with rage throughout the film. It’s quite a few. He does this even during inappropriate situations. Before the film ends, the strategy becomes fairly obvious. It’s because he is not a great actor, exploding with rage is another skill entirely, and yet it looks superficially like what we might consider “good acting” when asked for examples. Crying and screaming do, too, and these things are also not quite acting.

We don’t like to acknowledge that most performances of intense emotional experiences, like pain, are typically indistinguishable from the real thing. Around twenty years ago, it was common for educated people to mock pro wrestling because of the fakeness of the sport. Even when they would see a wrestler falling from a twenty-foot ladder to the barely-padded concrete floor outside of the ring, they would think to themselves, “Ah! If this sport were real, then that would be painful. But since it is fake, that gentleman is perfectly fine.” As they apparently forgot that the laws of gravity still obtained, their awareness of the sport’s inauthenticity harmed their comprehension of what was happening more than if they had assumed it’s all real. But even then, no one considered wrestling to be acting, nor should they have, because the ability to simulate an extreme emotion doesn’t show the ability to convincingly portray a unique psychological disposition, the bread and butter of method acting. The great “Rowdy” Roddy Piper said it best: “wrestling is explosion; acting is implosion.”

Back to A Woman Under the Influence. Our question is, why now? Why are more recent critics starting to find fault in Gena Rowlands’s acting in this film? Virtually no critics attacked her performance in 1974. The NY Times’s own Pauline Kael hated the film (one of the only ones), but she had no problems with Rowlands’s acting. It then becomes reasonable to wonder about the cultural shift in perception. I think the answer is this: the mode of realism Rowlands inhabits has gone out of fashion. Its faithful mimesis no longer seems as lifelike as it once did, as the understated moments of truly skilled acting can only hang loosely with the more clichéd moments of the intense emotional experience.

Let me put it this way. When you are dealing with someone in the midst of a crazed outburst, you are experiencing him raw. There is no time for reflection about whether or not he would look “real” to someone watching from the outside; a voyeur leering in and gazing upon your shared intimate moment. But if time opened up and the possibility for such a consideration crept in, you would think to yourself, “How can this be real? It just looks so silly.” And perhaps, then, your response would change. The ability to understand how something would look from the third-person perspective — to a general, theoretical observer watching on — is the ability to gauge the second-order observation of an event. Incidentally, this problem of second-order observation was one of the reasons people criticized Christian Bale’s portrayal of Batman in the Chris Nolan films. To them, his voice sounded silly when he growled his lines. But, of course, if he were growling them directly at you in the same situations as in the films, ones in which you find yourself lost in the stakes of the moment, you would not possess the distance within yourself to find it so absurd. People thought, “This looks absurd to the average person in a movie theater,” not, “This would look absurd to the guy Batman is talking to.”

For comparison, consider the argument scene in Marriage Story by Noah Baumbach (2019) for an example of a performance that all the critics seem to enjoy. The actors Adam Driver and Scarlett Johannsson do a fine job, giving us plenty of recognizable visual cues to signal extreme anger and frustration, but the dramatic weight of the scene rests entirely on the dialogue. There is not a single moment of verbal ambiguity that a viewer’s close attention to body language or facial expression might resolve. The language is novelistic, and its delivery simply perfect, like an unblemished, smoothly rounded pearl. Not a single moment of wordless exasperation, misplaced diction, searching for which words to say (even in long and complex sentences), false starts/finishes, or interruption of the other’s speech. This isn’t how arguments tend to go in real life, even among the highly educated, yet most educated viewers find themselves impressed with the film’s realism. They do not detect any disjunction between the immaculate dialogue and the frazzled expressions on the faces delivering it. The gesticulations and facial expressions are stereotyped, but the lines are the sorts of lines that we educated people would like to imagine ourselves saying. Because everything is idealized, this film looks “real” according to the standards of second-order observing.

In the age of digital cellular phone technology, we more readily conceive of everything via second-order observation. The anticipated reaction of the audience is itself a part of the product, while the question of how something will look to a given audience has become a major aesthetic consideration. When we watch videos on streaming web sites, it is common for us to check what the user comments have to say, even as the video plays. One reason is that consuming media through the cell phone is itself depersonalized, far from immersive, and it invites the user to tinker and fiddle with the device itself. But another reason is that we are also in some sense conditioned by social media web sites to see ourselves and our most intimate friends and family members through second-order observation. When we observe what’s happening on streaming video, we would prefer to see it through the perspective of a rather more generalized audience, and this preference carries over into other media, such as film as well as literature.

II. Literature

Whatever the medium, the audience is always a fiction. The artist, writer, musician, producer team, etc., have all calibrated a set of expectations into which their anticipated audience will fit, whether it chooses to play along or not. The audience is molded via negative space, and its outline is always right there in the piece. When the piece becomes popular, you know the strategy has worked, and an audience has chosen to fit the mold. There is a reason people often accuse popular works of appealing to the “lowest common denominator.” But this fact of audience-making can itself be a source of creativity. Walter Ong analyzes it in literature, taking Hemingway as his leading example in an essay aptly titled “The Writer’s Audience is Always a Fiction” (1975). As Ong observes, Hemingway is always writing sentences like this: “In the late summer of that year we lived in a house in a village that looked across the river and the plain to the mountains” — the opening words of A Call to Arms. People will notice the stripped-down diction, but the more important feature here is found in the content to which Hemingway only vaguely alludes. What, after all, is “that” year, and why “the” river? Which year, which river, which plain, and which mountains? Why the lack of specificity? It’s because he’s assuming prior familiarity between author and reader, and this assumption implies an intimate role that the reader must actively play if he wants to appreciate the story Hemingway will tell. Otherwise, its meaning is lost.

Books are strange things because when we write them, we don’t quite know who will read them, and so we never know precisely for whom we’re writing. To solve this problem of the audience, writers have often used pragmatic contrivances. For instance, during the enlightenment, epistolary novels such as Samuel Richardson’s Pamela; or, Virtue Rewarded (1740) were wildly popular among the bourgeoisie. The bourgeoisie wrote plenty of letters, and in fact it was common for your typical middle-class professional to spend about two hours a day writing them to his colleagues. The culture of letter-writing is why scholarly biographers are now able to paint such rich, detail-laden pictures of men like Goethe and Rousseau, relying less on their own memoirs than on correspondence. It is not hard to understand why, then, members of the reading public would feel comfortable placing themselves in the positions of someone receiving a letter, even assuming the vantage point from both ends of the dialogue.

The daily practice of letter-writing is itself important because the letter has an audience of just one. Imagine spending two hours of your day immersing yourself in the specific and unique concerns of one person at a time as opposed to announcing your thoughts and feelings to a more generalized audience on social media. Under such conditions, the decisions that one might make in conveying an authentically felt emotion came from a different set of assumptions and considerations than what we have now. The emotions would be real (of course!) but would take on an entirely different set of conventions.

This was the age of two key technologies: the first was print media. Printing meant that books could be copied into indefinitely large quantities and travel very far distances. But it was also the age of the lens, as the telescope and microscope were relatively recent inventions. The lens offered the same sense of third-person objectivity that today’s camera lens does, and it was first understood to be viewed by one person at a time. The non-epistolary novel in some sense responded to these technological conditions by carving out a diegetic vantage point from which the private reader could not only observe action taking place in the moment, playing the role of the “fly on the wall,” but also gain access to one character’s private thoughts at a time, which the author could simply narrate assuming an omniscient perspective. So not only was interpersonal communication individual-oriented, but so was the third-person view thereof.

Literature was not always quite this way. Before private reading was the norm — that is, before the age of print — vernacular fiction authors anticipated a physically present audience to whom they, or perhaps some acquaintance, would read their texts aloud. The conditions were a little bit closer to how things work today. For one thing, the author didn’t feel the need to encode a distinct “voice” in the text, because his voice was quite physically, corporeally there. There was no need to build up a personal ethos through the writing, because a degree of intimacy and prior familiarity was assumed to begin with. Additionally, second-order observation was also part of the presentation of one’s writing, because all crowds ipso facto involve second-order observation. If you see the crowd being pleased, you feel pleased, too. If you see them get angry, you will feel emboldened to become angry yourself. We will address some major differences between the internet and the pre-print era later, but these similarities must be understood.

Even before print was invented, however, authors were growing increasingly aware of the indeterminacy of a text’s audience. Right before the Gutenberg press was invented, namely the fourteenth century, Boccaccio and Chaucer wrote with framing devices for the Decameron and Canterbury Tales, respectively. Both men were aware of the book’s potential as a vehicle for transmitting disembodied information to an alien readership, and so they needed some way to simulate an audience to replace the living, real-world one to whom the authors of a previous age would have spoken aloud themselves (Ong discusses this in his essay). The pilgrims of the Canterbury Tales make up a “general audience” because they contain representatives of each social class, but it is also an intimate audience, because they’re all there in the story, hanging out.

Psychological depth, judged by modern standards, cannot be sustained under conditions like those. The audience is not an individualized role that the reader must “step up” to assume, as in Hemingway, but is itself a cliché, or collection of clichés. The characters, too, are clichés. And the narrator, the characters, and the anticipated audience are all tinctured by the unique voice of the author, the one distinct and irrepressible element of the fiction (for even if his own writing is plain and generic, he was still standing there once upon a time, talking).

For the better part of a century, too much ink was spilled by scholars who speculated about The Wife of Bath’s psychological nuances and ideological complexities. But those nuances and complexities were always illusory. Chaucer’s brilliance was in taking a couple female stock characters dating from late classical satire (the “cheating young woman” and the “horny old woman”) and putting them into one woman’s story, then having her narrate various stages of her life involving both roles, spicing them up with various arguments about the nature of marriage and womanhood, not to mention some glib theology aimed at St. Jerome. The fusing together of these various literary figures and stereotyped viewpoints allowed the modern reader to try and smooth over the rough-hewn edges through a sophisticated and altogether modern interpretive framework (cf. what T.S. Eliot says of the imagined psychological nuances of Hamlet). But the Wife of Bath is more like a character from the Commedia Dell’arte than a Flaubert novel, and her personality — to the extent that she really has one — is just Chaucer himself wearing drag.

Cliché and stereotype are words that originate in the technology of the printing press. They are the solid plates of type metal from which many identical paper copies are made. But metaphorically, they mean something more. One of Marshall McLuhan’s neglected works, From Cliché to Archetype (1970), notes that the “cliché” need not only refer to overused features that we encounter in speech or in art, but our very perceptions themselves. What we search for when we extend our senses outward is inextricably connected to what we retrieve from the transcendent world. A cliché is a probe. And some prove themselves to have timeless significance — those, we call archetypes. We see the little man in the three-prong Edison socket or Makapansgat pebble because we summon him into being. The characters we discover in literature can be clichés or archetypes, and time will render the judgment as to which is which. If we find them convincingly timeless enough, we instantly declare them archetypal.

The internet has pervaded the culture as a new cliché that privileges second-order observation, just as ancient cultures did. And second-order observation carries its own priorities that influences our sense of what constitutes a good acting performance; it shapes our very understanding of the psychological. When we look at something, we’re increasingly aware of the crowd’s reaction to it. This compels us to revisit various things like, say, racist Bugs Bunny cartoons with a new set of eyes, with its own refined priorities. These eyes might grasp more intuitively the manner in which the crowd forms a part of professional wrestling, whereas before it was taken as a separate consideration altogether. And these new eyes might see new things in the frenetic acting style of the German expressionist films, while the shifts between expressionism and realism in a Gena Rowlands might seem jarring and unresolved. This is not to say that our aesthetic preferences have altogether returned to a pre-modern era — quite the contrary, there are stark differences between this age and the ancient world. But our heightened awareness of an outside perspective certainly gives us different expectations for the portrayal of reality.

III. The Internet

In 1787, Wolfgang Amadeus Mozart’s pet starling bird died, and Mozart held a funeral for him. His friends attended, veiled in black just as he had instructed of them, and they sang dirges. Mozart wrote a funeral poem for the occasion, which is translated like this:

Here rests a bird called Starling,
A foolish little Darling.
He was still in his prime
When he ran out of time,
And my sweet little friend
Came to a bitter end,
Creating a terrible smart
Deep in my heart.
Gentle Reader! Shed a tear,
For he was dear,
Sometimes a bit too jolly
And, at times, quite folly,
But nevermore
A bore. [etc.]

On the one hand, this was a farcical event; a good example of Mozart’s sense of humor, good as any. On the other hand, Mozart’s emotions were quite sincere. His often overbearing father had just died some months before, and he did not attend the funeral. Biographers are probably onto something when they speculate that this was a nice way for Mozart to transpose his mixed feelings about his father onto an unrelated event. But also, everyone loves a starling bird and would feel awful if their pet starling died. They’re quite pleasant!

The audience for this funeral all knew Mozart, and so we can assume that they could disentangle the layers of emotional complexity at play to grasp the event’s overriding sentiment. Had they not known him, its meaning would be entirely different. This all happened, by the way, just a few years before Novalis and Schlegel would start to get philosophical about irony, treating it as the poetic antidote to philosophy’s fruitless attempts to systematize everything. But the sort of complex and multi-layered irony happening in Starling’s funeral was nothing new. Mozart may have arranged this funeral during the age of print, but his juxtaposition of light humor and genuine sorrow with no contradiction between them was downright ancient. What people now call “meta-irony” was understood then simply as irony, and during the age before print, there was no name for it at all. There didn’t need to be one.

Now, contrast the intimacy of such a scene with the performance of grief over dead pets that we occasionally see on the internet. These are performances of intense emotional experiences, the mode of self-expression that involves strong performance while requiring little acting skill. On one social media video that went viral, a girl holds her massive dead yellow snake and cries loudly, bellowing out her pain and agony in between sobs, while a popular sad song plays. The strangeness of the situation was not lost on some critical commentators: this girl had to mount the camera, hold her snake’s corpse in a particular pose, and cry loudly after hitting the “record” button. And then, once the recording ended, she had to choose a specific song under the “emotional music” category on her cell phone app’s music menu. How real, they asked, could her performance of emotion be?

Or, consider another similar situation regarding grief over a pet. A mother and social media influencer accidentally uploaded an outtake video of her and her eight-year-old son crying in their car over their puppy, who had just been diagnosed with a terminal disease. In the video, the mom tells her son to “come closer” to cry on cue, and then she proceeds to coach him on how to do it convincingly. At one point, the son says, “No, mom, I’m actually seriously crying” in between sobs. She then says, “No, I know, but go like this. For the video. Go like this, put one hand up. Like this. No, go like this. Put your hand like this. But let them see your mouth. Let them see your mouth!” Then they continue to cry together in various poses. Again, the behavior prompts the question: for this mom, how real could the performance of emotion be?

The answer is, probably quite real. The expression of emotion under such circumstances cannot be subtle, like we saw in the funeral for Mozart’s starling. When you have an audience of mostly strangers, nothing can simply go without saying. In the same way some of us write poems to exorcise our emotions, so, too, do some people make social media videos, and their creation requires both planning and preparation. More interesting than these videos, however, is the hostility with which they were greeted. A surprising amount of commentators felt that this mother was committing some sort of child abuse (she wasn’t by any legal standard) and decided that Child Protective Services ought to confiscate her children and break up their family.

Most people, when expressing sincere emotions in these web videos, anticipate such hostile responses, so they will often express themselves with noticeable element of self-awareness. There is now an entire subgenre of such videos in which people scream angrily about politics in their car, but they usually give the viewer some hint that they are “in on the joke,” so they don’t seem too crazy. One woman, for instance, recorded these words before the 2020 election:

Listen, kiddo. I get it. I don’t like the two-party system. I think our country is corrupt, and, quite frankly, I don’t want to vote for Biden. It feels like voting for a Republican. But I’m gonna do it. You want to know why? Because the alternative is a fucking fascist. A fascist. It’s a fascist. Maybe we can have the conversation about dismantling the two-party system when a fascist isn’t running. Maybe we can do that later, kiddo. Champ. Chief. Maybe we can talk about it later.

Although her political enemies declared her crazy, it is pretty obvious that she is attempting to be funny even while signaling real anger. The script should make it obvious, even though she belts out her lines with the same intensity as Lewis Black or Sam Kinison, comedians known for screaming loudly. Another, perhaps more worrisome video from the same year features a woman screaming the following:

Holy fucking shit, you guys. I’m driving a car, but I just got a notification that Ruth Bader Ginsburg died. Fuck. Could this year get any fucking worse? Ruth: you just had to make it to 2021. Waahhhhh.

She seems quite a bit angrier, but there is still a hint of self-awareness in how she speaks directly to Ruth with disappointment, as though the late Supreme Court justice is some sort of football player who has let down the rest of the team. Nonetheless, her political enemies still declared her mentally ill, even with more assuredness than with the previous example.

We ought to try and understand the creative process of these people making videos of themselves in their most emotionally vulnerable moments, since some weighty value judgments are apparently at stake. In E.H. Gombrich’s Art & Illusion, Gombrich argues that visual artists never simply “copy down what they see,” because the transmission from the natural world to the paper or canvas requires far too many changes to account for what is essentially nontransmissible. The artist’s head never remains completely steady, so the perspective constantly changes; the brightness and darkness of the natural lights can never be adequately rendered through paint or graphite; there is always some artifice involved in the depiction of depth because of its subtleties; and so on. Therefore, the artist must constantly negotiate between the raw information his eye gives to him and some schema that he has mentally constructed based on what is visually familiar to him. A schema is a psychological aide to help the artist filter and map out the prominent visual features of the object he has encountered as he renders it onto his medium. The schema the artist keeps in his thoughts is a cliché, or a stereotype.

The medieval artists who drew manuscript illuminations are infamously quite bad at drawing animals, often amusingly so, as the tumblr blog “Discarding Images” demonstrates. But why? The likely answer is that these illustrators were often tasked with drawing animals that they had never seen, so they would rely on a schema corresponding to a different animal altogether. A sketchbook album of Villard de Honnecourt contains plenty of these schemata, which usually take the form of simple shapes to help himself reconstruct various things including animals and also types of faces. The schema thus allows him and us to substitute the unknown with the familiar in order to trudge along and let the picture materialize.

Now, these internet videos are not visual art but rather performance art. In many ways they represent the exact opposite of method acting. The performers are not “imploding,” but “exploding,” as Hot Rod would have it. They do not need to plunge themselves deliberately into a soup of buried emotions to exhume for some scripted contrivance, as with method acting, for their emotions are absolutely real and in need of pointed articulation. But just as visual artists must use visual schemata to negotiate between what they see and what they can actually put on paper, so too do the emotion-performers rely on theatrical schemata to negotiate between the unique and complicated feelings they feel and what they can actually convey on the video screen.

In this way, the theatrical vocabulary of the entertainment industry serves as the conduit through which the performers can direct their emotional displays, whether they are sincerely crying about dead pets or half-jokingly screaming about politics. Halfway between their complex emotions and the unknown audience that awaits them lies the moment of performance, and that is when the compromise is made. They will, from the moment they hit “record” to the moment they hit “stop,” make themselves into clichés — all in the service of authenticity!

There is a certain irony regarding the woman who coached her son on effectively mourning a dead pet for the camera. Had the people never seen the outtake and only watched the finished product, their understanding of the situation would have been, in fact, more accurate than what they concluded about her when they saw her preparing. They weren’t ready to confront the reality of how such preparation looks, even if it is done to convey the spirit of real emotion. If some part of it didn’t match the viewer’s expectations, then of course (so they think), all of it must be a contrivance: she didn’t feel any real emotions about her dead pet, ever! Now throw her in jail for child abuse!

But despite the pushback some of these videos get, they work more often than not. There are plenty of pet-mourning videos on TikTok that have garnered plenty of views and elicited all the right emotions, and it is no stretch to say that political outrage has become the dominant strand of online public discourse during our time. The success of these videos depends on the audience, who in viewing them must negotiate in its own way. If it is going to be sympathetic, the audience will share the same schemata as the performers, and it will use them as antennae to sense what the performer is attempting to convey. The performer is not performing for everyone in the whole world (how could that be possible?), but rather the very cliché that she becomes, and the audience in turn steps up to join her in the role of this cliché. The performance adheres to a deep grammar that joins performer and audience together as one.

IV. Theory

It is tempting to form value judgments about all of this online emotion-sharing, and surely we can see that it has something to do with authenticity. But what, it is not exactly clear. The perennially bummed-out social theorist Byung-Chul Han starts off a chapter in a recent book by asserting that the performance of extreme emotion is entirely done in the service of authenticity. As he puts it, “The society of authenticity is a performance society. All members perform themselves. All produce themselves.” His argument is that today’s “neoliberal” society is different from previous ones, which relied on rituals, manners, and other displays of formality as a way of mediating the relationship between the self and its surrounding community. The laws of the marketplace, by contrast, cater to the individual’s inmost whims and desires, and so people are encouraged to perform their raw emotions in order to display their uniqueness and individuality for the consumption of others. Look:

“The culture of authenticity goes hand in hand with the distrust of ritualized forms of interaction. Only spontaneous emotion, that is, a subjective state, is authentic. Behaviour that has been formed in some way is denigrated as inauthentic or superficial. In the society of authenticity, actions are guided internally, motivated psychologically, whereas in ritual societies actions are determined by externalized forms of interaction.”

But there’s more: aiming for authenticity causes one’s uniqueness to disappear, and everyone starts to become the same. In a separate interview with the Spanish newspaper El País, Han clarifies that authenticity and uniqueness are synonymous: “Everyone today wants to be authentic, that is, different from others. We are constantly comparing ourselves with others. It is precisely this comparison that makes us all the same. In other words: the obligation to be authentic leads to the hell of sameness.”

To be sure, Han’s book’s main claim is correct. The decline of rituals in everyday life is bad. And some of what he says about authenticity is right: he recognizes that people perform versions of themselves and strive to create the appearance of unfettered subjectivity, and so our culture is indeed one of self-performance in this way. But his problem is in assuming that authenticity and individuality are still linked.

Pulling from Charles Taylor’s The Ethics of Authenticity (1992), Han examines Taylor’s call to disentangle the ideal of authenticity from that of individuality and instead reconcile it with communitarian aims. Han declares bluntly in response that authenticity and individualism are inseparable: “Authenticity is in fact the enemy of community. The narcissism of authenticity undermines community.” He never quite argues these claims but merely asserts them, and in doing so gives his own definition of authenticity, updating it from where it was during Taylor’s time, removing much of its nuance. I suspect that his definition is at odds with that of most people.

Taylor’s book is essentially a follow-up to his much more ambitious Sources of the Self: Making of the Modern Identity (1989), and when he wrote both, authenticity still strongly connoted individualism. Taylor’s aim in the 1992 book was to try and find a way to promote an ethic of authenticity that doesn’t succumb to either irrational optimism or pessimism regarding modern life and which instead allows people to live responsibly and well. Although his work is now somewhat dated thirty years after publication, he does a reasonable job at examining authenticity as a concept and trying to explore its potential. But Han, in his haste to assign all of the blame for modernity’s problems to the excesses of individualism and market forces, fails to recognize that since Taylor wrote that book, people are speaking less and less of “authenticity” as a mark of individual uniqueness. The two concepts have been almost entirely decoupled.

Arguably, the common man’s understanding of authenticity today is less problematic than ever before. During the enlightenment, authenticity suggested the attainment of a life unhindered by the constraints of societal pressures, and it indeed had much to do with individuality and nonconformism. Authenticity was about being true to oneself and ignoring pressures that come from bourgeois life, and it relied upon an understanding of human nature as beneficent and pro-social. When the age of decadence came along and surpassed its romantic predecessor, two major shifts occurred: first, human nature (and nature in general) had stopped seeming so pleasant and started to seem frightening; and second, artists and philosophers started knowingly to treat extreme sensations and emotions as the best conduit through which authenticity could be expressed, since such feelings suggest little societal mediation and thus little artificiality (this idea developed into modernity via Bataille and others). As soon as those developments occurred, the slow decoupling of authenticity from individualism had already begun.

The older, more benign notion of “authenticity” still persists to some extent, but it has more to do with one’s product purchasing choices than anything. In self-help books and New Age literature, gurus like Dr. Phil and Deepak Chopra still appeal to their audience’s individual uniqueness, often telling them directly “you are unique,” but their audience is mostly of an older generation. For most younger people, “being yourself” means finding a certain pose to strike, one that can be fully realized through the assumption of a distinct identity. I don’t want this claim to seem flippant. When people attest that their consumer habits, for instance, have helped them realize their true selves, we probably ought to take such claims seriously. They may actually understand something intuitively that the academics don’t: namely, the construction of the self can only arrive via negotiation with different social spheres available to the subject, like the commercial or the political. Trying to tell people that they don’t understand what authenticity truly means would be pointless.

Alongside the benign form of authenticity, there is also the more intense and dangerous one that developed alongside the disenchantment of romanticism’s sunny view of nature. When we talk about the philosophical history of a given concept, it will typically hover above the average person’s understanding of it — the philosopher’s usage will have some connection to common usage, but sometimes just peripherally. Yet it is not difficult to see that average people perceive extreme sensations and emotions, even antisocial behavior, as signs of authenticity. When things are “getting real,” they are often getting potentially violent or dangerous. Authenticity is typically lumped in with the unusual and the aberrant. When we “perform” this kind of authenticity, we stray from what’s typical in day-to-day life.

But none of these associations, you’ll notice, have anything to do with uniqueness per se, nor could they. As I’ve discussed above, the extremity of the limit-experience provides for us a common grammar, and many find it easy to imitate. The more extreme we get, the more clichéd. The typical state of consciousness in everyday life is rather where the person’s uniqueness will shine through, for it is far less imitable. And most importantly of all, people seem to be aware of all this.

When aspiring rappers, for instance, want to be authentic, they will rap about selling drugs or committing violent crimes. Ideally, they will have already done those things, but if not, then they will soon start lest their peers mock them for their inauthenticity. There is nothing unique or individualistic about what they’re doing: rap songs about selling drugs are as formulaic as it gets, and nowhere will any listener find pretense to originality. In fact, the lack of such pretense is one of trap rap’s charms! Now, of course this kind of performance art does involve The Self (what doesn’t?), but in this model of self-realization, the rapper presents the cliché to which he affixes himself. Then, if he really wants to be authentic, he modifies himself to be more compatible with it. He assumes, in other words, not an individual but a corporate identity.

But if that example is a bit too niche, let’s consider one of Generation Z’s favorite new pastimes: finding mental disorders and/or pre-established sexual identities to declare for themselves. This is something young people love to do on the internet, and it tends to concern the sex drive, man’s most primal faculty. As the quote attributed to Kierkegaard goes, “Once you label me, you negate me.” Well, nowadays, few seem to agree. Instead of resisting society’s attempts to put them in a box, these youngsters spend hour after hour searching for some sort of box in which to place themselves. Rather than allowing their egos to simply be, resisting classifications and labels, these teens instead go hunting for them like ravenous wolverines. Now, consider. Is this behavior self-absorbed? Sure. Narcissistic? Without a doubt. A solitary, navel-gazing pastime? Natch. But it isn’t done in the service of individualism, nor is there any claim to uniqueness. People who place themselves on the “asexual spectrum,” for instance, may identify as “demisexual,” “ignotasexual,” or “grey-A,” but not because they want to be “different from others.” It’s because they want to belong to a type; or, again, a corporate identity.

V. Articles and Books Consulted

Byung-Chul Han, The Disappearance of Rituals: A Topology of the Present. Polity, 2020.

Charles Taylor, The Ethics of Authenticity. Harvard UP, 1991.

E.H. Gombrich, Art and Illusion. Princeton UP, 1960.

Hans-Georg Moeller and Paul J. D’Ambrosio, You and Your Profile: Identity after Authenticity. Columbia UP, 2021.

Marshall McLuhan and Wilfred Watson, From Cliché to Archetype. Viking Press, 1970.

Niklas Luhmann, “Deconstruction as Second-Order Observing.” New Literary History 24.4, 1993.

Walter Ong, “The Writer’s Audience Is Always a Fiction” PMLA 90.1, 1975.

IV. Theory

On the other side of the discussion, there’s another school of thought that recognizes these developments in identitarian activity and argues that instead of having reached peak authenticity, we’re actually living in a post-authenticity world. An excellent recent example is You and Your Profile: Identity After Authenticity by Hans-Georg Moeller and Paul J. D’Ambrosio. Both authors ought to be congratulated for taking seriously the question of identity in the digital age and resisting the urge to make value judgments. The book is rife with example after example of online identity-formation, and the authors should be commended for resisting the urge to make emotional value judgments and instead treat the subject in a serious manner. Unfortunately for their theory, however, authenticity isn’t going anywhere, and it will become more important to people as time goes on.

Moeller and D’Ambrosio (henceforth M&D) agree with Charles Taylor when he describes modernity as an age of authenticity, but they believe it is in the process of coming to an end. They themselves would admit to being long on theoretical reasoning and short on empirical evidence. When you look up the word “authenticity” in the Google NGram search, you’ll find that in printed publications, the word peaked in 1800, reached a low point around 1920, and is currently in the process of climbing back to where it was before. And as they point out, modern advertising is more filled with “authenticity speech” than ever. For M&D, this is because people are “putting authenticity in the service of” what the authors call “profilicity,” the new paradigm that is supplanting authenticity.

Authenticity keeps chugging along

Profilicity, for M&D, is what happens when the social environment becomes saturated in second-order observation. People stop presenting themselves to others on an individual-by-individual basis, and instead begin curating presentations of themselves for the approval of what the authors call “the general peer.” Once people put together a profile for themselves, they need to continuously update and develop it so that it can receive constant validation from this “general peer.” Eventually, the individual’s sense of himself becomes inextricably linked to this profile that he curates, and the notion that he has a “true self” underneath it becomes nonsensical, since his very being is so suffused in the practice of profile-curation. But since the social value we assign to “authenticity” is still hanging on like a bad bronchial infection, we use the language of authenticity even while creating a generalized self-presentation. Eventually, though, this will all go away because we will become accustomed to our new conditions of widespread second-order observation and thus profilicity.

There are some problems with the premises beneath the analysis that I’ll try to briefly elucidate. First, there is no such thing as a “general peer” besides God. No one who creates a social media profile is speaking to everyone in the world, even though their audience could be potentially limitless. They are imagining a certain audience and shooting for its approval, and the cliché’s cold probe is how they determine its shape and contour. Second, the phenomenon of second-order observing is nothing new. As I showed in Parts I and II, anytime you observe the reaction of others, you are making your own judgment in light of second-order observation. The print era was, if anything, a break from it, since published information generally linked an individual observer to another individual’s thoughts rather than to the opinion of society in general. And third, most importantly, when we accuse someone of inauthenticity today, we are often not even concerned with the person’s inner state of consciousness at all.

This last claim requires some explanation. When we see people present something online, we evaluate the presentation through a “pars pro toto” criterion. The fragmented part must resemble the whole as it stands in three-dimensional space. We want to be able to think of it as if we were there with it in person. If someone shows photos of his guest room on an online room and board profile, it needs to at least somewhat resemble what it will look like if someone chooses to stay there; otherwise the person’s rating will suffer. If a lady presents photos of herself on a dating profile, a gentleman sees it and goes on a date with her, and then it turns out that she is one hundred pounds heavier than she made herself seem, there will likely be no second date. In these instances, the profiles will be taken as inauthentic, and few would argue that the renter or bachelor’s expectations were misplaced. Even in situations in which people present themselves emoting on camera, we often evaluate it on the basis of whether or not that person would behave similarly were the camera not there. In an increasing amount of situations, “inauthentic” is not a judgment about one’s inmost being. It is an evaluation of a presentation’s success as an iconic sign, i.e. a sign that points to a referent by way of direct sensory resemblance rather than abstract symbolism.

It is strange that M&D all but completely ignore this dimension of authenticity and instead go right for heady discussions about the making of the self. But it isn’t entirely their fault. The blame would have to go to their primary inspiration, Niklas Luhmann, who aligned his social complexity theory with postructuralist semiotics (pseudo-semiotics), such as in one late-career essay where he argued for the compatibility between his notion of second-order observing with Jacques Derrida’s deconstruction. As M&D openly state (pp.131–133), their observations are compatible with thinkers influenced by Ferdinand de Saussure, like Derrida and Baudrillard, since Saussure argued that the meaning of linguistic signs is rooted in conventional difference, and profilicity too is characterized merely by such difference. In other words, just as the word “cat” doesn’t have to have a motivated connection to its referent, a cat, neither does an online profile need to have a motivated connection with the person whom it supposedly represents. As they put it, “Profilicity … functions ‘postrepresentationally.’ […] With profilicity, identity consists not in representation of a role or a true self but in having a profile that is different from other profiles.”

At this point, I really have to wonder: do the authors even believe their own nonsense? People are constantly wondering, especially with anonymous profiles, if the e-personality being curated actually resembles the person who curates it. When a man has a “GigaChad” avatar on a social media profile and gets doxed, people will howl with laughter at his dumpy, beta-male appearance and its stark contrast from how he chose to represent himself. But if he looks tough, the laughter won’t be there. Even if we ignore pictures and just focus on writing, people expect the language and verbal tone of a social media profile to approximate the tone of the person who typed them in a corresponding real-life situation. Though we allow for exaggeration and a bit of cartoonishness, we nonetheless evaluate the language of a profile on a “pars pro toto” basis just as we do the pictures. As soon as the opportunity arises, we look to see if that language is a natural, uncontrived extension of the person we’re encountering. If his soul speaks a different tongue to us, his profile will be judged inauthentic. It isn’t about who he really is deep down inside; it’s about how we receive him and, in the case of language, evaluate the transference from written to oral speech, or planned speech to improvised speech. The smoothness of the transference from one mode of representation to another is everything. This is why Saussurean semiology, which lacks the important concept of iconicity, is simply inadequate to understand what digital self-representation is all about.

The truth is, authenticity remains a serious preoccupation in society, and it isn’t going anywhere. Rather than effacing it, the online world has become its baptismal font. But at some point, its meaning indeed changed. It is no longer about the search for the unique self and has instead turned into the search for the corporate self. The tribal self. The clichéd self. The exaggerations we place upon a given profile link us to others incorporating the same exaggerations through the bond of a common grammar. The tone we use, the stylistic choices we make, the memes we prefer, the race or ethnicity to which we belong — all of these place us in categories, and these categories prepare for us our destinies. Authenticity isn’t going away, but the pretense of individualism sure is.

III. The Internet

There’s a popular word online: “cringe,” short for cringe-inducing or cringeworthy. If you see a group of teens calling something cringe, you know they find it especially off-putting. As with most internet words, its core meaning didn’t last long; it now just means “bad” in any vague way imaginable. (Undoubtedly, this happened because middle-aged adults started to appropriate it in an attempt to conserve their youth, thus ruining it forever. Some other ruined words include “cope,” “based,” “midwit,” and “LARP.” Truly a pity.) But when it meant something, it meant that a certain display is so awkward in execution that it causes the viewer to cringe.

Now, why would you cringe? You might cringe if you remember yourself doing something embarrassing in the past. But to do it out of embarrassment when watching someone else acting lame or awkward is another matter entirely. It’s not mere embarrassment, but empathic embarrassment. The viewer places himself in the position of the person being cringe and has a sensation that approximates how he’d feel if it were him. And many things are cringe nowadays, depending on the cringer and cringee. The Vox web site has recently told us that Harry Potter fandom, the Hamilton musical, and other Obama-era cultural artifacts are now cringe, for instance. But anyone outside of their readership knows that plenty of people were cringing at all of that right when it was happening. The cosmopolitan sophisticate of 2022 can cringe in self-remembrance, but the 4chan teens of 2012 were cringing in the embarrassment of what they could perhaps one day become.

Cringe is the exposure of a cliché’s artifice. It’s often the uncanny valley between the archetype and the cliché — the manner in which the grammar of a certain performance suggests depth and universality yet fails to meet the mark. Moreover, its success or failure in the social setting rarely matters. An awkwardly executed marriage proposal met by rejection is cringe. One met by an awkward acceptance might be even more cringe. In fact, any moment of social awkwardness is cringe, even if at the time it happened no one saw it other than the eye of the camera (there always has to be a camera). Gena Rowlands may have put on a realistic acting performance in A Woman Under the Influence, but when she was being crazy, she was being cringe. The TikToker wailing about her pet snake was being cringe. A meme shared among a small group of friends isn’t cringe, but having the meme become mainstream just might make it so.

The interesting thing about today’s cultural analysts is that whether or not they agree that this culture is characterized by a persistent concern with authenticity, they do seem to agree that it is pervaded with feelings of alienation. The conclusion is inescapable, even for those who favor a pose of emotional detachment. Despite the similarities to the past, such as the return of second-order observing, today’s identity formation belongs to a different category from the identity formation of yesteryear. It simply isn’t the way it was during Boccaccio or Chaucer’s time. The explosion of cringe gives a hint as to why. Cringe happens in the hindsight of observation once three-dimensional space has lapsed into two dimensions. It’s the consequence of recording and surveillance; the self-consciousness of disembodied observation. It happens because you’re sitting there watching the event unfold on a screen. Isolated. Alone with your own thoughts.

Man is a semiotic animal in that he doesn’t merely interpret signs; he can do so while consciously knowing that they’re signs. But in the embodied world of social relations, man’s role as sign-interpreter is mercifully easy to forget. Even as he interprets the signs around him, everything feels natural and spontaneous. In private viewing, however, the schemata of daily life reveal themselves and the artifice of being there with others is exposed. Basic interactions become proceduralized. A couple decades ago, a group who called themselves pick-up artists observed the manner in which romantically successful men flirt with women, and these pick-up artists then reduced their flirtations to a series of social techniques to practice and refine. Awkward, socially maladjusted men everywhere tried to put these techniques into practice, and some were successful. But others couldn’t shake the sensation of self-consciousness. They couldn’t help but picture themselves through the cyclopean eye of the camera lens, and understandably they felt ridiculous. Thus, the incels were born.

In romance, what once seemed natural is now cliché, right down to its subtlest motions. Even the initiation of a mere kiss brings to mind a million electronic media topoi that each participant has internalized through the reflection of a distorted glass, so that when life has resumed, the participants are themselves distorted. It is thus not hard to understand why so many have turned away from courtship altogether, overburdened by the weight of cliché and schemata. The second-order observing of yesteryear took place in the public performance of the priest or bard, and thus everyone in the crowd was implicated in mankind’s folly and absurdity, or alternatively his virtue and heroism. But today, second-order observing is usually done privately on a personal computer or through a tiny phone screen, so we are given the privilege of convincing ourselves that we stand outside of everything. Cringe is borne of this illusory outsideness.

The explosion of media schemata has thus produced contrary reactions. Those people screaming, crying, and carrying on through TikTok have decided to internalize the schemata and convey themselves through them, occupying a strange aporia in which self-awareness and unselfconsciousness clash. They live their lives as the stars of movies that take place in their own minds. I’ll give one more example. There is a kind of meme that reads “POV: You are [such and such],” and increasingly it has not been using the first-person point-of-view camera angle at all; it simply shows its intended subject from a third-person shot. Others have remarked on the prevalence of this mistake. But it is not really an error if you think about it. The true point-of-view in the digital age is of the subject watching herself through the screen that represents her. Those who have thrown themselves into the battleground of clichés and have found their authenticity through just the right combination will move through life cultivating a variety of interesting backdrops to accompany that point-of-view. But those who feel the weight of isolation accompanying that point-of-view will stand askance and, well, cringe.

II. Literature

In fiction writing, the practice of developing a unique literary voice is dying. One recent essay argues that it has been replaced by the cultivation of “the pose,” a series of vague but evocative gestures that seem to correspond to the cultivation of the corporate self that I’ve been describing. I have no desire to read new fiction, so I’ll just lazily allow this essay to make my point for me in block quotes:

The writing of the pose is, first and foremost, about being correct, both in terms of style and content. Its foremost goal is not to make any mistakes. Its foremost gesture is erasure and its foremost subject is social anxiety and self-presentation. One never loses oneself in the writing. Rather, one admires, at a slight remove, the precision of the undertaking.

[…]

The first thing a young poet needs to be heard today is not mastery of language nor the calling of a muse. It’s a look. The writing of the pose is the literary product of the MFA system and of Instagram in equal measure — it brings writing into the ordinary grueling business of the curation of the self which dominates advanced capitalist culture today.

We are dealing with sensibilities, inevitably mushy around the edges, but they are in conflict, in irreconcilable conflict, exactly because they’re so vague, because they underlie assumptions and fundamental approaches. So much of contemporary discourse generally, not just in literature, is people talking past each other. The Voice and the Pose cannot comprehend one another; they occupy different spheres of meaning. The loathing and the fury that define the political struggle over the use of language — interchangeably referred to as “political correctness” or “cancel culture” — emanate in large part from that chasm.

[…]

Almost all major writers of the Pose attended elite schools and creative writing programs — even an exception like Tao Lin took a journalism degree at NYU. They have jumped through hoops. They have not screwed up. Perhaps for that reason, the definitive gesture of their writing is erasure. Ottessa Moshfegh’s My Year of Rest and Relaxation constitutes a series of experiments in non-existence; no work has ever given such rich, sustained attention to oblivion. It is a masterpiece. Moshfegh’s contempt, like [Sally] Rooney’s, embraces the world and then herself: The narrator decides to sleep for a year rather than commit suicide. Suicide would be a meaningful statement. If she killed herself, what would people say?

The author emphasizes the culture of striving, competition, and obedience to teacher as the main contributor to this new approach, but he also rightly acknowledges that the idiosyncrasy of a distinctly cultivated literary voice can’t be adequately captured across multiple media. It doesn’t work for Instagram, which is just as important for the new style as the graduate school program. These books lack uniqueness, instead opting for a generic sort of weariness at the conditions that have allowed them to exist in the first place. At once, they’re both the TikToker expressing herself through a combination of well-refined stereotypes and the Reddit user watching her video and calling it cringe.

The workshop model of literary writing criticism and preparation is essentially a miniature version of what social media feedback does to the process of self-representation. But the demands of one’s peers in the classroom don’t necessarily match those of the online world. It is thus easy to understand why, as that essay suggests, aloofness and detachment are used in constructing a literary stance to balance out the author’s real-life headrush into careerism and the desire to please. The MFA writer still lives in the situation Marshall McLuhan described as common to the age of print, wherein the heart and head are at odds with one another. It is not clear how long the contradiction will be able to sustain itself, but as fiction becomes less and less profitable and occupies an increasingly marginal part of the culture, the tension can probably afford to play itself out on taxpayer funding and endowments for a little while longer.

In genre fiction, which sells better, no such feigned detachment is necessary. It can gladly bathe in the cliché and make its home there. A recent viral tweet expressed giddy enjoyment at the opening sentences of a recent female-authored potboiler. They go like this:

Slayyy, queen

I. Film

Much of the stress we feel today comes from the clash of different media as we awkwardly apply our old content to new forms. We rely on schemata from television and film as a compass to help us navigate the intricacies of video production in the smartphone era. We really haven’t progressed much beyond the late 90s, when we conceived of “email” as “just like sending a letter but on a computer” and used a picture of a mailbox as the icon to access it. As Marshall McLuhan has pointed out, there’s always a delay before a population realizes the potential of a new medium because we look toward the future by staring into a rearview mirror. Rarely do we know how to be in the present.

Artists, according to this line of thinking, constitute a big exception, since they are comfortable with the present. Sometimes they are too comfortable, and few can grasp what they are attempting to say. One such artist was Federico Fellini, whose films were championed by aficionados at first, then the entirety of the literary avant-garde through much of the 1960s, until they became polarizing starting in the 1970s when they lost clear narrative continuity and started to have more in common with (if anything) burlesque theater, fumetto, and verse poetry than the more “dignified” medium of the novel. It is fitting that in Woody Allen’s Annie Hall (1977), McLuhan himself makes a cameo to humiliate some film snob who begins the scene by opining loudly about his dissatisfaction with the recent work of Fellini.

In one of his lesser known films, Ginger e Fred (1986), Fellini directly addresses the alienation that arises from the dizzying clash of media forms, and he uses the televised variety show as the vehicle to do it. The film’s story is this: two tap dancers, Amelia and Pippo (Giullieta Masina and Marcello Mastroianni), who had specialized in Ginger Rogers and Fred Astaire dance sequences for live variety shows and vaudeville-style performances are now reuniting thirty years later for a one-time appearance on a TV variety show. Pippo has become an eccentric loner who had spent some time in the mental hospital after his wife left him, while Amelia has become a recently widowed but successful businesswoman. Both are single and would seem to have much to discuss, but they are overwhelmed by the absurdity that surrounds them in the television studio, which often interferes with their attempts to catch up with one another. Grotesque exaggerations of stock characters common in Fellini films parade around both the TV set and nearby hotel in which everyone is staying: midgets, transvestites, oiled-up bodybuilders, fat women, magicians, line dancers, men inexplicably wearing helmets, women with blue hair, Catholic clergymen, etc. And throughout the film, a television can often be seen in the background displaying sequences (often advertisements for imaginary products) that resemble parodic imitations of Fellini’s filmmaking style.

The film’s storytelling is more conventional than most of Fellini’s late work, but in a word, it is cluttered. Many critics found this off-putting because they felt Fellini was lazily using the premise as an excuse to cram as much “Felliniesque” things into it as possible. Maybe, but in doing so he was making a point. As the film proceeds, it becomes increasingly apparent that the garishness of the televisual spectacle effaces the possibility of any authentic core to emerge from the main characters. Fellini himself acknowledged this, stating that his depiction of TV becomes a play of mirrors, in which imitations reflect images of other imitations seemingly endlessly. Perhaps putting too fine a point on it, the most genuine moment of honesty and warmth between Pippo and Amelia occurs when they are finally dancing together on live television and then a storm causes the power to cut out, leaving the studio in darkness, untainted by the intrusion of electronic media such as the TV camera or even the light bulb.

At the most obvious level, Ginger e Fred seems to be contrasting cinema with television — the comparative innocence of Hollywood under siege from the bombardment of the unrestrained visual schemata of TV. But on closer inspection, its message is not so simple. The film appears to be something of an aborted love story, as evinced from a scene in which Pippo explains to a journalist, right in front of Amelia, that tap dance originated as a means of communication under slavery. As he explains to them, it was the Morse Code of the black slaves, who would tell each other things like, “Watch out, there’s a guard, I have a knife, they’re doing him in… or rather… [at this point, he pauses dramatically] I love you. [He pauses again.] And I, too.” Though Amelia finds all of this fascinating, it doesn’t seem to occur to her that Pippo may be trying to say, in this explanation, that all the time they were dancing he was expressing his true love for her. Only in the moment of darkness, in which their dance is interrupted and they speak frankly with each other, does this possibility emerge.

Additionally, the dance they choose to perform is significant in its own right. It begins with Pippo (playing Fred Astaire) standing off in the distance, pantomime-smoking an invisible cigarette as a ship’s horn blows, indicating that he is about to board this mysterious ship and leave Amelia (playing Ginger) forever. “Adieu!” she calls. But then, right as Fred is about to leave, she changes her mind and calls out, “Fred!” prompting him to approach her as they begin their dance. As the variety show’s announcer puts it when he introduces the segment, “They are hugging and promising not to be separated again. The music … surrounds the two partners in work and life, and they’re still dancing together.”

The dance sequence couldn’t possibly have been from a real Ginger and Fred film, since the two always portrayed characters other than themselves. It was planned by Pippo and Amelia, imagining the real Ginger and Fred living out a performance much like those in their films. And given that Ginger and Fred movies were sometimes about dancers having their lives changed by their own performances, the idea, while tacky, is not altogether outlandish. The music accompanying the dance is a medley of three different numbers composed by Irving Berlin: two from Top Hat (1935) and, importantly, one from Follow The Fleet (1936). In Follow The Fleet, Astaire and Rogers play dancing partners who reunite after several years’ absence for a benefit concert. The act of their dancing together (in a melodramatic dance sequence that has its own internal storyline) resolves the plot, causes the dance partners to realize their love for one another, and prompts someone from Broadway to offer them a show, which they accept amid their newfound love. Wouldn’t it be nice, then, if the real Ginger Rogers and Fred Astaire could have perhaps declared their love for one another, right as they were about to part?

Essentially, Pippo and Amelia’s dance sequence is fan fiction about real people, much like the way X-Files fans a decade later would fantasize about David Duchovny and Gillian Anderson getting married. But by the end of the film, we realize that this fan fiction has become a wish-fulfillment fantasy that these two dancers are performing about their own lives. Through the performance of these romantic schemata, Amelia and Pippo have gotten to know each other and develop real feelings that they otherwise would not have, but it is precisely these same schemata that prevent them from seizing upon these feelings, leaving them alienated from one another. The last scene of the film makes it clear: right when Amelia is leaving Pippo to board her train, Pippo calls out “Amelia!” and blows out an imitation of a ship’s horn through his hand, suggesting the possibility that she might rush toward him, just as they had done on stage. Amelia then smiles and imitates the exaggerated gesture of her calling out “Fred!” just like in their dance, but then shrugs, waves, and walks off toward her train. Of course, she could have actually behaved just as their dance suggested she should, but the familiarity of the convention would have rendered barren the gesture. The very last shot of the film shows an antenna salesman talking about how Rome can get 66 cable channels while a television playing a pasta commercial can be seen.

Such a situation, the crux of the story, shows why Fellini is not making the straightforward and clunky attack on television that some audiences assumed. Fellini’s films often have a pattern: they start out satirizing something (e.g. fascism in Amarcord, Casanova in Casanova) only to turn around and gradually build a defense for their original target. Ginger e Fred is no exception. Yes, the televisual spectacle is alienating for Pippo and Amelia, but it is only by virtue of the mirror it holds up to their lives that they became aware of what their previous performances from so many years ago had held locked within. Only the new cliché of television could reveal what the old cliché of film had been concealing all the time. Had they never met for that gaudy and tasteless televised reunion show, the dancers never could have explored their own authentic feelings for one another. And yet that same encounter shows how their feelings, authentic though they were, had always been infused inextricably with the clichés of cinema. Pippo wants to ride the clichés and let them carry their love home, but Amelia can’t abide. Only by rising across the aureate sky on wings of “cringe” could she have solved the problem — but it was not to be!

There’s still another level of media transference at play besides the switch from film to television, though, one even more pertinent to the concept of authenticity. Ginger and Fred may have been Hollywood stars, but Pippo and Amelia never were; they were variety show performers. Fellini stated that while he loved live variety shows on stage (one is featured in Roma), he hated televised ones, and this opinion clearly informs much of the film. We never get a taste of what Amelia and Pippo’s live variety shows may have been like. Their memory lingers in the shadows of the film, and we are left only to piece together how they could have been performed through the distortion of their televised presentation. This was undoubtedly a deliberate choice on Fellini’s part.

The variety show is a modern art form, but it wasn’t recorded at first. The live audience was not understood to be part of the show, though it certainly was. A theatergoer would have been part of the audience, not observing one on a television screen. For a televised variety show, there’s a live audience, and its behavior is highly artificial, and the same kinds of audiences can be seen on late night TV talk shows and other productions even today. For quite some time, the feeling of an in-studio audience was essential. Since the real audience would be sitting at home, TV producers needed to contrive the platonic ideal for what an audience ought to be and present that for the home audience. For the latter half of the 20th century, even shows with no live audience including cartoons would use fake laugh-tracks to simulate one. A show designed for a carnal audience means that the audience partakes in the spectacle and can even change its direction. But a show designed for a virtual audience, a television show, must use its live audience as an adhesive as it grasps into the darkness, trying to seize whichever audience it can find. Such an audience possesses no inner will, responding passively to electronic “applause” signs and stage directions from the producer.

The carnal audience’s transition to the virtual audience is the chief difference between the pre-print world and the electronic one. If the clichés of what we encounter in the electronic world are weighing upon us too heavily for us to simply live life, it is only because we have the opportunity to view observation itself as an object of consideration rather than something to which we belong. Pippo and Amelia’s television performance is for a thoroughly virtual audience, and the artificiality of the live audience there in the studio, a mere prop, helps them realize how they really feel about themselves and each other. In the moment of the power outage, when the two characters finally have a quiet moment together, they plan to run off and ditch the show. It is no trifling detail that Pippo makes an obscene gesture at the audience right before the power is restored and their escape plan is foiled. The TV studio gives them a glimpse into “mass man” against which they can recognize their own uniqueness.

When Marshall McLuhan called TV a “cool” medium and film a “hot” one, he was thinking mainly of how the audience was situated before the production. Film is “hot” because the screen is really big, the audience is forced to sit there in the dark, and the picture and sound quality are high. TV is “cool” because the screen is small, you can do other things while the show plays, and the picture and sound are pretty crappy. But since then, digital video has taken over, theater attendance has dropped, televisions have become marketed as “home theaters,” high-definition screens have become the norm, and the number of channels has gotten indefinitely big. The distinction between film and TV has become meaningless. The erosion of a substantive difference began with the invention of the beta-max tape, which slowly rendered irrelevant the common space binding theatergoers together. Since then, the audience for the movies has gradually become as virtual as for TV, and it’s now just an assortment of clichés (we often call these “demographics”). In the old-time theaters, people were congregated together as a mass, and any live production would have to respect the kind of second-order observation that can only subsist in three-dimensional space. The kind in which the crowd thinks and acts as its own sovereign unit, with its own distinct, inimitable quality that can only emerge once and never again, and for whom the ability to improvise was paramount. But the TV targets increasingly small, isolated carnal audiences in order to reach big virtual ones. For the internet, the erosion of three-dimensional space has been even greater. Every new innovation in electronic media has been a defeat for the carnal and a triumph for the virtual. And in the virtual world, you’re merely two ears, a mouth, and a floating eye. You have a voice but no memory.

One last brief word. And it’s just for you. That’s right — you. Perhaps you’ve noticed that within any one of these disembodied audiences, audiences for just about anything, the individual members’ similarities are trifling and superficial. Such people would never understand or appreciate one other on an intimate level if given the chance. Their allegiance to one another is fickle, their resolve weak. The internet has blessed you with the ability to see their inconstancy for yourself, but it has been around for much longer. Just look at fandoms and political movements (but I repeat myself). Observe how they build up their communities online and loudly proclaim their loyalty toward one another, only to sever it promptly, tearing themselves apart after the slightest disagreement about some minor ideological dispute, semantic quibble, attention-seeking social gesture, or perhaps a transient blip in the news cycle.

And if you can perceive all this hollowness while you are by yourself, and if you know you’re being lured in with impotent bravado and false promises, and if you’re encountering some particular piece of its content and you feel yourself becoming the cartoon character that it needs you to be, then it’s probable that your instincts will violently resist the transformation. Your feeling of absolute exteriority will intensify, and you’ll instead play the part of the beautiful soul, refusing to let yourself be tarnished by the tainted world. You’ll be distant and solitary, walking the earth in spite of it. Absorbing everything, returning nothing. Steadfast in your role as a discarnate being. At least, that is… at least, until you can find others who feel the way you do. You can befriend them, forge bonds with them, and then maybe you’ll have your own cliché to which you can belong. One that quickens your resolve and compels you onward. One through which you’ll view the physical world surrounding you anew. One that convinces you that you belong to something more. One through which you’ll achieve authenticity. What splendor! What joy! And then, my friend, love wins, and the world is yours.

--

--

Kerwin Fjøl
Kerwin Fjøl

Written by Kerwin Fjøl

Semiotics, media ecology, intellectual history, art & literature.

No responses yet