Readers today believe that they are living through unprecedented changes in how ...
The post Public Thinker: Leah Price on Books, Book Tech, and Book Tattoos appeared first on Public Books.
For many people, Halloween means it’s time to throw on a classic teen slasher like Halloween or Friday the 13th. Today, we often look back on those movies as festivals of gore and cleavage designed to appeal to teen boys. But, as film historian Richard Nowell writes, the most coveted audience for these movies at the time was teenage girls.
Nowell writes that teen slashers emerged in the wake of 1970s horror films aimed at adults. Starting with the success of Rosemary’s Baby in 1968, many moviemakers had centered scary supernatural plots on strong female characters. In contrast to the horror movies of earlier eras, these films generally avoided the trope of cowering, half-dressed women. For example, in 1973, the theatrical trailer for The Exorcist, and the film itself, focused on the working single mother of the possessed girl.
By the late ‘70s, adult horror audiences were on the decline. Overall, market research found, half of U.S. theatergoers were between 12 and 20, with a fairly even gender balance. Many went to the movies with dates, and industry professionals generally believed that teen girls usually chose which movie to see on a date with a boy.
To sell movies to a teen audience, writers and directors took special care with their depictions of teen girls. Debra Hill, cowriter of 1978’s Halloween, later said she wanted young women to be able to “see themselves” in the female leads, who spend significant time talking about schoolwork, dating, and babysitting.
While later commentary has often assumed that the sex in teen slashers was gratuitous and promiscuous, Nowell writes that films like Friday the 13th (1980) actually spent a lot of screen time showing couples’ sexual relationships as emotionally intense and romantic. Following on the heels of non-horror teen films like Grease, studio executives had discovered that young love and platonic teen relationships were strong assets for marketing a movie. Lobby cards for Friday the 13th featured few moments of horror or titillating shots of female leads. Instead, they showed romantic moments, platonic friendships, and even a female character showing a young man how to change a light bulb.
“Taken as a whole, Paramount’s lobby cards marketed Friday the 13th as female-youth-friendly entertainment,” Nowell writes.
The marketing apparently worked. Forty-five percent of the theater audience for Halloween and Friday the 13th was under 17, and, of those young viewers, 55 percent were girls.
Following in the footsteps of those hits, a flood of teen slasher movies showed up in theaters in 1981, including My Bloody Valentine, The Burning, and Friday the 13th Part II. These movies followed the newfound convention of mixing romance with horror, leading New York Times critic Vincent Canby to refer to the genre as “teen-age love-and-meat-cleaver films.” The heroines of these movies were traditionally feminine, tough, and sexually confident.
So, if you’re inclined to throw on something scary this Halloween while also celebrating empowered young women, it turns out there are a lot of options.
I was recently honored by the Electronic Frontier Foundation. Alongside Oakland Privacy and William Gibson, I received a 2019 Barlow/Pioneer Award. I was asked to give a speech. As I reflected on what got me to this place, I realized I needed to reckon with how I have benefited from men whose actions have helped uphold a patriarchal system that has hurt so many people. I needed to face my past in order to find a way to create space to move forward.
This is the speech I gave in accepting the award. I hope sharing it can help others who are struggling to make sense of current events. And those who want to make the tech industry to do better.
I cannot begin to express how honored I am to receive this award. My awe of the Electronic Frontier Foundation dates back to my teenage years. EFF has always inspired me to think deeply about what values should shape the internet. And so I want to talk about values tonight, and what happens when those values are lost, or violated, as we have seen recently in our industry and institutions.
But before I begin, I would like to ask you to join me in a moment of silence out of respect to all of those who have been raped, trafficked, harassed, and abused. For those of you who have been there, take this moment to breathe. For those who haven’t, take a moment to reflect on how the work that you do has enabled the harm of others, even when you never meant to.
The story of how I got to be standing here is rife with pain and I need to expose part of my story in order to make visible why we need to have a Great Reckoning in the tech industry. This award may be about me, but it’s also not. It should be about all of the women and other minorities who have been excluded from tech by people who thought they were helping.
The first blog post I ever wrote was about my own sexual assault. It was 1997 and my audience was two people. I didn’t even know what I was doing would be called blogging. Years later, when many more people started reading my blog, I erased many of those early blog posts because I didn’t want strangers to have to respond to those vulnerable posts. I obfuscated my history to make others more comfortable.
I was at the MIT Media Lab from 1999–2002. At the incoming student orientation dinner, an older faculty member sat down next to me. He looked at me and asked if love existed. I raised my eyebrow as he talked about how love was a mirage, but that sex and pleasure were real. That was my introduction to Marvin Minsky and to my new institutional home.
My time at the Media Lab was full of contradictions. I have so many positive memories of people and conversations. I can close my eyes and flash back to laughter and late night conversations. But my time there was also excruciating. I couldn’t afford my rent and did some things that still bother me in order to make it all work. I grew numb to the worst parts of the Demo or Die culture. I witnessed so much harassment, so much bullying that it all started to feel normal. Senior leaders told me that “students need to learn their place” and that “we don’t pay you to read, we don’t pay you to think, we pay you to do.” The final straw for me was when I was pressured to work with the Department of Defense to track terrorists in 2002.
After leaving the Lab, I channeled my energy into V-Day, an organization best known for producing “The Vagina Monologues,” but whose daily work is focused on ending violence against women and girls. I found solace in helping build online networks of feminists who were trying to help combat sexual assault and a culture of abuse. To this day, I work on issues like trafficking and combating the distribution of images depicting the commercial sexual abuse of minors on social media.
By 2003, I was in San Francisco, where I started meeting tech luminaries, people I had admired so deeply from afar. One told me that I was “kinda smart for a chick.” Others propositioned me. But some were really kind and supportive. Joi Ito became a dear friend and mentor. He was that guy who made sure I got home OK. He was also that guy who took being called-in seriously, changing his behavior in profound ways when I challenged him to reflect on the cost of his actions. That made me deeply respect him.
I also met John Perry Barlow around the same time. We became good friends and spent lots of time together. Here was another tech luminary who had my back when I needed him to. A few years later, he asked me to forgive a friend of his, a friend whose sexual predation I had witnessed first hand. He told me it was in the past and he wanted everyone to get along. I refused, unable to convey to him just how much his ask hurt me. Our relationship frayed and we only talked a few times in the last few years of his life.
So here we are… I’m receiving this award, named after Barlow less than a week after Joi resigned from an institution that nearly destroyed me after he socialized with and took money from a known pedophile. Let me be clear — this is deeply destabilizing for me. I am here today in-no-small-part because I benefited from the generosity of men who tolerated and, in effect, enabled unethical, immoral, and criminal men. And because of that privilege, I managed to keep moving forward even as the collateral damage of patriarchy stifled the voices of so many others around me. I am angry and sad, horrified and disturbed because I know all too well that this world is not meritocratic. I am also complicit in helping uphold these systems.
What’s happening at the Media Lab right now is emblematic of a broader set of issues plaguing the tech industry and society more generally. Tech prides itself in being better than other sectors. But often it’s not. As an employee of Google in 2004, I watched my male colleagues ogle women coming to the cafeteria in our building from the second floor, making lewd comments. When I first visited TheFacebook in Palo Alto, I was greeted by a hyper-sexualized mural and a knowing look from the admin, one of the only women around. So many small moments seared into my brain, building up to a story of normalized misogyny. Fast forward fifteen years and there are countless stories of executive misconduct and purposeful suppression of the voices of women and sooooo many others whose bodies and experiences exclude them from the powerful elite. These are the toxic logics that have infested the tech industry. And, as an industry obsessed with scale, these are the toxic logics that the tech industry has amplified and normalized. The human costs of these logics continue to grow. Why are we tolerating sexual predators and sexual harassers in our industry? That’s not what inclusion means.
I am here today because I learned how to survive and thrive in a man’s world, to use my tongue wisely, watch my back, and dodge bullets. I am being honored because I figured out how to remove a few bricks in those fortified walls so that others could look in. But this isn’t enough.
I am grateful to EFF for this honor, but there are so many underrepresented and under-acknowledged voices out there trying to be heard who have been silenced. And they need to be here tonight and they need to be at tech’s tables. Around the world, they are asking for those in Silicon Valley to take their moral responsibilities seriously. They are asking everyone in the tech sector to take stock of their own complicity in what is unfolding and actively invite others in.
And so, if my recognition means anything, I need it to be a call to arms. We need to all stand up together and challenge the status quo. The tech industry must start to face The Great Reckoning head-on. My experiences are all-too common for women and other marginalized peoples in tech. And it it also all too common for well-meaning guys to do shitty things that make it worse for those that they believe they’re trying to support.
If change is going to happen, values and ethics need to have a seat in the boardroom. Corporate governance goes beyond protecting the interests of capitalism. Change also means that the ideas and concerns of all people need to be a part of the design phase and the auditing of systems, even if this slows down the process. We need to bring back and reinvigorate the profession of quality assurance so that products are not launched without systematic consideration of the harms that might occur. Call it security or call it safety, but it requires focusing on inclusion. After all, whether we like it or not, the tech industry is now in the business of global governance.
“Move fast and break things” is an abomination if your goal is to create a healthy society. Taking short-cuts may be financially profitable in the short-term, but the cost to society is too great to be justified. In a healthy society, we accommodate differently abled people through accessibility standards, not because it’s financially prudent but because it’s the right thing to do. In a healthy society, we make certain that the vulnerable amongst us are not harassed into silence because that is not the value behind free speech. In a healthy society, we strategically design to increase social cohesion because binaries are machine logic not human logic.
The Great Reckoning is in front of us. How we respond to the calls for justice will shape the future of technology and society. We must hold accountable all who perpetuate, amplify, and enable hate, harm, and cruelty. But accountability without transformation is simply spectacle. We owe it to ourselves and to all of those who have been hurt to focus on the root of the problem. We also owe it to them to actively seek to not build certain technologies because the human cost is too great.
My ask of you is to honor me and my story by stepping back and reckoning with your own contributions to the current state of affairs. No one in tech — not you, not me — is an innocent bystander. We have all enabled this current state of affairs in one way or another. Thus, it is our responsibility to take action. How can you personally amplify underrepresented voices? How can you intentionally take time to listen to those who have been injured and understand their perspective? How can you personally stand up to injustice so that structural inequities aren’t further calcified? The goal shouldn’t be to avoid being evil; it should be to actively do good. But it’s not enough to say that we’re going to do good; we need to collectively define — and hold each other to — shared values and standards.
People can change. Institutions can change. But doing so requires all who harmed — and all who benefited from harm — to come forward, admit their mistakes, and actively take steps to change the power dynamics. It requires everyone to hold each other accountable, but also to aim for reconciliation not simply retribution. So as we leave here tonight, let’s stop designing the technologies envisioned in dystopian novels. We need to heed the warnings of artists, not race head-on into their nightmares. Let’s focus on hearing the voices and experiences of those who have been harmed because of the technologies that made this industry so powerful. And let’s collaborate with and design alongside those communities to fix these wrongs, to build just and empowering technologies rather than those that reify the status quo.
Many of us are aghast to learn that a pedophile had this much influence in tech, science, and academia, but so many more people face the personal and professional harm of exclusion, the emotional burden of never-ending subtle misogyny, the exhaustion from dodging daggers, and the nagging feeling that you’re going crazy as you try to get through each day. Let’s change the norms. Please help me.
we’re all taught how to justify history as it passes by
and it’s your world that comes crashing down
when the big boys decide to throw their weight around
but he said just roll with it baby make it your career
keep the home fires burning till america is in the clear
i think my body is as restless as my mind
and i’m not gonna roll with it this time
no, i’m not gonna roll with it this time
— Ani Difranco
Whenever I’m grumpy about an update to a technology I use, I try to perform a self-audit examining why I’m unhappy about this change. It’s a helpful exercise since we are all by nature resistant to even minor alterations to the technologies we use every day (which is why website redesign is now a synonym for bare-knuckle boxing), and this feeling only increases with age. Sometimes the grumpiness is justified, since one of your tools has become duller or less useful in a way you can clearly articulate; other times, well, welcome to middle age.
The New York Times recently changed their iPad app to emphasize three main tabs, Top Stories, For You, and Sections. The first is the app version of their chockablock website home page, which contains not only the main headlines and breaking news stories, but also an editor-picked mixture of stories and features from across the paper. For You is a new personalized zone that is algorithmically generated by looking at the stories and sections you have most frequently visited, or that you select to include by clicking on blue buttons that appear near specific columns and topics. The last tab is Sections, that holdover word from the print newspaper, with distinct parts that are folded and nested within each other, such as Metro, Business, Arts, and Sports.
Currently my For You tab looks as if it was designed for a hypochondriacal runner who wishes to live in outer space, but not too far away, since he still needs to acquire new books and follow the Red Sox. I shall not comment about the success of the New York Times algorithm here, other than to say that I almost never visit the For You tab, for reasons I will explain shortly. For now, suffice it to say that For You is not for me.
But the Sections tab I do visit, every day, and this is the real source of my grumpiness. At the same time that the New York Times launched those three premier tabs, they also removed the ability to swipe, simply and quickly, between sections of the newspaper. You used to be able to start your morning news consumption with the headlines and then browse through articles in different sections from left to right. Now you have to tap on Sections, which reveals a menu, from which you select another section, from which you select an article, over and over. It’s like going back to the table of contents every time you finish a chapter of a book, rather than just turning the page to the next chapter.
Sure, it seems relatively minor, and I suspect the change was made because confused people would accidentally swipe between sections, but paired with For You it subtly but firmly discourages the encounter with many of the newspaper’s sections. The assumption in this design is that if you’re a space runner, why would you want to slog through the International news section or the Arts section on the way to orbital bliss in the Science and Health sections?
* * *
When I was growing up in Boston, my first newspaper love was the sports section of the Boston Globe. I would get the paper in the morning and pull out that section and read it from cover to cover, all of the columns and game summaries and box scores. Somewhere along the way, I started briefly checking out adjacent sections, Metro and Business and Arts, and then the front section itself, with the latest news of the day and reports from around the country and world. The technology and design of the paper encouraged this sampling, as the unpacked paper was literally scattered in front of me on the table. Were many of these stories and columns boring to my young self? Undoubtedly. But for some reason—the same reason many of those reading this post will recognize—I slowly ended up paging through the whole thing from cover to cover, still focusing on the Sox, but diving into stories from various sections and broadly getting a sense of numerous fields and pursuits.
This kind of interface and user experience is now threatened because who needs to scan through seemingly irrelevant items when you can have constant go-go engagement, that holy grail of digital media. The Times, likely recognizing their analog past (which is still the present for a dwindling number of print subscribers), tries to replicate some of the old newspaper serendipity with Top Stories, which is more like A Bunch of Interesting Things after the top headlines. But I fear they have contradicted themselves in this new promotion of For You and the commensurate demotion of Sections.
The engagement of For You—which joins the countless For Yous that now dominate our online media landscape—is the enemy of serendipity, which is the chance encounter that leads to a longer, richer interaction with a topic or idea. It’s the way that a metalhead bumps into opera in a record store, or how a young kid becomes interested in history because of the book reviews that follow the box scores. It’s the way that a course taken on a whim in college leads, unexpectedly, to a new lifelong pursuit. Engagement isn’t a form of serendipity through algorithmically personalized feeds; it’s the repeated satisfaction of Present You with your myopically current loves and interests, at the expense of Future You, who will want new curiosities, hobbies, and experiences.
Yes, it’s understandable for conservatives to worry that if Silicon Valley censors the likes of Molyneux, it will end up censoring them. It’s sensible for them to join parts in the left in worrying about the concentrated power over information that the stewards of social-media platforms enjoy. And it’s necessary for them to recognize that the influence of redpillers and white-identitarians reflects their own failure, across the decades of movement-conservative institution building, to create something that seems more compelling to fugitives from liberalism than the Spirit of the Reddit Thread.
With all that said, though, a humane conservatism should still be able to thrive in a world where white nationalists have trouble monetizing their extremism, in which YouTube algorithms are built to maximize something other than addiction.
I’m not sure what Ross means in the last sentence I’ve quoted by “should.” Does he mean that “humane conservatism” is likely to thrive, or that if the system is fair it ought to be able to do so? I doubt the first and doubt the conditional of the second.
Here’s the situation as I see it. First, as Alexis Madrigal has recently written, the big social media companies will from now on find it less likely to take refuge in the claim that they are “merely platforms”:
These companies are continuing to make their platform arguments, but every day brings more conflicts that they seem unprepared to resolve. The platform defense used to shut down the why questions: Why should YouTube host conspiracy content? Why should Facebook host provably false information? Facebook, YouTube, and their kin keep trying to answer, We’re platforms! But activists and legislators are now saying, So what? “I think they have proven — by not taking down something they know is false — that they were willing enablers of the Russian interference in our election,” Nancy Pelosi said in the wake of the altered-video fracas.
If you can’t plead platform neutrality, what do you do? Well, these companies being what they are, they’ll write algorithms to try to filter content. But the algorithms will often fail — after all, they can’t tell the difference between sites that promote hatred and sites that seek to combat it.
Where does that leave you? As Will Oremus points out, it leaves you with mob rule:
What should be clear to both sides, by now, is the extent to which these massive corporations are making up the rules of online speech as they go along. In the absence of any independent standards or accountability, public opinion has become an essential part of the process by which their moderation policies evolve.
Sure, online platforms have policies and terms of service that run thousands of words, which they enforce on a mass scale via software and a bureaucratic review process. But those rules have been stitched together piecemeal and ad hoc over the years to serve the companies’ own needs — which is why they tend to collapse as soon as a high-profile controversy subjects them to public scrutiny. Caving to pressure is a bad look, but it’s an inevitable feature of a system with policies that weren’t designed to withstand pressure in the first place.
Whatever should happen to humane conservatism on the internet, I don’t know what will, but as a person who is somewhat conservative and who would like to be humane, I wish I knew. In light of all the above, one thing seems nearly certain to me: If I were on a major social media service and a vocal group of that site’s users started calling me homophobic or transphobic or a white supremacist and demanded that I be banned, I would be banned.
It seems like every few months, a new movie remake populates movie theaters. These reinventions often adjust stories to fit today’s trends and values. But what happens when an entire genre is “remade”?
In her piece “Shall We Dance?: Feminist Cinema Remakes the Musical,” film scholar Lucy Fischer explores some of the ways in which female filmmakers such as Chantal Akerman flip the narrative of the traditional Hollywood musical, a form traditionally dominated by male and heteronormative direction.
The movie musical is a genre in which audiences suspend belief in order to enter into a world where singing drives a story. A song often emerges from a quotidian moment, in an effort to display a character’s feelings or motivations. Gender is frequently at play. Fischer quotes film critic Rick Altman, who says that within musicals we
alternate between the male focus and the female focus, working our way through a prepackaged love story whose dynamic principle remains the difference between male and female.
Akerman’s The Eighties takes place entirely in a shopping center and follows its protagonist Jeanne as she finds herself caught up in multiple romantic entanglements. In a way, it’s a standard set-up for a musical, with a love triangle at its center. The first act of the movie features snippets of a play in rehearsal which eventually culminate into that play’s performance in the second act.
Ackerman creatively inserts herself through several stand-ins such as the rehearsal director and conductor. Fischer writes, “As though to underscore the traditional power of male discourse (both on screen and off), when the first male actor speaks he does so assertively and no directorial voice comments on his delivery.”
Fischer also explains how the setting of the shopping center parodies the dominant culture of consumerism. She notes that traditionally, women in Hollywood musicals are seen as decoration. The Eighties, however, puts women in the center, with songs and a score that reveal the characters’ emotions.
Akerman dissects the melodrama that is traditionally part of a musical, using repetition to the point of absurdity. For example, the first scene of the film features a line about grief, “At your age, grief wears off,” that is reiterated multiple times throughout. According to Fischer, “The radical use of repetition underscores the redundancy of certain cliches in the melodramatic repertoire and foregrounds their endless replay in real women’s lives.” This theme of repetition is extended into the use of multiple actors for a single role which makes it more challenging for audiences to identity a particular character.
The second act of the film finally places the previous rehearsal scenes into context for the audience as the play is performed in full. Once in context, it is clear that the play is a parody of the musical romance. As a woman sings her romantic, explicitly sexual song, it becomes literal as she begins to make love with her partner as background singers poke their heads into frame. Fischer notes the ridiculousness of these over-the-top musical sequences.
The film doubles as homage and parody to the traditional musical romances, but Akerman revises the genre with her feminist sensibilities. According to Fischer:
Through her “re-make,” Akerman also engages what literary theorist Mikhail Bakhtin describes as the rhetoric of parody, wherein “the dominant discourse is reflected as something more or less bounded, typical and characteristic of a particular era, aging, dying, ripe for change and renewal.”
Get off of Twitter.
You can’t criticize Twitter on Twitter. It just doesn’t work. The medium is the message.
There’s an old joke where one fish says to the other, “How’s the water today?” And the fish responds, “What’s water?” On Twitter, you might ask, “How’s the outrage today?” (The answer, of course, is “I hate it! I’m so outraged about it!”)
Get off of Twitter.
Wait, have I said this before? Maybe only two or three hundred times.
But here’s why I keep saying it: The decision to be on Twitter (or Facebook, etc.) is not simply a personal choice. It has run-on effects for you but also for others. When you use the big social media platforms you contribute to their power and influence, and you deplete the energy and value of the open web. You make things worse for everyone. I truly believe that. Which is why I’m so obnoxiously repetitive on this point.
Just give it a try: suspend your Big Social Media accounts and devote some time to the open web, to a blog of your own — maybe to micro.blog as an easy, simple way in. Give it a try and see if you’re not happier. I know I am.
Hollywood director Frank Capra was born in Sicily as Francesco Rosario Capra on May 18th, 1897. He settled in Los Angeles with his immigrant family at five years old, and reached his height of Hollywood influence during the 1930s.
There are two main critical views on Capra’s work. One is that his films are “Capracorn,” kitschy stories about American small town heroes who conquer evil forces through grit and determination. Ronald Reagan, for one, saw Capra’s films as offering an optimistic view of American life, extolling the power of the individual.
Another view takes Capra’s most signature films—Mr. Deeds Goes to Town, Mr. Smith Goes to Washington, Meet John Doe and It’s a Wonderful Life—as dark messages about American life, pictures of a society overrun by corrupt elites with sinister motives, a place where the poor and working class struggle to little gain.
In the years between 1936 and 1946, Capra was given relative free reign after Columbia Pictures reaped massive box-office earnings from his comedic 1934 hit, It Happened One Night. That was when he made those four signature films, which feature idealistic lone individuals set against corporate and political elites. American Studies scholar Glenn Alan Phelps argues that Capra’s darker vision of American life, exemplified in those films, is under-recognized. “I would sing the songs of the working stiff, of the short-changed Joes, the born poor…I would gamble with those pushed around because of race or birth. Above all, I will fight for their causes on the screens of the world,” Capra once said.
The four films share similar dramatic arcs: an unassuming young man from small town America is confronted with the power of large institutions catering to a secretive, corrupt elite. Idealistic values eventually prevail. “His America is quite simply a plutocracy,” writes Phelps. Capra’s wrath is often aimed at the media. The reporters in Mr. Smith accept political corruption. In Meet John Doe, a manipulative newspaper magnate changes the motto of his paper from “A Free Press of a Free People,” to “A Streamlined Newspaper for a Streamlined Era.”
It’s A Wonderful Life is now considered pleasant holiday television fare. But at the time it was a box office failure. Critics noted that its depiction of small town bankers as corrupt exploiters of regular people ran afoul of post-war American optimism.
Mr. Smith fared better at the box office. But at its initial screening held for Washington luminaries, many power brokers objected, including Congressman Sam Rayburn and Senator Alben Barkley. Joseph Kennedy, then ambassador to Great Britain, urged that the film not be shown in Europe, for fear that its negative portrayals of American politics would disillusion allies as war loomed. But Franklin D. Roosevelt liked the film; he related to its hero breaking from political bosses, and shared its negative views of newspaper publishers.
Phelps notes that the “Capracorn” view probably comes from the upbeat endings of Capra’s films. The individual hero always prevails, despite daunting obstacles, including thoughts of suicide. He blames the endings on the Hays Code, which governed Hollywood at the time. The Code called for any portrayed evil to be punished at the end. Capra also knew that the public was unlikely to accept dark conclusions.
Capra himself retired early from Hollywood, failing to connect reliably at the box office. His films live on, popular with subsequent generations at ease with more sinister portrayals of American life.
The post Frank Capra’s Not-So-Sunny Vision of American Life appeared first on JSTOR Daily.
What happened when the Beat Generation bumped up against the popular culture they were rebelling against? Historian Stephen Petrus writes about how the youth subculture was turned into a commodity.
“From the end of 1958 through 1960, popular magazines, newspapers, television shows, and even comic strips bombarded Americans with images of the Beat Generation,” writes Petrus. But these images weren’t so much of writers like Allen Ginsberg, William S. Burroughs, Jack Kerouac, and Gregory Corso, the Beat Generation’s leading lights. These images were of the beatniks, the bohemian lifestyle highlighted by “Howl” and On the Road.
Petrus defines the original Beats thus:
To contemporary scholars the term “Beat Generation” refers to a group of post-World War II novelists and poets disenchanted with what they viewed to be an excessively repressive, materialistic, and conformist society, who sought spiritual regeneration through sensual experiences. This band of writers includes Allen Ginsberg, Jack Kerouac, and William Burroughs, who originally met in 1944 in New York City to form the core of this literary movement.
“Beatnik,” on the other hand, was a term coined by San Francisco Chronicle columnist Herb Caen in April 1958. It was a play on “Sputnik,” Earth’s first artificial satellite which had gone up in October 1957. Caen used the term to refer to “over 250 bearded cats and kits” “slopping up” free booze at a party sponsored by Look magazine. Like the more popular Life magazine, Look was obsessed with photographic spreads of beatniks. (Playboy had a different kind of beatnik photograph spread in its July 1959 issue.)
The years between 1957 and 1960 marked the “the acceptance of the beatnik dissent and the emergence of a fad: a cultural protest transformed into a commodity,” writes Petrus. There was fashion: loose sweaters, leotards, tight black pants, berets, and sunglasses were all the rage. There were spaces: coffee houses, cellar nightclubs, and espresso shops opened to meet the new demand. New York City even had a “Rent-A-Beatnik” service, where you could order up a poetry-reading/music-playing cool cat or cool chick for your event; sandals and bongos were available options.
The popular cultural responses to the beatniks ran from denunciation to tolerance to imitation. The 1960 Republican Convention featured J. Edgar Hoover proclaiming that “Communists, Eggheads, and Beatniks” were the country’s great enemies. Some Americans associated beatniks with drugs, delinquency, and un-Americanism.
Then again, as Petrus writes, “the original beatniks themselves became a tourist attraction” in San Francisco’s North Beach. There was even an American Beat Party, which nominated a Presidential candidate in 1960. Words like “cool,” “crazy,” “dig,” and “like,” entered the general American lexicon.
In the media, beatniks were mostly portrayed as “innocuous and silly figures, causing Americans to laugh at them and embrace them.” TV’s favorite beatnik was Maynard G. Krebs, a goofy, harmless man-child who shivered at the thought of work and began many a sentence with “like.” Wearing a chin beard and playing the bongo drums, the stereotype ultimately edged out the title character of The Many Loves of Dobie Gillis (1959-1963) in popularity.
But, as Petrus writes, the original Beats were more serious than that.
Shaped by the effect of the senseless murder of World War II and the knowledge of a possible instant death by an atomic explosion or a slow deterioration by the cancerous force of conformity, the hipster responded to his situation by detaching himself from society and rebelling.
Then again, dissenters have a way of popping up again, elsewhere. The year 1960 was defined by a wave of civil rights sit-ins across the South, the birth of Students for a Democratic Society, and protests against the House Committee on Un-American Activities in Berkeley. What rough beast came slouching towards Berkeley to be born? Like, the Sixties had arrived.
Last Sunday afternoon, in the aftermath of the first game of the NBA playoff series between the Houston Rockets and the Golden State Warriors, there was much online huffing and puffing about whether the game’s referees had failed to call fouls against the Rockets’ James Harden and Chris Paul.
But something important was overlooked in said huffing and puffing: the fact that, whether Harden and Paul were fouled or not, they were desperately trying to get fouls called against their opponents. And that makes the last few seconds of that game a kind of parable of our cultural moment.
It’s possible that the Warriors’ Draymond Green grazed James Harden as Harden came to earth after shooting — after, that is, missing a shot quite badly, possibly because he was thinking less about making the shot than about getting the ref to believe that Green had fouled him, which he did by falling, completely unnecessarily, to the ground. The ball ended up in Chris Paul’s hands, and Paul charged into the Warriors’ Klay Thompson while flailing his arms wildly, determined to force a call. (He did not get the call, and in his rage shouldered the referee, which has earned him a fine.)
This kind of thing has, of course, long been the bane of soccer: players who might have a legitimate chance to score a goal, or at least got off a shot on goal, fling themselves to the ground and roll about in feigned agony hoping that they will get a penalty called or a yellow card assigned to the opponent.
I have come to believe that this is what almost all of our culture is about now: working the refs. Trying to get the refs, whoever the refs might be in any given instance, to make calls in our favor — to rule against our enemies and for us, and therefore justify us before the whole world.
What are students doing when they try to get speakers disinvited from their campus? Or when Twitter users try to get other Twitter users banned from the platform? Or when people try to get executives or members of some board of directors fired from their jobs? In each case, it’s an appeal to the refs. These people are not trying to persuade through reasoned argument or to attract public opinion to their side through the charm of their personality. They’re demanding that the designated arbitrators arbitrate in their favor. (Sometimes, as in the case of the college admissions, scandal, they just bribe the refs.)
And it’s easy to see why people would think this way: If I assume the point of view underlying this habit, it means that nothing that goes wrong is ever my fault. If anything that I want to go my way doesn’t go my way, it’s because the referees didn’t make the right call. It’s never because I made any dumb mistakes, or indeed had any shortcomings of any kind. Things didn’t go my way because, whether through incompetence or bias, the refs suck. I would’ve won if it hadn’t been for the stupid refs.
I think this is a particularly attractive strategy in our current moment, especially on social media. As I wrote a couple of years ago,
When a society rejects the Christian account of who we are, it doesn’t become less moralistic but far more so, because it retains an inchoate sense of justice but has no means of offering and receiving forgiveness. The great moral crisis of our time is not, as many of my fellow Christians believe, sexual licentiousness, but rather vindictiveness. Social media serve as crack for moralists: there’s no high like the high you get from punishing malefactors. But like every addiction, this one suffers from the inexorable law of diminishing returns. The mania for punishment will therefore get worse before it gets better.
Call-out culture has many, many mechanisms of enforcement but none of forgiveness or restoration. A culture that knows only how to punish creates an environment in which, as Freddie deBoer has said, “everyone’s a cop”; but it simultaneously creates disincentives for people to admit they they might themselves need policing. Because who wants to apply the single-sanction one-strike-and-you’re-out criterion to themselves?
These reflections might help to explain a phenomenon that Michael Lewis describes on his new podcast “Against the Rules”: that the NBA is dealing with unprecedented levels of complaint about its officials at the moment when the league gives those very officials unprecedented levels of scrutiny, and unprecedented levels of training, and unprecedented opportunities to review and correct bad calls.
If refs are doing their job better than ever and simultaneously catching more grief for their errors, that just might be a result of our expecting more of them than is reasonable. In the NBA, and also in society at large, we do better when we try to solve problems ourselves rather than try to manipulate the refs into solving them all for us. I hope the Rockets get swept by the Warriors. (And that the Warriors swept in the next round, because their moaning and bitching are almost as bad.)
UPDATE: I realized something right after I posted this — that’s always how it happens, isn’t it? — which is that by circling back to the NBA at the end of the post I elided a major distinction: The NBA refs may be “doing their job better than ever,” but that doesn’t mean that the same can be said for all our society’s referees. Indeed, many of them are doing a very bad job indeed. More on that in another post. (This is also what I get for writing a short post about an issue that needs to be treated at length.)
More and more often, mental health professionals, religious leaders, educators, and even tech professionals are exhorting us to “unplug”—to disconnect from the internet and our beloved devices on a regular basis. But the more we’re pushed to unplug, the more we need to ask some very fundamental questions about this newly trendy form of self-care. Here are the four questions to ask whenever someone tells you to put your phone on ice.
Distraction. Addiction. Compulsion. A constant craving for attention.
Whatever your tech-related problem, someone will tell you that the solution is to simply unplug. In this respect, the off-switch resembles nothing so much as the humble leech, long used as a go-to treatment for a mind-boggling assortment of maladies.
As Margaret Modig recounts in “The Strange Lore of Leeches,” “The basis for the widespread use of leeches in medicine stems from the idea that all diseases are caused by tainted or impure blood that has to be removed from the body.”
We make a similar error when it comes to the psychological and cognitive problems of contemporary life. As Ian Marcus Corbin writes in “Time To Log Off,” “It goes without saying—everyone knows it now, even if they can’t say why—that things like social media are bad for us, that many of us are clinically addicted to our phones, that life online brings out the worst in many, and probably in most of us.”
But we’re well within our rights to ask for a more precise diagnosis before submitting to the surgical removal of our devices, particularly since the relationship between digital causes and human effects is still so murky. “With only observational data, and no experimental controls, it is notoriously difficult to make causal inferences,” Paul Resnick, Eytan Adar, and Cliff Lampe write in “What Social Media Data We Are Missing and How to Get It.” “For example, if we observe a positive correlation between posting frequency and loneliness, it could be that loneliness causes posting rather than the other way round.”
While we’re still trying to figure out which of society’s problems can be attributed to technology use, we can at least get clear on which of our personal or professional problems we’re trying to address by unplugging. Unless you know exactly what ails you online and what you hope to gain by taking a break, it’s probably safe to keep on surfing, texting, and tweeting.
Listen to the unplugging evangelists, and you’d be forgiven for thinking that all that stands between you and a day full of enriching activity is the evil, seductive phone in your pocket. Writing about the time he spends with his grandchildren in his column for American Libraries, Will Manley writes that “I endure the inevitable screams of protest when I snatch their iPads, turn off the high-def big screen TV, and hide the video game console. That leaves a soccer ball, a set of watercolors, and a glue-and-paper book.”
Uh, yeah….all that and a grandfather who is a professional librarian. Not all of us have a spare, well-educated family member available to pick up the slack on childcare. (Admittedly my own personal childcare backup is a retired classics professor, but you get my point.)
When we think about taking screens away from our kids, it’s worth thinking about what else they’ll do instead. Is it better for a kid to spend her afternoon learning to build her own video game, or lying in bed reading Archie comics? Is it better for a teen to stay home playing Fortnite, or to hang out smoking weed at a friend’s house? (I do not have a definitive answer to either of these questions.)
We should be equally honest about the opportunity costs of our own online time. It’s comforting to imagine that without the lure of Facebook and Twitter, I’d have written three novels, filled the freezer with meals, and refinished my hardwood floors. But I remember life before social media: my freezer was just as empty, and my floors just as scuffed.
If anything, our time online may be displacing less productive activities: In “Clash of the Titans: Does Internet Use Reduce Television Viewing,” Stan J. Liebowitz and Alejandro Zentner found that “the Internet has reduced television viewing for individuals with Internet connections,” particularly for those “who have grown up since the personal computer was developed.”
Whether that online time represents a better choice than an hour of T.V. depends on what you’d be watching, of course—but it also depends on exactly what you’d be doing online. Before we turn to unplugging as a way of reclaiming our “wasted” time online, let’s get very clear about which ennobling and regenerative activities we plan to do offline (especially if we can’t use Yelp or Eventbrite to find them).
Unplugging is far from costless—and I’m not just talking about the hours of therapy required to separate me from my computer. There’s a massive literature about digital exclusion and the “digital divide”: the gap between people (and especially young people) with significant technology access, and those whose access to tech and tech skills is more limited (or non-existent).
As Paul A. Longley and Alexander D. Singleton write in “Linking Social Deprivation and Digital Exclusion in England,” “There is increasing awareness that the failure of individuals, households, and communities to engage with new information communications technologies has negative consequences in both the private (for example, purchasing behaviour) and public (for example, accessing services) domains.” By mapping data from a national Index of Multiple Deprivation (IMD) against data classifying information and communications technology (ICT) usage, the authors were able to conclude that “nation-wide patterns of digital exclusion and material deprivation are linked, and that high levels of material deprivation are generally associated with low levels of engagement with ICTs and vice versa.”
Research proves the negative economic and social impact of digital exclusion; it therefore takes a position of relative privilege to see unplugging as a chosen step towards self-actualization, rather than an externally imposed denial of opportunity. In “Revisiting the Digital Divide in the Context of a ‘Flattening’ World,” Deepak Prem Subamony notes that “social/economic/cultural groups that find themselves on the right side of the Digital Divide—namely, those who comprise the technological haves-knowers-doers—can be seen as largely oblivious beneficiaries of a vast matrix of privileges.” Precisely because we are awash in the many benefits of ubiquitous technology, we have the luxury of underestimating the cost of switching off.
But it’s essential for all of us to think not only in terms of the online annoyances we want to escape from, but also, the proven benefits of internet access that we turn our backs on when we turn off the screen. Each person faces somewhat different trade-offs, as we see in Vaughan Bell, Dorothy V. M. Bishop, and Andrew K. Przybylski’s summary of the “The debate over digital technology and young people“:
With regard to social interaction and empathy, adolescents’ use of social networking sites has been found to enhance existing friendships and the quality of relationships, although some individuals benefit more than others. The general finding is that those who use social networks to avoid social difficulties have reduced wellbeing, while use of social networks to deal with social challenges improves outcomes.
The cost-benefit analysis of unplugging will differ for each and every one of us, depending on what we get (or lose) from our time online. But we need to spend just as much time weighing the benefits of technology—like the friendships and support we get through social networks—as we do disparaging its costs.
Of all the mysteries around the current vogue for unplugging, nothing mystifies me more than how it’s supposed to help us live in this actual reality—you know, the world in which most of us depend on our devices for little things like work, food, and transportation. In “Comte Unplugged: Using a ‘Technology Fast’ to Teach Sociological Theory,” Katrina C. Hoop writes about challenging a class of sociology students to spend seventy-two hours offline. As one of her students noted, “I had to tell my family and friends about this to prepare. Ironically, the fastest and most efficient way to do this was through a [Facebook] post. My post simply said: ‘wish me luck, no cell/phone/internet for 72 hours.'”
This student was hardly unusual: I’m exhausted by all the friends and colleagues who’ve announced their digital detoxes or Facebook breaks on Facebook or Twitter, and even more exhausted by the breathless social media posts about the experience. If a digital fast is not a turning away from the internet, but rather, some kind of tech vacation, what is its value in preparing us for the networks to which we must ultimately return?
In Hoop’s account of her students’ experiments, there are plenty of reflections on how hard it is to go without the internet, but no examples of how the experience actually affected student behavior or tech choices once they got back online. That represents a missed opportunity for the digital fasters: I actually think digital breaks can be useful, if you’re clear about what you want to learn from the experience, and how you might change your online behavior as a result. Too often, that reflection is left out of the process, so there’s no clear way for short-term unplugging to reshape long-term time online.
But all this undirected unplugging has a bigger cost, too. In framing the internet as something each one of us can check in or out of whenever we need to refresh our over-networked brains, it frames the perils of technology as a matter of personal choice rather than a social, political, or business problem. And as L.M. Sacasas writes, this is not “The Tech Backlash We Really Need“:
The tech backlash, emerging as it has within this centuries-old trajectory, will not achieve the perspective necessary to offer a substantive evaluation of our technological disorders. The critique emanates from within the system, assumes the overall beneficence of the system, and serves only to maximize the system’s power and efficiency by working out its bugs. Meanwhile, the big tech companies can rest ever more assured of their ability to withstand an occasional public battering and emerge unscathed so long as the bribe remains sufficiently enticing. So far, the tech backlash seems likely not to weaken the tech industry but to strengthen it, enhancing the power of the present techno-social configuration.
If you’re really concerned about the social, psychological, or cognitive impact of the internet, in other words, unplugging is both an ineffective and a selfish choice. Better to stay plugged in, and get ever-smarter about using mobile and social technologies to their full potential.
Yes, the devices and platforms of our new digital world deserve serious scrutiny, and require us to ask some tough questions about what we want from our online lives. But we should ask just as many questions of those who would have us unplug from technology, however briefly.
After all, unplugging is hardly a moral imperative. It’s a tool: a tool we can only use wisely when we are really clear about what we gain—and what we lose—by switching off.
Music has a way of bringing us back in time. A song from when you were a teenager can take you back to the school dance, or you might have a soft spot for that one pop song that reminds you of childhood. Curiously enough, however, many people feel a particular warmth for the music of not just their childhood, but that of their parents.
In 2013, Cornell University psychologists Carol Lynne Krumhansl and Justin Adam Zupnick conducted a study to find that “music transmitted from generation to generation shapes autobiographical memories, preferences, and emotional responses, a phenomenon we call ‘cascading reminiscence bumps.’” They found, somewhat unexpectedly, that it’s not just the music we listened to during the first two decades of our lives that provides us with fond memories, but also the music our parents listened to in their young adulthood.
The study’s participants included a group of sixty young adults with an average age of 20.1 years old. They listened to Billboard Top 100 songs spanning decades since the 1950’s and were asked which songs brought about the most emotions or memories and whether or not they involved their parents or peers. The researchers played snippets of a range of songs such as “Surfin’ USA” by The Beach Boys to “Irreplaceable” by Beyoncé, and the participants were asked to describe how those songs made them feel. The participants enjoyed the contemporary songs, since they were more nostalgic for their recent past. However, participants also recognized and enjoyed songs from previous decades, from the late 60s to early 80s.
The researchers write that “these participants exhibited something like a reminiscence bump for music released in two time periods before they were born.” It was natural for the strongest emotions to be for recently popular songs, but they also found intergenerational influences when it comes to music. Those “reminiscence bumps” occurred with the participants as a result of listening to music from the 60s and 80s, the periods of time when their parents and grandparents were most likely forming their musical tastes as 20- to 25-year-olds.
Krumhansl and Zupnick write, “One assumes, therefore, that this music was played during parents’ child-rearing years, and made an imprint on our listeners when they were children.” The memories associated with the songs from decades past would often involve their families and cause these nostalgic “reminiscence bumps” of enjoyment. The researchers ultimately reported that the relationship between songs and memories were “closely related to whether they made participants feel happy or energized.”
So while some may consider the music of yesteryear better than what’s on the radio now, that may simply be a result of what our parents listened to.
Bleary-eyed and snug under the covers, I end most nights in bed with my laptop propped on my lap as I squeeze in one more episode of Russian Doll. I binge, you binge, we all binge. But at this point, six years after Netflix began releasing full seasons of original television content all at once, what have we really learned about the binge watch? Experts researching our modern viewing practices have focused on the “flow” of streaming and how these design choices have helped bring us to the binge model, and on the (not unassociated) consequences of the algorithmic control these platforms have over our habits of consumption. The roots of these practices, though, lie further back than the arrival of streaming.
In a 1954 article for The Quarterly of Film, Radio and Television called “How to Look at Television,” Theodor W. Adorno wrote about his concerns over how little was understood about how television worked, and the dangers it posed:
By exposing the socio-psychological implications and mechanisms of television, often operating under the guise of fake realism, not only may the shows be improved, but, more important possibly, the public at large may be sensitized to the nefarious effect of some of these mechanisms.
While his focus is largely on the “imagery” of television and its attendant morality, Adorno presciently points out how significant the mechanisms of television would be to its reception and uses as an artistic medium, and the ways in which the audience would take on ever more importance in evaluating television and its cultural impact. These “mechanisms” are the specifics of how the medium works, the means television producers employ to keep viewers glued to the screen.
Scholars and consumers alike will put pressure on any new medium to account for what differentiates it from others. But this process takes place alongside a struggle to understand how the medium works and what effects it has on us. When the cinema came along, we wondered what it meant to attach images together sequentially, and later, with the release of home television sets, what it meant for an audiovisual machine to become a mass medium. More recently, video games have been subjected to the same lines of inquiry, with an emphasis on the effects of interactive play.
The media and communications scholar Arthur Asa Berger, writing in the Journal of the University Film Association in 1978, quotes the academic George Gerbner, who was speaking in testimony to the National Commission on the Causes and Prevention of Violence in 1968:
In only two decades of massive national experience television has transformed the political life of the nation, has changed the daily habits of our people, has molded the style of the generation, made overnight global phenomena of local happenings, redirected the flow of information and values from traditional channels into centralized networks reaching into every home. In other words, it has profoundly affected what we call the process of socialization, the process by which members of our species become human.
In the streaming age, we should remember this historical context. With the modern binge model of consumption, these concerns have intensified and evolved. The concept of television “flow” is not new, traditionally describing the strategies networks have used to keep viewers glued to their screens from program to program. The Marxist theorist Raymond Williams called it “the defining characteristic of broadcasting, simultaneously as a technology and as a cultural form.”
Television’s traditional linearity has been upended via home video, on-demand, and most significantly, streaming, with a greater emphasis put on “viewer choice,” such as it is. James Bennett, writing in Cinema Journal in 2008, said:
Television studies should pay close scrutiny to both the interfaces and applications that structure our access to “television” content and manage user flows, as well as the discourses and aesthetics of such content/programming itself.
Netflix, Hulu, Amazon, and the rest are deeply invested in these structures (each interface may look similar, but works according to distinct inner logics) and in creating the ideal consumer flow, intended to keep us watching for as long as possible. Berger commented: “Television, in many cases, creates the very dependencies we use it to overcome, so we become caught in a vicious cycle.” Watching television might soothe or entertain us, but the shows are written to draw us back in.
These moralistic (and vaguely health-related) worries over watching too much TV have existed since TV became available to regular families, and the contemporary binging affliction seems like a hyper-indulgent progression. It’s built around different systems, with better technology (you’ll probably notice Netflix introducing you to shows via a “Binge-worthy Shows For You” segment on your home page).
We’re also more aware than ever before of the 24-hour news cycle, with folks now claiming to binge-watch CNN, MSNBC, or Fox News. In an article from Spring 1984 in The Public Opinion Quarterly called “Political Correlates of Television Viewing,” the authors claimed that “television’s impact on the political front is not limited to news and specifically ‘political’ programming, but that the entire dramatic structure of network television may contribute to viewers’ underlying political orientations.” Interestingly, they claim that those who watch the most news and political programming tend to identify as moderates. Without an updated version of the study, I’d wager that the accelerated style of our news cycle and the drive to maintain eyeballs, along with many other factors, suggest that the opposite is now true, and we’re stuck in the muck of hyper-polarization partly as a result of these dramatic structures of network news television.
Compounded by the relative skill of platform algorithms, this all feeds into bingeable TV lives of personalization, each viewer experiencing their own individualized interface and consuming their own niche segment of the culture. A friend may tell you about their favorite show that’s popular now on Netflix, but if it’s not “on brand for you,” you likely haven’t even seen it appear in your app. Nothing is dominant anymore within a medium that was once thought of as the ultimate mass medium, or what the geographer Paul C. Adams in 1992 called “television as gathering place.”
With all this in mind, the nine critical propositions on binge-watching put forward by Tanya Horeck, Mareike Jenner, and Tina Kendall in a recent issue of Critical Studies in Television offer an instructive account of the unprecedented changes in viewer behavior we’ve seen in recent years. What we consider to be “bingeing” changes—it’s not just about the number of episodes you watch; it’s about the choice of a single show. Netflix’s full TV season drops have created an insular flow: “Rather than going back to the home page and making a deliberate choice… the post-play function takes us directly to the next episode. The ‘skip intro’ function even allows us to make the narrative flow feel more seamless.” Netflix tells us at every moment how to watch it.
The authors offer some directives for further work. One of these is to question the extent to which binge-watching has impacted biopolitical production. In its advertising campaigns, Netflix has made light of how sleep and the necessity to eat or leave your apartment can get in the way of your binge. Horeck and colleagues point out how this colonization of sleep is a core tenet of the company’s business model. Netflix’s CEO in 2017: “We’re competing with sleep.”
Not unrelated, I think, is their proposition regarding low-intensity consumption. Much of the binge discourse has been about the high-intensity versions—long nights, deeply felt fandom. Concurrent with those practices is the opportunity to zone out through consumption: “in a binge culture, the banal pleasures of zoning out coexist with—and play an increasingly strategic role in maintaining—the intensive pleasures of switched-on viewing.” This complicates our immediate notion of what it is to binge by bringing attention to how Netflix and other services recreate and rearticulate the televisual experience. Nodding to the medium’s history of being left on as background noise, these low-intensity binges are a natural outgrowth of the auto-play function and the easy-to-digest content Netflix in particular is increasingly producing: cheap reality shows, cooking bonanzas, stand-up specials, and a wealth of assorted documentary content.
These authors also suggest that we consider how bingeing repurposes gender and racial dynamics, often via well-worn tropes. Netflix’s role as a transnational broadcaster (everyone around the world can watch and discuss after a series drops), and the binge model’s role in “shaping viewing protocols in relation to a wider media ecosystem concerned with the generation of ‘clicks, likes and shares’” are all worthy pursuits. But what becomes clear is that bingeing in the streaming age is, right now, transforming how we watch, and we need to be aware of these changes and watchful of their consequences. As the media scholar Elana Levine wrote for Cinema Journal in 2011:
The medium need not speak in a single voice to be a factor in the exercise of dominant interests, nor do its audiences need to engage in a single experience of television to make their negotiations with it central to current social, cultural, and political debates.
Television and its flows remain hugely influential. We can’t just let ourselves binge.
Legendary filmmaker Barbara Hammer (May 15, 1939 – March 16, 2019) was known as one of the first lesbian experimental filmmakers. In the 1970s, she shook up the establishment by exploring queer life and feminism in her films.
Hammer wasn’t only a filmmaker, however; she was also a teacher. In 1984, she wrote the article “The Artist as Teacher: Problems and Experiments” as a way of documenting how she taught her students about art while being an artist.
Hammer wrote, “One of the ways I keep my ‘artist-self’ alive while teaching is to teach in new ways, to use my imagination in shaping or developing a class.” It was a welcome challenge to experience her students’ differing perspectives on the topics that she covered in film. Her own preparation for the class required tremendous research and visiting libraries, which she found lacked a large amount of history regarding women in film.
As a director, Hammer felt an inherent need to be in control of her artistic projects. Despite this instinct, Hammer wrote, “my chosen role as an art teacher is nondirective. I try to be a nonauthoritarian teacher of art who supports the unique development of students on a feminist basis of equal and shared learning.” In this way, she encouraged the artist within the student without imposing her own artistic ideas and strategies on them. When she found that a student was inspired by her work, she found it difficult—but crucial—to “hold back” and allow them to discover their style on their own.
It is evident from Hammer’s piece that she invested as much time and energy in her students as she did in her own work. “I no longer think it is necessary to create personal work while teaching full time. In fact, I suggest the opposite. When teaching, teach; when making art, don’t teach,” she wrote.
Hammer interrogated her identity as a teacher, which often challenged her identity as an artist. By collaborating with her students, however, she found the gender, power, and racial dynamics that she aimed to address in her films were also present in her classes. As Hammer wrote, “In creative teaching, the whole body, mind, spirit, physicality of the teacher is used; all her resources and capabilities are at play.”
This spring, the Brooklyn Academy of Music will present a retrospective dedicated to critically-acclaimed French filmmaker Claire Denis, called “Strange Desire: The Films of Claire Denis.” Claire Denis is a part of a group of contemporary French filmmakers who push the physical body to its limits. The genre is known as corporeal cinema, or cinema of the body and senses.
Film scholar Tim Palmer writes of Denis’s filmography: “Forcible and transgressive, this is a cinema of brutal intimacy.” Palmer includes in this category the work of Denis’s contemporaries, Bruno Dumont and Gaspar Noé. While cinema is often a place of refuge, where viewers go to witness some sort of satisfaction or gratification missing from our own lives, Denis and her contemporaries create films that do just the opposite.
Palmer writes that their filmography
offers incisive social critiques, portraying contemporary society as isolating, unpredictably horrific and threatening, a nightmarish series of encounters in which personal relationships…disintegrate and fail, often violently.
To create this cinema of the body, the filmmakers create drastic visual and auditory experiences, making films that become visceral for the viewers. For example, Palmer writes that Denis’s 2001 film Trouble Every Day features “carnal appetites [that] literally consume others.” In the controversial art house film, gore and lust are taken to the extreme when two former lovers share an interest in cannibalism. Palmer notes the use of parallel editing, or cross-cutting, to depict the slow denouement from psychological decay toward cannibalism. Additionally, Denis, along with her cinematographer, Agnès Godard, chose to shoot the most crucial scenes in the story at night to contrast with the harshness of day.
The way that Denis extends silence to create dramatic tensions, sometimes for up to twenty minutes, causes the audience to become restless. A disturbing soundtrack builds “from auditory claustrophobia rather than structured vocal interactions.” Both audio and visual elements are more enhanced during sexual encounters; Denis’s long takes make the scenes all the more excruciating for the viewer.
As Denis often chooses to stay away from light in the most intense scenes, the darkest scenes are normally when the story’s narrative is illuminated. This experimental, art-house, body-horror kind of film makes for an uneasy viewing experience, but that’s part of the point. Denis’s intent is to always explore the truth, even and especially when it disturbs the audience.
As a graduate student, Josephine Livingstone realized early that an ...
The post Public Thinker: Josephine Livingstone on the Critic’s Voice appeared first on Public Books.
After being away from Twitter for a few months, I have two thoughts.
The first is that I wish I had departed years ago.
The second is that when I peek at Twitter, the whole enterprise seems so weird. It’s not that it seems worse than I had remembered, nastier or stupider; rather, the fact that people spend time on that platform now strikes me as absurd, inexplicable. And I was tweeting for eleven years before I departed! It’s remarkable how quickly my mind has re-set itself to the pre-Twitter norm.