I. Crabs and chicken
The most radical thing I ever learned about writing I learned from an editor at Baltimore magazine, where I worked one summer before my senior year of college, when it was so swampy and hot I boiled like stew. One morning I showed up to the newsroom at 9, nauseous from the crab smell already wafting up from the fancy restaurant next door, when Jane, the food and dining editor, called me to her office. She suggested I write a story on chicken, about how local chefs were turning something once considered cheap and pedestrian into an order full of sophistication and technique. Sure, I agreed. I was more than happy to write about something other than crabs. That whole city is obsessed with crabs. I returned to Jane’s office 30 minutes later with a list of dubious evidence to support that chicken was indeed making a comeback. She stopped me: You know you don’t have to prove it’s already a thing. You can just write it, and then it’ll become a thing.
Creativity, in this way, went against much of what journalism school had taught me, which was that to prove a trend was nascent but demonstrable was a prerequisite to writing a story. To write about phenomena about which people already wanted to read was the very essence of all journalism bar breaking news, certainly the essence of service journalism and largely of cultural journalism, including food and dining.
While I have my doubts about how many valuable, lasting habits I picked up in my formal journalism education (any lack of which, I admit, falls mostly on my being too nervous and self-involved between ages 18 and 22 to move past trying to do assignments correctly and ever learn how to do them well), one idea I certainly internalized was that there exists a general cycle of cultural communication: It begins with 1) something happening, often or noticeably enough for people to perceive it, then 2) people talk about it, after which 3) beat reporters report the facts plainly, inverted pyramid-style with the five “W”s, and finally, if enough public or commercial interest remains in the story, 4) a different crop of journalists, who we called “cultural critics,” analyze (break down), explain, contextualize, and re-synthesize (name, frame) the phenomenon, which then influences the way people perceive the next something when it occurs. The cycle starts over, and so it goes.
Jane’s advice put a crack in the cycle: Creation, then, was shoehorned in between 1) people’s perceiving of something, and 4) journalists’ “critique” of such a thing as a valid cultural feature. It totally bypassed the need for a critical threshold of popular recognition. To just write “it,” and, in doing so, to create an “it” where before there was none was revolutionary to the point of irresponsible: We couldn’t just have people running around publishing things previously unsupported, could we? Would that not spread misinformation, confuse people, kill democracy in America? Journalism schools in the mid- to late-2010’s instructed its students to fear exactly this, for the sake of their own reputations and of the basic structure of society. And yet publishing something that said “Chicken is in!” suggested nothing to fear; there were no false facts, just the professional validation (as in, it was written about in a magazine) of something readers might consider random. With the chicken story, Jane’s Rule was, at least, sensible and harmless; at most, creative and wonderful.
I think about Jane’s Rule all the time. Even Johny, who that summer was studying political science in Cochabamba, remembers when I called him from my borrowed room in the northeast suburbs of Baltimore to tell him I’d just learned the most amazing thing at the magazine. My professors at Medill, who made us cite our sources into oblivion, would never have let us in on such dangerous information. We had to pay our dues first, get on a beat and publish hundreds of dry stories, live many years, and develop a sensitivity for what’s worth skipping ahead in the cycle to stick our fingers in the cultural pot. The irony is that after all that, the thing worth the use of your professional capital might be as silly as a story about chicken. Having a little fun is serious business.
II. Alienation
The problem now is that that serious business, the literal one, the journalism industry, is a shell of its old self, even of what I saw of it in ‘18 and ‘19, which was already looking haggard. Still, as long as they can keep the lights on, magazines and newspapers remain vital instruments for discovery: Editors decide what articles and advertisements fill an issue, just like how bookstore owners carefully stock the shelves and museum curators select which installations visitors will see while inside the building. Some god-play is always necessary in the negotiation of space and attention. When fallible people make such choices the results are imperfect and unfair, but earthbound and reasonable. One reads the local newspaper and can feel the presence of a human soul.
But when space is virtually unlimited, like it is online (except, we forget, that the internet is actually not immaterial but is stored physically in gigantic data centers, which take massive amounts of energy to cool and maintain—but I digress), the amount of information that comes in to fill it becomes overwhelming and un-sortable by people alone. Algorithms and ranking functions, therefore, are a rational solution. No sane person wants to spend their life selecting which videos to send to a person’s endless For You Page based on their user profile—more than just tedious, it would be ethically heart-wrenching. Anyway, we know objectivity is a desert mirage. It is impossible for a person to make a perfect evaluation of anything to do with things like beauty, truth, or freedom. It’s no wonder then why it seems smart to have algorithms set bail; but then again, it’s no wonder why, in the end, they’ve brought us nowhere closer to answering the question of justice, but instead have simply passed the power of judgment to something we can’t reason with.
Aside from that, the internet of course is a force of globalization. Remotely immersive experiences of faraway places are uncanny, and mutual exposure to certain media creates a global common ground. This is good, to a certain degree, as far as it fosters sympathy for any Other and helps us recognize our great human sameness. But when algorithms, spiritually unaware of the meaning of place and culture, replace localized editors, an inflated sense of global consciousness emerges and the difference between our actual experiences and the ones we see online seems negligible. “I’ve never had an original thought” is a popular thing to comment.
So consider all that, and then consider the effect of the near-total collapse of the institution of journalism has had on cultural writing: Without the authority of publications to back them up, writers find themselves in a precarious situation. Funneled to unsound, decentralized highways like Substack self-publishing or intellectual-as-influencer branding on their quest to commercial success, publication-less writers cull from all their random ideas and unique observations only what will fit their online brand or perform well for SEO. Everything must be relevant, current, obvious in its service to the reader.
So, the cycle: perceiving, thinking, talking, reporting, analyzing, creating. The current state of media has revealed that that old pattern—like everything one assumes is part of a reliable world order, which seems totally fixed until suddenly it’s not—was more fragile than I thought: It was held together on the condition that certain places (journalism schools, newsrooms) and professions (beat reporters, editors) also remained solid. When those pillars fall away, so does the reporting step of the cycle. (That goes elsewhere, removed to a different plane, wherever the last bastions of news are, at the AP, Reuters, Politico, and local newspapers while they still exist). Couple that with how fast things move online, how frequently a modern writer must now post/publish to keep their “cultural critic” brand relevant, and see the perception and thinking steps fall away, too. Hannah Arendt wrote in an epic essay called Thinking, “All thinking demands a stop-and-think…It interrupts any doing, any ordinary activities, no matter what they happen to be.” And that includes being online, listening to takes, staying on top of the zeitgeist. I’ve never had an original thought. Thinking, processing, and committing to an argument is risky business. It’s safer to write what other people are already talking about, to analyze whatever’s already buzzy, to create nothing new, and to estrange ourselves from the soul.
It’s not that nothing released recently has heart. Some things do, absolutely. You know them when you find them; they taste different. But, there is a steepening trade-off between craftsmanship and exposure, and it’s exacerbating the homogenization of not just the content we see but the feelings we have on it and so too the content we go on to create. We’re getting stuck on a dead-end road, with “people talking” on one end and “cultural criticism” on the other, where information, prepared for the palette of machines, bounces back and forth faster and faster until it becomes one monotonous flatline called culture. Browse your recommended content on TikTok, YouTube, Substack, whatever, and hear a stainless engine whirring behind it. Where is the creativity, the courage? The chicken?
III. Recovery
The 24 bus in San Francisco runs South down Castro Street, climbing up to summit Dolores Heights before careening back down into Noe Valley. The sun had already set by the time the 24 was jerking down the hill, and Johny and I sat inside, falling into each other on the decline. We had decided that evening that Alice’s on Sanchez and 29th was our favorite Chinese restaurant in the city because of how fresh the lo mein always is. (The last time we’d dined in about a year ago, when the waitress had asked if we needed a bigger table because we’d ordered too much food, which was, to me, a horrifying remark—and yet, the lo mein.) So that’s where we were headed, by then deep in a conversation about the whereabouts of creativity in modern media.
Johny was the first of us two to call it the soul. That real and unknowable thing, absent in so much art in 2024. It’d taken us most of the bus ride to Alice’s to name it. But there are other names from the past: Emma Goldman, in her essay Jealousy: Causes and a Possible Cure, which I’d found in a dry red zine at the anarchist bookstore on Haight Street, calls it personality. In that essay she argues that there are no innately jealous people, only those who forget, temporarily, “that all others who are unlike ourselves, who are different” have, like we do, “the right to oneself, to one’s personality.” In 1910, I suppose everyone was sure enough of their own inner world; only the recognition of such a depth in others was required for the sake “of understanding, of sympathy, and of generosity of feeling.” For the sake of freedom.
Then, of course, there is Joan Didion, who called it character in her essay Self-respect: Its source, its power, published in Vogue in 1961. “To have that sense of one’s intrinsic worth” is to recognize our own unique scoop of universe, which is the very foundation of creativity, of art that is not just interesting but wonderful. A few years later, in 1971, Hannah Arendt remarked in The Life of the Mind (which also includes Thinking) that this thing, “sometimes called personality or character,” is the manifestation of the mind, the organ of thinking, willing, and judging, which itself is the instrument of the soul, the chaotic center of impulse and passion.
It seems that every some 50-odd years we must reorient ourselves to it, whatever it’s called. Emma tells us to remember it exists within others, Joan tells us to remember it exists within ourselves. Now we must remember it exists at all. Perhaps the millennial snowflake discourse has buried the truth of individuality under the arguments against runaway ego and unchecked tenderness, but there are consequences for quietly tipping the scales so far in favor of strategy and detachment. There must be a balance between science and art. They are not the same.
IV. Fortune-telling
Taken from a certain perspective, what I think will happen next is optimistic, which is surprising even to me; the state of things is looking hairy, and to my own detriment I’ve never been an optimist. But some attempt at a mass rebellion against tech-made culture, against handing over the reins of our lives to something that has no sense of preciousness or love, seems, frankly, more likely than entire societies waltzing happily into a Wall-E situation for only the sake of convenience and comfort. I can’t imagine the mind withers away completely without us noticing or caring. The human need to create and work and fight is more deeply embedded than our relationship with social media and its externalities. If everyone’s a sellout now, like they say, but only some people are actually getting things sold, what we can expect, I believe, is a new wave of artists who decide not to play the game.
Equipped with time to think and unburdened of comment sections, I’ll wager these artists end up publishing pieces that are more specific, argumentative, and, likely, free, or at least sold for chump change compared to what an intellectual-influencer could get. That said, I think the “get that bag” pendulum is going to swing. (So they will probably have day jobs until, I guess, they organize and rebuild, say, journalism and publishing houses from hyperlocal epicenters.) And because the medium is the message, etc., I think we will see a widening difference between art made for and of social media, and art that is not; both can be good, but they must be made in different ways, at different stations. Crabs and chicken. It’s tempting to look around and think the internet will eat everything eventually, but if I am sure I have a soul, I must, per Ms. Goldman, be sure everyone else does too. If I feel a revolt brewing, I cannot be the only one.
We finished our dinner and the fortune cookies came. I really stomped my feet and shook my fists at the moon once we got back to the bus stop, when I realized I forgot my fortune on the tray with our receipt. After our whole conversation that night, that fortune hit so hard it felt like a message from God had appeared to me, and I’d meant to take a picture for proof. It’s okay, Johny laughed at me, isn’t that the whole point, not to need evidence? You’ll know it was true. Argh. Yes, yes, you’re right, and I went home kicking myself. All I can do now is tape the memory of it onto the page in my mind where back in Baltimore I wrote down Jane’s Rule, and when I need the kind of kick in the ass that can only come from some true and encouraging cheese, reread it: You are capable of tremendous creativity.
This was so good! I love the way you sectioned the essay and wove the story of the conversation with Johnny. It’s brilliant! I also read this after reading Eliza Mclamb’s post about writing about internet culture -- it took me reading her piece to realise that my sudden interest in internet culture writing stemmed from the fact that I’m seeing a lot of other women succeeding at internet culture writing, even though previously it was never a subject that drew me in. It took me reading your piece to snap out of it and remember that’s there’s a world outside the internet that needs to be written about, as silly as that sounds.
I read this immediately after reading Eliza McLamb's recent piece: https://substack.com/home/post/p-142401759?r=1c49aw&utm_campaign=post&utm_medium=web, an accidental juxtaposition that i'm glad i stumbled on. i wouldn't say this essay is a counterargument, but they feel adjacent, and i'll be thinking about both for a while.
i love the final paragraph of the piece, the kind of trick you do with the fortune cookie before ending the essay with its actual quote. the intertwining of your experience at the crab mag, the conversations with Johnny, the introspection and analysis, so wonderfully done. thanks for writing!