Platforms are over

Platforms are on life support. Alternative AI interfaces are on the rise. Meta is shifting emphasis away from Facebook to AR- and VR-enabled portals for interaction. Mastodon is emerging as a friendlier, smaller-scale (for now) antidote to the mass interaction most platforms foster. Twitter has transitioned from serving as the PR instrument of President Trump to the pet project of a billionaire. People have begun to exit platforms en masse, leaving behind zombie accounts with many followers and no activity. They download content and lock up accounts. It almost feels like they’re locking up house and leaving hostile territory, hoping possibly to return when things are normal again, whatever that may mean. The people are leaving; the bots keep gaining ground.

Where does that leave journalism?

It’s time for journalists to rethink their relationships to platforms. Platforms are not neutral; they never were. They are technology, and per Kranzberg’s famous first law, technology is neither good nor bad, nor is it neutral. Platforms are human-made and reflect the biases of their makers — in particular, of their owners. If journalists want to maintain their commitment to democracy, they must rethink their relationship to platforms that do little to strengthen democracy.

I’m not suggesting that journalists abandon platforms as a site of research and inquiry. However, if news institutions want to rebuild public trust around their mission, they’ll have to think critically about the places they take their business, and their readers, through.

News organizations rely on platforms to distribute content and drive back clicks to their sites. Leaving platforms is a complex decision for them. It’s a decision with economic repercussions. Staying on certain platforms, however, also has democratic consequences. Is it ethical for organizations that carry the crux of democracy to maintain an affiliation with platforms that don’t?

Community, trust, and authenticity do not scale up easily, if at all. As platforms expand, they lose the authenticity that rendered them unique. This isn’t inevitable: Responsible scaling can help platforms grow up and larger in a manner that preserves the affect that originally drew people to that platform. Contextual curation, consistent moderation, socialization to a platform, and etiquette are some practices that can help maintain the original atmosphere of interaction that a platform afforded. They can help preserve the sense of place, what Joshua Meyrowitz presciently described as the right balance between public and private that draws people in and fosters community and trust. But then again, community and trust aren’t things we create instantaneously or share in volume. We don’t trust everyone. We don’t feel close to everyone. We create our own places within larger spaces and thus render the closeness that hopefully will foster community.

As platforms continue to scale up, people’s connections to them will continue to thin out. Platforms will instead offer a Rolodex of contacts; an entry account to other spaces; a zombie account that collects dust like an abandoned house. They will become more vulnerable to content manipulation, engineered to support the whims of venture capital and stock market shorting. At present, the world watches as Elon Musk tweets content that seems tailor-made to test its effect on stock valuation. Musk follows a strategy that creates noise, estimating that this will maintain or increase perceptions of the value of the platform. And he further mocks news organizations for criticizing his practices yet remaining on his platform.

Why stay? Does the economic benefit really outweigh the reputational cost? The time seems opportune to leave and make a statement in so doing. What might shock the system more than all news institutions joining forces and leaving a platform like Twitter together?

If that seems like a lot, I’ll offer an alternative proposal.

In writing this piece, I asked ChatGPT to write me a manifesto for journalism. It offered a formulaic yet accurate treatise on fairness, objectivity, and democracy. The intelligence we create is tuned up to give us the responses we trained it to; does the world we live in fit that description? No. But what if news organizations trained their own conversational agents to engage in different modalities of news storytelling, ones that build on slower forms of news storytelling, like podcasts, that hold promise for building trust? I’m not suggesting that ChatGPT is not susceptible to manipulation, nor that we substitute conversational models for human interaction. I recommend that we optimize language models, like ChatGPT, to complement and augment our abilities instead of substituting; to help news institutions become more engaged in building platforms that are used to share the news, with a long-term investment in rebuilding trust, rather than a short-term interest in profit. Journalists can work together with social scientists and engineers to give these infrastructures the right architecture; the kind that turns a space into a place; the form that fosters trust, community, and accuracy. It’s not a prediction, but it is a challenge and an opportunity for the coming year.

Zizi Papacharissi is a professor of communication and political science at the University of Illinois Chicago).

Leave a Reply