Last week was rough for Facebook.
Further information came out that Russian agents had used the service to organise rallies in support of United States President Donald Trump and to buy pro-Trump ads. A ProPublica investigation showed it is possible to search for "Jew haters" and target them with ads for Nazi memorabilia.
These incidents no longer seem like accidents, but what's the right framework for thinking about the underlying failings of Facebook? I have a nomination: For all the wonders of contemporary technology, it is not so good at producing social context.
Let's consider the Russian organisation of pro-Trump rallies. In older times, you could imagine a Russian guy on an American street corner, trying to hand out leaflets, speaking to passers-by in his accent, trying to recruit them for Marxist causes. It probably wouldn't go well, in large part because the surrounding social context would make it clear what was going on, namely a clumsy attempt to boost a foreign cause for self-serving reasons.
Today, the same Russian could place an ad on Facebook, whether for Trump or for a fascist cause. With a minimum of professional effort, it can look just as good as the ads for more legitimate groups and not betray its origins. Low transaction costs lead to more good stuff, such as your nice Facebook posts, but they also enable more bad communications. When surrounding social cues are stripped away, we don't always know how to interpret - or dismiss - the bad messages on the site.
Unfortunately for Facebook, there is a permanent record of such dealings and screenshots can be made of the offending ads. Facebook is no more immoral than the phone company that allows pro-Nazi conversations to take place over its wires, but Facebook is more easily caught in the act. Viewers can be outraged by screenshots of the ads, which in turn go viral through Facebook and other social media.
So how do these recent incidents tie into the longstanding complaints from the tech critics? Arguably, Facebook is making it too easy for us to be superficially sociable, at the expense of deeper social cultural context. That's hard to prove, but it's a framework for interpreting the growing pile of circumstantial evidence that indeed something is wrong with Facebook.
Consider how social networks have taken a lot of the power away from popular music. Formerly, young people used music to signal who they were and to which social circles they wanted to belong. If you were a feminist in the late 1990s, you might listen to Indigo Girls and trade Sarah McLachlan CDs and go to Lilith Fair concerts. But today, you can just make a few clicks to show your views with a Planned Parenthood support banner over your Facebook profile photo.
People have hardly stopped listening to music, but music is less moored to our social attachments, and it doesn't seem to have the cultural force or social influence or political meaning of earlier times. Pop music has been in the ascendancy, and, outside of rap, protest music is less important. From the charts you hardly would know we are living in the time of Trump.
Listening to music, or for that matter making social connections, is a lot easier and more seamless than ever before. But what we've done is strip away a lot of the social context and broader meaning surrounding those connections, in part because we no longer need music to signal our aspirations and our social standing. Musical forms such as punk, heavy metal, indie rock and folk music run the risk of being turned into historical artefacts, mostly disconnected from their original roles in bringing people together and marking the formation of common cultural bonds. As with the Russian propaganda, that too is a problem of missing social context.
When social context was front and centre, as in the older world of mainstream media, fake news was harder to pull off. For all their flaws, major, well-funded newspapers and somewhat boring television networks helped knit Americans together, and most people had a sense of the borders of what kind of reporting lapses might be possible or not. When Facebook brings you directly to "the news", without much cultural intermediation, the risk of outright lies rises, and it is less clear which pieces of reporting have been through credible external scrutiny. In essence, Facebook makes it too easy for us to communicate without the background social production of context.
Frankly, I don't see what we have to lose from spending less time on Facebook: Research indicates that social media use is often a kind of addiction. It doesn't make most people happier, but causes them to feel alienated.
At least at current margins, I say hooray for cumbersome intermediaries and thick cultural textures. They keep our connections and our creativity vital, and so much the better if they also help limit Russian influence.