The US supreme court case that could bring the tech giants to their knees | John Naughton

Two weeks ago, the US supreme court decided that it would hear Gonzalez v Google, a landmark case that is giving certain social-media moguls sleepless nights for the very good reason that it could blow a large hole in their fabulously lucrative business models. Since this might be good news for democracy, it’s also a reason for the rest of us to sit up and pay attention.

First, some background. In 1996, two US lawmakers, Representative Chris Cox from California and Senator Ron Wyden from Oregon, inserted a clause into the sprawling telecommunications bill that was then on its way through Congress. The clause eventually became section 230 of the Communications Decency Act and read: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

The motives of the two politicians were honourable: they had seen how providers of early web-hosting services had been held liable for damage caused by content posted by users over whom they had no control. It’s worth remembering that those were early days for the internet and Cox and Wyden feared that if lawyers had henceforth to crawl over everything hosted on the medium, then the growth of a powerful new technology would be crippled more or less from birth. And in that sense they were right.

What they couldn’t have foreseen, though, was that section 230 would turn into a get-out-of-jail card for some of the most profitable companies on the planet – such as Google, Facebook and Twitter, which built platforms enabling their users to publish anything and everything without the owners incurring legal liability for it. So far-reaching was the Cox-Wyden clause that a law professor eventually wrote a whole book about it, The Twenty-Six Words That Created the Internet. A bit hyperbolic, perhaps, but you get the idea.

Now spool forward to November 2015 when Nohemi Gonzalez, a young American studying in Paris, was gunned down in a restaurant by the Islamic State terrorists who murdered 129 other people that night. Her family sued Google, arguing that its YouTube subsidiary had used algorithms to push IS videos to impressionable viewers, using the information that the company had collected about them. Their petition seeking a supreme court review argues that “videos that users viewed on YouTube were the central manner in which IS enlisted support and recruits from areas outside the portions of Syria and Iraq which it controlled”.

The key thing about the Gonzalez suit, though, is not that YouTube should not be hosting IS videos (section 230 allows that) but that its machine-learning “recommendation” algorithms, which may push other, perhaps more radicalising, videos, renders it liable for the resulting damage. Or, to put it crudely, while YouTube may have legal protection for hosting whatever its users post on it, it does not – and should not – have protection for an algorithm that determines what they should view next.

This is dynamite for the social-media platforms because recommendation engines are the key to their prosperity. They are the power tools that increase the user “engagement” – keeping people on the platform to leave the digital trails (viewing, sharing, liking, retweeting, purchasing, etc) – that enable the companies to continually refine user profiles for targeted advertising. And make unconscionable profits from doing so. If the supreme court were to decide that these engines did not enjoy section 230 protection, then social media firms would suddenly find the world a much colder place. And stock-market analysts might be changing their advice to clients from “hold” to “sell”.

Legal scholars have been arguing for decades that section 230 needs revision. Freedom of speech fanatics see it as a keystone of liberty, as the “kill switch” of the web. Donald Trump made threatening noises about it. Tech critics (such as this columnist) regard it as an enabler of corporate hypocrisy and irresponsibility. However you look at it, though, it’s more than half a century since it became law, which is about 350 years in internet time. Having such a statute to regulate the contemporary networked world seems a bit like having a man with a red flag walking in front of a driverless car. (Though, come to think of it, that might not be such a bad idea.)

Versions of the question posed by the Gonzalez suit – whether section 230 immunises internet platforms when they make targeted recommendations of content posted by other users – have been put to US courts over the last few years. To date, five courts of appeals judges have concluded that the section does provide such immunity. Three appeals judges have ruled that it does not, while one other has concluded only that legal precedent precludes liability for recommendation engines. There’s no legal consensus here, in other words. It’s high time that the supreme court decided the matter. After all, isn’t that what its there for?

What I’ve been reading

Voice of reasoning

A Locus of Care is Justin EH Smith’s tribute on his blog to his late colleague, the French philosopher Bruno Latour, who died this month.

Ways of seeing

How Photographers in the 1970s Redefined the Medium is a nice anniversary essay in Aperture by Geoff Dyer.

Crippling debt

If you think Bitcoin Spews Carbon, Wait Till You Hear About…Banking is an interesting revelation by Bill McKibben on his blog about finance’s carbon footprint.

Do you have an opinion on the issues raised in this article? If you would like to submit a letter of up to 250 words to be considered for publication, email it to us at observer.letters@observer.co.uk

Read More

John Naughton