“For the foreseeable future, I’m fine, because I did buy crypto at the right time.” So said Frances Haugen, the former Product Manager at Facebook turned whistleblower.
Described as “one of the greatest sources of the century,” she shared documents on how Facebook executives handled politicized lies with NYT claiming “the company chose to let misinformation spread widely, to keep more people logging on. The series also noted the lengths that Facebook went to in its desperation to hang on to its audience as young people drifted away from its platforms.”
She mentioned crypto following allegations that Pierre Omidyar, the eBay co-founder, financially supported her.
The many non profits of Omidyar financially supported her “only for travel and similar expenses,” she said. Her own cryptos apparently cover the rest of her financial requirements.
Omidyar’s global philanthropic organization Luminate is handling Haugen’s press and government relations in Europe, and his foundation last year gave $150,000 to Whistleblower Aid, the nonprofit organization that is providing Haugen’s legal representation and advice.
Bill Burton, Barack Obama’s former spokesperson, runs public affairs for the nonprofit Center for Humane Technology (CHT), which receives funding from Omidyar. Haugen appeared on a Center for Humane Technology podcast earlier this month.
CHT focuses on “the systemic harms of the attention economy, which it says include internet addiction, mental health issues, political extremism, political polarization, and misinformation.”
“I don’t want to give the impression that Pierre was involved for months and secretly funding this behind the scenes,” said a Politico source who requested anonymity to speak candidly. “It is the case that he funded lots of work around big tech and democracy — lots of different organizations for several years. And when the Haugen disclosures became public, we leaned in and said, ‘How can we help?’”
The Politics of Censorship
This is just the latest on what for a few years now has been a largely one sided ‘debate’ regarding how public debates should be handled and facilitated.
Something that has reached the stage where the Draft Online Safety Bill is to be proposed to the British Parliament perhaps later this year.
The bill as currently drafted gives the Secretary of State “power to direct OFCOM to modify its codes of practice to bring them in line with government policy,” according to the Carnegie Trust.
Ofcom, which currently regulates broadcast TV, would become the regulator of “user to user” platforms, or user content generating platforms, and search engines.
They will have the power to shut down such platforms or search engines or fine them up to £18 million or 10% of their global revenue if they do not comply with the removal of legal, but ‘harmful’ speech.
The regulation is limited to platforms where users primarily generate content. This website for example is excluded even though users can comment as it falls under limited user generated content.
However this publisher, and any newspaper, would not fall under a ‘recognized news publisher’ exemption as they are defined to be news publishers which are required to be registered with Ofcom, that being broadcast TV news stations.
They try to include other news publishers within the exemption if they meet Ofcom level requirements, but it is not clear why there should be such an exemption at all for such entities to be outside of ‘illegal and safety’ requirements as you’d think such publishers wouldn’t publish such things and if they do, why should they be exempt?
The answer is presumably because this betrays a certain something: it is going too far. Publishers however won their rights in the 1800s, and newspapers won their right to not be regulated at all or to require a license. So that right is transported perhaps even unconsciously. Yet that was at a time when not quite anyone could publish due to constrains of physical resources. Nowadays, isn’t any and everyone that makes any content whatever, including a comment, a publisher?
The distinction between the public’s participation in public platforms and journalistic participation in such platforms effectively creates two classes in a fairly unprincipled way because anyone has a right to be a journalist and anyone has the right to be a news publisher. Thus the public freedom of speech and the freedom of the press, or indeed their constrains, are one and the same.
This is crucial, because any distinction would not be between journalists and the public, but between the public and the government sanctioned content producers which effectively translates into a distinction between a lesser right to publishing for the public, and a greater right to do so for the government.
That’s especially when it applies to an exclusion of these licensed Ofcom publishers from a very wide definition of effectively prohibited content as the draft act says:
“Content is within this subsection if the provider of the service has reasonable grounds to believe that the nature of the content is such that there is a material risk of the content having, or indirectly having, a significant adverse physical or psychological impact on an adult of ordinary sensibilities.”
Indirectly having an ‘impact.’ What that means exactly is for Ofcom to lay down in a Code of Conduct which the Secretary of State can ask to be modified to be in line with government policy or on the basis of national security and public safety.
Interestingly financial harm is excluded from being considered as having an impact. So asking for someone to be deplatformed, which obviously has quite an impact on any adult of ordinary sensibilities, is fine. However, suggesting that ‘two weeks to flatten the curve’ is a plandemic that continues with boosters and may go on for decades with a bad flu season exaggerated completely out of proportion to serve the purpose of population control as the two decades of fake war on fake terr did, maybe does have ‘reasonable ground to believe’ that people might not vax and so is having a physical impact and so should be censored, even though suggesting something has been exaggerated for agendas and doesn’t exist at all are two things, so vax is a completely different matter.
Yet platforms are doing this censoring even before the bill has come forth and any adult of ordinary sensibilities would have reasonable grounds to believe they are doing it not out of free will, but because they are or have been ordered to do so by the government, perhaps under threats of compulsive legislation.
This specific example however concerns something ‘light’ as whether exaggerated or not, it’s at worst a very inconvenient imposition of government power and control over the populace. What is the cost, as gradually these new state powers of censorship through a state regulator develop, when the public debate is not over bad flu or ‘aaaa,’ but over a more existential question like whether to go to war.
Currently thankfully we are at peace, but this generation now has twice seen the very imposing government media machinery and utterly impactful propaganda that is somewhat effective towards regimenting the public with dissenting views effectively becoming prohibited.
It is then, when speech effectively becomes prohibited, when such public platforms are most needed to break the ice and break the mold and break the propaganda, so that the public has a say and so that the public reinstates itself as the ruler, peacefully through speech.
The legislation itself attempts to provide a safe harbor for political speech in creating two main obligations for these platforms and its users. One, a negative obligation where users are prohibited from this ‘adversely impactful’ content, and the other a positive obligation where the users have the right to not be removed or be taken action against for content that is contributing towards political debate.
That’s a very loose definition, but they’re basically saying Trump shouldn’t have been censored by Twitter. And or that the ‘bad flu’ suggestion shouldn’t be censored, or is that not ‘political?’ Does the ‘safety’ or ‘debate’ come first? Is ‘political’ limited to commentary on what Boris Johnson says, or is something like Qanon political and shouldn’t be censored? There was a Jeffrey Epstein afterall.
In addition, sites like 4chan would be banned by this bill which demands mechanisms for the censorship of non illegal content.
How does something like imgur, that deals with images, comply with what is effectively an offence of taking offense.
And more generally, what exactly has gone wrong in the past two decades that the status quo where social media is concerned deserves the state’s hammer? That Trump won?
The draft legislation however does have a loophole as it demands that some entity or some person has control over what user can access it or publish upon it.
This may be bad for society therefore, but good for bitcoin as with smart contracts you can design a Facebook where no user or entity has control, so making you exempt from any requirements to curate non-illegal speech.
This is thus more a web2 legislation. For web3, they recognize it can’t apply, but either way Mark Zuckerberg running for president might be a referendum of sorts on whether any such platform licenses should be required at all.
There’s been speculation he might want to run. He has the experience of running a global company worth 5% of US’ GDP. He also may bring some tech to the civil service.
So he could potentially even win if there is no real competition, and neither Biden nor Trump would be such competition.
Legislators therefore need to bear in mind what the public actually thinks, and not what they, or some consultants, think the public thinks, as outside of some echo chambers, the suggestion of licensing requirements for publishing platforms is radical.
They do impose the right to speech however, so Reddit presumably would have to reinstate The_Donald, or nonewnormal, why did they even ban the latter?
But that’s a small comfort for effectively a censorship infrastructure as market forces and users themselves can exert pressure on platforms they patron, while if the state is involved, then there is no longer quite choice in how a platform strikes the right balance.