Biohacking, a cultural movement that aims to democratize science and bring ivory tower knowledge as well as tools to the masses, can be interesting.
So rather than tuning into one of the politics obsessed news stations, one might go to YouTube to see a Ted talk on biohacking.
Left with a thirst to learn more, you keenly await what video YouTube will load next. The automatically recommended video, as it turns out far too often, happens to have nothing to do with biohacking.
It is perhaps a talk on microbes, or maybe on body language, or really on anything but what you were looking for. Looking at the other recommended videos on the sidebar, they’re all probably some music you’ve previously listened to. Again, nothing to do with the video we just watched.
Personalization, as they call it, turns out to be more of a broken tape. Constantly feeding you the past, constantly repeating the same thing, and constantly bringing you back to older stuff as you try and learn of new things.
One can, of course, manually search for something, but people are lazy. So the video will often move to your enjoyment from Trump to Trump to Trump and then a bit more Trump.
Who cares about flying cars, or robots, or science, or art, when all roads will eventually lead to Trump. Just which Trump you get of course depends on whether you lean left or right. The former will probably end up at some talk show that tries to be funny, but really amounts to just reeee-ing anger. The latter may end up at one of Trump’s latest rally or fox news.
Independents, on the other hand, may end up at whatever they into. If you like the gym, then even if you’re watching the moon landing, the next video will probably be some workout.
Here, all is visible. We know the recommending bot is being dum. We know it has given us a wrong video. We may feel a bit annoyed that we could not nicely find more just as interesting content as what we watched, but the harm is minuscule compared to other personalization bots.
Personalization now extends to Google searches themselves. The idea is to rank content based on what is “relevant” to you. That’s in a subtle way, with Google’s secret algorithm having many factors, one of them being to reflect your own bias.
Google, of course, is a bot, so it doesn’t quite know what your bias is, but it knows if you spent one minute or ten minutes on a website. It knows if you quickly left an article, or if you read it thoroughly. It knows many other factors which allow it to reach conclusions on what you “like” and what you don’t like with the aim of providing you on the first page of searches primarily left only or right only or gym only content.
That’s because Google hopes it pleases you, so you then come back on Google or spend more time on the website it sent you. A site that probably has Google ads, meaning you are more likely to click on one.
Facebook has its own broken tape that keeps feeding you the same type of content. The end result being different camps with their own reality and their own “facts.” According to a new survey:
“Nearly eight-in-ten Americans say that when it comes to important issues facing the country, most Republican and Democratic voters not only disagree over plans and policies, but also cannot agree on basic facts.
Ironically, Republicans and Democrats do agree that partisan disagreements extend to the basic facts of issues.”
That’s probably primarily because Republicans never hear of Democrats’ “facts” and vice versa, with this online broken tape reflected offline where Fox and CNN reflect back to their own viewers their own bias.
Thus in contrast to say a library, where you can roam around and maybe find a very interesting book, roaming around online is now nearly a dead art.
The closest thing to it is Reddit or 4Chan, but the latter is hardly a welcoming place, while the former facilitates censorship as clearly show during the 2016 election by r/politics effectively becoming r/hillary.
Meaning that online fora lack a public gathering space where left, right, artists, scientists, intermingle to have non subject-specific discussions like in the coffee houses of old.
All of which, conceptually, creates a dangerous potential weapon of grand scale manipulation. For example, could YouTube really not find another biohacking video? Is this really the bot being dum, or is it perhaps algorithmic biases of the executives that hired the coder? Do they subtly kind of brush away challenging content, or wrong think? Are they, in effect, secretly manipulating the public?
Instead of giving right-leaning content to a Republican, YouTube could, for example, give him/her one or two right-leaning video and then switch to left, or vice versa.
Is this actually happening? Well, we don’t know and we can’t know because the algorithms are secret. Google will never say how it determines page ranking because it claims people would then be able to play the rules.
The more probable reason is that they don’t want their competitors to copy it because there is an entire industry that has sprang up around Google which can often guess what the rules are. A for profit industry that is more often focused on getting their clients to the front page, rather than ringing some alarm bell over Google potentially being politically biased in its rankings.
Nor do we know that they are, but that’s the problem. When it is obvious that someone is manipulating you, that someone no longer has any manipulative power. When it is in secret, well you might be led to believe it is your own conclusion.
Which is why we should perhaps try to get rid of these secret algorithms by offering open alternatives that run on public networks like ethereum’s blockchain.
The code would then be open, so at the very least we’d know what are its biases. The full transparency would further mean that no one would be able to change the algorithm in secret.
In contrast, the infrastructure is there currently for a totalitarian power-grab whereby FBI or whoever can go to Google’s headquarters or wherever with a gag order to request say no video recommendations for biohacking, or one video of fuhrer for every 2-3 videos watched.
The government has already gotten involved in other aspects, pressuring Google and Facebook to ban crypto ads. In difficult times it is easy to see them “pressuring” for certain algorithmic biases.
Making open blockchain-based versions of Google or Facebook not just desirable, but a necessity if we are to preserve free speech and the freedom of thought, which in many ways is currently under attack by the new corporate media.
Such potentially disruptive blockchain based projects are already in development, but it may of course take a bit more time for them to be refined fully.
Until then, perhaps there should be independent audits of these algorithms, and/or they should be open sourced. Something that they won’t do willingly, but blockchain projects may force them to due to competitive pressure. And if they can’t open their algorithms because their entire business model relies on this secrecy, then maybe it is time for the disrupters to be disrupted.