Humanity is a vast network of small and large societies numbering 7.8 billion people. Almost half of them (3.6 billion) use social media. But when and how exactly did we get here? For tens of thousands of years, we have been foragers and hunters. Evolution did not come quickly.
But when we started cultivating the land and living in permanent settlements, everything changed in time about 12 thousand years ago. Now, everything shows that we are at a correspondingly essential crossroads.
This work is supported by an interdisciplinary paper recently published in the National Academy of Sciences of the United States. The 17 researchers who took part come from entirely different fields, but they seem to agree on one key thing: we need to realize the dangers of new technologies and social media before it’s too late.
“There is a huge problem, and we have to understand its context; it is vital for the future,” Carl Bergstrom, a biologist and head of research entitled “Managing Global Collective Behavior,” told Recode.
Social media, the center of the problem
For the scientific team, the critical issue is to treat the collective behavior of people not just as a fascinating scientific field but as a vital tool for the survival of the human species. But what does all this mean?
The research is written in a pretty popular language without trying to confuse the readers. Exactly the opposite. In other words, it is trying to shed light on something obvious. Still, we do not seem to be doing anything about it so far: new technologies and -mostly- social media are galloping without us being adequately prepared for it.
The digital age has brought about cataclysmic changes in our societies without realizing exactly the consequences it has on their operation. A few years ago, anyone who said that social networks often spark racist hatred was treated like crazy. Now, we know that cyberbullying is commonplace worldwide, and the times when a hoax acted as a spark for racial cleansing were not few. Sometimes a tweet is enough.
It makes sense to understand that more often – very rarely – we operate as groups and not as individuals. Just as locusts swallow large areas of green land as if they had a plan behind it, so too do we make massive decisions without often realizing why. The fact that we emptied the shelves of the supermarkets of toilet paper in the first quarantine speaks for itself.
Usually, we do not take so seriously the extent to which we are affected by digital developments. On the contrary, critical issues such as overpopulation are pretty clear in our minds: it poses a threat to the natural environment, the planet, and the human species; These in the real world.
But what is happening in the digital world, and how does it massively affect us? The scientists who are signing this new research believe that we do not know what is happening to us. We can not answer even the most straightforward questions: after all, can an algorithm that suggests new friends on social networks work for or against misinformation? We have no idea because we do not have the appropriate data, the necessary statistics.
Even in the 20th century, walls were erected between societies: the Western World and the Eastern Bloc, the different languages, the overseas distances that were difficult to cover. Now, we live in more or less the same world (except China), there are translation technologies that free our hands in seconds, while the flights that will allow New York-London in 3 hours will become a reality in a few years.
At the same time, what scares me, even more is the speed of information transmission. News that has not been cross-referenced – or even worse, fake news – can go around the world, causing chain reactions. It takes just a few minutes to find yourself off the coast of Miami in a Siberian village.
The way the fake news is transmitted is reminiscent of how Covid-19 was transmitted: quickly, everywhere, without discrimination based on age, gender, or social class. Algorithms do their job, but we are not sure that they do it well – at least not on social media.
Yes, sciences like medicine but also problems like road traffic find immediate solutions to algorithms. But information is another area. How easy is it for Facebook’s algorithm to create an illusion of reality in each of us? It serves us the news we want, the accounts that match ours, the comments we need.
After all, social media algorithms are not set up to produce the most objective result in journalism. What they see are reactions and likes. So, if one excludes the extreme voices of hatred and racism (which are now the last to eat ban) but also the naked, the question of social networks is one: interaction – and not the truth.
All this situation that has been created is not at all innocent. The most worrying thing is that there does not seem to be a straightforward solution on the horizon. This research highlights how crucial it is to put the issue on the table, as providing solutions without even defining what exactly the problem is an even greater risk.
In any case, science must find the right way to regulate the context in which social media should operate. This is not just the work of an algorithmic mathematician but also from biologists to psychologists and from social scientists to computer scientists and philosophers.
But it must be done quickly because the margins are narrowing. Even if the ideal solution is found, it must be embraced by the political leadership. After all, the resistances of the big-tech giants are not going to bend easily.