Two years ago, when I was still a student of Communication I was attending to a conference of censorship of media, and questioned about how can we guarantee the veracity of the information running through the Internet and how to punish or regulate acts of defamation and manipulation of the data.
The speaker said: ‘unfortunately, we can not to that yet, I’m sorry, but I’ll have to say I can’t talk about that’. And he was right. But for good, Facebook, for example, has been aware (finally) about the amount of false information promoted on that website and now is going to try and stop it.
The CUNY Graduate School of Journalism in New York, the London School of Economics in the UK, Sciences Pro in France and others will participate in this project as well.
The subject is to find the way to improve the public conversation and to find out why the information is produced and promoted the way it is, in order to help people to analyze the news they read and share online.
This has succeed before. In Europe Facebook got together with Buzzfeed and Google to fight against this problem during the season of presidential elections in France. The method was simple: users could know about the rating of trustable media.
Actually, the BBC publish an article about how Facebook could have influenced the American voters like this: if I’m interested in Donald Trump, I’ll only get news about him and his political plans, and vice versa about Hillary Clinton. That’s why it’s questionable the power Facebook (or social nets) have over its users’ information feed.
We use to think that ‘if it’s online, it’s true’, when it’s one of the easiest ways to manipulate the info, especially if it’s a text, photo or video of a remote place, where we can’t be at and confirm nor deny what it’s said.
We hope this can help you out to consider a little bit more how are you dealing with so much data online and how much are you feeding a chain of wronged information. Remember we’re always wanting to read your comments, even more when we talk about topics that affects us all.