Facebook and Google are Struggling with the Privacy Issues

Facebook and Google are Struggling with the Privacy Issues

Let’s talk about the front-line workers of Google and Facebook who are working on the pandemic. They are content moderators who keep the site run day in and day out. It is a tale about difficult tradeoffs, like most stories about content moderators. Moreover, the actions took place over the past few days by Youtube and Facebook. It will have significant implications for the future of the business.

Firstly, let’s check out the history.

Initially, content moderation on social networks was a business problem: letting in users who collapse the community like the Nazis, and the nudity. Later, it became a regulatory and legal problem. Companies now have a legal obligation to remove child abuse imagery. Also, they have a legal obligation to remove terrorist propaganda and other forms of content. It is despite the protections Section 230 affords. Moderation of content became more of a scale problem, as services like Facebook and YouTube grew user bases into the billions.

The solution to the problem was to outsource the job to large consulting companies. The wake of the 2016 election revealed a deficit of content moderators at all the big social networks. Thus, tech companies hired tens of thousands of moderators around the world with the help of the firms, including Genpact, Cognizant, and Accenture. Nevertheless, it created a problem of privacy. You can apply strict controls to their computers for monitoring the access they have to user data when your moderators are working in the house. Nevertheless, when they are working for third parties, the user data is at a much higher risk of leaking to the outside world.

The Difficulty of Facebook

The privacy issues surrounding the hiring of moderators have not gotten much attention from journalists. Nevertheless, inside tech companies’ fears over data leakage is strong. The post-2016 election backlash had arisen partly over privacy concerns for Facebook in particular. The world learned about how the Cambridge Analytica intended to use information taken from people’s Facebook use. Thus, trust in the company plunged precipitously.

Facebook

That is why moderation of outsourced content for Youtube and Facebook were designed as secure rooms. Employees could work only on designated production floors that they had to badge in and out. Moreover, after that, employees can’t bring in any personal devices, lest they attempt to smuggle out data another way or take covert photos. That might create havoc for the workers. Thus, they are often fired because of inadvertently bringing phones onto the production floor. Many employees complain about the way the divide is separating them from their support networks during the day. Nevertheless, no company wants to relax those restrictions. They fear the public-relations crisis, and high-profile data loss might spark.

Today the pandemic spreads around the world at a frightening speed. Thus, we need as many moderatos working to police social networks, if not more – usage surges. You will almost certainly contribute to the spread of the coronavirus if you bring them to the production floor to continue work normally. On the flip side, if you let them work from home, you will invite them to a privacy disaster at a time when people will be hyper-sensitive about the misuse of their data.