TechnologyFacebook makes profit at the expense of our security: Frances Hogan

Facebook makes profit at the expense of our security: Frances Hogan

Frances Hogan, who has spoken out against Facebook, said in a written statement to the US Senate that “at this time, Facebook decides at its own discretion what information to share with billions of people and what not.” This is how people’s perceptions of reality change.

Hogan, a data scientist and former Facebook employee, accused the giant technology company of leaking inside documents to The Wall Street Journal and US law enforcement agencies that the company knew how by prioritizing ‘sustainable profits over safety’, Facebook’s products are spreading hatred and harming children’s mental health.

Even people who are not Facebook users are affected by the extremism that arises in Facebook users.

“The company that controls our deepest thoughts, feelings and attitudes needs real oversight”, she said.

Hogan testified in the US Senate that the “immoral” company is pursuing harmful policies.

Following the Senate action, the Wall Street Journal published a critical report accusing Facebook of ignoring its own research on the negative effects of its app Instagram on the health of teenagers.

Then Facebook quietly published an internal investigation after which the US Senate asked this multinational tech company many questions in a long hearing.

Lawmakers and news outlets such as The Washington Post and Bloomberg are calling the move Facebook’s Big Tobacco. Following the Senate hearing, a number of US news agencies have been publishing related news and articles, collectively known as the facebook papers.

Hogan was part of Facebook, but left when he found the tech company failing to address key issues, such as the spread of false information for profit, despite its capabilities.

She demanded for the company to be regulated. “Facebook makes a profit at the expense of our security,” says Hogan.

Hogan also revealed that Facebook presents content to users in a way that keeps users engaged, meaning they get more comments, likes and shares. “Facebook makes more money when you spend more content,” she added.

Social networking sites do not hesitate to use emotions to their advantage in order to increase engagement.

In 2012, Facebook conducted a controversial human research study, FB automatically update the status of the news feeds of 689,003 users to assess the effects of emotional news on their temperament by making changes in algorithm.”The more irritating content was shown to consumers, so that more time they spend on that content,” Hogan said.

According to her, the company changed its content policies before the US election in 2020 and tried to stop the spread of false information by giving less priority to political content on its news feed.

However, after the riots in the US capital, the old algorithm, which prioritizes consumer engagement, was revived.

Hogan says, “Since Facebook wanted the same product after the election, the company has started to adopt its own method.” This is a very serious issue for me. “

Muhammad Umair, a research and development engineer based in Sweden, told EOS: “Most social media apps, including the Facebook News Feed, use basic algorithmic recommendation systems.

Websites typically use cookies to track the activities of their users and their past behavior.

In the case of machine learning (these are algorithms that enhance their abilities based on data and experience, it is seen as a branch of artificial intelligence) used to gauge the user’s respective preferences or interests.

The purpose of designing them was to ensure that users do not have to look for excessive options on online sources but now the system also serves the purpose of increasing the user engagement of websites or enabling maximum sales of different products.

Mohammad Umair says, “For example, when you search for a specific item on an online shopping portal like Amazon, you start to see its replacement or related equipment. When you watch a movie on Netflix, you are advised to watch other movies of similar nature.

In this way, the apps ensure the engagement of the users.

Noman Khalid, founder, partner and chief data architect at Love For Data, a consulting firm that uses artificial intelligence and statistical learning tools for data science and decision making, says: The whole of social media is biased. When you use search engines, after some changes, list shows you the results you want to see.

Related Stories

US Tech think tank criticizes Pakistan for internet ban

Daniel Castro, the vice president of a well-known American technology think tank, says that an Internet ban may cost...

Scientists have created the first highly detailed map of the brain

An international team of scientists has produced the first brain map showing every neuron and connection in an adult...

65 billion loss due to internet shutdown in Pakistan

In 2023, the economy of Pakistan had to suffer a loss of 65 billion rupees due to the internet...

Trending on DiariesPK