Facebook has often been under fire for deleting contentious content from the platform.
SEE ALSO: Mark Zuckerberg talked baby names with the president of China during Facebook's charm campaignThis time65 Days of Solo Pleasure 3: Secret Office Sex for example, the company has been accused of deleting posts from Rohingya activists, in and outside Myanmar (or Burma), who report on the human rights violations and persecutions of Rohingya people in the country.
Mohammad Anwar, a Rohingya activist who writes on the site RohingyaBlogger.com, told Mashable that Facebook has pulled several posts about the latest flare of violence in Myanmar's northern Rakhine state.
"Facebook has logged me out several times saying 'session expired'," he said.
Screengrabs that Anwar shared with Mashableindeed show that Facebook removed posts because they allegedly didn't follow the Facebook Community Standards.
One post from 28 August showed Myanmar military choppers flying over Rohingya villages in the Maungdaw province. Facebook removed it and sent a message saying: “We removed the post below because it doesn’t follow the Facebook Community Standards." (He uses the name Anwar S. Mohammed on Facebook.)
Similar posts detailing Rohingya hamlets being burned down by the military have been shut down and removed.
At one point, Facebook temporarily disabled Anwar's account. "This temporary block will last for 7 days, and you won't be able to post on Facebook until it's finished. Please bear in mind that people who repeatedly post things that aren't allowed on Facebook may have their accounts permanently disabled," it said in a message.
"Currently, Facebook has disabled all the features in my Facebook account from liking, commenting, sharing, posting to messaging," said Anwar, whose Facebook account has been active since 2010.
"I can login only to see what others post or comment. So, I have deactivated my account in frustration."
The activist, who is now on Twitter, believes Facebook started actively deleting his posts since Zaw Htay, the director general of Myanmar's president office, announced that the company was collaborating with the state against terrorists and terrorist sympathisers.
Mark Farmaner, director at Burma Campaign UK, suspects that Facebook is getting a deluge of coordinated complaints from Mynmar's anti-Rohingya groups.
"It's not Facebook actively taking posts and accounts down," he said."Racists in Burma have coordinated people making complaints on Facebook about people and posts knowing it triggers Facebook automatic systems that remove posts or suspend accounts."
Farmaner said the same thing happened to him a few years back. "My account was down for five months. Facebook are pretty hopeless at responding when you use their system to try to get your posts or accounts reinstated after they have been targeted by racists like this."
The Daily Beast, who first reported on Facebook silencing Rohingya reports of ethnic cleansing, quoted a Rohingya man living in Burma who said Facebook deleted several of his accounts, posts about the violence and even a poem he published on the platform about Rohingya refugees.
The Rohingya are a stateless Muslim minority group which for years has faced systematic discrimination and persecution in Myanmar's Rakhine state at the hands of the country's powerful military and vigilante mobs.
In the past three weeks, more than 370,000 Rohingya were forced to flee across the border to Bangladesh as Myanmar's security forces are "burning down entire Rohingya villages and shooting people at random as they try to flee," according to Amnesty International.
Zeid Ra’ad Al Hussein, the United Nations' top human rights official, said the situation in the country "seems a textbook example of ethnic cleansing".
A Facebook spokesperson told Mashable:
“We allow people to use Facebook to challenge ideas and raise awareness about important issues, but we will remove content that violates our Community Standards.
These include hate speech, fake accounts, and dangerous organizations. Anyone can report content to us if they think it violates our standards and it doesn’t matter how many times a piece of content is reported, it will be treated the same.
Sometimes we will allow content if newsworthy, significant or important to the public interest - even if it might otherwise violate our standards.
In response to the situation in Myanmar, we are only removing graphic content when it is shared to celebrate the violence, versus raising awareness and condemning the action.
We are carefully reviewing content against our Community Standards and, when alerted to errors quickly resolving them and working to prevent them from happening again.”
Topics Facebook
(Editor: {typename type="name"/})
Did Elon Musk push former FAA leader out? Trump admin responds after deadly plane crash
5 new fall shows you need to watch
Today in You Are Old: Shaq's son vs. Ron Artest's son in high school hoops
How to watch NFL Thursday night games on Twitter
Boston Celtics vs. Dallas Mavericks 2025 livestream: Watch NBA online
Our brains still light up with emotions even when we relax, study finds
NBA, Facebook's Oculus give us the best VR sports film yet, via the 2016 Finals
7 reasons to read former child actress Mara Wilson's new memoir
Amazon Prime members gets 10% off Grubhub orders through Feb. 17
This website allows you to make the LaCroix flavor of your dreams
Golden State Warriors vs. Los Angeles Lakers 2025 livestream: Watch NBA online
Diego the tortoise is helping save his species by having lots of sex
接受PR>=1、BR>=1,流量相当,内容相关类链接。