myanmar – Techdirt (original) (raw)
Another Israeli Exploit Developer Caught Selling Malware To Blacklisted Countries
from the quite-the-cottage-industry-you-got-there dept
Maybe it’s time for the Israeli government to put a moratorium on Mossad-based startups. Israeli intelligence services have been the petri dishes for a particular strain of techbro — ones who have the smarts to create zero-click exploits but none of the common sense needed to cull baddies from their customer lists.
The Israeli government is partly to blame. It worked closely with NSO Group (and presumably others in the same business) to broker deals with human rights abusers: diplomacy via malware sales.
Months of negative press got NSO blacklisted by the US government. It also got it investigated in its homeland, finally resulting in the Israeli government (reluctantly) limiting who the company could sell to.
NSO isn’t the only malware merchant with Israeli roots. Candiru — another recipient of US sanctions — calls Israel home. So does Cytrox, yet another exploit developer with ties to Israeli intelligence services. Cytrox was at the center of a recent domestic spying scandal in Greece, with its malware being used to target opposition leaders and journalists. This culminated in Greek police forces raiding Cytrox’s local office, presumably as part of the ongoing investigation.
Now there’s another Israeli spyware maker making the wrong kind of headlines, as Fanny Potkin and Poppy McPherson report for Reuters.
Israel’s Cognyte Software Ltd won a tender to sell intercept spyware to a Myanmar state-backed telecommunications firm a month before the Asian nation’s February 2021 military coup, according to documents reviewed by Reuters.
No matter who’s running the Myanmar government, they shouldn’t be trusted with powerful spyware. For most of the past 60 years, the country has been run by some form of military dictatorship. The 2021 coup simply reshuffled a bit of the military dictatorship organizational chart. Throughout this time period, residents (especially Muslim residents) have been on the receiving end of intense oppression. For Myanmar’s Muslims, oppression means death: ethic cleansing.
Given the fact that any malware sold to the Myanmar government was likely to be abused to target critics and political opponents, Cognyte never should have agreed to sell the government its products. That’s what it should have willingly decided to do because that’s just being responsible.
But there’s another reason Cognyte shouldn’t have done it: it had to violate the law to complete the sale.
The deal was made even though Israel has claimed it stopped defence technology transfers to Myanmar following a 2017 ruling by Israel’s Supreme Court, according to a legal complaint recently filed with Israel’s attorney general and disclosed on Sunday.
According to the documents seen by Reuters, the sale was finalized at the end of 2020, apparently with the assistance of regulator Myanmar Post and Telecommunications (MPT). Given its proximity to the beginning of the coup, it seems this was deliberately acquired for use by the military government, which decided to contest an election it lost in November 2020 by overthrowing the democratically elected government three months later.
The fact that this sale occurred after the government swears it no longer permitted sales to Myanmar presents two possibilities. Neither option is good.
Either the government never stopped handing out export licenses to tech companies hoping to sell to Myanmar’s government or Cognyte ignored the restriction and made the sale without the required export license. Given that the documents show Cognyte as the winning bidder, the company didn’t even bother to try to launder its illegal export through a middleman. Or maybe it was both: a “don’t ask, don’t tell” policy for malware sales to human right abusers.
Whatever the case, it’s another black eye for the Israeli government — one that has done little to prevent local companies from selling powerful tech to bad people. It’s also an indictment of its intelligence services, which seem capable of attracting extremely skilled people who somehow decide that the logical extension of the lessons they’ve learned securing their nation is abandoning any remaining morality or ethics once they hit the private sector.
Filed Under: israel, malware, myanmar, spyware
Companies: cognyte
[UPDATED]: Myanmar's Military Junta Sentences American Journalist To Eleven Years In Prison
from the and-it's-coming-back-to-take-away-the-rest-of-his-life dept
[UPDATE]: Well, that was quick. Fenster has been released, which hopefully indicates Myanmar’s unelected government is discovering it’s a bad idea to pick fights with most of the rest of the world. However, I’m sure it will continue to brutalize its own citizens because those advocating for their rights on a local level won’t have the leverage of the US State Department. Here’s the statement by the US Secretary of State Antony Blinken celebrating Fenster’s unexpected release:
We welcome the release of American journalist Daniel Fenster from prison in Burma, where he was wrongfully detained for almost six months. I commend Ambassador Tom Vajda and his team at U.S. Embassy Rangoon, Special Presidential Envoy for Hostage Affairs Roger Carstens, the expertise of Consular Affairs and the dedicated partners, including Governor Bill Richardson, who helped facilitate Danny’s release.
We are glad that Danny will soon be reunited with his family as we continue to call for the release of others who remain unjustly imprisoned in Burma.
[Original article continues below:]
An American journalist is just one of many victims of a coup that overthrew Myanmar’s actual elected government and replaced it with the country’s military, which had claimed the election its favored party didn’t win had been, in effect, stolen. No election irregularities were discovered, but that didn’t matter much to the military, which had the might (but not the right) to seize power.
Along with the new government came new rules. Many of those targeted opponents and critics of the unelected government. Plenty of those targeted were journalists. Newspapers that had been at least tolerated under the previous regime were now deemed illegal operations.
One of those caught in the new government’s net was American-born journalist Danny Fenster. Fenster wrote for a news outlet the coup perpetrators declared illegal shortly after they took power. Thumbing its nose at sanctions imposed on it by dozens of countries, the government hauled Fenster into its kangaroo court and decided the actual facts were too inconvenient to be given any credence by the prosecution.
Much of the prosecution’s case appeared to hinge on his being employed by Myanmar Now, another online news site, that was ordered closed this year. But Fenster left his job at Myanmar Now in July last year, joining Frontier Myanmar the following month.
Prosecution witnesses testified that they were informed by a letter from the Information Ministry that its records showed that Fenster continued to be employed this year by Myanmar Now.
Both Myanmar Now and Frontier Myanmar issued public statements that Fenster had left the former publication last year, and his lawyer said defense testimony, as well as income tax receipts, established that he works for Frontier Myanmar.
The prosecution was going to get what it wanted. It threw everything it could at him and got all of it to stick.
The court found him guilty on Friday of spreading false or inflammatory information, contacting illegal organizations and violating visa regulations, lawyer Than Zaw Aung said.
Those charges alone will lock this journalist up for eleven years in a hard labor prison. But the newly non-elected government of Myanmar isn’t through making its point to critics, journalists, dissidents, and members of opposition parties. The government wants to put Danny Fenster away for life.
Fenster, the managing editor of the online magazine Frontier Myanmar, is still facing additional terrorism and treason charges under which he could receive up to life in prison.
The US State Department has condemned this prosecution by Myanmar’s military government. We’ll see how much that condemnation will actually matter in the coming weeks.
The Burmese military regime’s sentencing of U.S. journalist Danny Fenster is an unjust conviction of an innocent person. The United States condemns this decision. We are closely monitoring Danny’s situation and will continue to work for his immediate release.? We will do so until Danny returns home safely to his family.
That’s where Fenster was headed when he was arrested — at the airport, hoping to fly home and see his family. Now he’s being used as a test case for the newly installed government’s power. It’s staring down most of the rest of the world at this point and waiting to see who will blink first.
It’s still dangerous everywhere for journalists, especially those who put their own lives and liberty at risk to provide reporting in areas where governments directly control press outlets and subject those not under its control to massive amounts of oppression.
Filed Under: danny fenster, free speech, journalism, myanmar
Bizarre Magistrate Judge Ruling Says That If Facebook Deletes An Account, It No Longer Needs To Keep Details Private
from the that-doesn't-make-any-sense dept
There have been a bunch of slightly wacky court rulings of late, and this recent one from magistrate judge Zia Faruqui definitely is up there on the list of rulings that makes you scratch your head. The case involves the Republic of Gambia seeking information on Facebook accounts that were accused of contributing to ethnic genocide of the Rohingya in Myanmar. This situation was — quite obviously — horrible, and it tends to be the go-to story for anyone who wants to show that Facebook is evil (though I’m often confused about how people often seem more focused on blaming Facebook for the situation than the Myanmar government which carried out the genocide…). Either way, the Republic of Gambia is seeking information from Facebook regarding the accounts that played a role in the genocide, as part of its case at the International Court of Justice.
Facebook, which (way too late in the process) did shut down a bunch of accounts in Myanmar, resisted demands from Gambia to hand over information on those accounts noting, correctly, that the Stored Communications Act likely forbids it from handing over such private information. The SCA is actually pretty important in protecting the privacy of email and messages, and is one of the rare US laws on the books that is actually (for the most part) privacy protecting. That’s not to say it doesn’t have its own issues, but the SCA has been useful in the past in protecting privacy.
The ruling here more or less upends interpretations of the SCA by saying once an account is deleted, it’s no longer covered by the SCA. That’s… worrisome. The full ruling is worth a read, as you’ll know you’ll be in for something of a journey when it starts out:
I come to praise Facebook, not to bury it.
Not quite what you expect from a judicial order. The order lays out the unfortunately gory details of the genocide in Myanmar, as well as Facebook’s role in enabling the Myanmar government to push out propaganda and rally support for its ethnic cleansing. But the real question is how does all of this impact the SCA. As the judge notes, since the SCA was written in 1986 it certainly didn’t predict today’s modern social media, or the questions related to content moderation, so this is a new issue for the court to decide. But… still. The court decides that because an account is disabled… that means that the communications are no longer “stored.” Because [reasons].
The Problem Of Content Moderation
At the time of enactment, Congress viewed ECS and RCS providers as mail/package delivery services. See Cong. Rsch. Serv., R46662, Social Media: Misinformation and Content Moderation Issues for Congress (2021), https://crsreports.congress.gov/product/pdf/R/R46662\. This view failed to consider content moderation; mail/package delivery services have neither the ability nor the responsibility to search the contents of every package. Yet after disinformation on social media has fed a series of catastrophic harms, major providers have responded by taking on the de facto responsibility of content moderation. See id. ?The question of how social media platforms can respect the freedom of expression rights of users while also protecting [users] from harm is one of the most pressing challenges of our time.? …
This Court is the first to consider the question of what happens after a provider acts on its content moderation responsibility. Is content deleted from the platform but retained by the provider in ?backup storage?? It is not.
That obviously seems like a stretch to me. If the company still retains the information then it is clearly in storage. Otherwise, you’ve just created a massive loophole by saying that any platform can expose the private communications of someone if they first disable their account.
The court’s reasoning, though gets at the heart of the language of the SCA and how it protects both “any temporary, intermediate storage of a wire or electronic communication incidental to the electronic transmission thereof” or “any storage of such communication by an electronic communication service for purposes of backup protection of such communication.” It says the first bit can’t apply because these communications had reached their “final destination” and were no longer temporary. And it can’t be “backup” since the original content had been deleted, therefore there couldn’t be any “backup.”
Congress?s conception of ??backup? necessarily presupposes the existence of another copy to which this [backup record] would serve as a substitute or support.? Id. Without an original, there is nothing to back up. Indeed ?the lifespan of a backup is necessarily tied to that of the underlying message. Where the underlying message has expired . . . , any copy is no longer performing any backup function. An [ECS] that kept permanent copies of [deleted] messages could not fairly be described as ?backing up? those messages.?
But… I think that’s just wrong. Facebook retaining this data (but blocking the users from accessing it themselves) is clearly a “backup.” It’s backup in case there is a reason why, at some future date, the content does need to be restored. Under the judge’s own interpretation, if you backup your hard drive, but then the drive crashes, your backup is no longer your backup, because there’s no original. But… that’s completely nonsensical.
The judge relies on (not surprisingly) a case in which the DOJ twisted and stretched the limits of the SCA to get access to private communications:
Nearly all ?backup storage? litigation relates to delivered, undeleted content. That case law informs and supports the Court?s decision here. ?Although there is no binding circuit precedent, it appears that a clear majority of courts have held that emails opened by the intended recipient (but kept on a web-based server like Gmail) do not meet the [backup protection] definition of ?electronic storage.?? Sartori v. Schrodt, 424 F. Supp. 3d 1121, 1132 (N.D. Fla. 2019) (collecting cases). The Department of Justice adopted this view, finding that backup protection ?does not include post-transmission storage of communications.? U.S. Dep?t of Just., Searching and Seizing Computers and Obtaining Electronic Evidence in Criminal Investigations, 123 (2009), https://www.justice.gov/sites/default/files/criminal-ccips/legacy/2015/01/14/ssmanual2009.pdf. The Gambia argues for following the majority view?s limited definition of backup storage. See Sartori, 424 F. Supp. 3d at 1132; ECF No. 16 (Pet?r?s Resp. to Surreply) at 5?6. If undeleted content retained by the user is not in backup storage, it would defy logic for deleted content to which the user has no access to be in backup storage.
As for the argument (which makes sense to me) that Facebook made that the entire reason for retaining the account shows that it’s backup, the judge just doesn’t buy it.
Facebook argues that because the provider-deleted content remains on Facebook servers in proximity to where active content on the platform is stored, both sets of content should be protected as backup storage. See Conf. Tr. at 76. However, the question is not where the records are stored but why they are stored. See Theofel, 359 F.3d at 1070. Facebook claims it kept the instant records as part of an autopsy of its role in the Rohingya genocide. See Conf. Tr. at 80?81. While admirable, that is storage for self-reflection, not for backup.
The judge also brushes aside the idea that there are serious privacy concerns with this result, mainly because the judge doesn’t believe Facebook cares about privacy. That, alone, is kind of a weird way to rule on this issue.
Finally, Facebook advances a policy argument, opining that this Court?s holding will ?have sweeping privacy implications?every time a service provider deactivates a user?s account for any reason, the contents of the user?s communications would become available for disclosure to anyone, including the U.S. government.?…. Facebook taking up the mantle of privacy rights is rich with irony. News sites have entire sections dedicated to Facebook?s sordid history of privacy scandals.
So… because Facebook doesn’t have a great history regarding the protection of privacy… we can make it easier for Facebook to expose private communications? What? And even if it’s true that Facebook has made problematic decisions in the past regarding privacy, that’s wholly separate from the question of whether or not it has a legal obligation to protect the privacy of messages now.
Furthermore, the judge insists that even if there are privacy concerns, they are “minimal”:
The privacy implications here are minimal given the narrow category of requested content. Content urging the murder of the Rohingya still permeates social media. See Stecklow, supra (documenting ?more than 1,000 examples . . . of posts, comments, images and videos attacking the Rohingya or other Myanmar Muslims that were on Facebook? even after Facebook apologized for its services being ?used to amplify hate or exacerbate harm against the Rohingya?). Such content, however vile, is protected by the SCA while it remains on the platform. The parade of horribles is limited to a single float: the loss of privacy protections for de-platformed content. And even that could be mitigated by users joining sites that do not de-platform content.
Yes. In this case. But this could set a precedent for accessing a ton of other private communications as well, and that’s what’s worrying. It’s absolutely bizarre and distressing that the judge doesn’t bother to think through the implications of this ruling beyond just this one case.
Prof. Orin Kerr, one of the foremost experts on ECPA and the SCA, notes that this is both an “astonishing interpretation” and “stunning.”
Also, it's a stunning interpretation in its consequences. Under the op, the most fundamental rule of Internet privacy — that your e-mails and messages are protected from disclosure — is largely meaningless. A provider can just delete your account and hand out your messages.
— Orin Kerr (@OrinKerr) September 24, 2021
The entire ruling is concerning — and feels like yet another situation where someone’s general disdain for Facebook and its policies (a totally reasonable position to take!) colored the analysis of the law. And the end result is a lot more dangerous for everyone.
Filed Under: backup, deleted profiles, ecpa, gambia, myanmar, privacy, sca, stored communications act, zia faruqui
Companies: facebook
Facebook Oversight Board's First Decisions… Seem To Confirm Everyone's Opinions Of The Board
from the take-a-deep-breath dept
Last week, the Oversight Board — which is the official name that the former Facebook Oversight Board wants you to call it — announced decisions on the first five cases it has heard. It overturned four Facebook content moderation decisions and upheld one. Following the announcement, Facebook announced that (as it had promised) it followed all of the Oversight Board’s decisions and reinstated the content on the overturned cases (in one case, involving taking down a breast cancer ad that had been deemed to violate the “no nudity” policy, Facebook actually reinstated the content last year, after the Board announced it was reviewing that decision). If you don’t want to wade into the details, NPR’s write-up of the decisions and policy recommendations is quite well done and easily digestible.
If you want a more detailed and thoughtful analysis of the decisions and what this all means, I highly recommend Evelyn Douek’s detailed analysis of the key takeaways from the rulings.
What I’m going to discuss, however, is how the decisions seem to have only reinforced… absolutely everyone’s opinions of the Oversight Board. I’ve said before that I think the Oversight Board is a worthwhile experiment, and one worth watching, but it is just one experiment. And, as such, it is bound to make mistakes and adapt over time. I can understand the reasoning behind each of the five decisions, though I’m not sure I would have ruled the same way.
What’s more interesting to me, though, is how so many people are completely locked in to their original view of the board, and how insistent they are that the first decisions only confirm their position. It’s no secret that many people absolutely hate Facebook and view absolutely everything the company does as unquestionably evil. I’m certainly not a fan of many of the company’s practices, and don’t think that the Oversight Board is as important as some make it out to be, but that doesn’t mean it’s not worth paying attention to.
But I tended to see a few different responses to the first rulings, which struck me as amusing, since the positions are simply not disprovable:
1. The Oversight Board is just here to rubberstamp Facebook’s decisions and make it look like there’s some level of review.
This narrative is slightly contradicted by the fact that the Oversight Board overturned four decisions. However, people who believe this view retort that “well, of course the initial decisions have to do this to pretend to be independent.” Which… I guess? But seems like a lot of effort for no real purpose. To me, at least, the first five decisions are not enough to make a judgment call on this point either way. Let’s see what happens over a longer time frame.
2. The Oversight Board is just a way for Facebook and Zuckerberg not to take real responsibility
I don’t see how this one is supportable. It’s kind of a no-win situation either way. Every other company in the world that does content moderation has a final say on their decisions, because it’s their website. Facebook is basically the first and only site so far to hand off those decisions to a 3rd party — and it did so after a ton of people whined that Facebook had too much power. And the fact that this body is now pushing back on Facebook’s decisions suggests that there’s at least some initial evidence that the Board might force Zuckerberg to take more responsibility. Indeed, the policy recommendations (not just the decisions directly on content moderation) suggest that the Board is taking its role as being an independent watchdog over how Facebook operates somewhat seriously. But, again, it’s perhaps too early to tell, and this will be a point worth watching.
3. The Oversight Board has no real power, so it doesn’t matter what they do.
The thing is, while this may be technically true, I’m not sure it matters. If Facebook actually does follow through and agree to abide by the Board’s rulings, and the Board continues the initial path it’s set of being fairly critical of Facebook’s practices, then for all intents and purposes it does have real power. Sometimes, the power comes just from the fact that Facebook may feel generally committed to following through, rather than through any kind of actual enforcement mechanism.
4. The Oversight Board is only reviewing a tiny number of cases, so who cares?
This is clearly true, but again, the question is how it will matter in the long run. At least from the initial set of decisions, it’s clear that the Oversight Board is not just taking a look at the specific cases in front of it, but thinking through the larger principles at stake, and making recommendations back to Facebook about how to implement better policies. That could have a very big impact on how Facebook operates over time.
As for my take on all of this? As mentioned up top, I think this is a worthwhile experiment, though I’ve long doubted it would have that big of an impact on Facebook itself. I see no reason to change my opinion on that yet, but I am surprised at the thoroughness of these initial decisions and how far they go in pushing back on certain Facebook policies. I guess I’d update my opinion to say I’ve moved from thinking the Oversight Board had a 20% chance of having a meaningful impact, to now it being maybe 25 to 30% likely. Some will cynically argue that this is all for show, and the first cases had to be like that. And perhaps that’s true. I guess that’s why no one is forced to set their opinion in stone just yet, and we’ll have plenty of time to adjust as more decisions come out.
Filed Under: appeals, breast cancer, content moderation, free speech, myanmar, nudity, review
Companies: facebook, oversight board
Digital Technology As Accelerant: Growth And Genocide In Myanmar
from the broader,-collaborative-view dept
Every person in Myanmar above the age of 10 has lived part, if not most, of their life under a military dictatorship characterized by an obsession with achieving autonomy from international influences. Before the economic and political reforms of the past decade, Myanmar was one of the most isolated nations in the world. The digital revolution that has reshaped nearly every aspect of human life over the past half-century was something the average Myanmar person had no personal experience with.
Recent reforms brought an explosion of high hopes and technological access, and Myanmar underwent a digital leapfrog, with internet access jumping from nearly zero percent in 2015 to over 40 percent in 2020. At 27-years-old, I remember living in a Yangon where having a refrigerator was considered high tech, and now, there are 10-year-olds making videos on Tik Tok.
Everyone was excited for Myanmar’s digital revolution to spur the economic and social changes needed to transform the country from a pariah state into the next economic frontier. Tourists, development aid, and economic investment poured into the country. The cost of SIM cards dropped from around 1,000 US dollars in 2013 to a little over 1 dollar today.
This dramatic price drop was paired with a glut of relatively affordable smartphones and phone carriers that provided data packages that made social media platforms like Facebook free, or nearly free, to use. This led to the current situation where about 21 million out of the 22 million people using the internet are on Facebook. Facebook became the main conduit through which people accessed the internet, and now is used for nearly every online activity from selling livestock, watching porn, reading the news, to discussing politics.
Then, following the exodus of over 700,000 Rohingya people from Myanmar’s war-torn Rakhine State, Facebook was accused of enabling a genocide.
The ongoing civil wars in the country and the state violence against the Rohingya, characterized by the UN as ethnic cleansing with genocidal intent, put a spotlight on the potential for harm brought on by digital connectivity. Given its market dominance, Facebook has faced great scrutiny in Myanmar for the role social media has played in normalizing, promoting, and facilitating violence against minority groups.
Facebook was, and continues to be, the favored tool for disseminating hate speech and misinformation against the Rohingya people, Muslims in general, and other marginalized communities. Despite repeated warnings from civil society organizations in the country, Facebook failed to address the new challenges with the urgency and level of resources needed during the Rohingya crisis, and failed to even enforce its own community standards in many cases.
To be sure, there have been improvements in recent years, with the social media giant appointing a Myanmar focused team, expanding their number of Myanmar language content reviewers, adding minority language content reviewers, establishing more regular contact with civil society, and devoting resources and tools focused on limiting disinformation during Myanmar’s upcoming election. The company also removed the accounts of Myanmar military officials and dozens of pages on Facebook and Instagram linked to the military for engaging in “coordinated inauthentic behavior.” The company defines “inauthentic behavior” as “engag[ing] in behaviors designed to enable other violations under our Community Standards,” through tactics such as the use of fake accounts and bots.
Recognizing the seriousness of this issue, everyone from the EU to telecommunications companies to civil society organizations have poured resources into digital literacy programs, anti-hate-speech campaigns, social media monitoring, and advocacy to try and address this issue. Overall, the focus of much of this programming is on what Myanmar and the people of Myanmar lack—rule of law, laws protecting free speech, digital literacy, knowledge of what constitutes hate speech, and resources to fund and execute the programming that is needed.
In the frenzy of the desperate firefighting by organizations on the ground, less attention has been given to larger systemic issues that are contributing to the fire.
There is a need to pay greater attention to those coordinated groups that are working to spread conspiracy theories, false information, and hatred to understand who they are, who is funding them, and how their work can be disrupted—and, if necessary, penalized.
There is a need to reevaluate how social media platforms are designed in a way that incentivizes and rewards bad behavior.
There is also a need to question how much blame we want to assign to social media companies, and whether it is to the overall good to give them the responsibility, and therefore power, to determine what is and isn’t acceptable speech.
Finally, there is a need to ask ourselves about alternatives we can build, when many governments have proven themselves more than willing to surveil and prosecute netizens under the guise of health, security, and penalizing hate speech.
It is dangerous to expect private, profit-driven multinational corporations to be given the power to draw the line between hate speech and free speech. Just as it is dangerous to give that same power to governments, especially in this time of rising ethno-nationalistic sentiments around the globe and the increasing willingness of governments to overtly and covertly gather as much data as possible to use against those they govern. We can see from the ongoing legal proceedings against Myanmar in international courts regarding the Rohingya and other ethnic minorities, and statements from UN investigative bodies on Myanmar that Facebook has failed release to them evidence of serious international crimes, that neither company policies nor national laws are enough to ensure safety, justice, and dignity for vulnerable populations.
The solution to all this, as unsexy as it sounds, is a multifaceted, multi-stakeholder, long-term effort to build strong legal and cultural institutions that disperses the power and the responsibility to create and maintain safe and inclusive online spaces between governments, individuals, the private sector, and civil society.
Aye Min Thant is the Tech for Peace Manager at Phandeeyar, an innovation lab which promotes safer and more inclusive digital spaces in Myanmar. Formerly, she was a Pulitzer Prize winning journalist who covered business, politics, and ethno-religious conflicts in Myanmar for Reuters. You can follow her on Twitter @ma_ayeminthant.
This article was developed as part of a series of papers by the Wikimedia/Yale Law School Initiative on Intermediaries and Information to capture perspectives on the global impacts of online platforms’ content moderation decisions. You can read all of the articles in the series here, or on their Twitter feed @YaleISP_WIII.
Filed Under: content moderation, myanmar
Companies: facebook
UN Says Facebook Is Complicit In The Spread Of Anti-Muslim Hate In Myanmar
from the sort-of-right,-but-approaching-the-problem-the-wrong-way dept
The UN has decided it’s possibly Facebook’s fault things are going so badly in Myanmar. Muslims have been fleeing the country in droves thanks to Myanmar security forces engaging in widespread acts of violence (including rape) against them, urged on by hardline nationalist monks.
For all intents and purposes, Facebook is Myanmar’s internet. Loosening of restrictions on social media access has resulted in a large portion of the population getting all their news (along with all the hate speech the UN is complaining about) via the social media giant. The UN is looking into genocide accusations but has decided to speak up against Facebook first.
Marzuki Darusman, chairman of the UN Independent International Fact-Finding Mission on Myanmar, told reporters that social media had played a “determining role” in Myanmar.
“It has … substantively contributed to the level of acrimony and dissension and conflict, if you will, within the public. Hate speech is certainly of course a part of that. As far as the Myanmar situation is concerned, social media is Facebook, and Facebook is social media,” he said.
The UN Myanmar investigator Yanghee Lee said Facebook was a huge part of public, civil and private life, and the government used it to disseminate information to the public.
When there’s only one main pipeline of info, everything flows through it, whether it’s the official government narrative or government-supported hate speech targeting Myanmar Muslims. The UN feels Facebook has contributed to the violence by not doing enough to remove hate speech.
If these are the UN’s conclusions, it’s severely late to the party. Last fall, The Daily Beast reported Facebook was instrumental in removing reports of anti-Muslim violence the Myanmar government didn’t approve of.
Rohingya activists—in Burma and in Western countries—tell The Daily Beast that Facebook has been removing their posts documenting the ethnic cleansing of Rohingya people in Burma (also known as Myanmar). They said their accounts are frequently suspended or taken down.
The Rohingya people are a Muslim ethnic minority group in Burma. They face extraordinary persecution and violence from the Burmese military; military personnel torch villages, murder refugees, and force hundreds of thousands of people to flee their homes.
Facebook promised to do better after being confronted with this evidence. But it offered no good reason why activists’ posts detailing government atrocities were frequently removed and the accounts posting them locked or suspended. The company did not specifically say whether or not it was responding to government requests for content removal, but its transparency report shows almost no activity related to Myanmar’s government. If this was solely the result of horrendous judgment calls by Facebook moderators, the end result of its moderation efforts has been the loss of human lives.
Even as Muslims are being silenced, the Myanmar government has used Facebook to push its own narrative to a largely captive audience. Shuna Mia, a Rohingya man who spoke to reporters about government-backed rape and murder was found floating headless in a nearby river the following day. According to this Guardian report, the Myanmar government immediately began rewriting history using Facebook as its soapbox.
The day after Shuna was found dead, someone representing the state counsellor of Myanmar (Aung San Suu Kyi’s official title) posted a photo of a headless body on the office’s Facebook page, stamped with the words “Truth teller BEHEADED”. The post claimed Shuna had told the media that security forces had not committed rape or arson, and suggested he was killed by “Muslim insurgents” in retaliation. That directly contradicted local reports, activists and Shuna’s family, who believe he was abducted and beheaded by security forces for speaking to journalists.
This was just part of the government’s efforts to discredit Rohingya people. On the same day, the same Facebook account posted photos of Rohingya women who said they had been raped by security forces. The label “FAKE RAPE” dismissed the countless reports of sexual violence.
Facebook has always had issues with moderation. Its policies may seem internally consistent, but the way they play out in the messiness of everyday life leaves a lot to be desired. The content it removed may have somehow violated policies or local laws, but posts are apparently viewed in vacuum, removed from political and social context. This isn’t necessarily an oversight by Facebook. It’s merely reflective of the reality the company deals with: more than a billion users scattered across the globe, operating under a patchwork of speech laws that cannot be applied across all posts from all people.
But the end result of this impossible task is Facebook’s inadvertent participation in the spread of anti-Muslim hate that is linked to suspected ethnic cleansing. Unfortunately, the UN’s public criticism of Facebook isn’t going to help. If Facebook sees more regulation or international pressure headed its way, it’s likely to double-down on moderation, resulting in even more suppression of anti-government sentiment. It will get worse for Myanmar Muslims, thanks to Facebook’s inadvertent stranglehold on news distribution.
This isn’t to say no one should be speaking up about Facebook’s contribution to an international problem. It’s just that they shouldn’t expect things to get better just because they’re loudly complaining about it. The problem is the Myanmar government and that’s where the UN should focus its efforts. Facebook’s contribution is a symptom of the underlying problem, not a root cause.
Filed Under: hate speech, intermediary liability, myanmar, un, violence
Companies: facebook
Myanmar Internet Shut Down, But We Can Still Watch From Space
from the it's-a-small-world dept
As previously reported, the pro-democracy rallies in Myanmar have been closely covered by regular reports coming out of the embattled nation via cellphone, email and even YouTube. The government’s attempts to try and pollute the web with their own propaganda must not have worked, since on Friday morning, the government shut off Internet access, cut phone lines and confiscated mobile phones in an attempt to control the outflow of information about the rallies. Though this may have slowed reports, it’s very difficult for the government to completely clamp down, so some news reports are still getting out through mobile phones and a few satellite uplinks to the Internet. Even if the junta is able to completely shut things down, events can still be monitored from satellites, which are providing evidence of potential human rights abuses conducted by the government. Now that its next actions are being played out under a vigilant global eye, hopefully Myanmar officials will make the right choices in the coming days.
Filed Under: myanmar
Myanmar Protests Reported by Citizen Journalists, And Possibly Government Journalists As Well
from the information-is-power dept
As Myanmar struggles towards democracy after 40 years under military junta, the Internet is playing a crucial role in the fight. News of Monday’s protest was reported within a few hours of it starting, due largely in part to thousands of citizen journalists who sent their stories, photos and videos to global news sites. This is in stark contrast to the days that it took for news to break about the 1988 8888 uprising, where 3,000 civilians were killed. Now, armed with cameraphones and email, coverage of the events in Myanmar are posted immediately to blogs and news sites, forcing the junta to play out this weeks events under the scrutiny of global eyes. Well, perhaps the government has started to take notice — false reports are being sent out as well, presumably by Burmese authorities looking to undermine those reporting the news or to spread government propaganda. However, regardless of how the medium is used, the most important thing is that the Internet has made it easier for information to be free, which presumably will make it more difficult for totalitarian regimes to hang on to the reins of control.
Filed Under: citizen journalism, myanmar