nudity – Techdirt (original) (raw)
Content Moderation Case Studies: Facebook Suspends Account For Showing Topless Aboriginal Women (2016)
from the double-standards? dept
Summary: Facebook’s challenges of dealing with content moderation around “nudity” have been covered many times, but part of the reason the discussion comes up so often is that there are so many scenarios to consider that it is difficult to create policies that cover them all.
In March of 2016, activist Celeste Liddle gave the keynote address at the Queen Victoria Women’s Centre’s annual International Women’s Day address. The speech covered many aspects of the challenges facing aboriginal women in Australia, and mentions in passing at one point that Liddle’s Facebook account had been repeatedly suspended for posting images of topless aboriginal women that were shown in a trailer for a TV show.
“I don’t know if people remember, but last year the Indigenous comedy show 8MMM was released on ABC. I was very much looking forward to this show, particularly since it was based in Alice and therefore I knew quite a few people involved.
“Yet there was controversy because when 8MMM released a promotional trailer for the show prior to it going to air. This trailer was banned by Facebook because it featured topless desert women painted up for ceremony engaging in traditional dance.
“Facebook saw these topless women as “indecent” and in violation of their no nudity clause. On hearing this, I was outraged that Arrernte woman undertaking ceremony could ever be seen in this way so I posted the trailer up on my own page stating as such.
“What I didn’t count on was a group of narrow-minded little white men deciding to troll my page so each time I posted it, I not only got reported by them but I also got locked out and the video got removed.” — Celeste Liddle
The publication New Matilda published a transcript of the entire speech, which Liddle then linked to herself, leading her account to be suspended for 24 hours and New Matilda’s post being removed — highlighting the point that Liddle was making. As she told New Matilda in a follow up article about the removal and the suspension:
“My ban is because I’ve previously published images of nudity… I’m apparently a ‘repeat nudity poster offender’…
“I feel decidedly smug this morning, because everything I spoke about in my speech on this particular topic just seems to have been proven completely true…
“It’s actually a highly amusing outcome.” — Celeste Liddle
Facebook’s notice to New Matilda claimed that it was restricted for posting “nudity” and said that the policy has an exception if the content is posted for “educational, humorous or satirical purposes,” but did not give New Matilda a way to argue that the usage in the article was “educational.”
Many publications, starting with New Matilda, highlighted the contrast that the same day Liddle gave her speech (International Women’s Day), Esquire released a cover story about Kim Kardashian which featured an image of her naked but partially painted. Both images, then, involved topless women, with their skin partially painted. However, those posting the aboriginal women faced bans from Facebook, while the Kardashian image not only remained up, but went viral.
Company Considerations:
- How can policies regarding nudity be written to take into account cultural and regional differences?
- Is there a way to adequately determine if nudity falls into one of the qualified exemptions, such as “educational” use?
- What would be an effective and scalable way to enable an appeals process that would allow users like Liddle to inform Facebook the nature of the content that resulted in her temporary suspension?
Issue Considerations:
- Questions about moderating “nudity” have been challenging for many websites. Are there reasonable and scalable policies that can be put in place that adequately take context into account?
- Many websites start out with a “no nudity” policy to avoid having to deal with adult material on their websites. What other factors should any website consider regarding why a more nuanced policy may make more sense?
Resolution: After this story got some attention, Liddle launched a Change.org petition asking Facebook to recognize that aboriginal women “practicing culture are not offensive.”
Facebook’s standards are a joke. They are blatantly racist, sexist and offensive. They show a complete lack of respect for the oldest continuing culture in the world. They also show that Facebook continually fails to address their own shortfalls in knowledge. Finally, they show that Facebook is more than willing to allow scurrilous bullying to continue rather than educate themselves. — Celeste Liddle
New Matilda requested comment from Facebook regarding the removal of the link to its story and were told that even if the sharing was for an “awareness campaign” Facebook still believed it should be removed because some audiences in Facebook’s “global community” would be “sensitive” to such content. The company also notes that in order to allow its content moderators to apply rules “uniformly” they sometimes need to be “more blunt than we would like.”
“We are aware that people sometimes share content containing nudity for reasons like awareness campaigns, artistic projects or cultural investigations. The reason we restrict the display of nudity is because some audiences within our global community may be sensitive to this type of content – particularly because of cultural background or age. In order to treat people fairly and respond to reports quickly, it is essential that we have policies in place that our global teams can apply uniformly and easily when reviewing content. As a result, our policies can sometimes be more blunt than we would like, and restrict content shared for legitimate purposes. We encourage people to share Celeste Liddle’s speech on Facebook by simply removing the image before posting it.”
Originally posted to the Trust & Safety Foundation website.
Filed Under: australia, celeste liddle, content moderation, double standards, nudity
Companies: facebook
As Prudes Drive Social Media Takedowns, Museums Embrace… OnlyFans?
from the didn't-see-that-one-coming dept
Over the last few years, we’ve seen more and more focus on using content moderation efforts to stamp out anything even remotely upsetting to certain loud interest groups. In particular, we’ve seen NCOSE, formerly “Morality in Media,” spending the past few years whipping up a frenzy about “pornography” online. They were one of the key campaigners for FOSTA, which they flat out admitted was step one in their plan to ban all pornography online. Recently, we’ve discussed how MasterCard had put in place ridiculous new rules that were making life difficult for tons of websites. Some of the websites noted that Mastercard told them it was taking direction from… NCOSE. Perhaps not surprisingly, just recently, NCOSE gave MasterCard its “Corporate Leadership Award” and praised the company for cracking down on pornography (which NCOSE considers the same as sex trafficking or child sexual abuse).
Of course, all of this has some real world impact. We’ve talked about how eBay, pressured to remove such content because of FOSTA and its payment processors, has been erasing LGBTQ history (something, it seems, NCOSE is happy about). And, of course, just recently, OnlyFans came close to prohibiting all sexually explicit material following threats from its financial partners — only to eventually work out a deal to make sure it could continue hosting adult content.
But all of this online prudishness has other consequences. Scott Nover, over at Quartz, has an amazing story about how museums in Vienna are finding that images of classic paintings are being removed from all over the internet. Though, they’ve come up with a somewhat creative (and surprising) solution: the museums are setting up OnlyFans accounts, since the company is one of the remaining few which is able to post nude images without running afoul of content moderation rules. Incredibly, the effort is being run by Vienna’s Tourist Board.
The Vienna Tourist Board said its museums have faced a litany of online challenges. After the Natural History Museum Vienna posted images of the Venus of Willendorf, a 25,000-year-old Paleolithic limestone figurine, Facebook deleted the images and called them pornographic. The Albertina Museum had its TikTok account suspended in July for showing nudes from the Japanese artist and photographer ??Nobuyoshi Araki, CNN reported. And the Leopold Museum, which houses modern Austrian art, has struggled to advertise on social media because of the bans on nudity.
Even advertising the new OnlyFans account on other social media proved difficult, the board said. Twitter rejected links to the board?s website because it linked out to the OnlyFans account. (Twitter allows nudity on its platform as long as the account and images are labeled as such.) Facebook and Instagram only allowed ads featuring the Venus of Willendorf and a nude painting by Amedeo Modigliani after the tourist board explained the context to the platforms, but other images by artists Egon Schiele and Peter Paul Rubens were rejected.
This is all kind of ridiculous, but certainly falls into the Masnick’s Impossibility Theorem collection of the impossibility of content moderation at scale. Of course, it also recalls the case in France where Facebook took down an classic 1866 oil painting by Gustave Courbet, in which the court initially ruled that Facebook could not take down the image. Facebook has (for many years now) had exceptions to its nudity rule for “art,” but figuring out how to enforce that kind of thing is notoriously difficult.
And when you have prudish, moralizing busybodies like NCOSE pressuring companies to wipe out any and all nudity, it’s no surprise that this kind of thing is the result. But, really, all of this seems likely to backfire in the end. Cordoning off even artistic nudity into sites like OnlyFans… also means that more and more people may be introduced to OnlyFans “for the paintings,” only to discover what else is available there.
Filed Under: content moderation, museums, nudity, paintings, pornography, prudes, social media, vienna, vienna tourist bouard
Companies: onlyfans
Facebook Oversight Board's First Decisions… Seem To Confirm Everyone's Opinions Of The Board
from the take-a-deep-breath dept
Last week, the Oversight Board — which is the official name that the former Facebook Oversight Board wants you to call it — announced decisions on the first five cases it has heard. It overturned four Facebook content moderation decisions and upheld one. Following the announcement, Facebook announced that (as it had promised) it followed all of the Oversight Board’s decisions and reinstated the content on the overturned cases (in one case, involving taking down a breast cancer ad that had been deemed to violate the “no nudity” policy, Facebook actually reinstated the content last year, after the Board announced it was reviewing that decision). If you don’t want to wade into the details, NPR’s write-up of the decisions and policy recommendations is quite well done and easily digestible.
If you want a more detailed and thoughtful analysis of the decisions and what this all means, I highly recommend Evelyn Douek’s detailed analysis of the key takeaways from the rulings.
What I’m going to discuss, however, is how the decisions seem to have only reinforced… absolutely everyone’s opinions of the Oversight Board. I’ve said before that I think the Oversight Board is a worthwhile experiment, and one worth watching, but it is just one experiment. And, as such, it is bound to make mistakes and adapt over time. I can understand the reasoning behind each of the five decisions, though I’m not sure I would have ruled the same way.
What’s more interesting to me, though, is how so many people are completely locked in to their original view of the board, and how insistent they are that the first decisions only confirm their position. It’s no secret that many people absolutely hate Facebook and view absolutely everything the company does as unquestionably evil. I’m certainly not a fan of many of the company’s practices, and don’t think that the Oversight Board is as important as some make it out to be, but that doesn’t mean it’s not worth paying attention to.
But I tended to see a few different responses to the first rulings, which struck me as amusing, since the positions are simply not disprovable:
1. The Oversight Board is just here to rubberstamp Facebook’s decisions and make it look like there’s some level of review.
This narrative is slightly contradicted by the fact that the Oversight Board overturned four decisions. However, people who believe this view retort that “well, of course the initial decisions have to do this to pretend to be independent.” Which… I guess? But seems like a lot of effort for no real purpose. To me, at least, the first five decisions are not enough to make a judgment call on this point either way. Let’s see what happens over a longer time frame.
2. The Oversight Board is just a way for Facebook and Zuckerberg not to take real responsibility
I don’t see how this one is supportable. It’s kind of a no-win situation either way. Every other company in the world that does content moderation has a final say on their decisions, because it’s their website. Facebook is basically the first and only site so far to hand off those decisions to a 3rd party — and it did so after a ton of people whined that Facebook had too much power. And the fact that this body is now pushing back on Facebook’s decisions suggests that there’s at least some initial evidence that the Board might force Zuckerberg to take more responsibility. Indeed, the policy recommendations (not just the decisions directly on content moderation) suggest that the Board is taking its role as being an independent watchdog over how Facebook operates somewhat seriously. But, again, it’s perhaps too early to tell, and this will be a point worth watching.
3. The Oversight Board has no real power, so it doesn’t matter what they do.
The thing is, while this may be technically true, I’m not sure it matters. If Facebook actually does follow through and agree to abide by the Board’s rulings, and the Board continues the initial path it’s set of being fairly critical of Facebook’s practices, then for all intents and purposes it does have real power. Sometimes, the power comes just from the fact that Facebook may feel generally committed to following through, rather than through any kind of actual enforcement mechanism.
4. The Oversight Board is only reviewing a tiny number of cases, so who cares?
This is clearly true, but again, the question is how it will matter in the long run. At least from the initial set of decisions, it’s clear that the Oversight Board is not just taking a look at the specific cases in front of it, but thinking through the larger principles at stake, and making recommendations back to Facebook about how to implement better policies. That could have a very big impact on how Facebook operates over time.
As for my take on all of this? As mentioned up top, I think this is a worthwhile experiment, though I’ve long doubted it would have that big of an impact on Facebook itself. I see no reason to change my opinion on that yet, but I am surprised at the thoroughness of these initial decisions and how far they go in pushing back on certain Facebook policies. I guess I’d update my opinion to say I’ve moved from thinking the Oversight Board had a 20% chance of having a meaningful impact, to now it being maybe 25 to 30% likely. Some will cynically argue that this is all for show, and the first cases had to be like that. And perhaps that’s true. I guess that’s why no one is forced to set their opinion in stone just yet, and we’ll have plenty of time to adjust as more decisions come out.
Filed Under: appeals, breast cancer, content moderation, free speech, myanmar, nudity, review
Companies: facebook, oversight board
Content Moderation Case Study: Facebook's AI Continues To Struggle With Identifying Nudity (2020)
from the ai-is-not-the-answer dept
Summary: Since its inception, Facebook has attempted to be more “family-friendly” than other social media services. Its hardline stance on nudity, however, has often proved problematic, as its AI (and its human moderators) have flagged accounts for harmless images and/or failed to consider context when removing images or locking accounts.
The latest example of Facebook’s AI failing to properly moderate nudity involves garden vegetables. A seed business in Newfoundland, Canada was notified its image of onions had been removed for violating the terms of service. Its picture of onions apparently set off the auto-moderation, which flagged the image for containing “products with overtly sexual positioning.” A follow-up message noted the picture of a handful of onions in a wicker basket was “sexually suggestive.”
Facebook’s nudity policy has been inconsistent since its inception. Male breasts are treated differently than female breasts, resulting in some questionable decisions by the platform. Its policy has also caused problems for definitively non-sexual content, like photos and other content posted by breastfeeding groups and breast cancer awareness videos. In this case, the round shape and flesh tones of the onions appear to have tricked the AI into thinking garden vegetables were overtly sexual content, showing the AI still has a lot to learn about human anatomy and sexual positioning.
Decisions to be made by Facebook:
- Should more automated nudity/sexual content decisions be backstopped by human moderators?
- Is the possibility of over-blocking worth the reduction in labor costs?
- Is over-blocking preferable to under-blocking when it comes to moderating content?
- Is Facebook large enough to comfortably absorb any damage to its reputation or user goodwill when its moderation decisions affect content that doesn’t actually violate its policies?
- Is it even possible for a platform of Facebook’s size to accurately moderate content and/or provide better options for challenging content removals?
Questions and policy implications to consider:
- Is the handling of nudity in accordance with the United States’ more historically Puritianical views really the best way to moderate content submitted by users all over the world?
- Would it be more useful to users is content were hidden — but not deleted — when it appears to violate Facebook’s terms of service, allowing posters and readers to access the content if they choose to after being notified of its potential violation?
- Would a more transparent appeals process allow for quicker reversals of incorrect moderation decisions?
Resolution: The seed company’s ad was reinstated shortly after Facebook moderators were informed of the mistake. A statement from the company raised at least one more question as its spokesperson did not clarify exactly what the AI thought the onions actually were, leaving users to speculate what the spokesperson meant, as well as how the AI would react to future posts it mistook for, “well, you know.”
“We use automated technology to keep nudity off our apps,” wrote Meg Sinclair, Facebook Canada’s head of communications. “But sometimes it doesn’t know a walla walla onion from a, well, you know. We restored the ad and are sorry for the business’ trouble.”
Originally posted at the Trust & Safety Foundation website.
Filed Under: ai, content moderation, nudity
Companies: facebook
Content Moderation Case Study: Facebook Attracts International Attention When It Removes A Historic Vietnam War Photo Posted By The Editor-in-Chief Of Norway's Biggest Newspaper (2016)
from the the-terror-of-content-moderation dept
Summary: Tom Egeland, a Norwegian author of a number of best-selling fiction books, posted a well-known photo known as “The Terror of War” to Facebook. The historic photograph (taken by Vietnamese-American photographer Nick Ut) depicts a naked Vietnamese girl running from a napalm attack during the Vietnam War.
Ut’s iconic photo brought the horrors of the war in Vietnam to viewers around the world. But it was not without controversy. Given the full-frontal nudity of the child depicted in the image, the Associated Press pushed back against Ut, citing the paper’s policy against publishing nudity. In this case, the nudity of the child resulted in more resistance than usual. Ultimately, the AP decided to run the photo, resulting in a Pulitzer Prize for Ut in 1973.
Despite the photo’s historical significance, Facebook decided to suspend Tom Egeland’s account. It also deleted his post.
Facebook’s decision was based on its terms of service. While the photo was undeniably a historical artifact, moderation efforts by the platform were not attuned to the history.
A notice sent to Egeland pointed out that any displayed genitalia would result in moderation. Also, given the platform’s obligation to inform the government about Child Sexual Assault Material (CSAM), leaving a photo of a naked prepubscent up posed problems the algorithms couldn’t necessarily handle on their own.
The decision to remove the post and suspend the author’s account resulted in an open letter being sent by Norwegian journalist Epsen Hansen. The letter — addressed to Facebook founder and CEO Mark Zuckerberg — asked what negative effects moderation efforts like these would have on a “democratic society.”
Decisions to be made by Facebook:
- Should automatic moderation that aids law enforcement be overridden when context shows posts are not attempting to sidestep rules put in place to prevent Facebook users from being subjected to abusive content?
- What value is placed on context-considerate moderation? Does it add or subtract from financial obligations to shareholders?
- Does it serve users better to be more responsive — and helpful — when context is a primary consideration?
Questions and policy implications to consider:
- Is the collateral damage of negative press like this offset by Facebook’s willingness to be proactive when removing questionable content?
- Is it more important to serve private users than the numerous governments making moderation demands?
- Do inexact or seemingly-incoherent responses to controversial content raise the risk of government intervention?
Resolution: Despite the letter from a prominent Norwegian journalist, Facebook refused to reinstate the photo. Instead, it offered boilerplate stating its objection to “nude genitalia.” While it stated it did make “allowances” for “educational, humorous, and satirical purposes.” Ut’s photo did not make the cut apparently. Facebook asked Aftenposten, Egeland, and/or Hansen to “pixelate” the iconic photo before reposting. This was the response from Aftenposten’s Hegeland:
Unfortunately, Facebook did not see the pointed humor of Hansen’s modification. Facebook’s deletion of the original — as well as its suspension of author Tom Egeland’s account — remained in force. While public shaming has had some effect on moderation efforts by social media companies, Facebook’s stance on nudity — especially the nudity of minors — prevented it from backing down in the face of negative publicity.
Filed Under: content moderation, csam, historic photos, nudity, photo, tom egeland, vietnam
Companies: facebook
Content Moderation Case Study: Facebook Nudity Filter Blocks Historical Content And News Reports About The Error (June 2020)
from the content-moderation-is-hard dept
Summary: Though social media networks take a wide variety of evolving approaches to their content policies, most have long maintained relatively broad bans on nudity and sexual content, and have heavily employed automated takedown systems to enforce these bans. Many controversies have arisen from this, leading some networks to adopt exceptions in recent years: Facebook now allows images of breastfeeding, child-birth, post-mastectomy scars, and post-gender-reassignment surgery photos, while Facebook-owned Instagram is still developing its exception for nudity in artistic works. However, even with exceptions in place, the heavy reliance on imperfect automated filters can obstruct political and social conversations, and block the sharing of relevant news reports.
One such instance occurred on June 11, 2020 following controversial comments by Australian Prime Minister Scott Morrison, who stated in a radio interview that ?there was no slavery in Australia?. This sparked widespread condemnation and rebuttals from both the public and the press, pointing to the long history of enslavement of Australian Aboriginals and Pacific Islanders in the country. One Australian Facebook user posted a late 19th century photo from the state library of Western Australia, depicting Aboriginal men chained together by their necks, along with a statement:
Kidnapped, ripped from the arms of their loved ones and forced into back-breaking labour: The brutal reality of life as a Kanaka worker – but Scott Morrison claims ?there was no slavery in Australia?
Facebook removed the post and image for violation of their policy against nudity, although no genitals are visible, and restricted the user?s account. The Guardian Australia contacted Facebook to determine if this decision was made in error and, the following day, Facebook restored the post and apologized to the user, explaining that it was an erroneous takedown caused by a false positive in the automated nudity filter. However, at the same time, Facebook continued to block posts that included The Guardian?s news story about the incident, which featured the same photo, and placed 30-day suspensions on some users who attempted to share it. Facebook?s community standards report shows that in the first three months of 2020, 39.5-million pieces of content were removed for nudity or sexual activity, over 99% of those takedowns were automated, 2.5-million appeals were filed, and 613,000 of the takedowns were reversed.
Decisions to be made by Facebook:
- Can nudity filters be improved to result in fewer false-positives, and/or is more human review required?
- For appeals of automated takedowns, what is an adequate review and response time?
- Should automated nudity filters be applied to the sharing of content from major journalistic sources such as The Guardian?
- Should questions about content takedowns from major news organizations be prioritized over those from regular users?
- Should 30-day suspensions and similar account restrictions be manually reviewed only if the user files an appeal?
Questions and policy implications to consider:
- Should automated filter systems be able to trigger account suspensions and restrictions without human review?
- Should content that has been restored in one instance be exempted from takedown, or flagged for automatic review, when it is shared again in future in different contexts?
- How quickly can erroneous takedowns be reviewed and reversed, and is this sufficient when dealing with current, rapidly-developing political conversations?
- Should nudity policies include exemptions for historical material, even when such material does include visible genitals, such as occurred in a related 2016 controversy over a Vietnam War photo?
- Should these policies take into account the source of the content?
- Should these policies take into account the associated messaging?
Resolution: Facebook?s restoration of the original post was undermined by its simultaneous blocking of The Guardian?s news reporting on the issue. After receiving dozens of reports from its readers that they were blocked from sharing the article and in some cases suspended for trying, The Guardian reached out to Facebook again and, by Monday, June 15, 2020, users were able to share the article without restriction. The difference in response times between the original incident and the blocking of posts is possibly attributable to the fact that the latter came to the fore on a weekend, but this meant that critical reporting on an unfolding political issue was blocked for several days while the subject was being widely discussed online.
Photo Credit (for first photo):
State Library of Western Australia
[Screenshot is taken directly from a Twitter embed]
Filed Under: case study, consistency, content moderation, historical content, nudity, reporting
Companies: facebook
Facebook Censors Art Historian's Photo Of Neptune's Statue-Penis
from the pics-or-gtfo dept
It’s probably time for Facebook to give up trying to be the morality police, because it isn’t working. While nobody expects the social media giant to be perfect at policing its site for images and posts deemed “offensive”, it’s shown itself time and time again to be utterly incapable of getting this right at even the most basic level. After all, when the censors are removing iconic historical photos, tirades against prejudice, forms of pure parody, and images of a nude bronze statue in the name of some kind of corporate puritanism, it should be clear that something is amiss.
Yet the armies of the absurd march on, it seems. Facebook managed to kick off the new year by demanding that an Italian art historian remove an image of a penis from her Facebook page. Not just any penis, mind you. It was a picture of a godly penis. Specifically, this godly penis.
That, should you not be an Italian art historian yourself, is a picture of a statue of the god Neptune. In the statue, which adorns the public streets of Bologna, Neptune is depicted with his heavenly member hanging out, because gods have no time for clothes, of course. Yet this carved piece of art somehow triggered a Facebook notice to the photographer, Elisa Barbari.
According to the Telegraph, Barbari got the following notification from Facebook. “The use of the image was not approved because it violates Facebook’s guide lines on advertising. It presents an image with content that is explicitly sexual and which shows to an excessive degree the body, concentrating unnecessarily on body parts. The use of images or video of nude bodies or plunging necklines is not allowed, even if the use is for artistic or educational reasons.”
Even were I to be on board with a Facebook policy banning nudity and, sigh, “plunging necklines” even in the interest of education or art — which I most certainly am not on board with — the claim that the image is explicitly sexual and focused on “body parts” is laughably insane. There’s nothing sexual about the depiction of Neptune at all, unless we are to believe that all nudity is sexual, which simply isn’t true. Also, the depiction focuses not on one body part, but on the entire statue. Nothing about this makes sense.
And that’s likely because Facebook is relying on some kind of algorithm to automatically generate these notices. Confusingly, the site’s own community standards page makes an exception for art, despite the notice Barbari received claiming otherwise.
Strangely, an exception is made for art. “We also allow photographs of paintings, sculptures, and other art that depicts nude figures.”
Except when it doesn’t, that is. Look, again, nobody is expecting Facebook to be perfect at this. But the site has a responsibility, if it is going to play censor at all, to at least be good enough at it not to censor statues of art in the name of prohibiting too much skin.
Filed Under: art, automation, censorship, neptune, nudity, social media
Companies: facebook
Apple Threatens To Kick Out Comic Book App Over 'Adult' Content, Forcing Publisher To Pull 40% Of Its 4,000 Titles
from the option-iPrude-accessory-gouges-eyes-out,-removes-personal-responsiblity dept
Apple’s worked very hard cultivating its walled garden and it isn’t going to let a bunch of creators ruin its pristine utopia with nudity, depictions of sweatshops, nudity (again), swearing, topical commentary, competitive apps and the ancient art of intricate lovemaking.
Once again, Apple has decided to arbitrarily boot more content out of its garden, expressing its concern that things might be getting a little too sexy for its apparent target audience of schoolchildren who have never browsed the internet.
Reports are coming in that the digital comics distributor Izneo has had to radically prune their catalog or face banishment from iTunes.
Izneo has been selling digital comics on the iPad since they released an iPad app in mid-2010, and they successfully built a catalog of over 4000 French and Belgian titles.
Everything was going fine until late Friday night when one of Apple’s censors noticed that Izneo sold adult comics. And since Apple clearly cannot allow their precious iPad to be sullied by salacious content, the censor gave Izneo 30 hours to remove all adult comics.
Like other Apple takedown requests, this one arrived with no warning and no clear indication as to what content Apple felt was inappropriate and should be removed.
IDBoox broke the story earlier today, and they report that Izneo had absolutely no warning that there was a problem or guidance as to which titles needed to be removed. All they were reportedly told by Apple was that the adult content had to go, so Izneo drastically pruned any comic that showed a breast, cleavage, and even ones with characters evoking a suggestive gesture.
In order to comply with this incredibly vague request, Izneo immediately pulled 2,800 of its 4,000 titles. After a more in-depth review of its content, Izneo restored about half of what it had dumped, bringing it back up to 2,500. That’s still 1,500 titles pulled because Apple said, “Jump,” and couldn’t even be bothered to specify how high.
Now, Izneo is stuck in a bit of a bind. It can abide by Apple’s ethereal “guidelines” and hope that it doesn’t need to remove even more titles. Or, it can start looking at a few options to get around the walled garden while still remaining somewhat ensconced. Nate Hoffelder suggests it switch to an HTML-5 reading app, or better yet, simply stop selling titles from within the app. This will allow Izneo to avoid Apple’s app censoring while also bypassing the “opportunity” to toss 30% of the in-app purchase Apple’s way.
As long as Apple is going to continue to behave like a stern parent in need of mood stabilizers, app developers and content creators are going to find themselves on the receiving end of vague missives like these. Apple is, of course, welcome to run its business however it sees fit, but every story like this serves as a warning to developers: if you want to play in Apple’s garden, you’ll have to abide by the nebulous, arbitrary rules. Apple has stated that if game developers want to handle serious issues (like the Syrian War), they should write a book instead. What is it going to tell comic book creators whose artwork veers into adult areas? Fire up the keyboard and turn those pictures into 1,000 words?
Filed Under: apps, arbitrary decisions, comic books, ios, nudity, walled garden
Companies: apple, izneo
No Nudity: Playboy's iPhone App To Test Men's 'For The Articles' Excuse
from the seriously?--no-boobies? dept
Looking back, I think I saw my first Playboy magazine when I was roughly ten years old or so. That would put us somewhere in the early 90’s. My friends and I stopped on our way to school and huddled around each other, all trying to get a glimpse of the in-depth article on Operation Desert Storm and it’s long-reaching implications for the Middle East, American foreign policy, and the rest of the world. No…wait…now I remember. We wanted to see the naked girls, because these were the days before wide internet adoption would put roughly all the porn at everyone’s fingertips and President Bush's name still made us giggle (it kind of still does, actually). That said, amongst older generations, you would occasionally hear the laughable excuse from men that they wanted their Playboy magazines so they could read the articles, I suppose because Time Magazine, The New Yorker and Newsweek didn’t exist (psst! They did!).
Well, now it appears we’ll get something of a test for that excuse, with Playboy releasing a mobile app for Apple’s app store, which of course had to nix all the nipples and vaginas to get it past the tech company’s Quaker-like regulators.
This winter, the company, long barred from Apple’s digital storefronts because of its pornographic associations, will package a nudity-free version of its content together for the launch of its first iPhone app, featuring lifestyle tips, articles from the magazine and, of course, photos of beautiful women.
Those beautiful women will be clad in lingerie, under Apple’s strict no boobies policy. Now, here’s why this probably won’t work. Nobody is going to download this app to see women in lingerie. There’s a couple of reasons for this. First, we’ve long been able to get that elsewhere. Victoria Secret has an iOS app, after all. Also, there’s that handy browser option for viewing all the images one could want on the internet. As for the articles, we have a couple of problems. Jumping into the news content business this late in the game and having success in it would require really compelling articles. The good news is that Playboy still has this. The bad news is that all those people who claim their allegiance to Playboy for their articles are full of crap. As the article summarizes:
So, mobile readers will have to actually read Playboy for the articles, with a little lingerie on the side. This could totally work. What could go wrong?
The answer, of course, is everything.
Filed Under: apps, nudity
Companies: apple, playboy
Apparently Stripping Nude To Protest TSA Search Is Protected By The First Amendment
from the in-oregon-at-least dept
A few months ago, you may have heard about John Brennan, who was going through Portland International Airport, and felt that the TSA screening procedures were the equivalent of harassing him. In response, to protest, he stripped naked… and was promptly arrested for disorderly conduct and indecent exposure. However, a court has now acquitted Brennan by saying that the stripping was an act of public protest, and thus protected by the First Amendment. The judge pointed out that there’s already state precedent in Oregon that anti-nudity laws “do not apply in cases of protest.”
“It is the speech itself that the state is seeking to punish, and that it cannot do,” Circuit Judge David Rees said.
The DA who prosecuted the case is complaining that now anyone arrested for indecent exposure can just claim that it’s a protest.
Deputy District Attorney Joel Petersen argued that Brennan only spoke of a protest minutes later. Petersen urged the judge to recognize that distinction, “otherwise any other person who is ever naked will be able to state after the fact” that it was done in protest.
Of course, this now raises the troubling (or appealing, depending on your nature) idea that stripping at the front of the TSA line may become more popular. That said, if you’re now… er… itching to disrobe in front of the TSA, it’s worth noting that this ruling is specific to Oregon, and who knows how other states might deal with the same issue.
Filed Under: free speech, nudity, protest, search, tsa