Intel Wants To Add Unproven ‘Emotion Detection’ AI To Distance Learning Tech (original) (raw)

from the and-what-does-disabling-the-camera-denote? dept

Last week, Zoom announced its plans to add emotion detecting tech to its virtual meeting platform, something it apparently felt would facilitate the art of the deal. Here’s Kate Kaye, breaking the news for Protocol.

Virtual sales meetings have made it tougher than ever for salespeople to read the room. So, some well funded tech providers are stepping in with a bold sales pitch of their own: that AI can not only help sellers communicate better, but detect the “emotional state” of a deal — and the people they’re selling to.

In fact, while AI researchers have attempted to instill human emotion into otherwise cold and calculating robotic machines for decades, sales and customer service software companies including Uniphore and Sybill are building products that use AI in an attempt to help humans understand and respond to human emotion. Virtual meeting powerhouse Zoom also plans to provide similar features in the future.

Advocates for this AI, which includes customers like Zoom, claim this unproven tech could make it easier to “build rapport” during virtual meetings or, at the very least, give those performing pitches a heads up when they’re losing their audiences.

That’s all well and good when we’re talking about a bunch of consenting adults playing sales pitch poker while attempting to Voight-Kampff their way into a competitive edge. Any advantage should be exploited, even if it means subjecting potential customers to AI with no proven track record. It’s unclear how consent to be emotionally analyzed is obtained (or if it’s even sought), but, again, we’re dealing with adults in a sales situation where this sort of manipulation is considered normal behavior.

The problem with Zoom is it thinks this same tech should be inflicted on non-consenting minors. Again, it’s Kate Kaye with the news for Protocol.

Rather than simply allow instructors to draw inferences from student facial expressions and behavior, a couple of companies think they can make teachers better by throwing more tech (and surveillance) at their students.

Intel and Classroom Technologies, which sells virtual school software called Class, think there might be a better way. The companies have partnered to integrate an AI-based technology developed by Intel with Class, which runs on top of Zoom. Intel claims its system can detect whether students are bored, distracted or confused by assessing their facial expressions and how they’re interacting with educational content.

“We can give the teacher additional insights to allow them to better communicate,” said Michael Chasen, co-founder and CEO of Classroom Technologies, who said teachers have had trouble engaging with students in virtual classroom environments throughout the pandemic.

This means the cameras always need to be on, even though some instructors are capable of teaching classes without expecting students to open up a window to their home lives via laptop cameras. IM services, microphones, and texting seem to fill the face-to-face void quite capably. The ability to strip things back to text-only communication allows students without access to speedy internet connections to stay connected without exceeding the bandwidth they’re allotted or burning up their data if their operating under a cap.

The business version requires always-on cameras to record footage that can then be processed by the emotion detection AI to provide customers with insights on detected mood swings by their sales pitch recipients. Presumably, the school version will operate the same way until Intel and Classroom Technologies feel the AI has learned enough to go live.

The end goal is always-on surveillance of students, with the stated goal being better instruction and more student engagement.

“We are trying to enable one-on-one tutoring at scale,” said [Intel research scientist Sinem] Aslan, adding that the system is intended to help teachers recognize when students need help and to inform how they might alter educational materials based on how students interact with the educational content.

At this point, the product is still in the testing phase. To become a full-fledged product, it will need significant buy-in from educational institutions. That’s the sort of thing that often happens without consulting the stakeholders most affected by the addition of new in-home surveillance tech: the students who will be the testing ground for the product.

Even if the reps for both companies are to be believed — that the product is intended to help teachers better reach their students — the potential for misuse (or deliberate abuse) is omnipresent. On top of that, most humans are incapable of accurately reading the emotions of others and they’ve got a lifetime of experience and better innate learning systems. Add to that the fact that not all cultures utilize the same expressions or body language to signal mood shifts, and you’ve got a product with the potential to generate a ton of useless or counterproductive data.

Filed Under: ai, classrooms, emotion detection, surveillance, virtual learning
Companies: classroom technologies, intel, zoom