There were times when grown-ups looked forward to the announcement of new technology like they used to look forward to Christmas. Nowadays one encounters this almost childlike enthusiasm less and less. In times of technical surveillance, the descriptions of upcoming products almost sound like a threat. The youngest comes from Amazon. A gadget is being developed there that should be able to recognize human emotions.
According to internal documents, the device should be worn on the wrist. Software will use microphones to evaluate the wearer’s emotional state. The technology should help to interact “more effectively” with people – whatever that means in Amazon’s reading. A distinction is made between “joy, anger, sadness, fear, boredom” and half a dozen other feelings. The corresponding patent application, which was filed last year, shows that Amazon could adapt its product recommendations to the information provided by “Dylan”. If you feel joy, you might get an offer for pop music or a trampoline. When it comes to worry or sadness, over-the-counter mood enhancers would probably be at the top of the list.
It’s still unclear how far the project, codenamed “Dylan,” has progressed or if it will ever come to market. At Amazon, the developers are given a lot of freedom for mind games. Every idea that doesn’t fizzle out immediately seems to be filed for a patent. In the last few years, sketches of autonomous drone swarms, tunnel-based delivery systems or underwater logistics centers have been found on the list. Submissions range somewhere between regular Silicon Valley hubris and 007 supervillain level.
The fact that computers decode people’s feelings through video recordings or voice recordings is a popular motif in science fiction. In reality, the field is called affective computing and appears to be equal parts psychology, artificial intelligence, and voodoo.
So far, the discipline has primarily been the field of ambitious start-ups and not tech monopolists. In the meantime, however, entrepreneurs and investors are expecting billions in business. One company that has made a name for itself in recent years is obviously called Affectiva. She recently announced the development of a new software called Human Perception AI. According to the developers, the aim is to give the technology “emotional intelligence”.
It’s not just about basic states like fear or anger, but about more complex sensations. In addition to speech, facial expressions are also analyzed. The responsible PR departments boldly manage to portray this as an act of humanization: in order to become even more useful, technology must be able to understand its users on a deep human level. The ethical questions that arise will be dealt with at a later date.
Unlike Amazon, Affectiva doesn’t just aim for boring consumer decisions. But on the total decoding of human feelings. Due to the omnipresence of corresponding sensors, a smartphone could recognize whether the user is really concentrating – just like cars already monitor their drivers and warn them when they are tired. In the probable future case that it should be an autonomous vehicle, the vehicle would also know whether its occupants were feeling nauseous due to their driving style.