Adam Higginbotham reports on advances in computers and biosensors that can read emotion. It's being used to monitor autistic children, to prevent the onset of meltdowns, and also in advertising. Higginbotham plays guinea pig:
After a camera check, as the machine makes sure my face fills the frame and is well enough lit to be analysed, a commercial for Doritos begins: two hefty frat-boy types in their living room; one complains that the other has eaten all the crisps. "Relax, bro-chaco," he replies, "this new phone I got will get us anything we want." He demonstrates, by asking the phone to send more Doritos, and then a sombrero, which magically plink into existence around him. His friend takes over: "Send three hot, wild girls." "Sending three Rottweilers," replies the phone. Uh-oh. After the punch line — three women in low-cut outfits left in the suddenly deserted room, asking, "So… why are we here again?" — there's another pause while the machine transfers the video of my face into the cloud for processing, inferring emotional state from my expressions. It then presents its analysis of my reaction on a five-layer graph mapping a video strip of the ad against fluctuating emotion tracks: smile, surprise, confusion/dislike, attention and valence, or the intensity of feeling. My response is apparently close to the global average: a slowly rising track of smile and surprise, peaking with the appearance of the barking dogs; broadly, the ad is a success.