posted by S. Abbas Raza in 3 Quarks Daily
Richard Johnson in IEEE Spectrum:
Illustration: iStock/IEEE Spectrum
Affective computing systems are being developed to recognize, interpret, and process human experiences and emotions. They all rely on extensive human behavioral data, captured by various kinds of hardware and processed by an array of sophisticated machine learning software applications.
AI-based software lies at the heart of each system’s ability to interpret and act on users’ emotional cues. These systems identify and link nuances in behavioral data with the associated emotion.
The most obvious types of hardware for collecting behavior data are cameras and other scanning devices that monitor facial expressions, eye movements, gestures, and postures. This data can be processed to identify subtle micro expressions that a human assessment might struggle to identify consistently.
What’s more, high-end audio equipment records variances and textures in users’ voices. Some insurance companies are experimenting with call voice analytics that can detect if someone is lying to their claim handlers.
More here. [Thanks to Brian Whitney.]
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment