HP Innovation Issue 17: Spring 2021 | Page 24

“ COGNITIVE LOAD IS REALLY JUST OUR FIRST INFERENCE . WE ASPIRE TO DO A LOT MORE .”
FROM THE LABS
HP OMNICEPT

“ COGNITIVE LOAD IS REALLY JUST OUR FIRST INFERENCE . WE ASPIRE TO DO A LOT MORE .”

— Tico Ballagas , senior manager , h p labs

20 7

WEI WAS THEN leading a project called “ Emotion AI ” to train computers to understand and react to users ’ emotions . He was trying to understand how to interpret people ’ s emotions from their biological responses in order to craft an experience that adapts to them .
Ballagas and Wei found common ground in their visions and teamed up to tackle a central problem : We could understand machines , but they couldn ’ t understand us terribly well . For all their incredible computing power , machines can ’ t anticipate our feelings , or recognize when our attention is wandering or when we ’ re upset or tired or anxious . “ We saw an opportunity to change that with VR ,” Ballagas says .
Wei and Ballagas demonstrated their pilot results to HP ’ s business unit , which expressed strong interest , and they assembled a crack team , including Mithra Vankipuram , a specialist in data science and user experience ; Kevin Smathers , a software engineer ; and nearly a dozen other talented scientists .
The team brainstormed for hours about how to create a machine that could work with humans . But how could they get inside someone ’ s head ? They needed to measure emotions in real time , and they couldn ’ t stick people who were at work
GETTING INSIDE OUR HEADS
FOCUSING ON COGNITIVE LOAD
AFTER READING the work of Nobel Prize – winning behavioral economist Daniel Kahneman , Ballagas realized that the team didn ’ t need all that extra information from brain waves and skin conductivity . They could get great data from just four biometric sensors that tracked changes in pupil dilation , heart and respiratory rates , and head movement . And they could use those measurements to make important inferences about one specific , but very telling , mental state : a person ’ s cognitive load .
Just like the RAM in a computer , humans can only hold and process so much information at any given time . As we start to do more and increasingly difficult tasks , remember more details or recall more facts , our brains have to work harder and our cognitive load

20 8

grows . Our working memory can ’ t hold any more : We start to forget things , have a hard time focusing , and struggle to keep up with new information coming at us .
Everyone from air-traffic controllers to fighter pilots to surgeons has to manage a careful balance called the Goldilocks Zone : They ’ re engaged enough to stay focused , but not so challenged and overloaded with information that they get overwhelmed or burn out .
Ballagas reasoned that knowing someone ’ s cognitive load could help them train more efficiently . The Omnicept could measure cognitive load and see where people were struggling and needed more time and practice , tailoring that training to their unique needs .
But Ballagas also kept an eye on the future , including a camera that captures facial expressions which could eventually be used to create avatars that mimic users ’ faces in real time . “ Cognitive load is really just our first inference ,” says Ballagas . “ We aspire to do a lot more .”
inside an MRI machine . Instead , they had to rely on proxy measures — indications from the body about emotional and psychological states , such as pupil dilation , which can signal mental strain or arousal , or increased heart rate , which can indicate stress .
“ We had a lot of failure stories ,” like skin conductivity sensors that didn ’ t work and EEG sensors that didn ’ t get clean data , says Wei , the team ’ s machine-learning maestro .
HP / INNOVATION / SPRING 2021 22