Kingston, ON April 23, 2003 With increasing numbers of digital devices vying for our attention and time today, researchers from the Human Media Lab (HML) at Queen’s University have developed a new concept that allows computers to pay attention to their users’ needs.
HML researchers at the university’s School of Computing are addressing the problem of the barrage of messages people receive from large numbers of digital appliances. Their Attentive User Interface (AUI) is a new paradigm for interacting with groups of computers that moves beyond the traditional desktop interface.
Current computers are generally designed to act in isolation, without considering what the user is doing before producing distracting interruptions. As a result, today’s user has trouble keeping up with volumes of email, instant messages, phone calls and appointment notifications.
"Today’s digital lifestyle has the unfortunate side effect of bombarding people with messages from many devices all the time, regardless of whether they’re willing, or able to respond," says Dr Roel Vertegaal, HML director. "Like spam [unsolicited e-mail], this problem needs to be addressed." The HML team is designing devices that determine the level of user attention and the importance of each message relative to what the user is doing. Then the computer decides whether to ‘take a turn’ to deliver the message.
In early April, Dr Vertegaal and students Jeffrey Shell, Alexander Skaburskis and Connor Dickie presented their findings at the Association of Computing Machinery’s CHI 2003 Conference on Human Factors in Computing Systems in Fort Lauderdale, Florida.
"The way that we use computers has fundamentally changed," says Dr Vertegaal. "There has been a shift over the past four decades from many users sharing a single mainframe computer, to a single user with a single PC, to many people using many portable, networked devices.
"We now need computers that sense when we are busy, when we are available for interruption, and know when to wait their turn just as we do in human-to-human interactions," he says. "We’re moving computers from the realm of being merely tools, to being sociable appliances that can recognize and respond to some of the non-verbal cues humans use in group conversation."
Many of the Queen’s team’s discoveries are rooted in their research into the function of eye contact in managing human group conversation. One of the main underlying technologies they developed is an eye contact sensor that allows each device to determine whether a user is present, and further, whether that user is looking at the device. This allows devices to establish what the user is attending to, and when, whether, and how they should contact the user, if at all.
Funding support for the Human Media Lab includes the Premier’s Research Excellence Awards, the Natural Sciences and Engineering Research Council, Institute for Robotics and Information Systems, Communications and Information Technology Ontario and Microsoft Research.
HML works in collaboration with IBM Almaden Research Center in San Jose, and Microsoft Research in Redmond, Washington.
Have your say: