In this post, I’m going to briefly discuss the potential benefits from exploring affective computing technologies in smart homes and smart environments in general especially when it comes to providing health and social services. Smart home technologies today are mostly accompanied with some potential problems annoying residents of smart homes with false alarms and unwanted reactions. When an elderly or person with disability lives in a smart home one would expect it to be that smart home to recognize what situation is really a potential danger or threat to them and act accordingly. Sensor technologies have advanced a lot in the direction of detecting potential dangers and threats but there is one thing they still cannot control. Human behavior. Unexpected human behavior out of what the sensor or the machine has learned can lead to false alarms, unwanted disturbance and finally rejection of such solutions from elderly and people with disabilities. All these could be avoided if sensor signals were reinforced with further information about the resident’s status at the moment. What was his emotional state, what was he doing at the time the alarm went off, how is he reacting to what happened. Much of this information could be gathered by affective computing solutions.
In Universidad Carlos III de Madrid scientists developed a machine able to recognize a person’s feelings based on his speech interaction with the machine. It can recognize anger, doubt and boredom and adjust the conversation accordingly. In Binghamton University scientist Lijun Yin is working with psychologist Peter Gerhardstein collecting a large set of facial expression images and mapping them to emotions in order to produce software that will recognize emotion based on a person’s facial expression. EU has also funded research in the area with HUMAINE being one the projects in that direction. HUMAINE project aims to lay the foundations for European development of systems that can register, model and /or influence human emotional and emotion-related states – ’emotion-oriented systems’. In general, the area of affective computing is showing vast potential of recognizing emotion of people in the future.
One of the areas where affective computing solutions can be used is smart home environments for health care monitoring services. Such environments in the future could detect person’s emotions and respond accordingly. If the person is feeling quite confident with what he is doing they could reduce assistance provided and if they show signs of anxiety, doubt or other disturbing feelings they could come up with appropriate responses. Smart homes of the future are expected to understand implicit user’s behavior signs such as face expressions, movements, voice etc. and react accordingly. This technology could also provide monitoring and caring services with additional parameters when something unexpected happens to the person receiving care to evaluate the importance of a situation.
In such environments for example when a person for example wakes up and stays in bed because he is in pain it will trigger the appropriate alarms and reactions, while on the other hand it will not do anything if the person woke up and just enjoys a few more moments of relaxation in bed looking out the window. Sensors in bed will cooperate with wake-up alarm clocks and cameras or microphones in the room to detect which is the case. Further research in the area of affective computing will boost the applications of smart home environments and not only. The applications of this technology can be numerous across many aspects of living apart from health-related services. A number of research issues should (better) be addressed, including privacy concerns, acceptance by users, integration of different emotion detection technologies, etc. but we’ll come back to those in another post.
Until then, be safe and don’t play with the fire alarm sensor!