STRETCH Infrastructure and Activity Recognition

After evaluating various off the shelf and specially developed sensors, this month we created a short list of technologies that will form the basis of our pilot studies.

With the help of colleagues at Sensor Platform for Healthcare in a Residential Environment (SPHERE) project in Bristol, we were able to deploy their sensor technology in Blaine’s home. This provides us with both environmental (temperature, humidity, luminosity) and presence sensing. We compared this technology to various off the shelf z-wave sensors, which we ultimately found to be unreliable.

We hope to use these sensors to detect activities performed by our participants. For example, kitchen presence with an increase in temperature and humidity could be a sign that the participant is cooking a meal.

To aid in activity recognition we have experimented with off the shelf energy monitoring kits to investigate how different activities can be detected by analysing energy usage. For example, being able to detect that a participant has boiled a kettle can indicate that a tea is being made, an indirect measurement of hydration.

All of these sensors communicate with our central servers via “home hub”, a raspberry pi computer that receives and processes the sensor data.
We have also been working on analyzing the SPHERE project’s wrist worn sensor for recognising basic activities such as: walking, walking up and down the stairs, sitting, standing, and laying down. The developed algorithms recognized these activities with high accuracy. We will combine this data with some other sensors (such as motion sensors) to further enhance the activity recognition accuracy.

Then we will move towards recognising more complex activities, especially kitchen activities such as preparing drinks and meals. The recognition at the first stage will rely on the data streamed from the environmental and power sensors, but new sensors may be deployed if needed to enhance the activity recognition accuracy.