Research in Humanoid Robotics and Artificial Intelligence
Project no.2: Euclid
Euclid is a fully automated social robot with automatic speech recognition (ASR), computer vision, human modelled AI personality, skin sensors and a unique robotic mouth that uses ML to generate visemes. The robot is designed for social and assistive interactions with older adults.
Project no.3: Machine Learning for Robotic Lipsync
As recently listed in BBC science focus's list of the top 13 scientific moments of 2021 my research in developing a machine learning tool to extract visemes from speech synthesis to control robotic lip synchronisation gained a lot of attention in the robotics community.
Project no.4: Robotic Eyes that Dilate to Light and Emotion
When the robot is interacting with someone, a camera uses machine-learning software to predict their emotional state from their facial expressions. This assigns an emotion to the robot such as positive or negative and sends a message to the pupils to expand or get smaller accordingly. Similarly, in light mode, the robot’s pupil dilates in darkness and shrinks in brighter conditions. This research was recently used for the front cover of the December 2021 issue of MDPI informatics HCI special edition.
Project no.5: Multimodal Turing Test
A video introduction by Staffordshire University to my research in developing a multimodal Turing test to evaluate the authenticity of humanoid robots with embodied artificial intelligence. As recently employed by leading roboticists at the Intelligent Robotics Laboratory at Osaka University, Japan.