WIT students have developed a robotic assistant for the home out of a domestic lamp.
Students at Waterford Institute of Technology (WIT) created Lampbot, a robotic home assistant and ‘friend’, to show how static smart speakers embedded with intelligent assistants might one day become a more animated part of the household.
Lampbot is so-called because it is a domestic lamp that has been kitted out with linear actuators, vision systems, a lit-up face, and speech recognition and chatbot technology from Google. Much like a Google Home device, Lampbot can be an answer-giving personal assistant, but the students claim it can also read human emotions and expressions, responding accordingly with its own body language, eyes and headdress.
“What if we wanted to bring more than a robotic voice into our home? … What if we want some movement, maybe even some emotions from our robot?” challenged Jason Berry, a lecturer in WIT’s School of Engineering who runs the institute’s Applied Robotics Lab.
“One of the most powerful ways for humans to communicate with each other is with eye contact. The students leveraged this human trait in the design of the robot, helping the Lampbot to make more of an emotional connection with its humans,” he explained.
Staking their claim as makers of the “world’s first six-axis robotic lamp assistant”, the student team built and developed Lampbot over the course of a few months, tackling the design challenge of creating a robotic assistant suited to the home environment as opposed to a factory or lab.
Ashraf Mustafa, a French student in second year of a BSc in applied computing for the internet of things, was charged with giving the robot emotional literacy by transferring data from a facial recognition system in the Lampbot’s eyes. Robert Solomon from the same course had the role of applying sound detection tracking to the robot.
“This means that the robot would be able to detect where a sound source was coming from and be able to face the direction of where it came from, giving it a more human-to-robot feel,” Solomon explained.
Michael Vereker from Waterford was on the two-person team that gave the lamp movement. “Actuator motors were attached to the lamp and wired back to motor controllers. Software was written to control the lamp’s movement using data sent from vision software,” he explained.
“I enjoyed seeing the final project come together; everybody had their own part to complete (movement, vision, eyes, speech etc) and when it was all connected at the end the lamp really came to life,” Vereker added.
As well as helping design Lampbot’s fetching headdress, BEng graduate Brian Prendergast was involved in designing present and future display units that could be used to enhance the facial expressions on the Lampbot in future iterations.
He started with the overall design of a microcontroller PCB (printed circuit board), which was built and used to enable communication within a controller area network (CAN).
“The initial PCB implementation included the design and construction of the hardware. This design was then used to communicate with the Lampbot by using programming languages that instructed the various sections of the Lampbot to perform,” Prendergast explained.
Prendergast believes the project has a bright future, with the option to enhance Lampbot further with added applications to target specific users, such as children with sensory processing disorders.
“In relation to the sensory processing disorders, this is part of a wider spectrum of autism which was mentioned between my colleagues and myself,” he explained. “My view was to include beneficial stimulation with this project to cater for every child no matter their role in society. There is no personal connection with this topic, but inclusion of the masses is always the best approach to take.”