iCub Tutorial
We've brought our iCub humanoid robot from the Italian Institute of Tecnology. This iCub comes equipped with two ATIS neuromorphic sensors, tactile skin over the arms and torso, and IMU sensor in the head and encoder information for each joint positions. We can control the robot in 6-DOF for the head and eyes, and even more for the arms and torso. All the sensor values and commands to move the robot can be easily accessed by connecting to the robot network, and installing YARP middleware. Software can be written in C++ or Python, and we can interface to a 48-chip SpiNNaker board. It should also be possible to bridge connections to other neuromorphic chips - but we'll need to discuss about it!
This workgroup is a single tutorial to show the robot, and explain how to interface to it's sensors and motors. Please come along if you:
1. Are just interested in seeing the robot and seeing what it can do.
2. Have a cool network that might be able to interface to the robot.
3. Are interested in some of the projects we plan to do with it. (visual attention mechanisms, auditory attentions mechanisms, head direction networks)
For anyone interested we can continue the tutorial as a hands-on session on how to install the required software to talk to the robot. Also please ask me at any time if you have questions about the iCub robot!
Arren.
Timetable
Day | Time | Location |
---|---|---|
Wed, 24.04.2019 | 21:30 - 22:00 | Disco |