Visual control of dynamic actuators
by Admin
This student project is a collaboration with Hugo Marques (ailab UZH, BIRL ETHZ) who is interested in the field of controlling bioinspired robots and their actuators.
Standard humanoid robots mimic the human form, but the mechanisms used in such robots are very different from those in humans, and the characteristics of the robots reflect this. This places severe limitations on the kinds of interactions such robots can engage in, on the knowledge they can acquire of their environment, and therefore on the nature of their cognitive engagement with the environment.
Therefore a new kind of robot is being developed – anthropomimetic robots (e.g. ECCE robot). Instead of just copying the outward form of a human, it copies the inner structures and mechanisms – bones, joints, muscles, and tendons – and thus has the potential for human-like action and interaction in the world.
Due to the highly non-linear nature of such actuators and the lack of sufficient sensors, it is very hard to control them. This project aims to partially overcome this problem by using visual feedback from the DVS. The goal is to construct a self-organizing algorithm that learns the relations between the visual input and the actuator output in a model system.
Contact: Christian