Currently robots need to be precisely programmed for each step of a given task, but the move towards autonomous systems will see robots reacting intelligently to their surroundings and performing tasks largely independently. To do this they will need to rely on their own sensory perceptions. However, in harsh environments, laid low by fumes, dust, water, high temperatures or low visibility, new senses are called for – perhaps even sensory organs that humans lack. Researchers have fitted an underwater robot with an artificial sensory organ inspired by the so-called lateral line system found in fish and some amphibians that lets it orient itself in murky waters.
One of the biggest challenges facing robotics is teaching machines to perceive surroundings and make sense of what they see. Attempting to duplicate the complexity of human perception is next to impossible, so researchers at Cognition for Technical Systems (CoTeSys) in Munich are, instead, studying how blowflies process images using a 'flight simulator'. Despite having a brain the size of a pinhead, a fly can process and interpret 100 discrete images per second – four times better than humans.