Applying Bionics to Robotics

  • Client


  • Services

    Industial Automation

  • Technologies


  • Dates



As a purveyor of motion control technologies, it’s no surprise that Festo is fixated with industrial motion. What has differentiated Festo from other industrial motion technology companies is Festo’s focus on the biological underpinnings of motion.

Over the past decade we’ve seen Festo spotlight a number of robotic devices designed to mimic various biologic motions—such as those displayed by seagulls, penguins, elephant trunks, ants, kangaroos and octopi. Research by Festo into these kinds of biologic motion is not done solely to show off the capabilities of its R&D team, but to inform its ongoing design of industrial motion technologies. The company’s recent announcement that it has developed a BionicCobot (collaborative robot) appears to highlight how its bionic research is currently being applied to real world industrial products.


Festo says the movement patterns of the BionicCobot are modelled on the human arm, from the shoulder via the upper arm, elbow, radius and ulna down to its gripping hand. “Each of its seven joints makes use of the natural operating mechanism of the biceps and triceps—the efficient interplay of flexor and extensor muscles,” the company says in its announcement of the BionicCobot.

The appeal of cobots is their ability to work alongside humans, rather than in a caged or highly sensored environment that restricts robot motion whenever a human is nearby. To deliver on the BionicCobot’s ability to work alongside humans, Festo says the movements of the cobot can be “finely regulated so as to be either powerful and dynamic, or sensitive and readily yielding; the system therefore cannot endanger humans even in the case of a collision. This is made possible by the Festo Motion Terminal, a pneumatic automation platform that unites high-precision mechanics, sensors and complex control and measuring technology within a very small space.”

The robot interface developed by Festo allows users to teach the BionicCobot what actions it should perform and parameterise them. The defined work steps can then be transferred via drag-and-drop to a sequencer track for implementation in any order. The movement sequence is then depicted and simultaneously simulated.