Scientists have created a robot that can play JENGA by sensing when to push and pull at specific blocks
- The machine, developed by MIT engineers, is equipped with a force-sensing cuff
- It’s also endowed with an external camera, which helps it to ‘learn’ the game
- Scientists say it is just as skilful at the game as human players are
A robotic arm capable of playing the popular game Jenga has been built by American engineers.
The machine, developed by MIT engineers, is equipped with a soft-pronged gripper, a force-sensing wrist cuff and an external camera.
This enables it to see and feel the movement of the tower and adjust for each individual block.
It monitors and tracks the feedback from the blocks and the machine makes subtle adjustments to avoid toppling the tower and losing the game.
Scroll down for video
Tech advances: The machine, developed by MIT engineers, carefully moves the Jenga blocks
A computer takes in visual and tactile feedback via the cameras and cuff, and compares these measurements to moves that the robot previously made.
In real-time, the robot then ‘learns’ whether to keep pushing or move to a new block, in order to keep the tower from falling.
Details of the Jenga-playing robot are published in the journal Science Robotics.
Professor Alberto Rodriguez, from the Department of Mechanical Engineering at MIT, says the robot demonstrates the ability to quickly learn, not just from visual cues, but also from tactile, physical interactions.
Previous systems have struggled to master this.
‘Unlike in more purely cognitive tasks or games such as chess or Go, playing the game of Jenga also requires mastery of physical skills such as probing, pushing, pulling, placing, and aligning pieces,’ Professor Rodriguez said.
‘It requires interactive perception and manipulation, where you have to go and touch the tower to learn how and when to move blocks,’ Rodriguez says.
‘This is very difficult to simulate, so the robot has to learn in the real world, by interacting with the real Jenga tower.
‘The key challenge is to learn from a relatively small number of experiments by exploiting common sense about objects and physics.’
He says the learning system can be used in tasks that require careful physical interaction such as separating recyclable objects from landfill trash and assembling consumer products.
‘In a cellphone assembly line, in almost every single step, the feeling of a snap-fit, or a threaded screw, is coming from force and touch rather than vision,’ he says.
‘Learning models for those actions is prime real-estate for this kind of technology.’
The team carried out a few informal trials with several volunteers and found the difference was minimal.
For now, however, the team is less interested in developing a robotic Jenga champion, and more focused on applying the robot’s new skills to other areas.
‘There are many tasks that we do with our hands where the feeling of doing it ‘the right way’ comes in the language of forces and tactile cues,’ Professor Rodriguez says.
‘For tasks like these, a similar approach to ours could figure it out.’
HOW WILL ROBOTS CHANGE THE WORKPLACE BY 2022?
The World Economic Forum has unveiled its latest predictions for the future of jobs.
Its 2018 report surveyed executives representing 15 million employees in 20 economies.
The non-profit expects robots, AI and other forms of automation to drastically change the workplace within the next four years.
Jobs predicted to be displaced: 75 million
Jobs predicted to be created: 133 million
Share of workforce requiring re-/upskilling: 54 per cent
Companies expecting to cut permanent workforce: 50 per cent
Companies expecting to hire specialist contractors: 48 per cent
Companies expecting to grow workforce: 38 per cent
Companies expecting automation to grow workforce: 28 per cent