Sign in or Register

Fictron Industrial Supplies Sdn Bhd
No. 7 & 7A,
Jalan Tiara, Tiara Square,
Taman Perindustrian Sime UEP,
47600 Subang Jaya,
Selangor, Malaysia.
+603-8023 9829
+603-8023 7089
Fictron Industrial
Automation Pte Ltd

140 Paya Lebar Road, #03-01,
AZ @ Paya Lebar 409015,
Singapore.
+65 31388976
sg.sales@fictron.com

Let¡¯s Build Robots That Are as Smart as Babies

29 Oct 2019
Let¡¯s Build Robots That Are as Smart as Babies
View Full Size
Let’s face it: Robots are dumb. To say the least they are idiot savants, ideal for doing one thing very well. Typically, even those robots need particular environments in which to do their one thing really well. This is the reason autonomous cars or robots for home health care are so hard to build. They will need to react to an uncountable number of situations, and they are going to need a generalised understanding of the world in order to navigate them all.
 
Children as young as two months already understand that an unsupported object will fall, while five-month-old babies know materials like sand and water will pour from a container instead of plop out as a single chunk. Robots lack these understandings, which hinders them as they try to navigate the world without a prescribed task and movement.
 
But yet we could see robots with a generalised insight of the world (and the processing power required to wield it) thanks to the video-game industry. Researchers are bringing physics engines — the software that provides real-time physical interactions in complex video-game worlds — to robotics. The purpose is to develop robots’ understanding in order to learn about the world in the same way babies do.
 
Providing robots a baby’s sense of physics helps them navigate the real world and can even save on computing power, according to Lochlainn Wilson, the CEO of SE4, a Japanese company constructing robots that could operate on Mars. SE4 plans to eliminate the problems of latency caused by distance from Earth to Mars by building robots that can operate independently for a few hours before receiving more instructions from Earth.
 
Wilson says that his company uses simple physics engines for example PhysX to help build more-independent robots. He adds that if you can tie a physics engine to a coprocessor on the robot, the real-time basic physics intuitions won’t take compute cycles away from the robot’s principal processor, which will often be focused on a more complicated task.
 
Wilson’s firm periodically still turns to a traditional graphics engine, just like Unity or the Unreal Engine, to handle the demands of a robot’s movement. In some instances, however, such as a robot accounting for friction or understanding force, you really need a robust physics engine, Wilson says, not really a graphics engine that simply simulates a virtual environment. For his projects, he often turns to the open-source Bullet Physics engine built by Erwin Coumans, who is now an employee at Google.
 
Bullet is a common physics-engine option, but it isn’t the only one out there. Nvidia Corp., for example, has realized that its gaming and physics engines are well-placed to handle the computing demands needed by robots. In a lab in Seattle, Nvidia is working with teams from the University of Washington to construct kitchen robots, entirely articulated robot hands and more, all equipped with Nvidia’s tech.
 
The robot could also understand that less pressure is needed to grasp something like a cardboard box of Cheez-It crackers versus something more durable like an aluminum can of tomato soup. Nvidia’s silicon has already helped advance the fields of artificial intelligence and computer vision by making it possible to process multiple decisions in parallel. It’s possible that the company’s new focus on virtual worlds will help advance the field of robotics and teach robots to think like babies.
 

You have 0 items in you cart. Would you like to checkout now?
0 items
Switch to Mobile Version