By incorporating insights from canine companions, researchers enable robots to use both language and gesture as inputs to help fetch the right objects.
POMDP, an AI framework inspired by dogs that allows robots to use human gestures and language to find objects with 89% accuracy.