XOTAR develops advanced robot perception and control technologies empowering robots with the visual-motor intelligence required to perform valuable autonomous missions in support of challenges facing businesses, consumers and governments. A differentiating foundation of XOTAR’s systems is a technology approach that enables our robots to acquire their perception and control behaviors via experimentation, observation of human behaviors and optimization instead of explicit human programming. The company engineers embedded software and electronics, electro-opto-mechanical systems, as well as complete robotic systems and will enter products in the marketplace in 2013.
Today’s robots and autonomous systems largely fit into two categories, remotely controlled platforms, and systems with highly task specific programming. Both of these approaches significantly limit the deployment scale of robots:
Remotely controlled robots require a human trained relative to the robots sensing and control functions as well as the tasks the robot is to perform. This works feasibly for specialized task domains like hazardous environments, defense systems and robotic surgery, however, the deployments are limited to applications that can afford the full time human being required for sensing and control. Often times, several trained humans are required for remote supervision of a single robot on a single mission. This era of robotics technology does not scale well.
Autonomous systems with task specific programming are limited by the ability for engineers to predict the environments and situations the robot is to encounter during missions. The result is either largely simplified missions limiting the value of the robot or robots that fail in the mission objectives or require human intervention thereby limiting their autonomy.
The key to unlocking economic feasibility is intelligence, and the core of that intelligence is visual perception. With visual intelligence established as a foundation, XOTAR is developing the capability for robots to acquire their perception and control behaviours automatically. In effect we are engineering the means for robot to acquire and refine their own behavioural intelligence.