Bootstrapping bilinear models of simple Vehicles 2013-10-02

Andrea Censi and Richard M. Murray. Bootstrapping bilinear models of Simple Vehicles. International Journal of Robotics Research, 34:1087–1113, July 2015. pdfdoi video supp. material slidesbibtex

Abstract: Learning and adaptivity will play a large role in robotics in the future, as robots transition from unstructured to unstructured environments that cannot be fully predicted or understood by the designer. Two questions that are open are how much it is possible to learn, and how much we should learn. The goal of bootstrapping is creating agents that are able to learn "everything" from scratch, including a torque-to-pixels models for its robotic body. Systems with such capabilities will be advantaged in terms of being resilient to unforeseen changes and deviations from prior assumptions. The robotics domain is a challenging one for learning, due to the presence of high-dimensional signals, various nonlinearities, and hidden states. There are no formal results, in the spirit of control theory, that could give guarantees. This paper considers the bootstrapping problem for a subset of the set of all robots. The Vehicles, inspired by Braitenberg's work, are idealization of mobile robots equipped with a set of “canonical” exteroceptive sensors (camera; range-finder; field-sampler). Their sensel-level dynamics are derived and shown to be surprising close. We define the class of BDS models, which assume an instantaneous bilinear dynamics between observations and commands, and derive streaming-based bilinear strategies for them. We show in what sense the BDS dynamics approximates the set of Vehicles to guarantee success in the task of generalized servoing: driving the observations to a given goal snapshot. Simulations and experiments substantiate the theoretical results. This is the first instance of a bootstrapping agent that can learn the dynamics of a relatively large universe of systems, and use the models to solve well-defined tasks, with no parameter tuning or hand-designed features.

Additional materials

Video with experimental data

Video snippets

These are some of the "video snippets" used in the main video available for viewing and separate download.

Exploration with motor babbling

Download mp4

Tensor learning animations

One range-finder (YHL)

Download mp4

Two range-finders (YHLR)

Download mp4

1D Camera (YVS)

Download mp4

Field sampler (YFS)

Download mp4