viernes, 18 de mayo de 2012

Entropy will find the way

The most awesome measure in Information Theory is entropy. It is widely used in pattern recognition [Escolano,Suau,Bonev 2009] because it quantifies the expected value of the information contained in a message.
As far as I remember the first application I gave to entropy was to help a robot to find its way in a semi-structured environment using only vision [Bonev,Cazorla,Escolano 2007]. Not in terms of high-level knowledge, but just to go where there seem to be things in the distance. If people have a walk they don't get stuck by trying to go against a building. Instead, they see something in the end of a street and they start to walk that way. 
In an image representing 360º of the environment that "something in the end of a street" is visually perceived as a more entropic region:
To avoid ambiguities I forced to have only two most entropic regions at a time: the two ends of a corridor or a street. A Fourier approximation results in the following map: for each moment in the time line we have only to hot regions in the angle axis. The robot should head to one of them: the one which grees with its current heading.
This simplistic approach had to be aided by another vision-based mechanism to avoid obstacles. I refer to it as visual sonars, but no range sensors and no GPS are used in the following video, only vision:


[Escolano, Suau, Bonev 2009] F. Escolano, P. Suau, B. Bonev. "Information Theory in Computer Vision and Pattern Recognition". (Hardcover) Springer, 2009
[Bonev, Cazorla, Escolano 2007] B. Bonev, M. A. Cazorla, F. Escolano. "Robot Navigation Behaviors based on Omnidirectional Vision and Information Theory". Journal of Physical Agents - September 2007

1 comentario:

  1. Or maybe "just entropy" won't find the way. From S. Soatto's publication:

    Despite its pervasive reach today, Shannon’s notion of information had early critics, among those James J. Gibson, who wrote "My theory of the available information in ambient light is radically different from [that of] Shannon. [...] My notion is that information consists of invariants underlying change". Already in the fifties he was convinced that data is not information, and the value of data should depend on what one can do with it, i.e. the task.

    http://link.springer.com/chapter/10.1007/978-3-642-28661-2_2

    ResponderEliminar