Active Visual Search

Object search is the task of efficiently searching for a given 3D object in a given 3D environment by an agent equipped with a camera for target detection and, if the environment configuration is not known, a method of calculating depth, like stereo or laser range finder.


Sensor planning for object search refers to the task of selecting and sensing parameters so as to bring the target into the field of view of the camera and to make the image of the target easily detected by the available recognition algorit
hms.  In the doctoral thesis of Yiming Ye, the task of sensor planning for object search is formulated as an optimization problem.  This problem is proved to be NP-Complete, thus an approximate solution employing a one step look-ahead strategy is proposed.  This approximation is equivalent to the optimal solution under certain conditions.  The search region is characterized by the probability distribution of the presence of the target.  The goal is to find the desired object reliably with minimum effort.  The control of the sensing parameters depends on the current state of the search region and the detecting ability of the recognition algorithm.  The huge space of possible sensing actions is decomposed into a finite set of actions that must be considered.  In order to represent the surrounding environment of the camera and to determine efficiently the sensing parameters over time, a concept called the sensed sphere is proposed, and its construction, using a laser range finder, is derived.

This strategy was tested on a mobile robot, shown here. It is a Cybermotion

Navmaster platform, equipped with Laser-Eye, sonar, video, and infrared sensors.

The recognition algorithm used was quite primitive and the point was to test the planning method, which worked quite well.



A subsequent implementation was attempted, this time on a different platform and with far better recognition algorithms. The platform Ksenia Shubina used was a Pioneer 3 robot, outfitted with a Point Grey Research Bumblebee camera on a Directed Perception pan-tilt unit. The Triclops StereoVision SDK was employed. The papers cited below provide further detail.








Her system was ported to our autonomous wheelchair, Playbot, where it underwent a successful series of tests. In all, the three separate implementations testify to the robustness of the overall strategy.

     






An example, using the Pioneer platform, appears in the movie below.

A simple target is sought within a room of our lab. The robot knows nothing about the room or its contents other than its exterior walls and size. The robot starts in a position in the middle of the room, facing away form the target. The sequence of images shows the search process. It is recommended that you look at the paper - Shubina, K., Tsotsos, J.K., Visual search for an Object in a 3D Environment using a Mobile Robot, Departmental Technical Report CSE-2008-02, April 28, 2008. -  in order to interpret these images.


        



References


  1. Ye, Y., Tsotsos, J.K., A Complexity  Level Analysis of the Sensor Planning Task for Object Search, Computational Intelligence Vol. 17, No. 4, p. 605 – 620, Nov. 2001.

  2. Ye, Y., Tsotsos, J.K., Sensor Planning for Object Search, Computer Vision  and Image Understanding 73-2, p145 - 168, 1999.

  3. Ye, Y., Tsotsos, J.K., Knowledge Difference and its Influence on a Search Agent", 1st International Conference on Autonomous Agents, Marina del Ray, CA, February 1997.

  4. Ye. Y., Tsotsos, J.K., Sensor Planning in 3D Object Search,  Int. Symposium on Intelligent Robotic Systems, Lisbon, July 1996.

  5. Ye, Y. Tsotsos, J.K., 3D Sensor Planning: Its Formulation and Complexity, International Symposium on Artificial Intelligence and Mathematics,  January 1996.

  6. Ye, Y., Tsotsos, J., Where to Look Next in 3D Object Search, IEEE International Symposium on Computer Vision, Coral Gables, Florida, November 1995, p539 - 544.

  7. Ye, Y., Tsotsos, J.K., The Detection Function in Object Search, The Fourth International Conference for Young Computer Scientists (ICYCS'95). July 19-21,1995  Beijing.

  8. Shubina, K., Tsotsos, J.K., Visual search for an Object in a 3D Environment using a Mobile Robot, Departmental Technical Report CSE-2008-02, April 28, 2008.

  9. Tsotsos, J.K.,Shubina, K., Attention and Visual Search : Active Robotic Vision Systems that Search. The 5th International Conference on Computer Vision Systems, Bielefeld, March 21 - 24, 2007.