The Playbot Project

Click here to go to the lab pages for this project

 

This project was conceived in early 1991, when I was watching Saturday morning television while taking care of my toddler son. I was flipping channels hoping to find some cartoon we could watch together when I happened on a TVOntario program about a researcher in Vancouver who was developing a robot for a disabled child. I did not recognize the name so I left it for a bit (it was a system developed by Gary Birch, of the Neil Squire Foundation).). Watching the disabled little boy juxtaposed with my healthy son cause a strong emotional response. This is my area, and we should be able to do better, I thought. And Playbot was born!



From York U Magazine, Summer 2007, P.7.


The project began in earnest around 1992 when the first funding for it was obtained from the Network of Centres Excellence IRIS (Institute for Robotics an Intelligent Systems).  Our initial focus was not on navigation, it was on how to ease the task of    

instructing a robot and having that robot deal with a search and grasp task. This differs from most other projects where the focus is on navigation. Add in the natural vision focus in my lab and this leads to a purely vision based robot.

The first major paper on our efforts was


Tsotsos, J.K., Verghese, G., Dickinson, S., Jenkin, M., Jepson, A., Milios, E., Nuflo, F., Stevenson, S., Black, M., Metaxas, D., Culhane, S., Ye, Y., Mann, R., PLAYBOT: A visually-guided robot to assist physically disabled children in play, Image & Vision Computing Journal 16, Special Issue on Vision for the Disabled, p275 - 292, April 1998.



More recently funding from the Canada Foundation for Innovation allowed us to build a serious prototype, pictured below.

The hardware setup includes:

Chair-man Entra, Permobil Inc.

cameras: Bumblebee, Flea from Point Grey

6+2 d.o.f. robotic manipulator - MANUS, Exact Dynamics

a suite of on-board (Toshiba touchpad, Macintosh laptops) and off-board computers.

motion controller (RoboteQ Inc.)

custom control electronics using a Motorola HCS12 microcontroller.


The current visual behaviors Playbot can execute include:

- find, approach, open and pass through a door

- visual SLAM

- visual user monitor

- obstacle detection

  1. -visual free space / anomaly detection 

  2. -visual search for an object within a room


The following give good summary and details for these:


Rotenstein, A., Andreopoulos, A., Fazl, E., Jacob, D., Robinson, M., Shubina, K., Zhu, Y., Tsotsos, J.K.,  Towards the dream of  intelligent, visually-guided wheelchairs, Proc. 2nd International Conference on Technology and Aging, Toronto, Can. June, 2007.


Andreopoulos, A., Tsotsos, J.K., A Framework for Door Localization and Door Opening Using a Robotic Wheelchair for People Living with Mobility Impairments, RSS 2007 Manipulation Workshop: Sensing and Adapting to the Real World, Atlanta, Jun. 30, 2007.