Real-time appearance-based Monte Carlo localization
A new technique for vision processing is presented which lets a mobile robot equipped with an omnidirectional camera perform appearancebased global localization in real time. The technique is applied directly to the omnidirectional camera images, producing low-dimensional rotation invariant feature vectors without any training or set-up phase. Using the feature vectors, particle filters can accurately estimate the location of a continuously moving real robot, processing 5000 simultaneous localization hypotheses on-line. Estimated body positions overlap the actual ones in over 95% of the time steps. The feature vectors show a graceful degradation against increasing levels of simulated noise and occlusion.