Details

Self-localization in ubiquitous computing using sensor fusion /

by Zampieron, Jeffrey Michael

Abstract (Summary)
"The widespread availability of small and inexpensive mobile computing devices and the desire to connect them at any time in any place has driven the need to develop an accurate means of self-localization. Devices that typically operate outdoors use GPS for localization. However, most mobile computing devices operate not only outdoors but indoors where GPS is typically unavailable. Therefore, other localization techniques must be used. Currently, there are several commercially available indoor localization systems. However, most of these systems rely on specialized hardware which must be installed in the mobile device as well as the building of operation. The deployment of this additional infrastructure may be unfeasible or costly. This work addresses the problem of indoor self-localization of mobile devices without the use of specialized infrastructure. We aim to leverage existing assets rather than deploy new infrastructure. The problem of self-localization utilizing single and dual sensor systems has been well studied. Typically, dual sensor systems are used when the limitations of a single sensor prevent it from functioning with the required level of performance and accuracy. A second sensor is often used to complement and improve the measurements of the first one. Sometimes it is better to use more than two sensors. In this work the use of three sensors with complementary characteristics was explored. The three sensor system that was developed included a positional sensor, an inertial sensor and a visual sensor. Positional information was obtained via radio localization. Acceleration information was obtained via an accelerometer and visual object identification was performed with a video camera. This system was selected as representative of typical ubiquitous computing devices that will be capable of developing an awareness of their environment in order to provide users with contextually relevant information. As a part of this research a prototype system consisting of a video camera, accelerometer and an 802.11g receiver was built. The specific sensors were chosen for their low cost and ubiquitous nature and by their ability to complement each other in a self-localization task using existing infrastructure. A Discrete Kalman filter was designed to fuse the sensor information in an effort to get the best possible estimate of the system position. Experimental results showed that the system could, when provided with a reasonable initial position estimate, determine its position with an average error of 8.26 meters"--Abstract.
Bibliographical Information:

Advisor:

School:

School Location:

Source Type:Master's Thesis

Keywords:ubiquitous computing mobile multisensor data fusion

ISBN:

Date of Publication:

© 2009 OpenThesis.org. All Rights Reserved.