Dept. Of Electrical Engineering & Computer Science
Coby has just completed his second year at Queen’s University and is spending the summer conducting research at York University. Coby is majoring in Applied Mathematics and Computer Engineering. Coby will be working in the Vision, Graphics, and Robotics Lab under the supervision of Professor Michael Jenkin. More specifically, Coby will be utilizing existing robots in the lab to implement a Simultaneous Localization and Mapping (SLAM) algorithm in conjunction with a known directional landmark used for localization. He will be developing an appropriate probabilistic model for the inclusion of “almost certain” localization information within the SLAM framework. He will simulate this system to test his model. This is important because under non-ideal conditions, such systems are prone to failure. Thus, using a single landmark, the SLAM problem can be solved deterministically better than existing SLAM and localization solutions.
Localization and SLAM with a Unique Landmark
The ability for an autonomous vehicle to navigate an unknown space using various sensors has many useful applications. A key issue for such navigation is the development of spatial representations (maps) of the space within which the robot will operate. Current Simultaneous Localization and Mapping (SLAM) solutions exist for sufficiently well conditioned environments, robots, and sensor systems. As environments become more complex, existing solutions can fail and this is unacceptable for industrial deployment. The likelihood of failure can be reduced if there exists a unique oriented landmark in the space being traversed and mapped. My project utilizes an existing algorithm, the Multi-Camera Parallel Tracking and Mapping (MCPTAM) algorithm augmented with the presence of a unique oriented landmark, co-localized with the robot at it’s initialization. My solution uses a differential drive robot and two wide angle cameras mounted on a robot to allow 360-degree video capture of the environment around the robot. The Robotic Operating System (ROS) is the main mechanism which coordinates mapping, localization, and control of the robot. When the uncertainty in the position of the robot exceeds a maximal threshold, the robot moves to a position where it can view the unique landmark and collapse the algorithm’s estimation of the position to the correct location. This technology has commercial applications in areas that requires the use of autonomous vehicles including welcoming and guiding people in commercial stores and buildings.