Abstract :
We present a demonstrated and commercially viable self-tracker, using robust software that fuses data from inertial and vision sensors. Compared to infrastructure-based trackers, self-trackers have the advantage that objects can be tracked over an extremely wide area, without the prohibitive cost of an extensive network of sensors or emitters to track them. So far, most AR research has focused on the long-term goal of a purely vision-based tracker that can operate in arbitrary unprepared environments, even outdoors. We instead chose to start with artificial fiducials, in order to quickly develop the first self-tracker which is small enough to wear on a belt, low cost, easy to install and self-calibrate, and low enough latency to achieve AR registration. We also present a roadmap for how we plan to migrate from artificial fiducials to natural ones. By designing to the requirements of AR, our system can easily handle the less challenging applications of wearable VR systems and robot navigation.
Keywords :
augmented reality; image sensors; optical tracking; sensor fusion; wearable computers; AR registration; AR research; VIS-Tracker; artificial fiducials; augmented reality; commercially viable self-tracker; inertial sensors; infrastructure based trackers; robot navigation; robust software; vision sensors; vision-based tracker; wearable VR systems; wearable vision-inertial self-tracker; Belts; Costs; Delay; Fuses; Navigation; Robots; Robustness; Sensor fusion; Virtual reality; Wearable sensors;