In past manned lunar landing missions, such as Apollo 14, spatial disorientation of astronauts substantially compromised the productivities of astronauts, and caused safety and mission success problems. The non-GPS lunar environment has micro-gravity field, and lacks both spatial recognition cues and reference objects which are familiar to the human biological sensors related to spatial recognition (e.g. eyes). Such an environment causes misperceptions of the locations of astronauts and targets and their spatial relations, as well as misperceptions of the heading direction and travel distances of astronauts. These spatial disorientation effects can reduce productivity and cause life risks in lunar manned missions. A navigation system, which is capable of locating astronauts and tracking the movements of them on the lunar surface, is critical for future lunar manned missions where multiple astronauts will traverse more than 100km from the lander or the base station with the assistance from roving vehicle, and need real-time navigation support for effective collaborations among them.
Our earlier research to solve these problems dealt with developing techniques to enable a precise, flexible and reliable Lunar Astronaut Spatial Orientation and Information System (LASOIS) capable of delivering real-time navigation information to astronauts on the lunar surface. The LASOIS hardware was a sensor network composed of orbital, ground and on-suit sensors: the Lunar Reconnaissance Orbiter Camera (LROC), radio beacons, the on-suit cameras, and shoe-mounted Inertial Measurement Unit (IMU). The LASOIS software included efficient and robust algorithms for estimating trajectory from IMU signals, generating heading information from imagery acquired from on-suit cameras, and an Extended Kalman Filter (EKF) based approach for integrating these spatial information components to generate the trajectory of an astronaut with meter-level accuracy. Moreover, LASOIS emphasized multi-mode sensors for improving the flexibility and robustness of the system.
From the experimental results during three field tests for the LASOIS system, we observed that most of the errors in the image processing algorithm are caused by the incorrect feature tracking. This dissertation addresses the feature tracking problem in image sequences acquired from cameras. Despite many alternatives to feature tracking problem, iterative least squares solution solving the optical flow equation has been the most popular approach used by many in the field. This dissertation attempts to leverage the former efforts to enhance feature tracking methods by introducing a view geometric constraint to the tracking problem, which provides collaboration among features. In contrast to alternative geometry based methods, the proposed approach provides an online solution to optical flow estimation in a collaborative fashion by exploiting Horn and Schunck flow estimation regularized by view geometric constraints. Proposed collaborative tracker estimates the motion of a feature based on the geometry of the scene and how the other features are moving. Alternative to this approach, a new closed form solution to tracking that combines the image appearance with the view geometry is also introduced. We particularly use invariants in the projective coordinates and conjecture that the traditional appearance solution can be significantly improved using view geometry. The geometric constraint is introduced by defining a new optical flow equation which exploits the scene geometry from a set drawn from tracked features. At the end of each tracking loop the quality of the tracked features is judged using both appearance similarity and geometric consistency. Our experiments demonstrate robust tracking performance even when the features are occluded or they undergo appearance changes due to projective deformation of the template. The proposed collaborative tracking method is also tested in the visual navigation algorithm of the LASOIS system instead of original KLT tracking method for the experiment data from Moses Lake. The experimental analysis shows that the proposed collaborative tracking approach significantly improved the accuracy of the navigation solution.