Daisy Chaining
Introduction
The Euclidean position and orientation (i.e., pose) of an unmanned ground vehicle (UGV) is typically required for autonomous navigation and control. A collaborative visual servo controller is developed with the objective to regulate an UGV to a desired pose utilizing the feedback from a moving airborne monocular camera system. In contrast to typical camera configurations used for visual servo control problems, the controller in this paper is developed using a moving on-board camera viewing a moving target. Multi-view photogrammetric methods are used to develop relationships between different camera frames and UGV coordinate systems. Lyapunov-based methods are used to prove asymptotic regulation. The results are further extended to develop a cooperative visual servo tracking controller with the objective to enable an unmanned ground vehicle (UGV) to follow a desired trajectory encoded as a sequence of images (i.e. a prerecorded video) utilizing the image feedback from a moving airborne monocular camera system. An adaptive Lyapunov-based control strategy is employed to actively compensate for the lack of known depth measurements and the lack of an object model. The range of operation of an unmanned ground vehicle can be extended using multi-reference based daisy chaining controller assuming reseeding reference targets. Fusing geometric reconstruction technique with daisy chaining simultaneous localization and mapping (SLAM) of the UGV can be obtained, with the application towards path planning, real time trajectory generation, obstacle avoidance, multi-vehicle coordination control and task assignment, etc.
A simulated daisy chaining control is presented below.