×

Vision-based navigation and guidance of a sensorless missile. (English) Zbl 1395.93447

Summary: The objective of this paper is to develop a vision-based terminal guidance system for sensorless missiles. Specifically, monocular vision-based relative navigation and robust control methods are developed for a sensorless missile to intercept a ground target maneuvering with unknown time-varying velocity. A mobile wireless sensor and actor network is considered wherein a moving airborne monocular camera (e.g., attached to an aircraft) provides image measurements of the missile (actor) while another moving monocular camera (e.g., attached to a small UAV) tracks a ground target. The challenge is to express the unknown time-varying target position in the time-varying missile frame using image feedback from cameras moving with unknown trajectories. In a novel relative navigation approach, assuming the knowledge of a single geometric length on the missile, the time-varying target position is obtained by fusing the daisy-chained image measurements of the missile and the target into a homography-based Euclidean reconstruction method. The three-dimensional interception problem is posed in pursuit guidance, proportional navigation, and the proposed hybrid guidance framework. Interestingly, it will be shown that by appropriately defining the error system a single control structure can be maintained across all the above guidance methods. The control problem is formulated in terms of target dynamics in a ’virtual’ camera mounted on the missile, which enables design of an adaptive nonlinear visual servo controller that compensates for the unknown time-varying missile-target relative velocity. Stability and zero-miss distance analysis of the proposed controller is presented, and a high-fidelity numerical simulation verifies the performance of the guidance laws.

MSC:

93D05 Lyapunov and other classical stabilities (Lagrange, Poisson, \(L^p, l^p\), etc.) in control theory
93C15 Control/observation systems governed by ordinary differential equations
93C85 Automated systems (robots, etc.) in control theory
93B52 Feedback control
68U10 Computing methodologies for image processing
93B51 Design techniques (robust design, computer-aided design, etc.)
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Carnahan, B.; Luther, H.; Wilkes, J., Applied numerical methods, (1969), John Wiley & Sons Inc. New York · Zbl 0195.44701
[2] Chen, H.; Chang, K.; Agate, C. S., UAV path planning with tangent-plus-Lyapunov vector field guidance and obstacle avoidance, IEEE Trans. Aerosp. Electron. Syst., 49, 840-856, (2013)
[3] Corke, P.; Hutchinson, S., A new partitioned approach to image-based visual servo control, IEEE Trans. Robotics Autom., 17, 507-515, (2001)
[4] Corless, M.; Leitmann, G., Continuous state feedback guaranteeing uniform ultimate boundedness for uncertain dynamic systems, IEEE Trans. Autom. Control, 26, 1139-1144, (1981) · Zbl 0473.93056
[5] Dixon, W. E., Adaptive regulation of amplitude limited robot manipulators with uncertain kinematics and dynamics, IEEE Trans. Autom. Control, 52, 488-493, (2007) · Zbl 1366.93402
[6] Ergezer, H.; Leblebicioglu, K., Path planning for UAVs for maximum information collection, IEEE Trans. Aerosp. Electron. Syst., 49, 502-520, (2013)
[7] Faugeras, O., Three-dimensional computer vision: A geometric viewpoint, (1993), MIT Press Cambridge, MA
[8] Faugeras, O.; Lustman, F., Motion and structure from motion in a piecewise planar environment, Int. J. Pattern Recognit. Artif. Intell., 2, 485-508, (1988)
[9] N.R. Gans, A. Dani, W.E. Dixon, Visual servoing to an arbitrary pose with respect to an object given a single known length, in: Proceedings of the American Control Conference, Seattle, WA, USA, 2008, pp. 1261-1267.
[10] Garnell, P.; East, D. J., Guided weapon control systems, (1977), Pergamon Press Oxford
[11] Kim, S.; Oh, H.; Tsourdos, A., Nonlinear model predictive coordinated standoff tracking of a moving ground vehicle, J. Guid. Control Dyn., 1-10, (2013)
[12] Malis, E.; Chaumette, F., 2 1/2 D visual servoing with respect to unknown objects through a new estimation scheme of camera displacement, Int. J. Comput. Vis., 37, 79-97, (2000) · Zbl 0996.68237
[13] Malyavej, V.; Manchester, I. R.; Savkin, A. V., Precision missile guidance using radar/multiple-video sensor fusion via communication channels with bit-rate constraints, Automatica, 42, 763-769, (2006) · Zbl 1137.93414
[14] I. Manchester, A. Savkin, F. Faruqi, Optical-flow based precision missile guidance inspired by honeybee navigation, in: Proceedings of the 42nd IEEE Conference on Decision and Control, 2003, pp. 5444-5449.
[15] Manchester, I. R.; Savkin, A. V., Circular-navigation-guidance law for precision missile/target engagements, J. Guid. Control Dyn., 29, 314-320, (2006)
[16] S. Mehta, W.E. Dixon, D. MacArthur, C.D. Crane, Visual servo control of an unmanned ground vehicle via a moving airborne monocular camera, in: Proceedings of the American Control Conference, Minneapolis, Minnesota, 2006, pp. 5276-5281.
[17] S. Mehta, K. Kaiser, N. Gans, W.E. Dixon, Homography-based coordinate relationships for unmanned air vehicle regulation, in: Proceedings of the AIAA Guidance, Navigation, and Control Conference, Keystone, Colorado, 2006, AIAA 2006-6718.
[18] S.S. Mehta, W. MacKunis, J.W. Curtis, Adaptive vision-based missile guidance in the presence of evasive target maneuvers, in: 18th IFAC World Congress, 2011, pp. 5471-5476.
[19] S.S. Mehta, W. MacKunis, E.L. Pasiliao, J.W. Curtis, Adaptive image-based visual servo control of an uncertain missile airframe, in: Proceedings of the AIAA Guidance, Navigation, and Control Conference, 2005, AIAA 2012-4901.
[20] Miwa, S.; Imado, F.; Kuroda, T., Clutter effect on the miss distance of a radar homing missile, J. Guid. Control Dyn., 11, 336-342, (1988)
[21] Nesline, F. W.; Zarchan, P., Missile guidance design tradeoffs for high-altitude air defense, J. Guid. Control Dyn., 6, 207-212, (1983)
[22] Pagilla, P.; Yu, B., An experimental study of planar impact of a robot manipulator, IEEE/ASME Trans. Mechatron., 9, 123-128, (2004)
[23] P.N. Pathirana, A.V. Savkin, Sensor fusion based missile guidance, in: Proceedings of the 6th International Conference on Information Fusion, 2003, pp. 253-260. · Zbl 1040.68137
[24] Shaferman, V.; Shima, T., Unmanned aerial vehicles cooperative tracking of moving ground target in urban environments, J. Guid. Control Dyn., 31, 1360-1371, (2008)
[25] Siouris, G., Missile guidance and control systems, (2004), Springer Science & Business Media New York
[26] Stepanyan, V.; Hovakimyan, N., Adaptive disturbance rejection controller for visual tracking of a maneuvering target, J. Guid. Control Dyn., 30, 1090-1106, (2007)
[27] Summers, T. H.; Akella, M. R.; Mears, M. J., Coordinated standoff tracking of moving targetscontrol laws and information architectures, J. Guid. Control Dyn., 32, 56-69, (2009)
[28] Tian, Y.; Li, Y.; Ren, Z., Vision-based adaptive guidance law for intercepting a manoeuvring target, IET Control Theory Appl., 5, 421-428, (2011)
[29] Uhrmeister, B., Kalman filters for a missile with radar and/or imaging sensor, J. Guid. Control Dyn., 17, 1339-1344, (1994)
[30] Vepretsky, V. A., Prospects for aerial reconnaissance in combat and operations, Mil. Thought, 21, 107-112, (2012)
[31] Yanushevsky, R., Modern missile guidance, (2008), CRC Press Boca Raton
[32] Zarchan, P., Tactical and Strategic Missile Guidance. Progress in Astronautics and Aeronautics, vol. 176, (1998), AIAA New York
[33] Zergeroglu, E.; Dixon, W. E.; Behal, A.; Dawson, D. M., Adaptive set-point control of robotic manipulators with amplitude-limited control inputs, Robotica, 18, 171-181, (2000)
[34] Zhang, Z., On the optimization criteria used in two-view motion analysis, IEEE Trans. Pattern Anal. Mach. Intell., 20, 717-729, (1998)
[35] Z. Zhang, A. Hanson, Scaled Euclidean 3D reconstruction based on externally uncalibrated cameras, in: Proceedings of the IEEE International Symposium on Computer Vision, 1995, pp. 37-42.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.