|Table of Contents|

An Integrated Navigation Method Based on VINS/FINS(PDF)

南京师范大学学报(工程技术版)[ISSN:1006-6977/CN:61-1281/TN]

Issue:
2021年03期
Page:
42-48
Research Field:
控制科学与工程
Publishing date:

Info

Title:
An Integrated Navigation Method Based on VINS/FINS
Author(s):
Yuan ShanWan YouMeng JiajieWang YutingQian WeixingGu Cuihong
NARI School of Electrical and Automation Engineering,Nanjing Normal University,Nanjing 210023,China
Keywords:
integrated navigationbiped walking robotvisual-inertial navigation systemzero-velocity updateinformation bidirectional fusion
PACS:
TP391.41
DOI:
10.3969/j.issn.1672-1292.2021.03.006
Abstract:
Aiming at the problems of low accuracy of micro inertial devices in vision/inertial integrated navigation system(VINS)and poor observability of course angle error of foot inertial navigation system(FINS),a navigation and positioning scheme based on the above two systems is studied. The system structure of this method is composed of two parts:the VINS navigation system installed on the trunk of the biped walking robot and the fins navigation system installed on the foot of the biped robot. VINS can obtain relatively accurate heading angle through visual slam data fusion method. FINS uses the position information after zero speed correction to correct the error of low precision inertial devices in VINS in real time,so as to construct the integrated navigation system structure of bidirectional fusion of vision and inertial information. The experimental results show that the integrated navigation scheme can effectively improve the navigation and positioning accuracy of biped walking robot indoor environment.

References:

[1] 古翠红. 基于机器学习的行人导航系统关键技术研究[D]. 南京:南京师范大学,2020.
[2]YANG S,SCHERER S A,YI X D,et al. Multi-camera visual SLAM for autonomous navigation of micro aerial vehicles[J]. Robotics and Autonomous Systems,2017,93(3):116-134.
[3]MUR A R,TARDOS J D. Visual-inertial monocular SLAM with map reuse[J]. IEEE Robotics and Automation Letters,2017,2(2):796-803.
[4]QIN T,LI P L,SHEN S J. VINS-mono:a robust and versatile monocular visual-inertial state estimator[J]. IEEE Transactions on Robotics,2018,34(4):1004-1020.
[5]赵小明,邓芳瑾,杨松普,等. 基于压力传感器辅助的行人室内定位零速修正方法[J]. 中国惯性技术学报,2018,26(1):1-5.
[6]胡棒. 智能双足机器人定位与导航系统[D]. 西安:西安电子科技大学,2015.
[7]刘鸿勋,王伟. 双目相机和激光雷达的融合SLAM研究[J]. 南京师范大学学报(工程技术版),2021,21(1):64-71.
[8]HENG L,LEE G H,POLLEFEYS M. Self-calibration and visual SLAM with a multi-camera system on a micro aerial vehicle[J]. Autonomous Robots,2015,39(3):259-277.
[9]YANG S,SCHERER S A,YI X,et al. Multi-camera visual SLAM for autonomous navigation of micro aerial vehicles[J]. Robotics and Autonomous Systems,2017,93(3):116-134.
[10]FORSTER C,CARLONE L,DELLAERT F,et al. On-manifold preintegration for real-time visual-inertial odometry[J]. IEEE Transactions on Robotics,2017,33(1):1-21.
[11]GU C H,QIAN W X,YANG S Q,et al. Application of system fault detection and intelligent reconstruction method based on machine learning in micro inertial pedestrian navigation system[C]//Proceedings of 5th IEEE Conference on Ubiquitous Positioning,Indoor Navigation and Location-Based Services(UPINLBS). Wuhan,China,2018.
[12]HARLE R. A survey of indoor inertial positioning systems for pedestrians[J]. IEEE Communications Survey & Tutorials,2013,15(3):1281-1293.
[13]钱伟行,周紫君,谢非,等. 基于机器学习与步态特征辅助的行人导航方法[J]. 中国惯性技术学报,2019,27(4):433-439,447.
[14]ZHUANG Y,SHEIMY N E. Tightly-coupled integration of WiFi and MEMS sensors on handheld devices for indoor pedestrian navigation[J]. IEEE Sensors Journal,2016,16(1):224-234.
[15]LI A,RUAN X,HUANG J,et al. Review of vision-based simultaneous localization and map-ping[C]//2019 IEEE 3rd Information Technology,Networking,Electronic and Automation Control Conference(ITNEC). Chengdu,China,2019.

Memo

Memo:
-
Last Update: 2021-09-30