Fingertip data fusion of Kinect v2 and leap motion in unity

Fingertip data fusion of Kinect v2 and leap motion in unity

Bo Li Chao Zhang  Cheng Han  Baoxing Bai 

Changchun University of Science and Technology, No.7186, Weixing Road, Changchun, 130022 China

College of Optical and Electronical Information Changchun University of Science and Technology, No.333, Xueli Road, Changchun, 130114 China

Corresponding Author Email: 
zhangchao@cust.edu.cn
Page: 
143-159
|
DOI: 
https://doi.org/10.3166/ISI.23.6.143-159
Received: 
|
Accepted: 
|
Published: 
31 December 2018
| Citation

OPEN ACCESS

Abstract: 

This paper describes how the data fusion and application of Kinect v2 and Leap Motion in Unity3D are implemented. Firstly, it implements a method based on Kinect v2 to obtain fingertips. Then, it calibrates Kinect v2 and Leap Motion in two different orientations in two steps. The preliminary calibration uses a one-dimensional calibration rod algorithm, and the fine calibration keeps approximating the true value through iteration, which realizes the joint calibration of the two. Finally, this paper uses Unity3D to fuse the data of the two types of equipment and conducts human-computer interaction with the virtual object in the virtual space of Unity3D. Experiments show that the method proposed in this paper can extend the hand tracking range and improve the accuracy of the collision between the human hand and the virtual object.

Keywords: 

fingertip recognition, joint calibration, data fusion, natural human-computer interaction, leap motion, Kinect v2.

1. Introduction
2. Acquisition of fingertip data from depth images
3. Joint calibration of Kinect and leap motion
4. Data fusion
5. Experiments and results
6. Conclusion and outlook
Acknowledgments
  References

Besl P. J., Mckay N. D. (1992). A method for registration of 3-D shapes. IEEE Transactions on Pattern Analysis and Machine Intelligence - Special issue on interpretation of 3-D scenes—part II, Vol. 14, No. 2, pp. 239-256. http://doi.org/10.1109/34.121791

Bratoszewski P., Czyżewski A. (2015). Face profile view retrieval using time of flight camera image analysis. in Pattern Recognition and Machine Intelligence: 6th International Conference, PReMI 2015, Warsaw, Poland, Publisher: Springer, pp. 159-168. http://doi.org/10.1007/978-3-319-19941-2_16

Chan A., Halevi T., Memon N. (2015). Leap motion controller for authentication via hand geometry and gestures, human aspects of information security, privacy, and trust. Springer International Publishing, pp. 13-22. http://doi.org/10.1007/978-3-319-20376-8_2

Chuan C. H., Regina E., Guardino C. (2014). American sign language recognition using leap motion sensor. in 2015 International Conference on Machine Learning and Applications IEEE, Vol. 13, pp. 541-544. http://doi.org/10.1109/ICMLA.2014.110

Craig A., Krishnan S. (2016). Fusion of leap motion and kinect sensors for improved field of view and accuracy for VR applications (course report). Stanford University, unpublished.

Erdoğan K., Durdu A., Yilmaz N. (2016). Intention recognition using leap motion controller and artificial neural networks. International Conference on Control, Decision and Information Technologies IEEE, pp. 689-693. http://doi.org/10.1109/CoDIT.2016.7593646

He G. F., Kang S. K., Song W. C., Jung S. T. (2011). Real-time gesture recognition using 3D depth camera. International Conference on Software Engineering and Service Science IEEE, pp. 187-190. http://doi.org/10.1109/ICSESS.2011.5982286

Li Y. (2012). Hand gesture recognition using Kinect. IEEE 3rd International Conference on Software Engineering and Service Science (ICSESS), pp. 196–199.

Mapari R. B., Kharat G. (2015). Real time human pose recognition using leap motion sensor. in 2016 IEEE International Conference on Research in Computational Intelligence and Communication Networks IEEE, pp. 323-328. http://doi.org/10.1109/ICRCICN.2015.7434258

Marin G., Dominio F., Zanuttigh P. (2015). Hand gesture recognition with leap motion and kinect devices. IEEE International Conference on Image Processing IEEE, pp. 1565-1569. http://doi.org/10.1109/ICIP.2014.7025313

Meng G., Wang M. (2013). Hand gesture recognition based on fingertip detection. Global Congress on Intelligent Systems IEEE Computer Society pp. 107-111. http://doi.org/10.1109/GCIS.2013.23

Penelle B., Debeir O. (2014). Multi-sensor data fusion for hand tracking using Kinect and leap motion. in Virtual Reality International Conference ACM, Laval, France, 2014, pp. 22. http://doi.org/10.1145/2617841.2620710

Song X., Huang H., Zhong F., Ma X., Qin X. (2017). Edge-guided depth map enhancement. International Conference on Pattern Recognition, IEEE, http://doi.org/10.1109/ICPR.2016.7900053

Sreejith M., Rakesh S., Gupta S., Biswas S., Das P. P. (2015). Real-time hands-free immersive image navigation system using Microsoft Kinect 2.0 and Leap Motion Controller. Computer Vision, Pattern Recognition, Image Processing and Graphics IEEE, pp. 1-4. http://doi.org/ 10.1109/NCVPRIPG.2015.7489999

Staretu I., Moldovan C. (2016). Leap motion device used to control a real anthropomorphic gripper. International Journal of Advanced Robotic Systems, Vol. 13.

Tsuchida K., Miyao H., Maruyama M. (2015). Handwritten character recognition in the air by using leap motion controller. HCI International 2015 - Posters’ Extended Abstracts. Springer International Publishing, pp. 534-538, 2015. http://doi.org/10.1007/978-3-319-21380-4_91

Yang C., Jang Y., Beh J., Han D. (2012). Gesture recognition using depth-based hand tracking for contactless controller application. IEEE International Conference on Consumer Electronics IEEE, pp. 297-298. http://doi.org/10.1109/ICCE.2012.6161876

Zhang Z. Y. (2004). Camera calibration with one-dimensional objects. IEEE Transactions on Pattern Analysis & Machine Intelligence, Vol. 26, No. 7, pp. 892-899, http://doi.org/10.1109/TPAMI.2004.1304991