Lulu Filterized Lin’s Correlative Theil–Sen Regression-Based Fully Connected Deep Multilayer Perceptive Neural Network for Eye Gaze Pattern Recognition

Authors

DOI:

https://doi.org/10.2298/YJOR240215027R

Keywords:

Gaze estimation, fully connected deep multilayer perceptive neural network, lulu nonlinear smoothing filtering technique, polar coordinate system-based eye-gaze pointestimation, Lin’s Concordance correlative Theil-Sen regression

Abstract

Gaze estimation is process finding the point of gaze on observe axis of eye. Gaze tracking schemes are mainly employed in HCI and study of visual scanning samples. Traditional tracking schemes usually need accurate personal calibration procedure to evaluate the particular eye metrics. In order to improve the accurate gaze estimation, Lulu Filterized Lin’s Correlative Theil–Sen Regression-based Fully Connected Deep Multilayer Perceptive Neural Network (LFLCTR-FCDMPNN) is designed for accurate gaze pattern identification through lesser time consumption. Fully Connected Deep Multilayer Perceptive NN contains input layer, three hidden layers, output layer. In input layer, number of gaze images is collected. Then using Lulu nonlinear smoothing filtering method is applied in initial hidden layer for removing noise as well as enhancing image quality. In second hidden layer, Polar coordinate system-based eye-gaze point estimation is performed. Finally, the Gaze Pattern matching is carried out in third hidden layer using Lin’s Concordance Correlative Theil–Sen regression. The estimated gaze points are organized at gaze plane to identify gaze patterns. Then pattern matching performed by Lin’s Concordance Correlation. In this way, the eye gaze patterns are correctly recognized at the output layer. Experimental evaluation is conducted to demonstrate performance analysis of LFLCTR-FCDMPNN technique through different metrics like gaze pattern recognition accuracy, gaze pattern recognition time, and false-positive rate with different number of eye images. Explained result illustrates which LFLCTR-FCDMPNN method improves the accuracy of gaze pattern recognition and decreases the time consumption than the conventional prediction methods. Using the Synthes Eyes dataset, it turned out that the FPR of the suggested LFLCTR-FCDMPNN was 63% higher than existing.

References

L. Gu, L. Wang, L. He, X. He, and J. Wang, “Gaze estimation via a differential eyes’ appearances network with a reference grid”, Engineering, Elsevier, vol. 7, no. 6, pp. 777-786, 2021.

X. Cheng, X. Zhang, F. Lu and Y. Sato, “Gaze Estimation by Exploring Two-Eye Asymmetry”, IEEE Transactions on Image Processing, vol. 29, pp. 5259-5272, 2020.

Xiaolong Zhou, Jianing Lin, Zhuo Zhang, Zhanpeng Shao, Shenyong Chen, Honghai Liu, “Improved Itracker Combined with Bidirectional Long Short-Term Memory for 3D Gaze Estimation using Appearance Cues”, Neurocomputing, Elsevier, vol. 390, pp. 217-225, 2020.

S. Makowski, L.A. Jäger, L. Schwetlick, H. Trukenbrod, R. Engbert and T. Scheffer, “Discriminative Viewer Identification using Generative Models of Eye Gaze”, Procedia Computer Science, Elsevier, vol. 176, pp. 1348-1357, 2020.

S. Zabihi, E. Mansoori and M. Yazdi, “Exploiting object features in deep gaze prediction models”, Journal of Visual Communication and Image Representation, Elsevier, vol. 73, pp. 1-11, 2020.

A. Ali and Y.G. Kim, “Deep Fusion for 3D Gaze Estimation from Natural Face Images Using Multi-Stream CNNs”, IEEE Access, vol. 8, pp. 69212-69221, 2020.

G. Liu, Y. Yu, K.A.F. Mora, and J.M. Odobez, “A Differential Approach for Gaze Estimation”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 43, no. 3, pp. 1092-1099, 2021.

Z. Wan, C. Xiong, Q. Li, W. Chen, K.K.L. Wong and S. Wu, “Accurate Regression-Based 3D Gaze Estimation Using Multiple Mapping Surfaces”, IEEE Access, vol. 8, pp. 166460-166471, 2020.

M. Liu, Y. Li and H. Liu, “3D Gaze Estimation for Head-Mounted Eye Tracking System with Auto-Calibration Method”, IEEE Access, vol. 8, pp. 104207-104215, 2020.

I.F. Tso, M. Angstadt, S. Rutherford, S. Peltier, V.A. Diwadkar, and S.F. Taylor, “Dynamic causal modeling of eye gaze processing in schizophrenia”, Schizophrenia Research, vol. 229, pp. 112-121, 2021.

P. Shi, M. Billeter, and E. Eisemann, “SalientGaze: Saliency-based gaze correction in virtual reality”, Computers & Graphics, Elsevier, vol. 91, pp. 83-94, 2020.

J. Chi, J. Liu, F. Wang, Y. Chi, and Z.G. Hou, “3-D Gaze-Estimation Method Using a Multi-Camera-Multi-Light-Source System”, IEEE Transactions on Instrumentation and Measurement, vol. 69, no. 12, pp. 9695-9708, 2020.

R.U. Haque, A.L. Pongos, C.M. Manzanares, J.J. Lah, A.I. Levey, and G.D. Clifford, “Deep Convolutional Neural Networks and Transfer Learning for Measuring Cognitive Impairment Using Eye-Tracking in a Distributed Tablet-Based Environment”, IEEE Transactions on Biomedical Engineering, vol. 68, no. 1, pp. 11-18, 2021.

Z. Wang, J. Chai, and S. Xia, “Realtime and Accurate 3D Eye Gaze Capture with DCNN-Based Iris and Pupil Segmentation”, IEEE Transactions on Visualization and Computer Graphics, vol. 27, no. 1, pp. 190-203, 2021.

F.B. Narcizo, F.E.D. Dos Santos, and D.W. Hansen, “High-Accuracy Gaze Estimation for Interpolation-Based Eye-Tracking Methods”, Vision, vol. 5, pp. 1-27, 2021.

A.T. Duchowski, V. Peysakhovich, and K. Krejtz, “Using Pose Estimation to Map Gaze to Detected Fiducial Markers”, Procedia Computer Science, Elsevier, vol. 176, pp. 3771-3779, 2020.

J. Liu, J. Chi, N. Lu, Z. Yang, and Z. Wang, “Iris Feature Based 3D Gaze Estimation Method Using a One-Camera-One-Light-Source System”, IEEE Transactions on Instrumentation and Measurement, vol. 69, no. 7, pp. 4940-4954, 2020.

I. Martinikorena, A. Larumbe-Bergera, M. Ariz, S. Porta, R. Cabeza, and A. Villanueva, “Low Cost Gaze Estimation: Knowledge-Based Solutions”, IEEE Transactions on Image Processing, vol. 29, pp. 2328-2343, 2019.

M. Liu, Y. Li, and H. Liu, “Robust 3-D Gaze Estimation via Data Optimization and Saliency Aggregation for Mobile Eye-Tracking Systems”, IEEE Transactions on Instrumentation and Measurement, vol. 70, pp. 1-10, 2021.

C. Liu, A. Plopski, and J. Orlosky, “OrthoGaze: Gaze-based three-dimensional object manipulation using orthogonal planes”, Computers & Graphics, Elsevier, vol. 89, pp. 1-10, 2020.

F. Zuo, P. Jing, J. Sun, J. Duan, Y. Ji, and Y. Liu, “Deep Learning-based Eye-Tracking Analysis for Diagnosis of Alzheimer's Disease Using 3D Comprehensive Visual Stimuli,” IEEE Journal of Biomedical and Health Informatics, 2024.

Downloads

Published

2024-07-08

How to Cite

Rathi, K., & Srinivasan, K. (2024). Lulu Filterized Lin’s Correlative Theil–Sen Regression-Based Fully Connected Deep Multilayer Perceptive Neural Network for Eye Gaze Pattern Recognition. Yugoslav Journal of Operations Research, 34(4), 687–706. https://doi.org/10.2298/YJOR240215027R

Issue

Section

Special Issue