Jump to the main content block

Intelligent visual acuity estimation system with hand motion recognition

The main target of this article is to propose a faster algorithm to estimate the vision acuity of human eyes, and retain the same accuracy as per conventional vision acuity tests.

Intelligent visual acuity estimation system with hand motion recognition

Fig. 1.Schematic of usage scenarios for the proposed iVAE system.

Technology Overview
The VAML-L scheme considers the correctness of the subject’s response by characterizing his probability of seeing the optotype sizes via a logistic function. The VAML-N method adopts an NN as a machine-learning technique to enhance the estimation performance. The correctness of the subject’s response and reaction time is considered in the learning process. Furthermore, the VAML-C scheme combines the likelihood functions from VAML-L and VAML-N methods to account for the tradeoffs between the adoption of either a logistic function or a learning mechanism. On the other hand, the V-HMR scheme automates the vision testing process by recognizing the subject’s hand motion and exhibits a feasible estimation accuracy.

Applications & Benefits
Realistic experiments show that contributions from both the reaction time and the NN model, the proposed iVAE system outperforms the conventional line-by-line testing method with approximately 10% in the number of testing trials while maintaining a logMAR error of less than 0.2. Furthermore, the average number of trials for complete VA measurement can also be decreased significantly.

Abstract:
Visual acuity (VA) measurement is utilized to test a subject’s acuteness of vision. Conventional VA measurement requires a physician’s assistance to ask a subject to speak out or wave a hand in response to the direction of an optotype. To avoid this repetitive testing procedure, different types of automatic VA tests have been developed in recent years by adopting contact-based responses, such as pushing buttons or keyboards on a device. However, contact-based testing is not as intuitive as speaking or waving hands, and it may distract the subjects from concentrating on the VA test. Moreover, problems related to hygiene may arise if all the subjects operate on the same testing device. To overcome these problems, we propose an intelligent VA estimation (iVAE) system for automatic VA measurements that assists the subject to respond in an intuitive, noncontact manner. VA estimation algorithms using maximum likelihood (VAML) are developed to automatically estimate the subject’s vision by compromising between a prespecified logistic function and a machine-learning technique. The neural-network model adapts human learning behavior to consider the accuracy of recognizing the optotype as well as the reaction time of the subject. Furthermore, a velocity-based hand motion recognition algorithm is adopted to classify hand motion data, collected by a sensing device, into one of the four optotype directions. Realistic experiments show that the proposed iVAE system outperforms the conventional line-by-line testing method as it is approximately ten times faster in testing trials while achieving a logarithm of the minimum angle of resolution error of less than 0.2. We believe that our proposed system provides a method for accurate and fast noncontact automatic VA testing.

IEEE Transactions on Cybernetics  Volume: 51, Issue: 12, Dec. 2021

Intelligent visual acuity estimation system with hand motion recognition
Author:Chiu C.-J., Tien Y.-C., Feng K.-T., Tseng P.-H.
Year:2020
Source publication:IEEE Transactions on Cybernetics  Volume: 51, Issue: 12, Dec. 2021
Subfield Highest percentage:99%    Control and Systems Engineering    #1/260

https://ieeexplore.ieee.org/document/9005409

Click Num: