The United States Patent and Trademark Office has today published a patent granted by Apple relating to user interfaces for interacting with a future mixed reality (HMD) headset by simply using eye gaze (technology).
The patent granted by Apple describes technologies for interacting with an HMD using eye gaze. According to some models, the user uses his eyes to select the text input field displayed on the HMD screen.
The technologies provide a more natural and efficient interface by, in some typical forms, allowing the user to decide where to enter text primarily using eye gaze.
The technologies are useful for virtual reality, augmented reality, and mixed reality devices and applications.
The technologies can also be applied to traditional user interfaces on devices such as desktop computers, laptops, tablets, and smartphones.
Apple patented FIG. Figure 2 below shows a top view of a user (#200) whose gaze is focused on an object (#210). The user’s gaze is determined by the visual axes of each of the user’s eyes (shown in rays 201A and 201B). The orientation of the visual axes determines the direction of the user’s gaze, and the distance at which the axes converge determines the depth of gaze.
Gaze direction can also be referred to as gaze vector or line of sight. in fig. 2, the direction of gaze in the direction of the object and the depth of gaze is the distance D, relative to the user. Gaze direction and/or gaze depth are characteristics used to locate a stare.
(Click on the image to enlarge)
In some embodiments, the center of the user’s cornea, the center of the pupil, and/or the center of rotation of the user’s eyeball are specified to determine the position of the optical axis of the user’s eye. Accordingly, the center of the user’s cornea, the center of the pupil, and/or the center of rotation of the user’s eyeball can be used to determine the direction of the user’s gaze and/or the depth of gaze.
In some embodiments, depth of view is determined based on the meeting point of the user’s optical axes (or the location of the minimum distance between the optical axes of the user’s eye) or some other measurement of the user’s eye focus (X). Optionally, gaze depth is used to estimate the distance a user’s eyes are focused on.
Apple patented FIG. Figure 4 below shows a head-mounted display device (HMD) with a built-in gaze sensor. The user will be able to look at a model (for example) in a virtual reality world or real world environment and be able to direct text input to specific areas of the model as detected by the gaze sensor which detects the user’s focus. The technology is so precise that a simple glance from the “first name” entry slot to the next “last name” box can be accurately detected so that the user can fill in the box without using the mouse.
(Click on the image to enlarge)
Apple notes that the gaze sensor (#410) is user-oriented and, during operation, captures the characteristics of the user’s eye gaze, such as image data of the user’s eyes.
In some models, the gaze sensor includes an event camera that detects event data from the user (for example, the user’s eyes) based on changes in detected light intensity over time and uses the event data to determine gaze direction and/or gaze depth.
Optionally, HMD uses both image data and event data to determine gaze direction and/or gaze depth. Optionally, HMD uses ray casting and/or cone casting to determine gaze direction and/or gaze depth. In some models, multiple vision sensors are used.
For more details, see the patent granted by Apple 11.314396.