Research

An Overview of Eye-Tracking in Games

A common non-intrusive eye-tracking technique is pupil centre corneal reflection (PCCR), which is used by the Tobii Eye Trackers. This technique involves illuminating the eye and identifying the reflections on the cornea and in the pupil, before calculating gaze direction [1].

Humans move their eyes in rapid movements that reposition the fovea, known as saccades, with fixations occurring between saccades, where the eyes are relatively still and fixed on a specific object within the visual field. The eyes are also able to follow a moving object through smooth pursuit, [2] [3] and whilst saccades are sudden, rapid movements, ‘pursuits are smooth and match the relative velocity of the target’ [4].

Eye-tracking controlled games should support the use of these naturally occurring behaviours or challenge the user to learn new behaviours [2], with users generally preferring gaze interactions that utilise natural eye movements [5].

The use of eye-tracking control in fast paced games such as First Person Shooters puts users at a disadvantage compared to users that are not using eye tracking. However, this disparity in game experience lessens with turn based and puzzle games as it is easier to make gaze targets larger and so easier to select via gaze and these games require slower reactions and less accuracy than many fast paced games. Therefore, turn based and puzzle games are more accessible for eye tracking interaction than other genres of games [2].

Eye-tracking games can also be used to help develop eye tracking abilities and rehabilitate eye motion disabilities for example, Lin et. al. produced a game which led to an improvement in eye motion after two weeks of regular use. [6].

The Midas Touch Problem

The Midas Touch Problem is a fundamental issue in developing eye-tracking controlled games and is engendered by the difficulty of determining whether a user is making a selection with their eyes or whether they are simply looking at the game. Eye tracking games must enable users to use their eyes for observation at all times within the game, whilst ensuring an effortless switch from using gaze to observe the game, to using gaze to perform actions within the game [2].

There are several different strategies that can be used to overcome the Midas Touch Problem:

Dwell Time

Dwell time-based object selection involves the user fixating on the object beyond a certain time threshold. This reduces the chance of accidental selections occurring, as the user must consciously fixate on the object for a longer time than the natural fixation length [4], however the choice of dwell time threshold is a compromise, with shorter dwell times leading to faster game play but, an increased chance of accidental selections.

Optimal dwell time settings depend on the individual user’s preferences and the context of use of the gaze action [2] for example, destructive actions such as exiting the level should have longer dwell times than ordinary actions such as moving as sprite. In addition, long dwell times can be fatiguing [7] and may lead the user to believe that the system has crashed, if no user feedback is given [8].

Gaze and Click

Gaze and Click involves using gaze to determine which object to select, followed by a further user interaction for example, pressing a button, which then leads to the object being selected [9].

Gaze and Click minimises the chance of accidental selections occurring and unlike dwell-time based selection, it does not slow down gameplay. In addition, Gaze and Click does not lead to user uncertainty on whether they have made a selection or not. However, Gaze and Click does require individuals to have the ability to press a physical button, which limits access to individuals who cannot press a physical button due to their disabilities.

While the physical button in a Gaze and Click mechanism can be replaced by other gaze behaviour such as winking, some users may not be able to wink voluntarily or may be limited in which eye they can wink and so this may also limit access to certain individuals [4].

Selection can also be confirmed via users voluntarily changing their pupil dilation for example, through positive emotions however, this is a difficult behaviour to control and is susceptible to outside influences [4].

Conclusion

In conclusion, we decided to use dwell time based selection in our game, as it provided an easy to use and accessible way to overcome the Midas Touch Problem, as a short dwell time is seen as more convenient than Gaze and Click based activation [8]. We also enabled users to select between a slow, medium and fast dwell time to cater to different eye tracking abilities and to avoid unnecessary slowdowns of game play.

Inaccuracies in gaze-data

Gaze data can provide inaccurate information on what a user is looking at due to the ‘jittery movement of the eyes’, calibration issues and depending on the eye tracker used, a possible lag between a gaze movement and what is displayed on the screen, which must be taken into account when developing eye tracking games. These inaccuracies can be mitigated by increasing the size of gaze targets such as buttons [4].

Related Projects

8 Tile Puzzle Game
8 tile puzzle game

Bednarik et al. created an 8-tile puzzle game that is controlled via eye tracking with dwell time based selection. The study used changing colours to provide visual feedback of the dwell time remaining, whilst a user is fixating on an object, until the object is selected [9].

If the user's gaze leaves an object before it is selected, the colour of the object is gradually returned to white and if the user's gaze returned to the object before it turns completely white, the object would continue darkening from its current hue. [9]

Bednarik et al. found that an additional click sound was useful in providing additional confirmation that the selection had successfully occurred. The study also found that the gaze and click control produced the greatest user immersion and led to increased problem-solving abilities and that dwell-time selection was a significant interaction obstacle in their study. However, they acknowledged that the use of user-adjustable dwell times could change this result [9].

Conclusion

The study’s emphasis on the importance of visual and auditory selection cues to reduce user uncertainty has led us to introduce both visual and auditory cues within our game for example, when a user dwells on a game object it slowly changes colour and when the selection occurs, a visual and audio confirmation occurs. These cues will also enable individuals to develop their eye tracking abilities, as they will be able to see how much time they should spend fixating on an object until it is selected.

We will continue to use dwell time based activation in the game to ensure that no users are excluded from playing the game due to physical constraints. We will mitigate the risk of dwell-based control becoming a possible interaction obstacle due to overly long dwell times, by allowing users to adjust dwell times to suit their personal preferences.

Eyes First Maze
Eyes First Maze

Microsoft Research have produced a series of 'Eyes First' reinventions of classic games that aim to introduce users to basic eye-tracking skills, with dwell-time based selection.

Eyes First Maze involves a user gazing at dots, arranged in a grid pattern, to move a sprite to the end of a maze. It provides several different maze sizes and the user can move a sprite to an adjacent dot by dwelling on that dot. The game also provides immediate visual feedback of whether a selection has been made by highlighting selected grid points with a box [10].

Conclusion

We will use a similar grid system to make it easier for users to select a location to move pipes to. We will also provide immediate visual feedback of when a grid block is selected, through the colour of the block changing and a selection box to highlight the grid block appearing once a selection is complete.

Eye Writer Drawing Software
Eye Writer

The Eye Writer drawing software is designed to allow individuals with ALS to 'draw, manipulate and style a tag' and was developed by Free Art and Technology, Open Frameworks, the Graffiti Research Lab, The Ebeling Group and TEMPTONE [11].

The software utilises dwell-time based gaze selection, with the colour of the buttons changing to indicate it has been selected. The Eye Writer drawing software also uses large buttons to allow for easy selection with gaze and long dwell times to avoid accidental selections being made. When the application is paused, the user cannot draw on the screen but, can adjust settings however, in drawing mode a grid appears on the screen and the user is able to draw lines between grid points [11].

Conclusion

We will utilise a similar grid system to help users focus their fixations on particular points of the screen, enabling them to move pipes to different locations. We will use large buttons and game objects to ensure that users can easily select them via gaze.

Invisible Eni
Invisible Eni

Eckman et.al. have created Invisible Eni, a game controlled through the user's gaze points, blinking and pupil size. The user moves the character by fixating on different parts of the game area, whilst blinking is used to make the character disappear and escape their enemies. Pupil size is then used as a novelty control mechanism, where the user dilating their pupils leads to magic flowers opening [12].

The study found that 'it is possible for non-expert users to achieve control of their pupil size in a controlled environment' with positive activation of dilation working being more effective as a control mechanism than negative emotions such as self-inflicted pain. However, the study also found that inexperienced participants found it difficult to intuitively know how to control the game via gaze control or pupil dilation without guidance. The study argued that these issues could be resolved by providing visual feedback for gaze actions and by providing a tutorial in the game to teach users the game mechanics [12].

Conclusion

Whilst using pupil dilation as a control input is a novel idea and this study provided evidence that pupil dilation can be partially controlled with training, it may be an intimidating and confusing game mechanism for the majority of users and is vulnerable to external factors.

We will, however, ensure that appropriate visual feedback is given to users with each gaze action and we will provide an in-game tutorial to teach users the game's mechanics.

EyeChess
EyeChess

EyeChess is a gaze-controlled Chess tutorial aimed at physically impaired users, with three interaction options available: gaze, blinking and eye gestures. The game allowed users to select a chess piece and move it to a destination square, with hints being shown if the user requires assistance [13].

The study found that dwell time ‘was the preferred selection technique’ with users finding it to be fun and easy to play with in comparison to the blinking and eye gesture techniques which were seen as ‘fatiguing’ [13].

Related Technologies

There were several implementation strategies that could have been used to create the game, each with differing advantages and disadvantages. The three main options were:

  • Creating the game in Unity and using the Unity Tobii Eye Tracking SDK.
  • Creating the game using UWP and the Windows 10 APIs.
  • Creating the game using Unreal Engine and the Unreal Engine Tobii Eye Tracking SDK.
Aspect Unity UWP Unreal Engine
Eye Tracking The Unity Tobii Eye Tracking SDK provides full support and integration with Tobii Eye Tracking devices. It also provides built-in support for mapping eye gaze to game objects through Gaze Focus, without additional filtering being required. There are also a large variety of sample scripts showcasing how the technology can be used [14] UWP applications provide full support for eye tracking via the Windows Gaze Input APIs and the Windows Gaze Interaction Library. This provides support for dwell time-based activation of buttons and other controls [15]. The Unreal Engine Tobii Eye Tracking SDK provides beta support for Tobii Eye Tracking devices with support for Gaze to Object Mapping but, it has limited functionality compared to the Unity Tobii SDK and fewer sample scripts [16].
Graphics and Game Development Unity provides strong support for graphics and game development, with additional game development specific features such as prefabs and colliders being available, in addition to supporting the customisation of a fluid engine [17]. It also has strong support for 2D games. UWP has some support for game development with libraries such as MonoGame [18] and there are some C# fluid engines available such as Liquid Fun [19]. Unreal Engine provides strong support for 3D graphics and game development however, it only has some support for 2D games through Paper 2D [20].
Compatibility When using the Unity Tobii Eye Tracking SDK, games can only be built to Windows Standalone [14]. UWP apps can run on any UWP device [21] Unreal Engine provides multiple build options but, there is no official support for UWP.
Conclusion

We decided to use Unity and the Unity Tobii Eye Tracking SDK to build our game due to its strong, specialised support for 2D game development and excellent eye tracking features, which will enable us to develop a cohesive, enjoyable and aesthetically pleasing game. Though it can only build to Windows Standalone, this will still enable us to reach a wide user audience for example, through submission to the Windows Store and other platforms.

In terms of the libraries and frameworks we could have used for the project, the Unity Eye Tracking SDK has comparable features to the Windows Gaze Interaction Library however, whilst complex games can be produced with MonoGame, Unity provides greater support for developers enabling more rapid development. As the Tobii Eye Tracking SDK is designed to work well with Unity, we decided to use it instead of using the Windows Gaze Interaction Library.

Though UWP also offers strong support for eye tracking, it does not offer the specialised game development features that Unity offers, and Unreal Engine does not fully support eye tracking or 2D game development and so Unity was the ideal choice for our project.

Comparison of Eye Tracking Devices

There are several eye trackers on the market, each with differing advantages and disadvantages, for example, the Tobii and SmartEye eye-trackers.

Smart Eye Pro Tobii Eye Tracker 4C
Aimed at research use. SDKs for multiple game development platforms such as Unity and Unreal Engine.
Made up of multiple cameras providing highly accurate readings and allowing 360 degree head tracking [22]. Less accurate than a Smart Eye Pro but, still offers head tracking capabilities.
Not accessible to the average user, with a high cost. Consumer device both for general eye tracking use and for gaming and so highly accessible to users. It is low cost and portable and can be used with little set up.
Conclusion

Tobii are the dominating figure in the consumer eye-tracking landscape with several different hardware options and multiple SDKs that work well with game development platforms such as Unity. Though the 4C has less accuracy than a Smart Eye Pro, it has a lower cost and is more portable and accessible to the average consumer.

Overall Conclusion and Impact on Project

Our research has influenced our project in several ways for example, we have used dwell time based selection of objects to solve the Midas Touch problem, but to overcome the challenges of dwell time based selection we have allowed the user to customise their dwell time to their preferences in order to avoid unnecessarily slow game play. Our game is also a puzzle game which will allow for larger gaze targets, to account for inaccuracies in gaze data and to make selecting the targets via gaze easier.

We are also using changing colours to indicate when an object is being dwelled upon, with a final flash appearing when the object has been selected. We have also added additional confirmation sounds when buttons and other game objects are selected to remove user uncertainty about whether a selection has successfully occurred.

We will also follow the common convention of using a grid to assist users in moving objects to certain locations and provide an interactive tutorial to assist users with learning the game's mechanics.

We will be using the Unity game development platform for the project, as it offers the best balance between specialised support for game development and strong support for eye tracking for our project. In addition, we will be using the Tobii eye trackers as they are the market leaders for consumer eye trackers and offer the best balance between cost, portability and accuracy.

References

[1] Tobii, "How do Tobii Eye Trackers work?," [Online]. Available: https://www.tobiipro.com/learn-and-support/learn/eye-tracking-essentials/how-do-tobii-eye-trackers-work/. [Accessed 28 December 2019].

[2] P. Isokoski, M. Joos, O. Spakov and B. Martin, "Gaze controlled games," Universal Access in the Information Society, 2009.

[3] S. Almeida, A. Veloso, L. Roque and O. Mealha, "The Eyes and Games: A Survey of Visual Attention and Eye Tracking Input in Video Games," in SBC - Proceedings of SBGames, Salvador, 2011.

[4] E. Velloso and M. Carter, "The Emergence of EyePlay: A Survey of Eye Interaction in Games," in Annual Symposium on Computer-Human Interaction in Play (CHI PLAY ’16), Austin, Texas, USA, 2016.

[5] L. . E. Sibert and R. J. Jacob, "Evaluation of Eye Gaze Interaction," in SIGCHI conference on Human Factors in Computing Systems (CHI ’00), The Hague, The Netherlands, 2000.

[6] C.-S. Lin, C.-C. Huana, C.-N. Chana, M.-S. Yehb and C. Chuang-Chien, "Design of a computer game using an eye-tracking device for eye’s activity rehabilitation," Optics and Lasers in Engineering, p. p. 91–108, 2004.

[7] V. Sundstedt, Gazing at Games: An Introduction to Eye Tracking Control, Morgan & Claypool Publishers, 2012.

[8] E. Jönsson, "If Looks Could Kill – An Evaluation of Eye Tracking in Computer Games," 2005. [Online]. Available: http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=AB9CE4056543AAE3942E1463D2C0510A?doi=10.1.1.219.1981&rep=rep1&type=pdf. [Accessed 9 March 2020].

[9] R. Bednarik, T. Gowases and M. Tukiainen, "Gaze interaction enhances problem solving: Effects of dwell-time based, gaze-augmented, and mouse interaction on problem-solving strategies and user experience," Journal of Eye Movement Research, vol. 3, 2009.

[10] Microsoft, "Eyes First - Maze," [Online]. Available: https://www.microsoft.com/en-gb/p/eyes-first-maze/9p1d3gnmzx5t?activetab=pivot:overviewtab. [Accessed 28 December 2019].

[11] Free Art and Technology, Open Frameworks, Graffiti Research Lab, The Ebeling Group and TEMPTONE, "Eye Writer," [Online]. Available: http://eyewriter.org. [Accessed 8 March 2020].

[12] P. A. Eckman and M. Mäkäräinen, "Invisible Eni – Using Gaze and Pupil Size to Control a Game," in CHI, Florence, Italy, 2008.

[13] O. Špakov, "EyeChess: the tutoring game with visual attentive interface," in Alternative Access: Feelings and Games 2005, University of Tampere, Finland, 2005.

[14] Tobii, "Unity SDK API Overview," [Online]. Available: https://developer.tobii.com/pc-gaming/unity-sdk/api-overview/. [Accessed 1 December 2019].

[15] Microsoft, "Gaze Interaction Library," [Online]. Available: https://docs.microsoft.com/en-us/windows/communitytoolkit/gaze/gazeinteractionlibrary. [Accessed 28 December 2019].

[16] Tobii, "Unreal Engine 4 SDK (BETA)," [Online]. Available: https://developer.tobii.com/pc-gaming/unreal-engine-sdk/. [Accessed 1 December 2019].

[17] Unity, "Water 2D," Apptouch, [Online]. Available: https://assetstore.unity.com/packages/tools/particles-effects/water-2d-136297. [Accessed 28 December 2019].

[18] Microsoft, "Create a UWP game in MonoGame 2D," [Online]. Available: https://docs.microsoft.com/en-us/windows/uwp/get-started/get-started-tutorial-game-mg2d. [Accessed 28 December 2019].

[19] Google, "Powered by Liquid Fun," [Online]. Available: https://google.github.io/liquidfun/. [Accessed 28 December 2019].

[20] Unreal Engine, "Paper2D," [Online]. Available: https://docs.unrealengine.com/en-US/Engine/Paper2D/index.html. [Accessed 2020 March 9].

[21] Microsoft, "What's a Universal Windows Platform (UWP) app?," 5 July 2018. [Online]. Available: https://docs.microsoft.com/en-us/windows/uwp/get-started/universal-application-platform-guide. [Accessed 28 December 2019].

[22] SmartEye, "SMART EYE PRO," [Online]. Available: https://smarteye.se/wp-content/uploads/2016/10/Smart-Eye-Pro.pdf. [Accessed 9 March 2020].