Chris
Harrison

EyeMU Interactions: Gaze + IMU Gestures on Mobile Devices

As smartphone screens have grown in size, single-handed use has become more cumbersome. It is not uncommon to have interactive targets that are impossible to reach unless the user adjusts their grip. No doubt many millions of smartphones have been dropped while performing such a grip change, especially if the user is walking or performing another action. For this reason, many users have added aftermarket rings and grips to the rear of their phones, adding weight and bulk to devices that we wish to be as thin as possible.

We present EyeMU interactions, a set of intuitive and rapid gestural actions that can be used on mobile phones, powered by a combination of state-of-the-art gaze estimation and IMU-tracked motion gestures. Importantly, our interactions require no grip change or touch input. To avoid false positive and accidental activations, as well as eliminate unnecessary computation, our system only activates when a series of conditions are met. First, a user must be present in the camera’s view, then attend to the screen, then fixate on a widget, and finally, while maintaining that fixation, perform a motion gesture. We note our technique is highly complementary with conventional touch input, and can serve to alleviate reach issues, as well as expose advanced functionality typically buried in long presses and menus.

Download

Reference

Kong, A., Ahuja, K., Goel, M. and Harrison, C. 2021. EyeMU Interactions: Gaze + IMU Gestures on Mobile Devices. To appear in Proceedings of the 23rd ACM International Conference on Multimodal Interaction (October 18 - 22, 2021). ICMI '21. ACM, New York, NY. 577–585.

© Chris Harrison