Recognizing Words in Scenes with a Head-Mounted Eye-Tracker

Takuya Kobayashi, Takumi Toyama, Faisal Shafait, Andreas Dengel, Masakazu Iwamura, Koichi Kise
IAPR International Workshop on Document Analysis Systems, Gold Coast, Queensland, Australia, IEEE, 3/2012

Abstract:

Recognition of scene text using a hand-held camera is emerging as a hot topic of research. In this paper, we investigate the use of a head-mounted eye-tracker for scene text recognition. An eye-tracker detects the position of the user’s gaze. Using gaze information of the user, we can provide the user with more information about his region/object of interest in a ubiquitous manner. Therefore, we can realize a service such as the user gazes at a certain word and soon obtain the related information of the word by combining a word recognition system with eye-tracking technology. Such a service is useful since the user has to do nothing but gazes at interested words. With a view to realize the service, we experimentally evaluate the effectiveness of using the eye-tracker for word recognition. The initial results show the recognition accuracy was around 70% in our word recognition experiment and the average computational time was less than one second per a query image.

Files:

  Kobayashi-Eye-Tracker-DAS12.pdf

BibTex:

@inproceedings{ KOBA2012,
	Title = {Recognizing Words in Scenes with a Head-Mounted Eye-Tracker},
	Author = {Takuya Kobayashi and Takumi Toyama and Faisal Shafait and Andreas Dengel and Masakazu Iwamura and Koichi Kise},
	BookTitle = {IAPR International Workshop on Document Analysis Systems},
	Month = {3},
	Year = {2012},
	Publisher = {IEEE}
}

     
Last modified:: 30.08.2016