SlideShare ist ein Scribd-Unternehmen logo
1 von 4
Downloaden Sie, um offline zu lesen
The Use of Eye Tracking for PC Energy Management
                                                                   Vasily G. Moshnyaga
                                                Department of Electronics Engineering and Computer Science
                                                                Fukuoka University, Japan
                                                                  vasily@fukuoka-u.ac.jp

Abstract                                                                                           Those who do the adjustment, usually assign long intervals. As
                                                                                                   HP [Global Citizenship Report, 2006] reveals, just enabling the
This paper discusses a new application of eye-tracking, namely                                     low power mode after 20 minutes of inactivity can save up to
power management, and outlines its implementation in personal                                      381kWh for a single PC per year. Clearly, the PC power man-
computer system. Unlike existing power management technol-                                         agement must employ a more efficient user monitoring.
ogy, which “senses” a PC user through keyboard and/or mouse,                                          Several approaches have been proposed to improve the PC
our technology “watches” the user through a single camera. The                                     user monitoring. Extending the touch-pad function for user pres-
technology tracks the user’s eyes keeping the display active only                                  ence detection [Park 1999] and placing thermal sensors around
if the user looks at the screen. Otherwise it dims the display                                     the display [Dai 2003] are some of them. Despite differences, all
down or even switches it off to save energy. We implemented                                        these approaches have one drawback in common. Namely, they
the technology in hardware and present the results of its experi-                                  ignore the viewer attention. Paradoxically, while the display is
mental evaluation.                                                                                 needed only for our eyes, none of the existing approaches, up to
                                                                                                   our knowledge, takes them into account. Neither ACPI nor tem-
CR Categories: K.6 [Management of Computing and Informa-
                                                                                                   perature sensors nor the advanced touchpad screeners can dis-
tion Systems]; K6.4 [System management]
                                                                                                   tinguish whether the user looks at screen or not. As a result, they
                                                                                                   may either switch the display off inappropriately (i.e. when the
Keywords: Eye tracking, applications, energy reduction
                                                                                                   user looks at screen without pressing a key) or keep the display
                                                                                                   active when it is not needed.
1       Introduction
                                                                                                      We propose to apply eye-tracking for PC energy management.
                                                                                                   Unlike existing technologies which “sense” a PC user through
With the wide popularity of user centric applications, the role of                                 keyboard, touchpad and/or mouse, our technology “watches” the
smart and intelligent devices, capable of monitoring human eyes                                    user through a single camera. More precisely, it tracks the user’s
is increasing. Up to the date eye-tracking has been applied for                                    eyes to detect whether he or she looks at screen or not and based
applications, such as HCI, security systems, health care, assis-                                   on that changes the display brightness and power consumption.
tive technologies, ubiquitous computing, etc. [Morimoto, 2004].
In this paper, we discuss a new application of eye-tracking,
namely power management and outline its hardware implemen-                                         2      The Proposed Technology
tation in personal computer system.
   Modern PCs burn a half of its total energy in display [Mahesri                                   2.1     An Overview
2004]. To reduce energy consumption, OS-based Advanced
Configuration and Power Interface (or ACPI) sets display to                                        The proposed technology is based on the following assumptions:
low-power modes after specified periods of inactivity on mouse                                     A. The PC is equipped with a color video camera. The camera
and/or keyboard [ACPI 2004]. The efficiency of ACPI strongly                                             is located at the top of display. When the user looks at dis-
depends on inactivity intervals, set by the user. From one hand,                                         play it faces the camera frontally.
if the inactivity intervals are improperly short, e.g. 1 or 2 min-                                 B. The display has a number of backlight intensity levels with
utes, the ACPI can be quite troublesome by shutting the display                                        the highest level corresponding to the largest power con-
off when it must be on. From another hand, if the inactivity in-                                       sumption and the lowest level to the smallest power, respec-
tervals are set to be long, the ACPI’s efficiency decreases. Be-                                       tively. The highest intensity level is enabled either initially
cause modifying the intervals requires system setting, a half of                                       or whenever the user looks at the screen.
the world’s PC users never adjust the power management of
                                                                                                   The idea is simple: based on the camera readings determine the
their PCs for fear that it will impede performance [Fujitsu 2007].
                                                                                                   display power mode. If no human face is detected in the current
*The work was supported by The Ministry of Education, Culture,                                     video frame, the display is switched off. Otherwise, we track the
Sports, Science and Technology of Japan under the Knowledge Cluster                                user’s eye-gaze. If the gaze has been off the screen for more than
Initiative (The Second Stage) and Grant-in-Aid for Scientific Research                             N consecutive frames, the current backlight luminance is
(C) No.21500063.
                                                                                                   dimmed down to the next level. Any on-screen gaze reactivates
Copyright © 2010 by the Association for Computing Machinery, Inc.
                                                                                                   the initial backlight luminance by moving the display onto the
Permission to make digital or hard copies of part or all of this work for personal or              power up mode. However, if no on-screen gaze has been de-
classroom use is granted without fee provided that copies are not made or distributed              tected for more than N frames and the backlight luminance
for commercial advantage and that copies bear this notice and the full citation on the             reached already the lowest level, the display enters the standby
first page. Copyrights for components of this work owned by others than ACM must be
honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on
                                                                                                   mode. Returning back from either standby or off modes is done
servers, or to redistribute to lists, requires prior specific permission and/or a fee.             by pushing the ON button. Below we describe the technology in
Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail               details
permissions@acm.org.
ETRA 2010, Austin, TX, March 22 – 24, 2010.
© 2010 ACM 978-1-60558-994-7/10/0003 $10.00

                                                                                             113
2.2      User Presence Detection
                                                                                                    Next frame
                                                                                                                     no   Search area (S)
The goal of this task is to determine from the camera readings                              BTE location known?
                                                                                                                          of whole image
whether or not the user is currently present in front of display.                                          yes
                                                                                                  Search area (S)
To detect the user’s presence in front of the display, we first                                   around the BTE
localize the face search by applying background subtraction and
skin-color segmentation to the RGB representation of input im-                             Compute integral image of S
age. The skin is defined by the following criteria [Douxchamps,
                                                                                              Find BTE candidates                 Run SSR filter
2008]: 0.55<R<0.85, 1.15<R/G<1.19, 1.15<R/B<1.5 and
0.6<(R+G+B)<1.8. To accelerate the face-area extraction, two                                                                                         no
                                                                                                 Next BTE candidate              Face candidate?
additional filters are used. The first one limits the size of the                                                                   yes
                                                                                                  Confirm by SVM
head in reasonable range. The second one verifies that the face                                                                  Save BTE location
contains a minimum of 25% of skin colored pixels. Thus, if the                                    Locate the eyes                                    no
                                                                                                                                   Run complete?
total number of pixels in the derived face area exceeds a given                             no                                           yes
                                                                                                      Positive
threshold the user is assumed present.
                                                                                                            yes
                                                                                            Save BTE, eye locations
      (x0,y0)
                                                                                     yes
                1          2        3                                                            0ther candidates?
                                                                                                          no
                (x1,y1)   (x2,y1)
                                                                                        Figure 2: The modified eye-tracking algorithm
                4          5        6
                (x1,y2)   (x2,y2)
                                                                                Fig.2 shows the modified algorithm. For the first frame or any
                    Figure 1: Illustration of the SSR filter                    frame in which the search for BTE candidate was unsuccessful,
                                                                                we search the image area reduced by background and skin-color
    2.3      Eye-Gaze Detection                                                 extraction; otherwise the search is restricted to a small area (S)
                                                                                of ±8 pixels around the previously located BTE pattern. For the
The eye-gaze detector implements the algorithm proposed by                      chosen area, the algorithm first transforms the green component
[Kawato 2005], which scans the Six Segment Rectangular (SSR)                    of corresponding image into integral image representation and
filter over the integral representation input image to define the               then scans it by the SSR filter to select the BTE candidate. If the
Between The Eyes (BTE) pattern of human face (Fig.1) and then                   BTE candidate is found, the system takes it as a starting point to
searches the regions 1 and 3 from the left and right side of the                locate eyes. If eyes have been detected, the user is assumed to be
BTE pattern to locate the eyes. The algorithm does not depend                   looking at screen; else it is not. If no BTE candidate has been
on illumination, face occlusion and eye closure. It is more stable,             found, the user is considered to be not looking at the screen.
robust and less complex than the other eye-tracking formula-
tions. However, it is still very computationally demanding. In a                  To detect a BTE pattern in an image, we scan the SSR filter
quest to locate all faces in an image (without restriction on face              over the search area S in a row-first fashion and at each location
size, motion and rotation), the algorithm scans the whole image                 compare the integral sums of the rectangular segments corre-
six times performing over 28M operations per (640x480) frame.                   sponding to eyes, chicks and nose (i.e. 1 and 2, 1 and 4, 3 and 2,
Though such a full search might be necessary in some applica-                   and 3 and 6) as follows:
tions, it seems redundant when tracking eyes of the PC user.                         Sum(1) < Sum (2) & Sum(1) < Sum(4)                               (2)
   In our eye-tracking application we can assume that:                               Sum(3) < Sum(2) & Sum(3) < Sum(6)
   1. The target object is a single PC user. The user sits in front             If the above criteria are satisfied, the SSR is considered to be a
      of PC at a relatively close distance of 50-70cm.                          candidate for the BTE pattern (i.e. face) and two local minimum
   2. The user’s motion is slow relatively to the frame rate.                   (i.e. dark) points each are extracted from the regions 1 and 3 of
                                                                                the SSR for left and right eyes, respectively.
   3. The background is stable and constant.
                                                                                    The eye localization procedure is organized as a scan over the
Based on these assumptions, we apply the following algorithmic
                                                                                green plane representation of regions 1 and 3 for a continuous
optimizations to reduce eye-tracking complexity [Yamamoto
                                                                                segment of dark pixels (i.e. whose value is lower than the
and Moshnyaga 2009]:
                                                                                threshold k). During the search for eyes, we ignore 2 pixels at
•         fixed SSR filter size; when the user is 50-70 cm from the             the boarder of the regions to avoid effects of eyebrows, hair and
          camera, the BTE interval of 55 pixels and the filter size ra-         beard. Also, because the eyebrows have almost the same grey
          tio of 2:3 ensure minimal computational complexity at                 level as the eyes, the search starts from the lowest positions of
          almost 100% detection rate.                                           regions 1 and 3. Similarly to [Kawato 2000], we assume that
•         single SSR filter scan; It follows from the single user as-           eyes are located if the distance between the located eyes (D) and
          sumption and the fixed SSR filter size.                               the angle (A) at the center point of the BTE area 2 (see Fig.3,
                                                                                left) satisfy the following: 30<D< 42 & 115° < A < 180°.If
•         pixel displacement of the SSR filter during the scan; Ex-
          periments showed that the computational complexity de-                both eyes are detected, the user’s gaze is considered to be on
          creases by a factor of 3 when the displacement of 2, and by           screen. The eye positions found in this case the current frame are
          a factor of 4.5 for the displacement of 3 without affecting           then used to reduce complexity of processing the successive
          the detection rate of the original (full scan) algorithm.             frames. The search in the next frame is limited to a small region,
                                                                                which spans by 8 pixels in vertical and horizontal direction
•         low frame processing rate (5-10 fps); Because the user                around the eye points of the current frame.
          motion is very slow, high processing rates are redundant.



                                                                          114
A
               D



    Figure 3: An illustration of the eye detection heuristics
         (left) and the search area reduction (right)

          Screen         Backlight                              Video
                                        User tracking unit      Camera
                           lamp
                                              User
                                       u0   presence
        High-voltage    Vb    Voltage       detector
          inverter           converter u1
                                              Eye
                                                             (R,G,B)
                                                                                       Figure 5: Examples of correct eye-detection
         Display                            detector
                                                                               memory, input images were 160x120 pixels in size. The SSR
                                                                               filter was 30x20 pixels in size. The total power consumption of
                   Figure 4: System overview                                   the design was 150mW, which is 35 times less than software
                                                                               implementation on a desktop PC [Moshnyaga et al. 2009].

      Table 1: Results of evaluation on test sequences                         4      Experimental Evaluation
      Test   Frames      Positives     Negatives Accuracy
                       True False True False           (%)
       1       151     127       0      6      18       88
                                                                                4.1     Eye-Detection Accuracy
       2       240     149       1     65      25       89
                                                                               To evaluate accuracy of the gaze detector, we ran four different
       3       100      74       0     16      10       90
                                                                               tests each of each conducted by different users. The users were
       4       180     142       4     18      24       84
    Average    167     123       1     26      19       88                     free to look at the camera/display, read from the materials on the
                                                                               table, type text, wear eyeglasses, move gesticulate or even leave
Fig. 3 (right) demonstrates the search area reduction by our algo-             the PC whenever wanted. Fig.5 illustrates the detection results
rithm: the dashed line shows the area defined by background                    on 4 images. The + marks depict positions where the system
extraction; the dotted line depicts the area obtained by skin-                 assumes the eyes to be. As we see, even though the lighting
color-segmentation; the plain (dark line) shows the area around                conditions of faces vary, the results are correct. Ordinary pairs
the BTE pattern found in the previous image frame; white                       of glasses (see Fig.5, top row) have no bad effect on the per-
crosses show the computed locations of eyes.                                   formance for frontal faces. In some face orientations, however,
                                                                               the frame of pair of glasses can hide a part of eye ball, causing
                                                                               the system to loose the eye. Or sometimes it takes eyebrow or
3     Implementation                                                           hair as an eye and tracks it in the following frames.
                                                                                  Table 1 summarizes the results. Here, the second column de-
We implemented the proposed PC display power management
                                                                               picts the total number of frames considered in the test; columns
system in hardware. Fig.4 outlines the block-diagram of the
                                                                               marked by ‘True’ and ‘False’ reflect the number of true and
system. The user tracking unit receives an RGB color image and
                                                                               false detections, respectively, for positive and negative cases.
outputs two logic signals, u1,u0. If the user is detected in the               The false positives correspond to cases in which one of the eyes
image, the signal u0 is set to 1; otherwise it is 0. The zero value
                                                                               is tracked on the eyebrow or on the hair near the eye. The false
of u0 enforces the voltage converter to shrink the backlight sup-              negatives reflect cases in which the user gazed off the screen
ply voltage to 0 Volts, dimming the display off. If the eye-gaze
                                                                               (the both eyes are tracked on the eyebrows). Accuracy column
detector determines that the user looks at screen, it sets u1=1.               shows the ratio of true decisions to the total number of decisions
When both u0 and u1 are 1, the display operates as usual. If the
                                                                               made. As the tests showed, the eye tracking accuracy of pro-
user’s gaze has been off the screen for more than N consecutive                posed system is quite high (88% on average).
frames, u1 becomes 0. If u0=1 and u1=0, the voltage converter
lowers the input voltage (Vb) of the high-voltage inverter by
                                                                                4.2     Energy Reduction Efficiency
∆V. This voltage drop lowers backlight luminance and so
shrinks the power consumption of the display. Any on-screen                    Next, we estimated the energy efficiency of the proposed cam-
gaze in this low power mode reactivates the initial backlight                  era–based power management system by measuring the total
luminance and moves the display onto normal mode. However,                     power consumption taken from the wall by the system itself and
if u0=0 and the backlight luminance has already reached the                    the 17” IO-DATA TFT LCD display controlled by the system.
lowest level, the display is turned off.                                       Fig.6 profiles the results measured per frame on a 100sec
   The user-tracking unit was realized on a single Xilinx FPGA                 (2000frames) long test. In the test, the user was present in front
board connected to VGA camera through parallel I/O interface.                  of the display (frames 1-299, 819-1491, 1823-2001); moved a
See [Moshnyaga, et al, 2009] for details. The unit operates at                 little from the display but still present in the camera view
48MHz frequency, 3.3V voltage and provides eye tracking at                     (frames 1300 to 1491); and stepped away from the PC disap-
20fps rate. Due to capacity limitations of the on-chip SRAM                    pearing from the camera (frames 300-818, 1492-1822). The



                                                                         115
40
                                                                                              The total power overhead of the system is 960mW. Even
                                    ACPI                                                    though the system takes a little more power than ACPI (see
              35
              30
                       Gaze                            Gaze                                 horizontal line in Fig.6) in active mode, it saves 36% of the total
                         on                            on
                       screen                          screen                               energy consumed by the display on this short test. In environ-
              25
                                                                                            ments when users frequently detract their attention from the
  Power (W)



                                                                    Gaze
                                                                  off screen
              20                                                                            screen or leave computers unattended (e.g. school, university,
              15                                                                            office) the energy savings could be significant.
                                No user                            No user
              10                                  User is

              5
                                                  present
                                                                                            5     Conclusion
              0
                   1              501      1001                 1501           2001
                                                                                            In this paper we presented a novel eye-tracking application,
                                                                                            namely display power management, and outlined an implemen-
                                           Frame
                                                                                            tation technology which made the application viable. Experi-
                   Figure 6: Display power consumption per frame                            ments showed that the camera-based display power management
                                                                                            is more efficient than the currently used ACPI method due to its
                                                                                            ability to adjust the display power adaptively to the viewer be-
                                                                                            havior. The application-specific algorithm optimizations and
                                                                                            eye-tracking implementation in hardware allowed us reduce the
                                                                                            power overhead below 1W yet satisfying real-time and high
                                                                                            accuracy requirements of the application. However, this power
                                                                                            can be reduced even further should custom design be performed.
                                                                                               In the current work we restricted ourselves to a simple case of
                                                                                            a singular user monitoring. However, when talking about moni-
                                                                                            toring in general, some critical issues arise. For instance, how
                                                                                            should the technology behave when handling more than one
                                                                                            person looking at screen? The user might not look at screen
                                                                                            while the others do. Concerning this point, we believe that a
                                                                                            feasible solution is to keep the display active while there is
                                                                                            someone looking at the screen. We are currently investigating
                                                                                            the issue as well as the influence of camera positioning, user
                                                                                            gender/race etc.

                                                                                            References
                                                                                            ACPI: Advanced Configuration and Power Interface Specification, 2004,
                                                                                            Sept., Rev. 3.0, http://www.acpi.info/spec.htm
                                                                                            YAMAMOTO S., AND MOSHNYAGA V.G. 2009. Algorithm optimizations
                                                                                            for low-complexity eye tracking, Proc. IEEE SMC, 18-22
                                                                                            MOSHNYAGA V.G., HASIMOTO K., SUETSUGU T., HIGASHI S. 2009. A
                                                                                            hardware implementation of the user-centric display energy management,
                                                                                            Proc. PATMOS 2009, LNCS 5953, 56-65.
                                                                                            DAI, X., AND RAYCHANDRAN, K. 2003. Computer screen power
                                                                                            management through detection of user presence, US Patent 6650322.
                                                                                            DOUXCHAMPS D., AND CAMPBELL N. 2008, Robust real time face
                                                                                            tracking for the analysis of human behavior, in Machine Learning for
                                                                                            multimodal Interaction, LNCS 4892, 1-10.
                                                                                            Fujitsu-Siemens Report 2007, Energy savings with personal computers,
   Figure 7: Screenshots of display and corresponding                                       Fujitsu-Siemens Corp. http://www.fujitsu-simens.nl/aboutus/sor/energy_
   power consumption: when the user looks at screen, the                                    saving/prof_desk_prod.html
   screen is bright and power is 35W (top picture); else the                                Global      Citizenship     Report,    Hewlett-Packard    Co.,    2006,
   screen is dimmed and is power 15.6W (bottom picture)                                     www.hp.com/hpinfo/globalcitizenship/gcreport/pdf/
                                                                                            hp2006gcreport_lowres.pdf
system was set to step down from the current power level if the                             KAWATO S., AND OHYA J. 2000, Two-step approach for real-time eye-
                                                                                            tracking with a new filtering technique. Proc. IEEE SMC, 1366-1371
eye-gaze off the screen was continuously detected for more than
15 frames (i.e. almost 1 sec). The ACPI line shows the power                                KAWATO S., TETSUTANI N., OSAKA K. 2005. Scale-adaptive face
                                                                                            detection and tracking in real time with SSR filters and support vector
consumption level of the ACPI.                                                              machine, IEICE Trans. Information &Systems, E88-D, (12) 2857-2863.
   We see that our technology is very effective. It changes the
                                                                                            MORIMOTO, C., MIMICA, M. R.M. 2004. Eye gaze tracking techniques
display power accordingly to the user behavior; dimming the                                 for interactive applications, Computer Vision and Image Understanding,
display when the user gaze is off the screen and powering the                               98 (2004) (1), 4–24
display up when the user looks at it. Changing the brightness                               MAHESRI, A., VARDHAN, V. 2005. Power Consumption Breakdown on a
from one power level to another in our system takes only 20ms,                              Modern Laptop, Proc. Power Aware Computing Systems, LNCS (3471),
which is unobservable for the user. Fig.7 shows the brightness of                           165-180.
the screenshots and the corresponding power consumption level                               PARK, W.I. 1999. Power saving in a portable computer, EU Patent,
(see the numbers displayed on the down-right corner of the                                  EP0949557.
screenshots; the second row from the bottom shows the power).



                                                                                      116

Weitere ähnliche Inhalte

Was ist angesagt?

Etna dev 2016 - Deep Dive Holographic Applications
Etna dev 2016 - Deep Dive Holographic ApplicationsEtna dev 2016 - Deep Dive Holographic Applications
Etna dev 2016 - Deep Dive Holographic ApplicationsMatteo Valoriani
 
5.report (blue eye technology)
5.report (blue eye technology)5.report (blue eye technology)
5.report (blue eye technology)JIEMS Akkalkuwa
 
User-Centered Mobile Concept Development
User-Centered Mobile Concept DevelopmentUser-Centered Mobile Concept Development
User-Centered Mobile Concept DevelopmentSøren Engelbrecht
 
INTERACTIVE WHITEBOARD SOLUTIONS
INTERACTIVE WHITEBOARD SOLUTIONSINTERACTIVE WHITEBOARD SOLUTIONS
INTERACTIVE WHITEBOARD SOLUTIONSHelder Lopes
 

Was ist angesagt? (6)

Etna dev 2016 - Deep Dive Holographic Applications
Etna dev 2016 - Deep Dive Holographic ApplicationsEtna dev 2016 - Deep Dive Holographic Applications
Etna dev 2016 - Deep Dive Holographic Applications
 
5.report (blue eye technology)
5.report (blue eye technology)5.report (blue eye technology)
5.report (blue eye technology)
 
User-Centered Mobile Concept Development
User-Centered Mobile Concept DevelopmentUser-Centered Mobile Concept Development
User-Centered Mobile Concept Development
 
INTERACTIVE WHITEBOARD SOLUTIONS
INTERACTIVE WHITEBOARD SOLUTIONSINTERACTIVE WHITEBOARD SOLUTIONS
INTERACTIVE WHITEBOARD SOLUTIONS
 
Dukane imagepro 8979 wu
Dukane imagepro 8979 wuDukane imagepro 8979 wu
Dukane imagepro 8979 wu
 
Dukane imagepro 8978 w
Dukane imagepro 8978 wDukane imagepro 8978 w
Dukane imagepro 8978 w
 

Andere mochten auch

Kathryn Fiedler\'s Portfolio
Kathryn Fiedler\'s PortfolioKathryn Fiedler\'s Portfolio
Kathryn Fiedler\'s PortfolioKathryn Fiedler
 
Hyves Cbw Mitex Harry Van Wouter
Hyves Cbw Mitex Harry Van WouterHyves Cbw Mitex Harry Van Wouter
Hyves Cbw Mitex Harry Van Wouterguest2f17d3
 
ZFConf 2010: Performance of Zend Framework Applications
ZFConf 2010: Performance of Zend Framework ApplicationsZFConf 2010: Performance of Zend Framework Applications
ZFConf 2010: Performance of Zend Framework ApplicationsZFConf Conference
 
การแก้ปัญหาและร่วมมือทางเศรษฐกิจระดับชุมชนและประเทศ
การแก้ปัญหาและร่วมมือทางเศรษฐกิจระดับชุมชนและประเทศการแก้ปัญหาและร่วมมือทางเศรษฐกิจระดับชุมชนและประเทศ
การแก้ปัญหาและร่วมมือทางเศรษฐกิจระดับชุมชนและประเทศPrincess Chulabhorn's College, Chiang Rai Thailand
 
ZFConf 2010: History of e-Shtab.ru
ZFConf 2010: History of e-Shtab.ruZFConf 2010: History of e-Shtab.ru
ZFConf 2010: History of e-Shtab.ruZFConf Conference
 
Manifesto para o día 13 de xuño
Manifesto para o día 13 de xuñoManifesto para o día 13 de xuño
Manifesto para o día 13 de xuñooscargaliza
 
2013 05-08 pnp convenio grandes almacéns e liberalización horarios comerciais...
2013 05-08 pnp convenio grandes almacéns e liberalización horarios comerciais...2013 05-08 pnp convenio grandes almacéns e liberalización horarios comerciais...
2013 05-08 pnp convenio grandes almacéns e liberalización horarios comerciais...oscargaliza
 
Manual de selección de medicamentos Minsal Chile 2010
Manual de selección de medicamentos Minsal Chile 2010Manual de selección de medicamentos Minsal Chile 2010
Manual de selección de medicamentos Minsal Chile 2010MANUEL RIVERA
 
ZFConf 2012: Dependency Management в PHP и Zend Framework 2 (Кирилл Чебунин)
ZFConf 2012: Dependency Management в PHP и Zend Framework 2 (Кирилл Чебунин)ZFConf 2012: Dependency Management в PHP и Zend Framework 2 (Кирилл Чебунин)
ZFConf 2012: Dependency Management в PHP и Zend Framework 2 (Кирилл Чебунин)ZFConf Conference
 
ASA RA VPN with AD Authentication
ASA RA VPN with AD AuthenticationASA RA VPN with AD Authentication
ASA RA VPN with AD Authenticationdirflash
 
פרטיות בעבודה מבחן בתי המשפט
פרטיות בעבודה   מבחן בתי המשפטפרטיות בעבודה   מבחן בתי המשפט
פרטיות בעבודה מבחן בתי המשפטhaimkarel
 
JudCon - Aerogear Android
JudCon - Aerogear AndroidJudCon - Aerogear Android
JudCon - Aerogear AndroidDaniel Passos
 
2014 Stop slavery! Pocheon African Art musuem in South Korea
2014 Stop slavery! Pocheon African Art musuem in South Korea2014 Stop slavery! Pocheon African Art musuem in South Korea
2014 Stop slavery! Pocheon African Art musuem in South KoreaSo-young Son
 

Andere mochten auch (20)

Tablas dia
Tablas diaTablas dia
Tablas dia
 
Bevezeto
BevezetoBevezeto
Bevezeto
 
Kathryn Fiedler\'s Portfolio
Kathryn Fiedler\'s PortfolioKathryn Fiedler\'s Portfolio
Kathryn Fiedler\'s Portfolio
 
Resume
ResumeResume
Resume
 
Hyves Cbw Mitex Harry Van Wouter
Hyves Cbw Mitex Harry Van WouterHyves Cbw Mitex Harry Van Wouter
Hyves Cbw Mitex Harry Van Wouter
 
ZFConf 2010: Performance of Zend Framework Applications
ZFConf 2010: Performance of Zend Framework ApplicationsZFConf 2010: Performance of Zend Framework Applications
ZFConf 2010: Performance of Zend Framework Applications
 
การแก้ปัญหาและร่วมมือทางเศรษฐกิจระดับชุมชนและประเทศ
การแก้ปัญหาและร่วมมือทางเศรษฐกิจระดับชุมชนและประเทศการแก้ปัญหาและร่วมมือทางเศรษฐกิจระดับชุมชนและประเทศ
การแก้ปัญหาและร่วมมือทางเศรษฐกิจระดับชุมชนและประเทศ
 
Rc Feedback
Rc FeedbackRc Feedback
Rc Feedback
 
ZFConf 2010: History of e-Shtab.ru
ZFConf 2010: History of e-Shtab.ruZFConf 2010: History of e-Shtab.ru
ZFConf 2010: History of e-Shtab.ru
 
TEMA 5A Vocabulario
TEMA 5A VocabularioTEMA 5A Vocabulario
TEMA 5A Vocabulario
 
แบบนำเสนอผลงานวิชาการ
แบบนำเสนอผลงานวิชาการแบบนำเสนอผลงานวิชาการ
แบบนำเสนอผลงานวิชาการ
 
Manifesto para o día 13 de xuño
Manifesto para o día 13 de xuñoManifesto para o día 13 de xuño
Manifesto para o día 13 de xuño
 
งานนำเสนอ1
งานนำเสนอ1งานนำเสนอ1
งานนำเสนอ1
 
2013 05-08 pnp convenio grandes almacéns e liberalización horarios comerciais...
2013 05-08 pnp convenio grandes almacéns e liberalización horarios comerciais...2013 05-08 pnp convenio grandes almacéns e liberalización horarios comerciais...
2013 05-08 pnp convenio grandes almacéns e liberalización horarios comerciais...
 
Manual de selección de medicamentos Minsal Chile 2010
Manual de selección de medicamentos Minsal Chile 2010Manual de selección de medicamentos Minsal Chile 2010
Manual de selección de medicamentos Minsal Chile 2010
 
ZFConf 2012: Dependency Management в PHP и Zend Framework 2 (Кирилл Чебунин)
ZFConf 2012: Dependency Management в PHP и Zend Framework 2 (Кирилл Чебунин)ZFConf 2012: Dependency Management в PHP и Zend Framework 2 (Кирилл Чебунин)
ZFConf 2012: Dependency Management в PHP и Zend Framework 2 (Кирилл Чебунин)
 
ASA RA VPN with AD Authentication
ASA RA VPN with AD AuthenticationASA RA VPN with AD Authentication
ASA RA VPN with AD Authentication
 
פרטיות בעבודה מבחן בתי המשפט
פרטיות בעבודה   מבחן בתי המשפטפרטיות בעבודה   מבחן בתי המשפט
פרטיות בעבודה מבחן בתי המשפט
 
JudCon - Aerogear Android
JudCon - Aerogear AndroidJudCon - Aerogear Android
JudCon - Aerogear Android
 
2014 Stop slavery! Pocheon African Art musuem in South Korea
2014 Stop slavery! Pocheon African Art musuem in South Korea2014 Stop slavery! Pocheon African Art musuem in South Korea
2014 Stop slavery! Pocheon African Art musuem in South Korea
 

Ähnlich wie Moshnyaga The Use Of Eye Tracking For Pc Energy Management

Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...Kalle
 
Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...
Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...
Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...Kalle
 
Schneider.2011.an open source low-cost eye-tracking system for portable real-...
Schneider.2011.an open source low-cost eye-tracking system for portable real-...Schneider.2011.an open source low-cost eye-tracking system for portable real-...
Schneider.2011.an open source low-cost eye-tracking system for portable real-...mrgazer
 
IRJET- Sixth Sense Technology in Image Processing
IRJET-  	  Sixth Sense Technology in Image ProcessingIRJET-  	  Sixth Sense Technology in Image Processing
IRJET- Sixth Sense Technology in Image ProcessingIRJET Journal
 
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...Kalle
 
Control the computer with your eyes
Control the computer with your eyesControl the computer with your eyes
Control the computer with your eyesDominick Maino
 
PowerPro Flyer English
PowerPro Flyer EnglishPowerPro Flyer English
PowerPro Flyer Englishcynapspro GmbH
 
SCREENLESS DISPLAY.pptx
SCREENLESS DISPLAY.pptxSCREENLESS DISPLAY.pptx
SCREENLESS DISPLAY.pptxAlenJames14
 
Blue eyes seminar report
Blue eyes seminar reportBlue eyes seminar report
Blue eyes seminar reportAnugya Shukla
 
San Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze TrackerSan Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze TrackerKalle
 
seminar ppt blue eyes.pptx
seminar ppt blue eyes.pptxseminar ppt blue eyes.pptx
seminar ppt blue eyes.pptxkiran814572
 
Eye Gaze Communication system
Eye Gaze Communication systemEye Gaze Communication system
Eye Gaze Communication systemMahimaKumari7
 
A Configurable User Interface For Hand Held Devices
A Configurable User Interface For Hand Held DevicesA Configurable User Interface For Hand Held Devices
A Configurable User Interface For Hand Held Devicesijsrd.com
 
Iirdem screen less displays – the imminent vanguard
Iirdem screen less displays – the imminent vanguardIirdem screen less displays – the imminent vanguard
Iirdem screen less displays – the imminent vanguardIaetsd Iaetsd
 

Ähnlich wie Moshnyaga The Use Of Eye Tracking For Pc Energy Management (20)

Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Envir...
 
Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...
Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...
Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...
 
Schneider.2011.an open source low-cost eye-tracking system for portable real-...
Schneider.2011.an open source low-cost eye-tracking system for portable real-...Schneider.2011.an open source low-cost eye-tracking system for portable real-...
Schneider.2011.an open source low-cost eye-tracking system for portable real-...
 
IRJET- Sixth Sense Technology in Image Processing
IRJET-  	  Sixth Sense Technology in Image ProcessingIRJET-  	  Sixth Sense Technology in Image Processing
IRJET- Sixth Sense Technology in Image Processing
 
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
 
Control the computer with your eyes
Control the computer with your eyesControl the computer with your eyes
Control the computer with your eyes
 
PowerPro Flyer English
PowerPro Flyer EnglishPowerPro Flyer English
PowerPro Flyer English
 
Ab1
Ab1Ab1
Ab1
 
SCREENLESS DISPLAY.pptx
SCREENLESS DISPLAY.pptxSCREENLESS DISPLAY.pptx
SCREENLESS DISPLAY.pptx
 
Blue eyes seminar report
Blue eyes seminar reportBlue eyes seminar report
Blue eyes seminar report
 
San Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze TrackerSan Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
 
seminar ppt blue eyes.pptx
seminar ppt blue eyes.pptxseminar ppt blue eyes.pptx
seminar ppt blue eyes.pptx
 
Eye Gaze Communication system
Eye Gaze Communication systemEye Gaze Communication system
Eye Gaze Communication system
 
Dukane imagepro8973 wa
Dukane imagepro8973 waDukane imagepro8973 wa
Dukane imagepro8973 wa
 
Dukane imagepro8973 wa
Dukane imagepro8973 waDukane imagepro8973 wa
Dukane imagepro8973 wa
 
A Configurable User Interface For Hand Held Devices
A Configurable User Interface For Hand Held DevicesA Configurable User Interface For Hand Held Devices
A Configurable User Interface For Hand Held Devices
 
Iirdem screen less displays – the imminent vanguard
Iirdem screen less displays – the imminent vanguardIirdem screen less displays – the imminent vanguard
Iirdem screen less displays – the imminent vanguard
 
2010TDC_light
2010TDC_light2010TDC_light
2010TDC_light
 
gcce-uapm-slide-20131001-1900
gcce-uapm-slide-20131001-1900gcce-uapm-slide-20131001-1900
gcce-uapm-slide-20131001-1900
 
Worldkit System
Worldkit SystemWorldkit System
Worldkit System
 

Mehr von Kalle

Blignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of HeatmapsBlignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of HeatmapsKalle
 
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...Kalle
 
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...Kalle
 
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...Kalle
 
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze ControlUrbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze ControlKalle
 
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...Kalle
 
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic TrainingTien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic TrainingKalle
 
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...Kalle
 
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser OphthalmoscopeStevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser OphthalmoscopeKalle
 
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...Kalle
 
Skovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze AloneSkovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze AloneKalle
 
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural TasksRyan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural TasksKalle
 
Rosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem SolvingRosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem SolvingKalle
 
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual SearchQvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual SearchKalle
 
Prats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement StudyPrats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement StudyKalle
 
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...Kalle
 
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...Kalle
 
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...Kalle
 
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...Kalle
 
Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...
Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...
Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...Kalle
 

Mehr von Kalle (20)

Blignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of HeatmapsBlignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of Heatmaps
 
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
 
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
 
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
 
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze ControlUrbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
 
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
 
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic TrainingTien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
 
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
 
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser OphthalmoscopeStevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
 
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
 
Skovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze AloneSkovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze Alone
 
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural TasksRyan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
 
Rosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem SolvingRosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem Solving
 
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual SearchQvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
 
Prats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement StudyPrats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement Study
 
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
 
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
 
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
 
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
 
Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...
Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...
Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...
 

Moshnyaga The Use Of Eye Tracking For Pc Energy Management

  • 1. The Use of Eye Tracking for PC Energy Management Vasily G. Moshnyaga Department of Electronics Engineering and Computer Science Fukuoka University, Japan vasily@fukuoka-u.ac.jp Abstract Those who do the adjustment, usually assign long intervals. As HP [Global Citizenship Report, 2006] reveals, just enabling the This paper discusses a new application of eye-tracking, namely low power mode after 20 minutes of inactivity can save up to power management, and outlines its implementation in personal 381kWh for a single PC per year. Clearly, the PC power man- computer system. Unlike existing power management technol- agement must employ a more efficient user monitoring. ogy, which “senses” a PC user through keyboard and/or mouse, Several approaches have been proposed to improve the PC our technology “watches” the user through a single camera. The user monitoring. Extending the touch-pad function for user pres- technology tracks the user’s eyes keeping the display active only ence detection [Park 1999] and placing thermal sensors around if the user looks at the screen. Otherwise it dims the display the display [Dai 2003] are some of them. Despite differences, all down or even switches it off to save energy. We implemented these approaches have one drawback in common. Namely, they the technology in hardware and present the results of its experi- ignore the viewer attention. Paradoxically, while the display is mental evaluation. needed only for our eyes, none of the existing approaches, up to our knowledge, takes them into account. Neither ACPI nor tem- CR Categories: K.6 [Management of Computing and Informa- perature sensors nor the advanced touchpad screeners can dis- tion Systems]; K6.4 [System management] tinguish whether the user looks at screen or not. As a result, they may either switch the display off inappropriately (i.e. when the Keywords: Eye tracking, applications, energy reduction user looks at screen without pressing a key) or keep the display active when it is not needed. 1 Introduction We propose to apply eye-tracking for PC energy management. Unlike existing technologies which “sense” a PC user through With the wide popularity of user centric applications, the role of keyboard, touchpad and/or mouse, our technology “watches” the smart and intelligent devices, capable of monitoring human eyes user through a single camera. More precisely, it tracks the user’s is increasing. Up to the date eye-tracking has been applied for eyes to detect whether he or she looks at screen or not and based applications, such as HCI, security systems, health care, assis- on that changes the display brightness and power consumption. tive technologies, ubiquitous computing, etc. [Morimoto, 2004]. In this paper, we discuss a new application of eye-tracking, namely power management and outline its hardware implemen- 2 The Proposed Technology tation in personal computer system. Modern PCs burn a half of its total energy in display [Mahesri 2.1 An Overview 2004]. To reduce energy consumption, OS-based Advanced Configuration and Power Interface (or ACPI) sets display to The proposed technology is based on the following assumptions: low-power modes after specified periods of inactivity on mouse A. The PC is equipped with a color video camera. The camera and/or keyboard [ACPI 2004]. The efficiency of ACPI strongly is located at the top of display. When the user looks at dis- depends on inactivity intervals, set by the user. From one hand, play it faces the camera frontally. if the inactivity intervals are improperly short, e.g. 1 or 2 min- B. The display has a number of backlight intensity levels with utes, the ACPI can be quite troublesome by shutting the display the highest level corresponding to the largest power con- off when it must be on. From another hand, if the inactivity in- sumption and the lowest level to the smallest power, respec- tervals are set to be long, the ACPI’s efficiency decreases. Be- tively. The highest intensity level is enabled either initially cause modifying the intervals requires system setting, a half of or whenever the user looks at the screen. the world’s PC users never adjust the power management of The idea is simple: based on the camera readings determine the their PCs for fear that it will impede performance [Fujitsu 2007]. display power mode. If no human face is detected in the current *The work was supported by The Ministry of Education, Culture, video frame, the display is switched off. Otherwise, we track the Sports, Science and Technology of Japan under the Knowledge Cluster user’s eye-gaze. If the gaze has been off the screen for more than Initiative (The Second Stage) and Grant-in-Aid for Scientific Research N consecutive frames, the current backlight luminance is (C) No.21500063. dimmed down to the next level. Any on-screen gaze reactivates Copyright © 2010 by the Association for Computing Machinery, Inc. the initial backlight luminance by moving the display onto the Permission to make digital or hard copies of part or all of this work for personal or power up mode. However, if no on-screen gaze has been de- classroom use is granted without fee provided that copies are not made or distributed tected for more than N frames and the backlight luminance for commercial advantage and that copies bear this notice and the full citation on the reached already the lowest level, the display enters the standby first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on mode. Returning back from either standby or off modes is done servers, or to redistribute to lists, requires prior specific permission and/or a fee. by pushing the ON button. Below we describe the technology in Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail details permissions@acm.org. ETRA 2010, Austin, TX, March 22 – 24, 2010. © 2010 ACM 978-1-60558-994-7/10/0003 $10.00 113
  • 2. 2.2 User Presence Detection Next frame no Search area (S) The goal of this task is to determine from the camera readings BTE location known? of whole image whether or not the user is currently present in front of display. yes Search area (S) To detect the user’s presence in front of the display, we first around the BTE localize the face search by applying background subtraction and skin-color segmentation to the RGB representation of input im- Compute integral image of S age. The skin is defined by the following criteria [Douxchamps, Find BTE candidates Run SSR filter 2008]: 0.55<R<0.85, 1.15<R/G<1.19, 1.15<R/B<1.5 and 0.6<(R+G+B)<1.8. To accelerate the face-area extraction, two no Next BTE candidate Face candidate? additional filters are used. The first one limits the size of the yes Confirm by SVM head in reasonable range. The second one verifies that the face Save BTE location contains a minimum of 25% of skin colored pixels. Thus, if the Locate the eyes no Run complete? total number of pixels in the derived face area exceeds a given no yes Positive threshold the user is assumed present. yes Save BTE, eye locations (x0,y0) yes 1 2 3 0ther candidates? no (x1,y1) (x2,y1) Figure 2: The modified eye-tracking algorithm 4 5 6 (x1,y2) (x2,y2) Fig.2 shows the modified algorithm. For the first frame or any Figure 1: Illustration of the SSR filter frame in which the search for BTE candidate was unsuccessful, we search the image area reduced by background and skin-color 2.3 Eye-Gaze Detection extraction; otherwise the search is restricted to a small area (S) of ±8 pixels around the previously located BTE pattern. For the The eye-gaze detector implements the algorithm proposed by chosen area, the algorithm first transforms the green component [Kawato 2005], which scans the Six Segment Rectangular (SSR) of corresponding image into integral image representation and filter over the integral representation input image to define the then scans it by the SSR filter to select the BTE candidate. If the Between The Eyes (BTE) pattern of human face (Fig.1) and then BTE candidate is found, the system takes it as a starting point to searches the regions 1 and 3 from the left and right side of the locate eyes. If eyes have been detected, the user is assumed to be BTE pattern to locate the eyes. The algorithm does not depend looking at screen; else it is not. If no BTE candidate has been on illumination, face occlusion and eye closure. It is more stable, found, the user is considered to be not looking at the screen. robust and less complex than the other eye-tracking formula- tions. However, it is still very computationally demanding. In a To detect a BTE pattern in an image, we scan the SSR filter quest to locate all faces in an image (without restriction on face over the search area S in a row-first fashion and at each location size, motion and rotation), the algorithm scans the whole image compare the integral sums of the rectangular segments corre- six times performing over 28M operations per (640x480) frame. sponding to eyes, chicks and nose (i.e. 1 and 2, 1 and 4, 3 and 2, Though such a full search might be necessary in some applica- and 3 and 6) as follows: tions, it seems redundant when tracking eyes of the PC user. Sum(1) < Sum (2) & Sum(1) < Sum(4) (2) In our eye-tracking application we can assume that: Sum(3) < Sum(2) & Sum(3) < Sum(6) 1. The target object is a single PC user. The user sits in front If the above criteria are satisfied, the SSR is considered to be a of PC at a relatively close distance of 50-70cm. candidate for the BTE pattern (i.e. face) and two local minimum 2. The user’s motion is slow relatively to the frame rate. (i.e. dark) points each are extracted from the regions 1 and 3 of the SSR for left and right eyes, respectively. 3. The background is stable and constant. The eye localization procedure is organized as a scan over the Based on these assumptions, we apply the following algorithmic green plane representation of regions 1 and 3 for a continuous optimizations to reduce eye-tracking complexity [Yamamoto segment of dark pixels (i.e. whose value is lower than the and Moshnyaga 2009]: threshold k). During the search for eyes, we ignore 2 pixels at • fixed SSR filter size; when the user is 50-70 cm from the the boarder of the regions to avoid effects of eyebrows, hair and camera, the BTE interval of 55 pixels and the filter size ra- beard. Also, because the eyebrows have almost the same grey tio of 2:3 ensure minimal computational complexity at level as the eyes, the search starts from the lowest positions of almost 100% detection rate. regions 1 and 3. Similarly to [Kawato 2000], we assume that • single SSR filter scan; It follows from the single user as- eyes are located if the distance between the located eyes (D) and sumption and the fixed SSR filter size. the angle (A) at the center point of the BTE area 2 (see Fig.3, left) satisfy the following: 30<D< 42 & 115° < A < 180°.If • pixel displacement of the SSR filter during the scan; Ex- periments showed that the computational complexity de- both eyes are detected, the user’s gaze is considered to be on creases by a factor of 3 when the displacement of 2, and by screen. The eye positions found in this case the current frame are a factor of 4.5 for the displacement of 3 without affecting then used to reduce complexity of processing the successive the detection rate of the original (full scan) algorithm. frames. The search in the next frame is limited to a small region, which spans by 8 pixels in vertical and horizontal direction • low frame processing rate (5-10 fps); Because the user around the eye points of the current frame. motion is very slow, high processing rates are redundant. 114
  • 3. A D Figure 3: An illustration of the eye detection heuristics (left) and the search area reduction (right) Screen Backlight Video User tracking unit Camera lamp User u0 presence High-voltage Vb Voltage detector inverter converter u1 Eye (R,G,B) Figure 5: Examples of correct eye-detection Display detector memory, input images were 160x120 pixels in size. The SSR filter was 30x20 pixels in size. The total power consumption of Figure 4: System overview the design was 150mW, which is 35 times less than software implementation on a desktop PC [Moshnyaga et al. 2009]. Table 1: Results of evaluation on test sequences 4 Experimental Evaluation Test Frames Positives Negatives Accuracy True False True False (%) 1 151 127 0 6 18 88 4.1 Eye-Detection Accuracy 2 240 149 1 65 25 89 To evaluate accuracy of the gaze detector, we ran four different 3 100 74 0 16 10 90 tests each of each conducted by different users. The users were 4 180 142 4 18 24 84 Average 167 123 1 26 19 88 free to look at the camera/display, read from the materials on the table, type text, wear eyeglasses, move gesticulate or even leave Fig. 3 (right) demonstrates the search area reduction by our algo- the PC whenever wanted. Fig.5 illustrates the detection results rithm: the dashed line shows the area defined by background on 4 images. The + marks depict positions where the system extraction; the dotted line depicts the area obtained by skin- assumes the eyes to be. As we see, even though the lighting color-segmentation; the plain (dark line) shows the area around conditions of faces vary, the results are correct. Ordinary pairs the BTE pattern found in the previous image frame; white of glasses (see Fig.5, top row) have no bad effect on the per- crosses show the computed locations of eyes. formance for frontal faces. In some face orientations, however, the frame of pair of glasses can hide a part of eye ball, causing the system to loose the eye. Or sometimes it takes eyebrow or 3 Implementation hair as an eye and tracks it in the following frames. Table 1 summarizes the results. Here, the second column de- We implemented the proposed PC display power management picts the total number of frames considered in the test; columns system in hardware. Fig.4 outlines the block-diagram of the marked by ‘True’ and ‘False’ reflect the number of true and system. The user tracking unit receives an RGB color image and false detections, respectively, for positive and negative cases. outputs two logic signals, u1,u0. If the user is detected in the The false positives correspond to cases in which one of the eyes image, the signal u0 is set to 1; otherwise it is 0. The zero value is tracked on the eyebrow or on the hair near the eye. The false of u0 enforces the voltage converter to shrink the backlight sup- negatives reflect cases in which the user gazed off the screen ply voltage to 0 Volts, dimming the display off. If the eye-gaze (the both eyes are tracked on the eyebrows). Accuracy column detector determines that the user looks at screen, it sets u1=1. shows the ratio of true decisions to the total number of decisions When both u0 and u1 are 1, the display operates as usual. If the made. As the tests showed, the eye tracking accuracy of pro- user’s gaze has been off the screen for more than N consecutive posed system is quite high (88% on average). frames, u1 becomes 0. If u0=1 and u1=0, the voltage converter lowers the input voltage (Vb) of the high-voltage inverter by 4.2 Energy Reduction Efficiency ∆V. This voltage drop lowers backlight luminance and so shrinks the power consumption of the display. Any on-screen Next, we estimated the energy efficiency of the proposed cam- gaze in this low power mode reactivates the initial backlight era–based power management system by measuring the total luminance and moves the display onto normal mode. However, power consumption taken from the wall by the system itself and if u0=0 and the backlight luminance has already reached the the 17” IO-DATA TFT LCD display controlled by the system. lowest level, the display is turned off. Fig.6 profiles the results measured per frame on a 100sec The user-tracking unit was realized on a single Xilinx FPGA (2000frames) long test. In the test, the user was present in front board connected to VGA camera through parallel I/O interface. of the display (frames 1-299, 819-1491, 1823-2001); moved a See [Moshnyaga, et al, 2009] for details. The unit operates at little from the display but still present in the camera view 48MHz frequency, 3.3V voltage and provides eye tracking at (frames 1300 to 1491); and stepped away from the PC disap- 20fps rate. Due to capacity limitations of the on-chip SRAM pearing from the camera (frames 300-818, 1492-1822). The 115
  • 4. 40 The total power overhead of the system is 960mW. Even ACPI though the system takes a little more power than ACPI (see 35 30 Gaze Gaze horizontal line in Fig.6) in active mode, it saves 36% of the total on on screen screen energy consumed by the display on this short test. In environ- 25 ments when users frequently detract their attention from the Power (W) Gaze off screen 20 screen or leave computers unattended (e.g. school, university, 15 office) the energy savings could be significant. No user No user 10 User is 5 present 5 Conclusion 0 1 501 1001 1501 2001 In this paper we presented a novel eye-tracking application, namely display power management, and outlined an implemen- Frame tation technology which made the application viable. Experi- Figure 6: Display power consumption per frame ments showed that the camera-based display power management is more efficient than the currently used ACPI method due to its ability to adjust the display power adaptively to the viewer be- havior. The application-specific algorithm optimizations and eye-tracking implementation in hardware allowed us reduce the power overhead below 1W yet satisfying real-time and high accuracy requirements of the application. However, this power can be reduced even further should custom design be performed. In the current work we restricted ourselves to a simple case of a singular user monitoring. However, when talking about moni- toring in general, some critical issues arise. For instance, how should the technology behave when handling more than one person looking at screen? The user might not look at screen while the others do. Concerning this point, we believe that a feasible solution is to keep the display active while there is someone looking at the screen. We are currently investigating the issue as well as the influence of camera positioning, user gender/race etc. References ACPI: Advanced Configuration and Power Interface Specification, 2004, Sept., Rev. 3.0, http://www.acpi.info/spec.htm YAMAMOTO S., AND MOSHNYAGA V.G. 2009. Algorithm optimizations for low-complexity eye tracking, Proc. IEEE SMC, 18-22 MOSHNYAGA V.G., HASIMOTO K., SUETSUGU T., HIGASHI S. 2009. A hardware implementation of the user-centric display energy management, Proc. PATMOS 2009, LNCS 5953, 56-65. DAI, X., AND RAYCHANDRAN, K. 2003. Computer screen power management through detection of user presence, US Patent 6650322. DOUXCHAMPS D., AND CAMPBELL N. 2008, Robust real time face tracking for the analysis of human behavior, in Machine Learning for multimodal Interaction, LNCS 4892, 1-10. Fujitsu-Siemens Report 2007, Energy savings with personal computers, Figure 7: Screenshots of display and corresponding Fujitsu-Siemens Corp. http://www.fujitsu-simens.nl/aboutus/sor/energy_ power consumption: when the user looks at screen, the saving/prof_desk_prod.html screen is bright and power is 35W (top picture); else the Global Citizenship Report, Hewlett-Packard Co., 2006, screen is dimmed and is power 15.6W (bottom picture) www.hp.com/hpinfo/globalcitizenship/gcreport/pdf/ hp2006gcreport_lowres.pdf system was set to step down from the current power level if the KAWATO S., AND OHYA J. 2000, Two-step approach for real-time eye- tracking with a new filtering technique. Proc. IEEE SMC, 1366-1371 eye-gaze off the screen was continuously detected for more than 15 frames (i.e. almost 1 sec). The ACPI line shows the power KAWATO S., TETSUTANI N., OSAKA K. 2005. Scale-adaptive face detection and tracking in real time with SSR filters and support vector consumption level of the ACPI. machine, IEICE Trans. Information &Systems, E88-D, (12) 2857-2863. We see that our technology is very effective. It changes the MORIMOTO, C., MIMICA, M. R.M. 2004. Eye gaze tracking techniques display power accordingly to the user behavior; dimming the for interactive applications, Computer Vision and Image Understanding, display when the user gaze is off the screen and powering the 98 (2004) (1), 4–24 display up when the user looks at it. Changing the brightness MAHESRI, A., VARDHAN, V. 2005. Power Consumption Breakdown on a from one power level to another in our system takes only 20ms, Modern Laptop, Proc. Power Aware Computing Systems, LNCS (3471), which is unobservable for the user. Fig.7 shows the brightness of 165-180. the screenshots and the corresponding power consumption level PARK, W.I. 1999. Power saving in a portable computer, EU Patent, (see the numbers displayed on the down-right corner of the EP0949557. screenshots; the second row from the bottom shows the power). 116