Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 7-Day Trial for You or Your Team.

Learn More →

Design of Head Movement Controller System (HEMOCS) for Control Mobile Application through Head Pose Movement Detection

Design of Head Movement Controller System (HEMOCS) for Control Mobile Application through Head... PAPER DESIGN OF HEAD MOVEMENT CONTROLLER SYSTEM (HEMOCS) FOR CONTROL MOBILE APPLICATION THROUGH … Design of Head Movement Controller System (HEMOCS) for Control Mobile Application through Head Pose Movement Detection http://dx.doi.org/10.3991/ijim.v10i3.5552 1 2 Herman Tolle and Kohei Arai Brawijaya University, Malang, Indonesia Saga University, Sagashi, Japan Abstract—Head movement has been found to be a natural recognition. Kupetz et al. [4] implemented a head move- way of interaction. It can be used as an alternative control ment tracking system using an IR camera and IR LEDs. method and provides accessibility for users when used in Another approach for head movement detection is by human computer interface solutions. The combination of using sensors such as gyroscopes and accelerometers. Head-mounted displays (HMDs) with mobile devices, pro- King et al. [5] implemented a hands-free head movement vide an innovation of new low cost of human-computer classification system which uses pattern recognition tech- interaction. Such devices are hands-free systems. In this niques with mathematical solutions for enhancement. A paper, we introduce a new method for recognizing head dual axis accelerometer mounted inside a hat was used to movement as the controller of mobile application and pro- collect head movement data. A similar method was pre- posed a new control system using head movement only. The sented by Nguyen et al. [6]. The method detects the proposed method can determine specific head pose move- movement of a user's head by analyzing data collected ment and respond it as a controller of an application. The from a dual-axis accelerometer and pattern recognition implementation of a music player application on an iOS techniques. But still no application based on the proposed devices shows that the proposed method is appropriate for a method was suggested. Other sensor based approach like new experience of real-time human-computer interaction [7][8]. However, it needs more theoretical proofs and with head movement control only. more experiments and accuracy analysis. A combination of different techniques can be used in Index Terms—head mounted display, accelerometer, head motion estimation, human computer interaction. head tracking systems. Satoh et al. [9] proposed a head tracking method that uses a gyroscope mounted on a head- mounted device (HMD) and a fixed bird's-eye view cam- I. INTRODUCTION era responsible for observing the HMD from a third- Head movement detection has received significant at- person viewpoint. Using a fixed camera, customized tention in recent research. One of the specific purposes for marker, gyroscope sensor and calibration process makes head movement detection and tracking is to allow the user this proposal impractical for head tracking tasks. The time to interact with a computer or new devices like mobile complexity of the algorithm has not been investigated phone. The increased popularity of the wide range of which makes it a little far from being used in real-world applications of which head movement detection is a part, applications. such as assistive technology, virtual reality, and augment- Head-mounted displays (HMDs) embedded in eye- ed reality, have increased the size of research aiming to glasses are the next innovation along the path of commu- provide robust and effective techniques of real-time head nication techniques. Such devices are hands-free systems. movement detection and tracking [1]. Although this is not a new idea, currently released and There are many different approaches for head move- commercially available products (such as the Project ment estimation. All investigated methods require high Glass by Google) show the immense potential of this computational and still difficult to implemented using low technology. They function as stand-alone computers; their computational hardware. Recently, there are three popular light glass frame is equipped with a variety of sensors; a approaches for estimating head movement and tracking: projector displays images and information onto the eye. In using camera based image processing, sensor based meth- our previous research work, we propose head movement od using accelerometer and gyroscope, and the combina- detection and tracking as a controller for 3D object scene tion of different techniques. view [10] and the combination of user’s head and body Most of the head pose estimation method is based on movement as a controller for virtual reality labyrinth game computer vision approach, like [2][3][4]. Liu et al. [2] [11]. introduced a video-based technique for estimating the In this paper, we introduce a new type of head move- head pose and used it in an image processing application ment controlling systems using 3 degrees of freedom of for a real-world problem; and attention recognition for head rotation movement. The method is based on recog- drivers. Murphy-Chutorian and Trivedi [3] presented a nizing the internal accelerometer and gyro sensor data static head pose estimation algorithm and a visual 3-D inside a mobile phone placed on user's head using Head- tracking algorithm based on image processing and pattern mounted Display (HMD) like Google cardboard (called as dummy HMD). A real-time mobile application is built to 24 http://www.i-jim.org PAPER DESIGN OF HEAD MOVEMENT CONTROLLER SYSTEM (HEMOCS) FOR CONTROL MOBILE APPLICATION THROUGH … prove the implementation of the method in real time basis. The user can easily control the hands-free application by using particular head pose movement only. II. DESIGN OF HEAD MOVEMENT CONTROL SYSTEM (HEMOCS) The head movement controller system (HEMOCS) Figure 1. Sample of Google Cardboard as a dummy-HMD with working with a dummy HMD complementary with a smartphone [12] smartphone which has internal inertial sensors like accel- erometer, gyroscope, and magnetometer. The user wears an HMD with a smartphone as shown in Figure 1 and Figure 2 while system displays a mobile application de- veloped on a smartphone. HMD is used for integrating human head and eyes, and through the smartphone’s dis- play user can watch through the camera while controlling something in the application. The user can control the application by moving the head in a particular movement. The head movement is detected through real-time data gathering and analysis from mobile phone’s sensors. The method for detecting user’s head pose movement is based on the pattern of data gathered by internal sensors. The human head is limited to three degrees of freedom Figure 2. User wears a dummy-HMD with 3 degrees of freedom of (DOF) of head translation in the pose, which can be char- head movement acterized by the pitch, roll, and yaw angles as pictured in Figure 2. In this research, we proposed a control system using three types of head pose movement with each has two opposite direction as shown in Figure 3. Type of head movement is axial rotation left (H1), axial rotation right (H2), flexion (H3) and extention (H4), lateral bending left (H5) and lateral bending right (H6). H1 to H6 is just the code for easily named the head pose movement. Axial rotation left means that user moves the head to the left (around 30-45 degree) from the initial position in the spe- cific speed and then pose back the head into initial posi- tion. This movement feel like user tries to swipe some- thing in the application using their head. All the same process for other five types of movement with each direc- tion. Then we have 6 type of head poses movement as the gesture for control something in the mobile application. A. Head Pose Gesture Control Function In the proposed head movement control system Figure 3. Six type of head pose movement named as H1 to H6 (HEMOCS), the head movement gesture is act like a swipe type in mobile phone user experience. This control TABLE I. system also work as substitute for the conventional button DESIGN OF HEAD POSE CONTROL FUNCTION or tap function. The proposed method for the head Code Head Pose Type Control Purpose movement and each type of control purpose as shown in H1 Move to the left / Axial rotation left Select Previous Table 1. Moving the head to the left or right is using as selecting control (previous (H1)) or next (H2)). Head H2 Move to the right / Axial rotation right Select Next looking down is for accept (or tap or choose) control H3 Head looking down / Flexion Choose this (H3), looking up for back function (H4), tilt the head H4 Head looking up / Extension Back to the List toward left shoulder is for back home (H5) and tilt head Tilt head toward left shoulder / Lateral H5 Back to Home toward right shoulder is just reserved (H6) for further Bending left functionality in the near future. Tilt head toward right shoulder / Lateral H6 reserved The first thing to investigate is how to detect and rec- Bending right ognize pose of the head movement through signal pattern. The method in the head movement controlling system application that correlated with the particular detected based on four repeatable process steps as shown in Figure head movement type. 4 as follows: 1) read sensors data, 2) recognize the da- ta/signal pattern, 3) determine the head movement, and 4) B. Preliminary Investigate on Head Movement Signal controller response based on the recognized head move- Pattern ment. First, the system read sensors data using push meth- Preliminary investigate on the pattern of the 6 type of od, secondly recognizing the pattern of sensor’s data, then head movement should be done before developing attitude determine which head pose movement type it is, lastly is control system. In our previous work [10][11], using iOS the response effect for control something in the mobile iJIM ‒ Volume 10, Issue 3, 2016 25 PAPER DESIGN OF HEAD MOVEMENT CONTROLLER SYSTEM (HEMOCS) FOR CONTROL MOBILE APPLICATION THROUGH … devices sensors achieve best results comparing to android problem because attitude data is in degree (after conver- based devices sensors [13]. In iOS based device system, sion from radian to degree), which means that first front reading the sensor data is facilitated through CoreMotion position as zero degrees, then we have to consider about which has 4 different type of movement data as follows: user orientation movement. If the user changes his/her acceleration, gravity, attitude, and rotation, while each base front position orientation, then the algorithm is not has three axis data. We read the data from for different working anymore. type on CoreMotion then analyze the pattern. The data User first front orientation position is a challenging pattern of the first type of head movement is shown in because in practice, user’s head can move any direction Figure 5. Comparison of the pattern of six different head and orientation, but only six specific head pose should be movement data concludes that CoreMotion attitude is detected, recognized and use as a controller. We improved feasible for usage in our proposed control system. The the algorithm to ensure that our proposed system is robust basic data rate is set to 60 cycles per second. From the and able to avoid different head movement type besides preliminary investigation, we found that core-motion the six types of head pose defined as H1 to H6. A attitude data sensors are appropriate for recognizing threshold number is a static number while the user’s base particular head movement type. C. Method for Detecting Head Pose Movement The method for detecting and recognizing the head pose is based on the analyzing of attitude data including yaw, roll, and pitch. We proposed the general algorithm as shown in Figure 4 and sample of pseudocode of detecting head pose algorithm as shown in Figure 7. Attitude data pattern for H1 and H2 is affected only with yaw data as shown in Figure 6. The system starts to count if yaw de- gree is higher than a specific threshold. The value of the threshold is 10 degree as the base threshold line (point 1) since the amplitude of pitch and roll is below the thresh- old value when user’s act H1 or H2 head movements. The number of counter when user’s head move more than 10 degrees is using to determine that user is in the motion of H1 or H2 (point 2). If the counter is in the specific number Figure 5. Different signal pattern of iOS CoreMotion for Axial Rota- between yMin and yMax then system recognized it as user tion (H1) Pose head movement in H1 or H2 pose. Determine the H1 or H2 is based on the peak value of the absolute degree level, if the real value is positive then determine as H1, and H2 for a negative value. For H3-H4 and H5-H6 we use roll and pitch respectively. There are two challenges in this system when to use CoreMotion attitude data. These are, how to get high accuracy when combined all the head pose movement detection; and the second is about user first head orienta- tion. The first problem is an algorithm problem to com- bine all processes with high accuracy. For the second Figure 6. Signal Pattern of Coremotion Attitude while user Move the Head to the Left (H1) !"#$%&'#()*%*$%+',*-.( /( (((012)1%1(3(4'5*6'%&'#78%%&%"9*7:12( (((&!(;012)1%1;(<(012=5*,>'?9( ((((((0124'"#%@@( (((*?,*( ((((((1??4'"#%@@( (((&!(-0124'"#%(<(0126&#.(AA(-0124'"#%(B(01261C.( (((/( (((((((&!(012)1%1(B(D( (((((((((E'F*G*!%-.( (((((((*?,*( (((((((((E'F*H&I>%-.( ((((J( ((((&!(1??4'"#%(<(KD( (((((((0124'"#%(3(D( Figure 7. Pseudocode of Head Pose Detection Function Figure 4. Proposed Head Movement Control System Process Step 26 http://www.i-jim.org PAPER DESIGN OF HEAD MOVEMENT CONTROLLER SYSTEM (HEMOCS) FOR CONTROL MOBILE APPLICATION THROUGH … front orientation is adaptable depend on user’s head base front orientation. If the user move the head’s orientation more than specific cycles, then we change the front base orientation based on current orientation. III. IMPLEMENTATION & EVALUATION A. Mobile Application Using HEMOCS There are plenty of mobile application can implement HEMOCS as a new control method. This method is appli- cable for any kind of mobile application in the area where user should not use their hand for controlling something in the screen, for example for the driver, welder, etc. In the area of healthcare, HEMOCS can be implemented in the Figure 8. Implementation Evaluation of HEMOC System. mobile application for usage by disabled people where they cannot talk and use their hands for controlling an application. Figure 7 show the illustration of an imple- mentation of HEMOCS in HMD based mobile applica- tion. One of the future targets of HEMOCS is implemen- tation as assisted communication device for disabled peo- ple. To prove our concept, we develop simple music player application with overlay text showed the song order one by one in the screen. User can control the song list and play a chosen song by move their head only. H1 and H2 control type is used to select different song in the list one by one with H1 for next song and H2 for previous song. H3 is used for chose one song to play, then system will play the song, and H4 for stopping the song and back to the menu. Figure 9 show sample view of screen show the first menu overlaid (9a), sample of fifth menu (9b), and view when user choose first song (9c). This simple appli- cation looks like an augmented reality application, where user can see through the HMD while controlling overlay text of song lists. B. Accuracy Evaluation. Accuracy parameter is use to evaluate the performance of proposed method on detecting user head movement. Figure 9. Screenshot of music player augmented application using Accuracy is the overall success rate, when user move the HEMOCS for control. head into particular move and system respond it with corresponding control type on the application. Experiment TABLE II. results as shown in Table 2 shows that there is some error ACCURACY EVALUATION OF EACH HEAD POSE in particular head movement with the average accuracy is Head Pose Total T F Accuracy 80%. These errors happen because user moves their head H1 20 18 2 90% at various speed. Using the same evaluation process, but H2 20 18 2 90% categories it into head movement duration as shown in Table 3 shows that 100% accuracy is achieved when user H3 20 14 6 70% move the head in the speed between 400 milliseconds to H4 20 15 5 75% 1.3 seconds, with an average head maximum degree is H5 20 19 1 95% around 46.84°. H6 20 12 8 60% C. Usability Evaluation. Average 80% Usability evaluation is conducted to measure the satis- TABLE III. faction of the user while using the new proposed control- ACCURACY EVALUATION BASED ON MOVEMENT DURATION ler systems. Experiments results by 20 users trying to use the iOS based music player application with head pose Movement Dura- Max Degree No Accuracy movement control as shown in Table 4. We evaluate five tion (s) (º) usability factor as follows: functionality, easy to use, effec- 1. ! 0.3 29.11 0% tiveness, satisfaction, and understandable. Highest aver- 2. 0.4 – 0.6 41.63 100% age score 94% is achieved in the factor of “effectiveness” 3. 0.7 – 0.9 48.40 100% which means that user thinks this control type is effective 4. 1 – 1.2 50.50 100% for control the application. The least average score 71% is 5. >=1.3 50.33 0% achieved in the factor of “easy to use”, which means that some user still difficult to use the new control system even they think that is effective. The average of usability fac- iJIM ‒ Volume 10, Issue 3, 2016 27 PAPER DESIGN OF HEAD MOVEMENT CONTROLLER SYSTEM (HEMOCS) FOR CONTROL MOBILE APPLICATION THROUGH … TABLE IV. [3] E. Murphy-Chutorian and M. M. Trivedi, “Head pose estimation USABILITY EVALUATION RESULTS and augmented reality tracking: An integrated system and evalua- tion for monitoring driver awareness” IEEE Trans. Intell. Transp. No Usability Factors Average Score Syst., vol. 11, no. 2, pp. 300-311, Jun. 2010. 1. Functionality 78% http://dx.doi.org/10.1109/TITS.2010.2044241 [4] D. J. Kupetz, S. A. Wentzell, and B. F. BuSha, “Head motion 2. Easy to Use 71% controlled power wheelchair” in Proc. IEEE 36th Annu. Northeast 3. Effectiveness 94% Bioeng. Conf., Mar. 2010, pp. 1-2. 4. Satisfaction 79% [5] L. M. King, H. T. Nguyen, and P. B. Taylor, “Hands-free head- movement gesture recognition using artificial neural networks and 5. Understandable 82% the magnified gradient function,'' in Proc. 27th Annu. Conf. Eng. Average 81% Med. Biol., 2005, pp. 2063-2066. http://dx.doi.org/10.1109/ iembs.2005.1616864 tors results reaches 81% which means that the user satis- [6] S. T. Nguyen, H. T. Nguyen, P. B. Taylor, and J. Middleton, “Improved head direction command classification using an opti- fied with music player mobile application controlling by mised Bayesian neural network” in Proc. 28th Annu. Int. Conf. using their head movement only. Some user’s feedback EMBS, 2006, pp. 5679-5682. said that better if the speed of the head movement is cali- [7] S. Manogna, S. Vaishnavi, and B. Geethanjali, “Head movement brating for each user before they start to use as a control- based assist system for physically challenged” in Proc. 4th ler. So the system will be adaptable with user’s head ICBBE, 2010, pp. 1-4. movement speed. [8] S. Kim, M. Park, S. Anumas, and J. Yoo, “Head mouse system based on gyro- and opto-sensors” in Proc. 3rd Int. Conf. BMEI, IV. CONCLUSION & FUTURE WORKS. vol. 4. 2010, pp. 1503-1506. [9] K. Satoh, S. Uchiyama, and H. Yamamoto, “A head tracking Detecting user’s head movement are possible through a method using bird's-eye view camera and gyroscope” in Proc. 3rd mobile phone internal sensors put in user’s head with a IEEE/ACM ISMAR, Nov. 2004, pp. 202-211. dummy HMD like Google cardboard. The proposed [10] K. Arai, H. Tolle, A. Serita, “Mobile Devices Based 3D Image method is succeeded to recognized user’s particular head Display Depending on User's Actions and Movements. Interna- movement as controller for specific gesture in mobile tional Journal of Advanced Research in Artificial Intelligence application. The proposed method is a novel approach for (IJARAI), 2013, vol 2, no. 6. pp. using single position of the sensors put in user’s head, and [11] H. Tolle, A. Pinandito, EM. Adams J., K. Arai, “Virtual reality recognizing user’s head movement type. Implementation game controlled with user’s head and body movement detection using smartphone sensors“. ARPN Journal of Engineering and of head movement controller system on a music player Applied Sciences. Nov. 2015, vol 10, no 20, pp 9776-9782. application prove that our new proposed control system is [12] ---, Google Cardboard SDK, http://developers.google.com/ card- acceptable to implement and possible for another kind of rd board. Accessed in January 23 , 2015. mobile application, especially the application for disabled [13] Shoaib M., Bosch S., Incel O.D., Scholten H., Havinga P.J. people while they cannot use the hand for control the (2014). Fusion of smartphone motion sensors for physical activity application. recognition. Sensors. 2014; 14: 10146–10176. In the near future, we will improve the sensitivity of the http://dx.doi.org/10.3390/s140610146 head movement controller systems with user’s comfort AUTHORS speed for moving their head as a controller. We could improve the method to implement in other mobile applica- H. Tolle is with the Research Group of Multimedia, tion for HMD with the proposed head movement control- Game & Mobile Technology, Informatics Department of ler system. There are plenty of area of mobile application Computer Science Faculty, Brawijaya University, Malang is applicable for this kind of new control by head pose 65145 INDONESIA (e-mail: emang@ub.ac.id, her- movement. man.saga@gmail.com). K. Arai is a professor in the Department of Information REFERENCES Science, Saga University, JAPAN. He is also an Adjunct [1] A. Al-Rahayfeh and M. Faezipour, "Eye Tracking and Head Professor of University of Arizona, USA since 1998. He Movement Detection: A State-of-Art Survey," IEEE Journal of wrote 33 books and published 500 journal papers. (e-mail: Translational Engineering in Health and Medicine, vol. 1, pp. 11- arai@is.saga-u.ac.jp). 22, 2013. http://dx.doi.org/10.1109/JTEHM.2013.2289879 [2] K. Liu, Y. P. Luo, G. Tei, and S. Y. Yang, “Attention recognition This research work is funded by Dikti SAME 2015 project of Indone- of drivers based on head pose estimation,'' in Proc. IEEE VPPC, sian Ministry of Research, Technology & Higher Education, as collabo- ration research between Brawijaya University, Indonesia and Saga Sep. 2008, pp. 1-5. University, Japan. Submitted, 22 February 2016. Published as resubmit- ted by the authors on 12 May 2016. 28 http://www.i-jim.org http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png International Journal of Interactive Mobile Technologies (iJIM) Unpaywall

Design of Head Movement Controller System (HEMOCS) for Control Mobile Application through Head Pose Movement Detection

International Journal of Interactive Mobile Technologies (iJIM)Jul 26, 2016

Loading...

Page 2

Loading...

Page 3

Loading...

Page 4

Loading...

Page 5

 
/lp/unpaywall/design-of-head-movement-controller-system-hemocs-for-control-mobile-3BRoSprKuT

References

References for this paper are not available at this time. We will be adding them shortly, thank you for your patience.

Publisher
Unpaywall
ISSN
1865-7923
DOI
10.3991/ijim.v10i3.5552
Publisher site
See Article on Publisher Site

Abstract

PAPER DESIGN OF HEAD MOVEMENT CONTROLLER SYSTEM (HEMOCS) FOR CONTROL MOBILE APPLICATION THROUGH … Design of Head Movement Controller System (HEMOCS) for Control Mobile Application through Head Pose Movement Detection http://dx.doi.org/10.3991/ijim.v10i3.5552 1 2 Herman Tolle and Kohei Arai Brawijaya University, Malang, Indonesia Saga University, Sagashi, Japan Abstract—Head movement has been found to be a natural recognition. Kupetz et al. [4] implemented a head move- way of interaction. It can be used as an alternative control ment tracking system using an IR camera and IR LEDs. method and provides accessibility for users when used in Another approach for head movement detection is by human computer interface solutions. The combination of using sensors such as gyroscopes and accelerometers. Head-mounted displays (HMDs) with mobile devices, pro- King et al. [5] implemented a hands-free head movement vide an innovation of new low cost of human-computer classification system which uses pattern recognition tech- interaction. Such devices are hands-free systems. In this niques with mathematical solutions for enhancement. A paper, we introduce a new method for recognizing head dual axis accelerometer mounted inside a hat was used to movement as the controller of mobile application and pro- collect head movement data. A similar method was pre- posed a new control system using head movement only. The sented by Nguyen et al. [6]. The method detects the proposed method can determine specific head pose move- movement of a user's head by analyzing data collected ment and respond it as a controller of an application. The from a dual-axis accelerometer and pattern recognition implementation of a music player application on an iOS techniques. But still no application based on the proposed devices shows that the proposed method is appropriate for a method was suggested. Other sensor based approach like new experience of real-time human-computer interaction [7][8]. However, it needs more theoretical proofs and with head movement control only. more experiments and accuracy analysis. A combination of different techniques can be used in Index Terms—head mounted display, accelerometer, head motion estimation, human computer interaction. head tracking systems. Satoh et al. [9] proposed a head tracking method that uses a gyroscope mounted on a head- mounted device (HMD) and a fixed bird's-eye view cam- I. INTRODUCTION era responsible for observing the HMD from a third- Head movement detection has received significant at- person viewpoint. Using a fixed camera, customized tention in recent research. One of the specific purposes for marker, gyroscope sensor and calibration process makes head movement detection and tracking is to allow the user this proposal impractical for head tracking tasks. The time to interact with a computer or new devices like mobile complexity of the algorithm has not been investigated phone. The increased popularity of the wide range of which makes it a little far from being used in real-world applications of which head movement detection is a part, applications. such as assistive technology, virtual reality, and augment- Head-mounted displays (HMDs) embedded in eye- ed reality, have increased the size of research aiming to glasses are the next innovation along the path of commu- provide robust and effective techniques of real-time head nication techniques. Such devices are hands-free systems. movement detection and tracking [1]. Although this is not a new idea, currently released and There are many different approaches for head move- commercially available products (such as the Project ment estimation. All investigated methods require high Glass by Google) show the immense potential of this computational and still difficult to implemented using low technology. They function as stand-alone computers; their computational hardware. Recently, there are three popular light glass frame is equipped with a variety of sensors; a approaches for estimating head movement and tracking: projector displays images and information onto the eye. In using camera based image processing, sensor based meth- our previous research work, we propose head movement od using accelerometer and gyroscope, and the combina- detection and tracking as a controller for 3D object scene tion of different techniques. view [10] and the combination of user’s head and body Most of the head pose estimation method is based on movement as a controller for virtual reality labyrinth game computer vision approach, like [2][3][4]. Liu et al. [2] [11]. introduced a video-based technique for estimating the In this paper, we introduce a new type of head move- head pose and used it in an image processing application ment controlling systems using 3 degrees of freedom of for a real-world problem; and attention recognition for head rotation movement. The method is based on recog- drivers. Murphy-Chutorian and Trivedi [3] presented a nizing the internal accelerometer and gyro sensor data static head pose estimation algorithm and a visual 3-D inside a mobile phone placed on user's head using Head- tracking algorithm based on image processing and pattern mounted Display (HMD) like Google cardboard (called as dummy HMD). A real-time mobile application is built to 24 http://www.i-jim.org PAPER DESIGN OF HEAD MOVEMENT CONTROLLER SYSTEM (HEMOCS) FOR CONTROL MOBILE APPLICATION THROUGH … prove the implementation of the method in real time basis. The user can easily control the hands-free application by using particular head pose movement only. II. DESIGN OF HEAD MOVEMENT CONTROL SYSTEM (HEMOCS) The head movement controller system (HEMOCS) Figure 1. Sample of Google Cardboard as a dummy-HMD with working with a dummy HMD complementary with a smartphone [12] smartphone which has internal inertial sensors like accel- erometer, gyroscope, and magnetometer. The user wears an HMD with a smartphone as shown in Figure 1 and Figure 2 while system displays a mobile application de- veloped on a smartphone. HMD is used for integrating human head and eyes, and through the smartphone’s dis- play user can watch through the camera while controlling something in the application. The user can control the application by moving the head in a particular movement. The head movement is detected through real-time data gathering and analysis from mobile phone’s sensors. The method for detecting user’s head pose movement is based on the pattern of data gathered by internal sensors. The human head is limited to three degrees of freedom Figure 2. User wears a dummy-HMD with 3 degrees of freedom of (DOF) of head translation in the pose, which can be char- head movement acterized by the pitch, roll, and yaw angles as pictured in Figure 2. In this research, we proposed a control system using three types of head pose movement with each has two opposite direction as shown in Figure 3. Type of head movement is axial rotation left (H1), axial rotation right (H2), flexion (H3) and extention (H4), lateral bending left (H5) and lateral bending right (H6). H1 to H6 is just the code for easily named the head pose movement. Axial rotation left means that user moves the head to the left (around 30-45 degree) from the initial position in the spe- cific speed and then pose back the head into initial posi- tion. This movement feel like user tries to swipe some- thing in the application using their head. All the same process for other five types of movement with each direc- tion. Then we have 6 type of head poses movement as the gesture for control something in the mobile application. A. Head Pose Gesture Control Function In the proposed head movement control system Figure 3. Six type of head pose movement named as H1 to H6 (HEMOCS), the head movement gesture is act like a swipe type in mobile phone user experience. This control TABLE I. system also work as substitute for the conventional button DESIGN OF HEAD POSE CONTROL FUNCTION or tap function. The proposed method for the head Code Head Pose Type Control Purpose movement and each type of control purpose as shown in H1 Move to the left / Axial rotation left Select Previous Table 1. Moving the head to the left or right is using as selecting control (previous (H1)) or next (H2)). Head H2 Move to the right / Axial rotation right Select Next looking down is for accept (or tap or choose) control H3 Head looking down / Flexion Choose this (H3), looking up for back function (H4), tilt the head H4 Head looking up / Extension Back to the List toward left shoulder is for back home (H5) and tilt head Tilt head toward left shoulder / Lateral H5 Back to Home toward right shoulder is just reserved (H6) for further Bending left functionality in the near future. Tilt head toward right shoulder / Lateral H6 reserved The first thing to investigate is how to detect and rec- Bending right ognize pose of the head movement through signal pattern. The method in the head movement controlling system application that correlated with the particular detected based on four repeatable process steps as shown in Figure head movement type. 4 as follows: 1) read sensors data, 2) recognize the da- ta/signal pattern, 3) determine the head movement, and 4) B. Preliminary Investigate on Head Movement Signal controller response based on the recognized head move- Pattern ment. First, the system read sensors data using push meth- Preliminary investigate on the pattern of the 6 type of od, secondly recognizing the pattern of sensor’s data, then head movement should be done before developing attitude determine which head pose movement type it is, lastly is control system. In our previous work [10][11], using iOS the response effect for control something in the mobile iJIM ‒ Volume 10, Issue 3, 2016 25 PAPER DESIGN OF HEAD MOVEMENT CONTROLLER SYSTEM (HEMOCS) FOR CONTROL MOBILE APPLICATION THROUGH … devices sensors achieve best results comparing to android problem because attitude data is in degree (after conver- based devices sensors [13]. In iOS based device system, sion from radian to degree), which means that first front reading the sensor data is facilitated through CoreMotion position as zero degrees, then we have to consider about which has 4 different type of movement data as follows: user orientation movement. If the user changes his/her acceleration, gravity, attitude, and rotation, while each base front position orientation, then the algorithm is not has three axis data. We read the data from for different working anymore. type on CoreMotion then analyze the pattern. The data User first front orientation position is a challenging pattern of the first type of head movement is shown in because in practice, user’s head can move any direction Figure 5. Comparison of the pattern of six different head and orientation, but only six specific head pose should be movement data concludes that CoreMotion attitude is detected, recognized and use as a controller. We improved feasible for usage in our proposed control system. The the algorithm to ensure that our proposed system is robust basic data rate is set to 60 cycles per second. From the and able to avoid different head movement type besides preliminary investigation, we found that core-motion the six types of head pose defined as H1 to H6. A attitude data sensors are appropriate for recognizing threshold number is a static number while the user’s base particular head movement type. C. Method for Detecting Head Pose Movement The method for detecting and recognizing the head pose is based on the analyzing of attitude data including yaw, roll, and pitch. We proposed the general algorithm as shown in Figure 4 and sample of pseudocode of detecting head pose algorithm as shown in Figure 7. Attitude data pattern for H1 and H2 is affected only with yaw data as shown in Figure 6. The system starts to count if yaw de- gree is higher than a specific threshold. The value of the threshold is 10 degree as the base threshold line (point 1) since the amplitude of pitch and roll is below the thresh- old value when user’s act H1 or H2 head movements. The number of counter when user’s head move more than 10 degrees is using to determine that user is in the motion of H1 or H2 (point 2). If the counter is in the specific number Figure 5. Different signal pattern of iOS CoreMotion for Axial Rota- between yMin and yMax then system recognized it as user tion (H1) Pose head movement in H1 or H2 pose. Determine the H1 or H2 is based on the peak value of the absolute degree level, if the real value is positive then determine as H1, and H2 for a negative value. For H3-H4 and H5-H6 we use roll and pitch respectively. There are two challenges in this system when to use CoreMotion attitude data. These are, how to get high accuracy when combined all the head pose movement detection; and the second is about user first head orienta- tion. The first problem is an algorithm problem to com- bine all processes with high accuracy. For the second Figure 6. Signal Pattern of Coremotion Attitude while user Move the Head to the Left (H1) !"#$%&'#()*%*$%+',*-.( /( (((012)1%1(3(4'5*6'%&'#78%%&%"9*7:12( (((&!(;012)1%1;(<(012=5*,>'?9( ((((((0124'"#%@@( (((*?,*( ((((((1??4'"#%@@( (((&!(-0124'"#%(<(0126&#.(AA(-0124'"#%(B(01261C.( (((/( (((((((&!(012)1%1(B(D( (((((((((E'F*G*!%-.( (((((((*?,*( (((((((((E'F*H&I>%-.( ((((J( ((((&!(1??4'"#%(<(KD( (((((((0124'"#%(3(D( Figure 7. Pseudocode of Head Pose Detection Function Figure 4. Proposed Head Movement Control System Process Step 26 http://www.i-jim.org PAPER DESIGN OF HEAD MOVEMENT CONTROLLER SYSTEM (HEMOCS) FOR CONTROL MOBILE APPLICATION THROUGH … front orientation is adaptable depend on user’s head base front orientation. If the user move the head’s orientation more than specific cycles, then we change the front base orientation based on current orientation. III. IMPLEMENTATION & EVALUATION A. Mobile Application Using HEMOCS There are plenty of mobile application can implement HEMOCS as a new control method. This method is appli- cable for any kind of mobile application in the area where user should not use their hand for controlling something in the screen, for example for the driver, welder, etc. In the area of healthcare, HEMOCS can be implemented in the Figure 8. Implementation Evaluation of HEMOC System. mobile application for usage by disabled people where they cannot talk and use their hands for controlling an application. Figure 7 show the illustration of an imple- mentation of HEMOCS in HMD based mobile applica- tion. One of the future targets of HEMOCS is implemen- tation as assisted communication device for disabled peo- ple. To prove our concept, we develop simple music player application with overlay text showed the song order one by one in the screen. User can control the song list and play a chosen song by move their head only. H1 and H2 control type is used to select different song in the list one by one with H1 for next song and H2 for previous song. H3 is used for chose one song to play, then system will play the song, and H4 for stopping the song and back to the menu. Figure 9 show sample view of screen show the first menu overlaid (9a), sample of fifth menu (9b), and view when user choose first song (9c). This simple appli- cation looks like an augmented reality application, where user can see through the HMD while controlling overlay text of song lists. B. Accuracy Evaluation. Accuracy parameter is use to evaluate the performance of proposed method on detecting user head movement. Figure 9. Screenshot of music player augmented application using Accuracy is the overall success rate, when user move the HEMOCS for control. head into particular move and system respond it with corresponding control type on the application. Experiment TABLE II. results as shown in Table 2 shows that there is some error ACCURACY EVALUATION OF EACH HEAD POSE in particular head movement with the average accuracy is Head Pose Total T F Accuracy 80%. These errors happen because user moves their head H1 20 18 2 90% at various speed. Using the same evaluation process, but H2 20 18 2 90% categories it into head movement duration as shown in Table 3 shows that 100% accuracy is achieved when user H3 20 14 6 70% move the head in the speed between 400 milliseconds to H4 20 15 5 75% 1.3 seconds, with an average head maximum degree is H5 20 19 1 95% around 46.84°. H6 20 12 8 60% C. Usability Evaluation. Average 80% Usability evaluation is conducted to measure the satis- TABLE III. faction of the user while using the new proposed control- ACCURACY EVALUATION BASED ON MOVEMENT DURATION ler systems. Experiments results by 20 users trying to use the iOS based music player application with head pose Movement Dura- Max Degree No Accuracy movement control as shown in Table 4. We evaluate five tion (s) (º) usability factor as follows: functionality, easy to use, effec- 1. ! 0.3 29.11 0% tiveness, satisfaction, and understandable. Highest aver- 2. 0.4 – 0.6 41.63 100% age score 94% is achieved in the factor of “effectiveness” 3. 0.7 – 0.9 48.40 100% which means that user thinks this control type is effective 4. 1 – 1.2 50.50 100% for control the application. The least average score 71% is 5. >=1.3 50.33 0% achieved in the factor of “easy to use”, which means that some user still difficult to use the new control system even they think that is effective. The average of usability fac- iJIM ‒ Volume 10, Issue 3, 2016 27 PAPER DESIGN OF HEAD MOVEMENT CONTROLLER SYSTEM (HEMOCS) FOR CONTROL MOBILE APPLICATION THROUGH … TABLE IV. [3] E. Murphy-Chutorian and M. M. Trivedi, “Head pose estimation USABILITY EVALUATION RESULTS and augmented reality tracking: An integrated system and evalua- tion for monitoring driver awareness” IEEE Trans. Intell. Transp. No Usability Factors Average Score Syst., vol. 11, no. 2, pp. 300-311, Jun. 2010. 1. Functionality 78% http://dx.doi.org/10.1109/TITS.2010.2044241 [4] D. J. Kupetz, S. A. Wentzell, and B. F. BuSha, “Head motion 2. Easy to Use 71% controlled power wheelchair” in Proc. IEEE 36th Annu. Northeast 3. Effectiveness 94% Bioeng. Conf., Mar. 2010, pp. 1-2. 4. Satisfaction 79% [5] L. M. King, H. T. Nguyen, and P. B. Taylor, “Hands-free head- movement gesture recognition using artificial neural networks and 5. Understandable 82% the magnified gradient function,'' in Proc. 27th Annu. Conf. Eng. Average 81% Med. Biol., 2005, pp. 2063-2066. http://dx.doi.org/10.1109/ iembs.2005.1616864 tors results reaches 81% which means that the user satis- [6] S. T. Nguyen, H. T. Nguyen, P. B. Taylor, and J. Middleton, “Improved head direction command classification using an opti- fied with music player mobile application controlling by mised Bayesian neural network” in Proc. 28th Annu. Int. Conf. using their head movement only. Some user’s feedback EMBS, 2006, pp. 5679-5682. said that better if the speed of the head movement is cali- [7] S. Manogna, S. Vaishnavi, and B. Geethanjali, “Head movement brating for each user before they start to use as a control- based assist system for physically challenged” in Proc. 4th ler. So the system will be adaptable with user’s head ICBBE, 2010, pp. 1-4. movement speed. [8] S. Kim, M. Park, S. Anumas, and J. Yoo, “Head mouse system based on gyro- and opto-sensors” in Proc. 3rd Int. Conf. BMEI, IV. CONCLUSION & FUTURE WORKS. vol. 4. 2010, pp. 1503-1506. [9] K. Satoh, S. Uchiyama, and H. Yamamoto, “A head tracking Detecting user’s head movement are possible through a method using bird's-eye view camera and gyroscope” in Proc. 3rd mobile phone internal sensors put in user’s head with a IEEE/ACM ISMAR, Nov. 2004, pp. 202-211. dummy HMD like Google cardboard. The proposed [10] K. Arai, H. Tolle, A. Serita, “Mobile Devices Based 3D Image method is succeeded to recognized user’s particular head Display Depending on User's Actions and Movements. Interna- movement as controller for specific gesture in mobile tional Journal of Advanced Research in Artificial Intelligence application. The proposed method is a novel approach for (IJARAI), 2013, vol 2, no. 6. pp. using single position of the sensors put in user’s head, and [11] H. Tolle, A. Pinandito, EM. Adams J., K. Arai, “Virtual reality recognizing user’s head movement type. Implementation game controlled with user’s head and body movement detection using smartphone sensors“. ARPN Journal of Engineering and of head movement controller system on a music player Applied Sciences. Nov. 2015, vol 10, no 20, pp 9776-9782. application prove that our new proposed control system is [12] ---, Google Cardboard SDK, http://developers.google.com/ card- acceptable to implement and possible for another kind of rd board. Accessed in January 23 , 2015. mobile application, especially the application for disabled [13] Shoaib M., Bosch S., Incel O.D., Scholten H., Havinga P.J. people while they cannot use the hand for control the (2014). Fusion of smartphone motion sensors for physical activity application. recognition. Sensors. 2014; 14: 10146–10176. In the near future, we will improve the sensitivity of the http://dx.doi.org/10.3390/s140610146 head movement controller systems with user’s comfort AUTHORS speed for moving their head as a controller. We could improve the method to implement in other mobile applica- H. Tolle is with the Research Group of Multimedia, tion for HMD with the proposed head movement control- Game & Mobile Technology, Informatics Department of ler system. There are plenty of area of mobile application Computer Science Faculty, Brawijaya University, Malang is applicable for this kind of new control by head pose 65145 INDONESIA (e-mail: emang@ub.ac.id, her- movement. man.saga@gmail.com). K. Arai is a professor in the Department of Information REFERENCES Science, Saga University, JAPAN. He is also an Adjunct [1] A. Al-Rahayfeh and M. Faezipour, "Eye Tracking and Head Professor of University of Arizona, USA since 1998. He Movement Detection: A State-of-Art Survey," IEEE Journal of wrote 33 books and published 500 journal papers. (e-mail: Translational Engineering in Health and Medicine, vol. 1, pp. 11- arai@is.saga-u.ac.jp). 22, 2013. http://dx.doi.org/10.1109/JTEHM.2013.2289879 [2] K. Liu, Y. P. Luo, G. Tei, and S. Y. Yang, “Attention recognition This research work is funded by Dikti SAME 2015 project of Indone- of drivers based on head pose estimation,'' in Proc. IEEE VPPC, sian Ministry of Research, Technology & Higher Education, as collabo- ration research between Brawijaya University, Indonesia and Saga Sep. 2008, pp. 1-5. University, Japan. Submitted, 22 February 2016. Published as resubmit- ted by the authors on 12 May 2016. 28 http://www.i-jim.org

Journal

International Journal of Interactive Mobile Technologies (iJIM)Unpaywall

Published: Jul 26, 2016

There are no references for this article.