default search action
18th HCI 2016: Toronto, ON, Canada
- Masaaki Kurosu:
Human-Computer Interaction. Interaction Platforms and Techniques - 18th International Conference, HCI International 2016, Toronto, ON, Canada, July 17-22, 2016. Proceedings, Part II. Lecture Notes in Computer Science 9732, Springer 2016, ISBN 978-3-319-39515-9
Gesture, Motion-Based and Eye-gaze Based Interaction
- Sebastian Balthasar, Manuel Martin, Florian van de Camp, Jutta Hild, Jürgen Beyerer:
Combining Low-Cost Eye Trackers for Dual Monitor Eye Tracking. 3-12 - Michelle A. Brown, Wolfgang Stuerzlinger:
Exploring the Throughput Potential of In-Air Pointing. 13-24 - Lorenzo Cavalieri, Maura Mengoni, Silvia Ceccacci, Michele Germani:
A Methodology to Introduce Gesture-Based Interaction into Existing Consumer Product. 25-36 - Niels Christian Nilsson, Stefania Serafin, Rolf Nordahl:
Walking in Place Through Virtual Worlds. 37-48 - Paula Gardner, Hart Sturgeon, Lee Jones, Stephen Surlin:
Body Editing: Dance Biofeedback Experiments in Apperception. 49-60 - Zhizhi Guo, Qian-Xiang Zhou, Zhong-Qi Liu, Xin Zhang, Zhaofang Xu, Yan Lv:
Real-Time Gaze Estimation Using Monocular Vision. 61-70 - Kazuyoshi Murata, Yu Shibuya:
Acceptable Dwell Time Range for Densely Arranged Object Selection Using Video Mirror Interfaces. 71-81 - Danilo Ribeiro, João Luiz Bernardes, Norton Trevisan Roman, Marcelo M. Antunes, Enrique M. Ortega, Antonio W. Sousa, Luciano A. Digiampietri, Luís M. del Val Cura, Valdinei F. Silva, Clodoaldo Ap. M. Lima:
Analysis of Choreographed Human Movements Using Depth Cameras: A Systematic Review. 82-92 - Sudarat Tangnimitchok, Nonnarit O.-Larnnithipong, Armando B. Barreto, Francisco R. Ortega, Naphtali David Rishe:
Finding an Efficient Threshold for Fixation Detection in Eye Gaze Tracking. 93-103 - Masaya Tsuruta, Shuhei Aoyama, Arika Yoshida, Buntarou Shizuki, Jiro Tanaka:
Hover Detection Using Active Acoustic Sensing. 104-114 - Etsuko Ueda, Kenichi Iida, Kentaro Takemura, Takayuki Nakamura, Masanao Koeda:
Identification of Gracefulness Feature Parameters for Hand-Over Motion. 115-124
Multimodal, Multisensory and Natural Interaction
- Evren Bozgeyikli, Lal Bozgeyikli, Andrew Raij, Srinivas Katkoori, Redwan Alqasemi, Rajiv V. Dubey:
Virtual Reality Interaction Techniques for Individuals with Autism Spectrum Disorder: Design Considerations and Preliminary Results. 127-137 - Allan Christensen, Simon André Pedersen, Per Bjerre, Andreas Køllund Pedersen, Wolfgang Stuerzlinger:
Transition Times for Manipulation Tasks in Hybrid Interfaces. 138-150 - Gencay Deniz, Pinar Onay Durdu:
BCI-Related Research Focus at HCI International Conference. 151-161 - Hae Youn Joung, Se Young Kim, Seung Hyun Im, Bo Kyung Huh, Heesun Kim, Gyu Hyun Kwon, Ji-Hyung Park:
Optimal User Interface Parameters for Dual-Sided Transparent Screens in Layered Window Conditions. 162-169 - Alexey Karpov, Alexander L. Ronzhin, Irina S. Kipyatkova, Andrey Ronzhin, Vasilisa Verkhodanova, Anton I. Saveliev, Milos Zelezný:
Bimodal Speech Recognition Fusing Audio-Visual Modalities. 170-179 - Akemi Kobayashi, Ryosuke Aoki, Norimichi Kitagawa, Toshitaka Kimura, Youichi Takashima, Tomohiro Yamada:
Towards Enhancing Force-Input Interaction by Visual-Auditory Feedback as an Introduction of First Use. 180-191 - Yuto Kotajima, Jiro Tanaka:
Book-Like Reader: Mirroring Book Design and Navigation in an E-Book Reader. 192-200 - Florian Nothdurft, Frank Honold, Wolfgang Minker:
Temporal and Spatial Design of Explanations in a Multimodal System. 201-210 - Kelvin S. Prado, Norton Trevisan Roman, Valdinei F. Silva, João Luiz Bernardes Jr., Luciano A. Digiampietri, Enrique M. Ortega, Clodoaldo Ap. M. Lima, Luís M. del Val Cura, Marcelo M. Antunes:
Automatic Facial Recognition: A Systematic Review on the Problem of Light Variation. 211-221 - Daniel R. Mestre, Céphise Louison, Fabien Ferlay:
The Contribution of a Virtual Self and Vibrotactile Feedback to Walking Through Virtual Apertures. 222-232 - Felix Schüssel, Frank Honold, Nikola Bubalo, Anke Huckauf, Harald C. Traue, Dilana Hazer-Rau:
In-Depth Analysis of Multimodal Interaction: An Explorative Paradigm. 233-240 - Marisol Wong-Villacres, Vanessa Echeverría Barzola, Roger Granda, Katherine Chiluiza García:
Portable Tabletops: A Low-Cost Pen-and-Touch Approach. 241-252
Mobile and Wearable Interaction
- Ahmed Sabbir Arif, Ali Mazalek:
A Survey of Text Entry Techniques for Smartwatches. 255-267 - Georges Badr, Antoine Ghorra, Kabalan Chaccour:
MobiCentraList: Software Keyboard with Predictive List for Mobile Device. 268-277 - Upasna Bhandari, Wen Yong Chua, Tillmann Neben, Klarissa Ting-Ting Chang:
Cognitive Load and Attention for Mobile Applications: A Design Perspective. 278-284 - Andrei Garcia, Cristina Camacho, Marina Bellenzier, Marina Pasquali, Tiago Weber, Milene Selbach Silveira:
Data Visualization in Mobile Applications: Investigating a Smart City App. 285-293 - Minal Jain, Sarita Seshagiri, Aditya Ponnada:
Should My Device Learn My Identity and Personality? 294-301 - Ger Joyce, Mariana Lilley, Trevor Barker, Amanda Jefferies:
Mobile Application Tutorials: Perception of Usefulness from an HCI Expert Perspective. 302-308 - Yuya Kawabata, Daisuke Komoriya, Yuki Kubo, Buntarou Shizuki, Jiro Tanaka:
Effects of Holding Ring Attached to Mobile Devices on Pointing Accuracy. 309-319 - Tian Lei, Xu Liu, Lei Wu, Ziliang Jin, Yuhui Wang, Shuaili Wei:
The Influence of Matching Degree of the User's Inherent Mental Model and the Product's Embedded Mental Model on the Mobile User Experience. 320-329 - Takeshi Nagami, Yoshikazu Seki, Hidenori Sakai, Hiroaki Ikeda:
Usability Evaluation of 4-Direction Keys for Ladder Menu Operation. 330-340 - Franck Poirier, Mohammed Belatar:
UniWatch: A Soft Keyboard for Text Entry on SmartWatches Using 3 Keys - Watch User-Interface and User Evaluation. 341-349
Multi-platform, Migratory and Distributed Interfaces
- Ryosuke Aoki, Akihiro Miyata, Shunichi Seko, Ryo Hashimoto, Tatsuro Ishida, Masahiro Watanabe, Masayuki Ihara:
An Information Display System with Information Scrapping User Interface Based on Digital Signage Terminals and Mobile Devices for Disaster Situations. 353-363 - Lukas Baron, Annerose Braune:
Challenges for the Application of Migratory User Interfaces in Industrial Process Visualizations. 364-378 - Lawrence J. Henschen, Julia C. Lee:
Human-Computer Interfaces for Sensor/Actuator Networks. 379-387 - Alexander M. Morison, Taylor Murphy, David D. Woods:
Seeing Through Multiple Sensors into Distant Scenes: The Essential Power of Viewpoint Control. 388-399 - Arthur Nishimoto, Daria Tsoupikova, Scott Rettberg, Roderick Coover:
From CAVE2TM to Mobile: Adaptation of Hearts and Minds Virtual Reality Project Interaction. 400-411 - Hye Sun Park, Ho Won Kim, Chang-Joon Park:
Dynamic-Interaction UI/UX Design for the AREIS. 412-418 - Kazuki Tada, Jiro Tanaka:
Development of Multiple Device Collaboration System Using Built-in Camera Image. 419-427
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.