Each year in December, senior Human-Computer Interaction researchers meet to discuss the articles submitted to the Conference on Human Factors in Computing Systems (or in short CHI). CHI is the most important venue for research on Human-Computer Interaction and covers a broad range of research from understanding people, via novel interaction techniques to visualization. This year, over 300 researchers came to Montreal and discussed the articles submitted to CHI 2018. With Harald Reiterer and me, two associate chairs from Konstanz and Stuttgart participated in the meeting. CHI only accepts about 25% of the submissions after a rigorous peer review process. With 16 accepted publications, the groups participating in SFB-TRR 161 from Konstanz, Tübingen, and Stuttgart have been very successful and are happy about how well their submissions have been received. All submissions are conditionally accepted which means that the authors have to incorporate the feedback from the reviewers before the articles will be accepted. The technical papers will be presented in April at the conference which will also take place in Montreal.

The accepted articles include work that develops a better understanding of how people use technology. The work from Tübingen addressed how humans perform in highly-realistic visually-immersive environments, namely simulators for automated cars [1] and trucks [7]. With researchers from Oldenburg, Heinrich Bülthoff and Lewis Chuang show that real motion influences our readiness to take over control of highly automated vehicles [1]. With Scania, Christiane Glatz et al. show that verbal commands and auditory icons, designed for supporting task-management in truck drivers, favor different information processes in the brain [7]. Huy Viet Le et al. will present work in which they investigate the fingers’ range and the comfortable area for one-handed smartphone interaction [2]. Rufat Rzayev et al. studied reading on head-mounted displays and investigated the effects of text position, presentation type and walking [3]. Together with researchers from Stuttgart and Saarbrücken, Mohamed Khamis from LMU Munich conducted a week-long study to reveal the face and eye visibility in front-facing cameras of Smartphones to identify challenges, potentials, and applications for mobile interfaces that use face and eye tracking [4]. Miriam Greis et al. look at uncertainty visualizations to improve humans’ choice of internal models for information aggregation [5]. Thomas Kosch, who recently moved to LMU Munich, together Paweł Wozniak from Stuttgart, Erin Brady from Purdue University Indianapolis, and Albrecht Schmidt now also in Munich investigate the design requirements of smart kitchens for people with cognitive impairments [15].

Testing the influence of real-motion on the readiness to take over control of automated vehicles ([1]) (Photo: Max Planck Institute for Biological Kybernetics).
Participant is reading with RSVP while walking during a study. In the study, we investigated the effect of different text positions and presentation types on binocular see-through smart glasses [3].

ART (Augmented Reality above the Tabletop) is system designed to facilitate the collaborative analysis of multidimensional data. A 3D parallel coordinates visualization in augmented reality is anchored to a touch-sensitive tabletop, enabling familiar operation [6].
Researchers will also present work that can improve the interaction with computing systems. Simon Butscher et al. investigated how immersive technologies can facilitate collaborative analysis of multidimensional data [6]. Sven Mayer et al. looked at the error that users make when pointing at distant target and how the error changes in virtual reality. They show that pointing precision can be significantly improved by compensating the systematic component of users’ pointing error [8]. Huy Viet Le et al. looked at improving the interaction with mobile devices [9]. They show how the palm touching a touchscreen can reliably be detected and used as an additional input mode. Roman Rädle et al. developed optical outside-in device tracking that exploits display polarization [10] in a joint collaboration with researchers from University of Konstanz, Aarhus University, and Microsoft Research in Redmond. Together with researchers from Aarhus, Melbourne, Helsinki, and Stuttgart, Sven Mayer looks at movement behavior when playing collaborative and competitive games on large displays [11]. Pascal Knierim, who recently moved from Stuttgart to LMU Munich [12], look at text entry in virtual reality together with researchers from Stuttgart and Helsinki. They analyzed the effects of avatar hands on typing performance and show that especially inexperienced typist can benefit from appropriate avatar visualizations. Together with Paweł Woźniak from Stuttgart and colleagues from LMU Munich, Thomas Kosch et al. investigate how smooth pursuit can be leveraged to assess cognitive workload.

PolarTrack [10] is an optical tracking system for mobile devices that combines an off-the-shelf RGB camera with a rotating linear polarization filter mounted in front of the lens. PolarTrack exploits the use of polarized light in current displays to segment device screens in the camera feed from the background by detecting periodical changes of display brightness while the linear polarizer rotates. The PolarTrack processing steps: the color input image, the weighted average of differential images, binarization of the weighted average, and the output of the multi-device tracking.

Using the palm as an additional input modality on smartphones [f]. The images shows a palm touch when holding the device one-handed and two-handed interaction.
Another focus of the work that will be presented at the conference is developing new study methods that can be used for building better interactive systems. Sven Mayer et al. will present a method for evaluating the disruptiveness of interactive systems using a mixed-method approach [13]. An emerging challenge when designing user interfaces is considering that different devices, such as smartwatches, smart glasses, and smart watches, are used for the same tasks. Together with researchers from Stuttgart, Tilman Dingler, who is now in Tokio, propose a gesture elicitation method to design consistent gestures across devices and provide a transferability score that describes how well interaction knowledge can be transferred from one device to another [14].


Here is a list of papers accepted for CHI 2018:

[1] Shadan Sadeghian Borojeni, Susanne Boll, Wilko Heuten, Heinrich Bülthoff, Lewis Chuang: Feel the Movement: Real Motion Influences Responses to Take-over Requests in Highly Automated Vehicles, In: Proceedings of the Conference on Human Factors in Computing Systems, 2018.

[2] Huy Viet Le, Sven Mayer, Patrick Bader, Niels Henze: Fingers’ Range and Comfortable Area for One-Handed Smartphone Interaction Beyond the Touchscreen: Evaluating the Disruptiveness of Mobile Interactions: A Mixed-Method Approach, In: Proceedings of the Conference on Human Factors in Computing Systems, 2018.

[3] Rufat Rzayev, Paweł Woźniak, Tilman Dingler, Niels Henze: Reading on HMDs: The Effect of Text Position, Presentation Type and Walking, In: Proceedings of the Conference on Human Factors in Computing Systems, 2018.

[4] Mohamed Khamis, Anita Baier, Niels Henze, Florian Alt, Andreas Bulling: Understanding Face and Eye Visibility in Front-Facing Cameras of Smartphones used in the Wild, In: Proceedings of the Conference on Human Factors in Computing Systems, 2018.

[5] Miriam Greis, Aditi Joshi, Ken Singer, Tonja Machulla und Albrecht Schmidt: Uncertainty Visualization Improves Humans’ Choice of Internal Models for Information Aggregation, In: Proceedings of the Conference on Human Factors in Computing Systems, 2018.

[6] Simon Butscher, Sebastian Hubenschmid, Jens Müller, Johannes Fuchs, Harald Reiterer: Cluster, Trends, and Outliers: How Immersive Technologies can Facilitate Collaborative Analysis of Multidimensional, Heath-Related Data. In: Proceedings of the Conference on Human Factors in Computing Systems, 2018.

[7] Christiane Glatz, Stas Krupenia, Heinrich Bülthoff, Lewis Chuang: Use the Right Sound for the Right Job: Verbal Commands and Auditory Icons for a Task-Management System Favor Different Information Processes in the Brain, In: Proceedings of the Conference on Human Factors in Computing Systems, 2018.

[8] Sven Mayer, Valentin Schwind, Robin Schweigert, Niels Henze: The Effect of Offset Correction and Cursor on Mid-Air Pointing in Real and Virtual Environments, In: Proceedings of the Conference on Human Factors in Computing Systems, 2018.

[9] Huy Viet Le, Thomas Kosch, Patrick Bader, Sven Mayer, Niels Henze: PalmTouch: Using the Palm as an Additional Input Modality on Commodity Smartphones, In: Proceedings of the Conference on Human Factors in Computing Systems, 2018.

[10] Roman Rädle, Hans-Christian Jetter, Jonathan Fischer, Inti Gabriel, Clemens N. Klokmose, Harald Reiterer, Christian Holz, PolarTrack: Optical Outside-In Device Tracking that Exploits Display Polarization. In: Proceedings of the Conference on Human Factors in Computing Systems, 2018.

[11] Sven Mayer, Lars Lischke, Jens Emil Grønbæk, Zhanna Sarsenbayeva, Jonas Vogelsang, Paweł Woźniak, Niels Henze, Giulio Jacucci: Pac-Many: Movement Behaviour when Playing Collaborative and Competitive Games on Large Displays, In: Proceedings of the Conference on Human Factors in Computing Systems, 2018.

[12] Pascal Knierim, Valentin Schwind, Anna Feit, Florian Nieuwenhuizen, Niels Henze: Physical Keyboards in Virtual Reality: Analysis of Typing Performance and Effects of Avatar Hands, In: Proceedings of the Conference on Human Factors in Computing Systems, 2018.

[13] Sven Mayer, Lars Lischke, Paweł Wozniak, Niels Henze: Evaluating the Disruptiveness of Mobile Interactions: A Mixed-Method Approach, In: Proceedings of the Conference on Human Factors in Computing Systems, 2018.

[14] Tilman Dingler, Rufat Rzayev, Ali Sahami, Niels Henze: Designing Consistent Gestures Across Device Types: Eliciting RSVP Controls for Phone, Watch, and Glasses, In: Proceedings of the Conference on Human Factors in Computing Systems, 2018.

[15] Thomas Kosch, Mariam Hassib, Paweł Woźniak, Daniel Buschek, Florian Alt: Your Eyes Tell: Leveraging Smooth Pursuit for Assessing Cognitive Workload, In: Proceedings of the Conference on Human Factors in Computing Systems, 2018.

[16] Thomas Kosch, Paweł Wozniak, Erin Brady, Albrecht Schmidt: Can I Assist You?: Investigating Design Requirements of Smart Kitchens for People with Cognitive Impairments, In: Proceedings of the Conference on Human Factors in Computing Systems, 2018.

 

 

Research Results from Tübingen, Konstanz and Stuttgart accepted at CHI 2018

Niels Henze is a junior professor for Socio-Cognitive Systems at the Institute for Visualization and Inteactive Systems, University of Stuttgart. His special research focus is on mobile human computer interaction and pervasive computing.

Leave a Reply

Mit dem Eintrag in das nachstehende Feld können Sie unter Angabe eines Namens einen Kommentar hinterlassen.

Personenbezogene Daten
Sie haben die Möglichkeit die Kommentarfunktion ohne Angabe von personenbezogenen Daten unter einem Pseudonym zu nutzen.
Name - Bei dieser Angabe handelt es sich um eine Pflichtangabe. Der von Ihnen gewählte Name wird mit dem von Ihnen verfassten Kommentar veröffentlicht.
E–Mailadresse - Bitte beachten Sie, dass die Angabe Ihrer E–Mailadresse zur Nutzung der Kommentarfunktion nicht erforderlich ist. Auch im Falle einer Eingabe wird die E-Mailadresse nicht verwendet, auch nicht veröffentlicht. Bitte lassen Sie dieses Feld unausgefüllt.
Webseite - Die Angabe Ihrer Webseite ist freiwillig. Die von Ihnen angegebene Webseite wird zusammen mit Ihrem Kommentar veröffentlicht.

Nach den §§ 21, 22 LDSG haben Sie das Recht, auf Antrag unentgeltlich Auskunft über die von der Universität Stuttgart und Universität Konstanz über Sie gespeicherten Daten zu erhalten und bei unrichtig gespeicherten Daten deren Berichtigung zu verlangen (Auskunfts- und Berichtigungsrecht). Ein Auskunfts- oder Berichtigungsersuchen richten Sie bitte schriftlich an die Geschäftsstellen des SFB-TRR 161 an der Universität Stuttgart (E-Mail: sfbtrr161[at]visus.uni-stuttgart.de) bzw. der Universität Konstanz (E-Mail: sfbtrr161[at]uni-konstanz.de).