When designing technical solutions, developers are aware that the attentional capacity of their users is limited. Thus, it is an open question on how our limited attention can be cued and redirected by warning systems. In a recent study, Lewis Chuang and Christiane Glatz tested different warning sounds at the Max Planck Institute for Biological Cybernetics in Tübingen. The scientists found out that certain sounds redirected our attention away from an ongoing task better than others.
Each year in December, senior Human-Computer Interaction researchers meet to discuss the articles submitted to the Conference on Human Factors in Computing Systems (or in short CHI). CHI is the most important venue for research on Human-Computer Interaction and covers a broad range of research from understanding people, via novel interaction techniques to visualization. This year, over 300 researchers came to Montreal and discussed the articles submitted to CHI 2018. With Harald Reiterer and me, two associate chairs from Konstanz and Stuttgart participated in the meeting. CHI only accepts about 25% of the submissions after a rigorous peer review process. With 16 accepted publications, the groups participating in SFB-TRR 161 from Konstanz, Tübingen, and Stuttgart have been very successful and are happy about how well their submissions have been received.
My name is Alexandra Sipatchin and I am currently a neuroscience student intern at the Max Plack Institute for Biological Cybernetics in Tübingen at the Cognition and Control for Human-Machine Systems group. I attended the AutomotiveUI 2017 because I have never been to one and since I am new to the field I decided to join and have a broader overview of the hot topics in the field right now. The conference offered me a new insight over the extended and vast universe of human-vehicle interface.
My name is Sarah Faltaous. I am an Egyptian student in the cognitive systems master program at the University of Ulm, currently doing my master thesis at Max Planck Institute for Biological Cybernetics in Tübingen. I had a great chance of joining AutoUI 2017 conference in Oldenburg as a student volunteer and also as a work-in-progress poster presenter. This granted me the opportunity of meeting a lot of people from all over the world who share with me the same automotive domain interest.
Within the research group Cognition & Control in Human-Machine Systems at Max Planck Institute for Biological Cybernetics in Tübingen, we want to study fundamental principles of human perception, and translate them to a variety of applied fields, including the design of virtual environment. One of our research interests, and topic of today’s blog post, is the perception of self-motion.
A motley crew of psychologists, neuroscientists, clinicians, engineers, computer scientists and other specialists congregated in an unlikely place – the headquarters of AXA Life Insurance in Paris, France. It was the 1st International Neuroergonomics Conference. What is Neuroergonomics? More importantly, do we need more conferences to attend?
Vehicle handling is a task that places high demands on our visual system. When driving a car, we have to constantly attend to visual factors such as our distance to the car in front of us, our lane-position, road-signs, and more. Therefore, perceptual distraction during driving can be expected to impair our ability to handle a vehicle. Nonetheless, some levels of distractibility can sometimes be beneficial – it can grant us access to unanticipated events that might be relevant. In our new article, published in the journal Frontiers in Human Neuroscience, we investigate how neural activity changes in order to maintain the balance between driving performance and the perception and processing of events outside of the focus of our visual attention.