Bachelor / Master Design FOCUS : Finding Ocular Pattern in Collaborative Sensemaking Evaluation Scenario Group work scenarios can be separated in at least three different activities: a) task-based activities are needed to solve a certain problem (e.g. sorting or clustering of artefacts), b) content-based activities (e.g. reading a document), and c) communication activities like discussing aspects of a given topic with a group partner. For each of these three aspects of group work there are different focus areas and ocular movement patterns. Knowledge about them could be used to assist and simplify the interaction with interactive devices or to provide additional information for a specific activity. Project Goal The outcome of this project is a sound state-of-the-art analysis of possible activities during group work activities, the usage of eye tracking devices for such tasks and a setting that allows for co-located collaboration in a specific scenario. The implementation allows for further evaluation. Task Development of test settings, implementation of test framework (project presentation & paper) Contact Ulrike Pfeil, Johannes Zagermann Room: PZ 908, PZ 905 ulrike.pfeil@uni.kn, johannes.zagermann@uni.kn
Bachelor / Master Sensor-Motoric Feedback Mechanism for Ergonomic Motion Sequences Scenario Learning and applying kinaesthetic motion sequences is an important skill for people working in health care, as it helps them to adapt an ergonomic way of working and diminishes the risk of getting health problems like back pain. Up to know, proper ergonomic movements are taught either theoretically in a lecture or in a one-to-one setting where the expert is critically assessing and improving the learners movements. The goal of the study is to provide the learner with sensors that he can attach to the body and provide situative feedback on the quality of the motion sequence as well as suggestions on how to improve it. Project Goal The outcome of this project is a sound state-of-the-art analysis of the potential and limitations of available sensors that can be applied for feedback mechanisms in the above mentioned scenario. The application of these sensors will be evaluated in a small study and recommendations will be provided for the design of sensor-motoric feedback mechanisms. Task Development of test settings and tasks (project presentation & paper) Contact Ulrike Pfeil, Johannes Zagermann Room: PZ 908, PZ 905 ulrike.pfeil@uni.kn, johannes.zagermann@uni.kn Design Evaluation
Bachelor / Master WYSIWYG : From Public to Personal Devices (and back) Scenario We are surrounded by large public displays with digital content, printed media (e.g. with QR-codes), and the combination of both. Interesting information is available but not always accessible. It can be desirable to pull information from a public installation to one of our personal devices. Vice versa, sometimes it can be of a great advantage to be able to push information from personal devices to a public display. This cross-device interaction could be supported by eye tracking technology. Design Evaluation Project Goal The outcome of this project is a sound state-of-the-art analysis of the usage of eye-tracking for crossdevice interaction. The design and implementation of various prototypical proof-of-concept solutions allow for further evaluation. Task Development of test settings and tasks, implementation of test framework (project presentation & paper) Contact Johannes Zagermann Room: PZ 905 johannes.zagermann@uni.kn Jayson Turner: Cross-Device Eye-Based Interaction
Human-Computer Interaction Group Prof. Dr. Harald Reiterer Design Bachelor / Master Augmented Sailing Scenario Because of being exposed to the whims of nature, sailing can not only be an exciting but also a dangerous pastime. According to the accident statistics of Lake Constance (fig. 1), most accidents are caused by bad weather conditions and the lack of diligence with respect to the sailing equipment. Providing necessary real-time information (e.g., weather conditions) can help reducing such accidents. In uncritical situations, additional contextual information (native flora and fauna, cultural information and offered activities of the coastal areas, ) can help making the trip even more enjoyable. Currently, sailors can access such information via smartphones. Smartphone interaction, however, is impractical in many sailing situations (needs to be pulled out of the pocket first, must not get # wet, ). This is where the canvases come into play Project Goal Develop an augmented canvas concept that displays contextual information to (1) minimize accident hazards (2) and to make the sailing trip more enjoyable. Task Contact Literature research & state-of-the-art analysis (seminar presentation & paper) Concept design & implementation of concept(s) (project presentation & paper) Jens Müller & Simon Butscher Room: PZ 906 jens.mueller@uni-konstanz.de simon.butscher@uni-konstanz.de
Bachelor / Master Off-screen Visualization Techniques for Mixed Reality Environments Scenario Mobile devices provide only limited display size to visualize information (such as a map). Thus, they can only show a small part of the given information space. Traditional offscreen visualization techniques, such as Wedges (fig. 1), address this shortcoming by providing the user a hint to potentially interesting off-screen targets (e.g., a parking lots, restaurants, ). Mixed Reality Displays (fig. 2) blend digital objects with the physical environment. Such displays provide a natural way to navigate 3D information spaces. Yet, they still come with the typical limitation of mobile devices: they can only display a fragment of the information space. Project Goal Develop off-screen visualization techniques for mixed reality displays to make the user become aware of potentially interesting entities (digital objects, physical objects, other persons) within the mixed reality environment. Task Literature research & state-of-the-art analysis (seminar presentation & paper) Concept design & implementation of concept(s) (project presentation & paper) Prerequisites: experiences in sketching techniques and the Unity 3D game development platform. Contact Jens Müller, Room: PZ 906 jens.mueller@uni-konstanz.de Design Fig. 1: Wedges visualization technique (Gustafson et al., 2008) displaying off-screen information. Fig. 2: Mixed reality displays without any off-screen visualization.
Bachelor / Master Lightweight Visual Data Analysis on Mobile Devices Providing Self-Monitoring Feedback Design Scenario Wellness applications help users record information about their health behaviors, such as physical activity and diet. Tracking ranges from counting steps to tracking the stages of sleep. Also calorie counters which make use of large calorie databases are part of self-monitoring. In order to increase the effectivity of self-monitoring for changing the behavior of the users we do not only have to track the data but also have to give feedback. Most applications only provide simple forms of feedback such as showing the number of steps that one has to take to meet one s goal. However more detailed insights about the behavior can be provided by unveiling relations between the gathered data (e.g., that meals at work are less healthy than at home). Project Goal Based on data about the nutrient of meals, tracked activities and some additional data like eating motives or time and location where meals took place visualizations and mechanism to analyze the data should be developed. The concepts should address the needs of end users and not the needs of expert data analysts. Therefore the concepts do not need to allow a detailed analysis, but have to provide a simple access to the gathered data. Task Literature research and state-of-the-art analysis (seminar thesis) Design and discussion of several interaction and visualization concepts (project work) Implementation of a prototype for Android (project work) Evaluation of the visualization and data analysis concept (thesis) Contact Simon Butscher Room: PZ 906 simon.butscher@uni-konstanz.de Announcement October 21 st 2015
Bachelor / Master Chair I/O 2.0 Scenario Chairs are items of our daily lives. Different kinds of chairs exists to support heterogeneous activities and scenarios e.g. office chairs, leisure chairs, and medical chairs. The resurrection of virtual reality adds a new task for chairs: support navigation and gaming in virtual worlds. This topic was first examined by a project called ChairIO in the year 2005. In the original ChairIO project a chairbased computer interface was developed. During the last year some kickstarter campaign picked up the topic and present new hardware solutions: RotoVR, Turris Chair, and VRGO Chair (see picture). The upcoming products rely on the development of new hardware. Project Goal The goal of the project is not to develop a new hardware solution but use existing smartphone sensors to augment a physical chair. The outcome of this project is a smartphone app which allows by attaching the phone to chair a navigation in virtual spaces. Furthermore a evaluation of this input device would be desirable. Task Development of software (project presentation & paper) Contact Design Daniel Klinkhammer Room: PZ 905 daniel.klinkhammer@uni-konstanz.de Evaluation
Bachelor / Master Navigation in 3D Spaces: Comparison of Input Devices Evaluation Scenario Interaction in virtual 3D Spaces is an important topic due to the new era of virtual reality and augmented reality devices. Input modalities and devices range from classical game controllers to new 3D controllers up to bare hands (e.g. leap motion). A whole body of research has looked on the performance of different input modalities for 2D spaces (e.g. Fitt s Law studies). However for 3D spaces only a few studies exists. So it is still an unanswered question which interaction device performs best in specific tasks and environments. Project Goal The outcome of this project is a comparative study of different 3D input devices. This study should at least answer the following questions: - Which device has the highest navigation performance (Fitt s law) - How high is the physical demand of the different devices Task Development of software (project presentation & paper) Contact Daniel Klinkhammer Room: PZ905 daniel.klinkhammer@uni.kn,