1. Modeling Human Behavior via Inverse Reinforcement Learning (Day 1, morning)
Organizers: Nikola Banovic
Room: Westminster
This tutorial will review current computational approaches to
describe, simulate, and predict human behavior from empirical behavior
traces data stored in large behavior logs. Attendees will learn how
train computational models of human behavior using Inverse
Reinforcement Learning and leverage them to create user interfaces
that can automatically reason about and act in response to people’s
behaviors. The tutorial will focus on high-level behaviors, such as
routines and habits, represented by sequences of situations people
find themselves in and actions they perform in those situations. After
completing this tutorial, attendees will be able to formulate behavior
modeling problems as computational modeling problems and to leverage
such models to create User Interfaces that help people increase their
productivity, safety, and health.
2. Data Visualization for UbiComp and ISWC Research
(Day 1, afternoon)
Organizers: Eun Kyoung Choe (University of Maryland, USA), Petra Isenberg (Inria, France), Bongshin Lee (Microsoft Research Redmond, USA)
Room: Westminster
This half-day tutorial will cover an introductory level of
visualization fundamentals and techniques needed to design and
evaluate visualization in the context of ubiquitous computing
research. It will focus on data visualization for mobile devices, such
as smartwatches, smartphones, and tablets as well as time-oriented
datasets. This introductory tutorial is targeted for UbiComp/ISWC
researchers who want to learn how to effectively present data
leveraging visualization. No specific prior knowledge or skills in
computer science or visualization are required.
3. Smartphone App Usage, Understanding, Modelling, and Prediction
(Day 2, morning)
Organizers: Yong Li (Tsinghua University, China), Vassilis Kostakos (University of Melbourne, Australia),
Sha Zhao (Zhejiang University, Hangzhou, China),
Sasu Tarkoma (University of Helsinki, Finland)
Room: Abbey
The wide adoption of mobile devices and smartphone applications (apps)
has enabled highly convenient and ubiquitous access to Internet
services. For both app developers and service providers, it becomes
increasingly important to predict how users use mobile apps under
various contexts. Mining and learning from smartphone apps for users
is very relevant to ubiquitous computing. Smartphone app are
ubiquitous in our daily life. Abundant apps provide useful services in
almost every aspect of modern life. Easy to download and often free,
apps can be fun and convenient for playing games, getting turn-by-turn
directions, and accessing news, books, weather, and more. Apps on
smartphones can be considered as the entry point to access everyday
life services such as communication, shopping, navigation, and
entertainment. Since a smartphone is linked to an individual user,
apps in smartphones can sense users’ behavior and activities.
Researchers use the data recorded by smartphone apps to analyze apps
and understand users. However, there are still not yet so many
researchers in this emerging field. One reason is the dataset
available for research and the second reason is the area is relatively
new. As the researchers working in this area with more than 5 years
and publishing more than 30 related high quality papers in this area
in the top conference, we would like to provide a tutorial will
provide a compact platform to help researchers to focus on this
research area, and also open several dataset in this tutorial.
This tutorial is helpful for researchers to learn the basic idea and
techniques in this area, and also distribute the latest progress on
this hop topic, to promote the research area. First, the participants
of this workshop can learn the data, methods, tools, and experiences
from the analysis of smartphone apps. Second, the findings and
discussions that are presented in the tutorial can motivate
researchers. Third, by bringing together participants with a variety
of backgrounds and goals, this tutorial provides a platform for
interdisciplinary cooperation and networking.
4. Eyewear Computing in the Wild (Day 2, afternoon)
Organizers:
George Chernyshov (KMD, Japan),
Shoya Ishimaru (DFKI Kaiserslautern, Germany),
Benjamin Tag (University Melbourne, Australia),
Philipp M. Scholl (University Freiburg, Germany),
Yuji Uema (JINS, Japan),
Kai Kunze (KMD, Japan),
Jamie A Ward (Goldsmith, UK)
Room: Victoria
Most of our senses involve the head, making it one of the most
interesting body locations for the simultaneous sensing and
interaction. Smart glasses, head-worn eye trackers, and similar “smart
eyewear” have recently emerged as interesting research platform for
ubiquitous computing. However, there are still very few open tools and
too little open datasets perform this research. In the spirit of
UbiComp, this tutorial will fill the gap, we plan a large scale data
recording during the UbiComp main conference using Jins MEME, smart EOG
enabled glasses.
In this tutorial we present the Smart Eyewear toolchain consisting of
smart glasses prototypes and a software platform for cognitive and
social interaction assessments in the wild, with several application
cases and a demonstration of activity recognition in real-time.
The tutorial participants should be ready to record during the main
conference. They will get a set of glasses and a mobile phone.
Participants recording data during the conference will have privileged
access to the gathered data (after ca. ½ - 1 year the dataset will be
made publically available).
Our platform is designed to work with Jins MEME, smart EOG enabled
glasses. The user software is capable of data logging, posture tracking
and recognition of several activities, such as talking, reading and
blinking. Organizers will walk through several applications and studies
that the platform has been used for.