mark w. newman university of michigan school of information @ mobile...
TRANSCRIPT
Mark W. Newman University of Michigan School of Information
@ Mobile Monday Detroit
November 11, 2013
A Long View of Context-Awareness How Context-Awareness Works How Context-Awareness Doesn’t Work (yet),
And a look to the future
Practitioner viewpoint: what can we do now?
What technology exists?
How can we employ it?
What applications should we be thinking about at this point?
Researcher viewpoint: what can we do in the future?
What capabilities ought to be achievable?
What applications will those enable?
What can we do now to bring the future closer?
Director of Xerox PARC Computer Science Lab
The “father of ubiquitous computing” (SciAm 1991)
Forecast an era of multiple devices per person
Predicted mobile, embedded, and context-aware computing
The inverse of Moore’s Law
The cost of computing (and storage, and networking, and displays, etc.) goes to zero
As computing becomes “free,” how will computing change?
Devices will “fade into the woodwork”
Carried, worn, embedded into the environment
Interaction will become less conscious, explicit
Analogy: the electric motor
First viable person-tracking system
Used infrared communication to track within buildings
Applications
Person-finder
Reroute phone calls
Active Maps
Applications • People finder & call/video routing • Dynamic remote controls • Adaptive lighting, heating, cooling • Collaboration (file sharing and group
pointing) • Life-logging
Significance to Mobile • The first mobile computing devices • Context-awareness has been part of
mobile from Day 1
Significance • Among the first outdoor mobile systems • Among the first used by non-researchers
• Explored Push vs Pull interaction • Influenced numerous other projects • Chose NOT to use GPS for positioning
• Required custom hardware • Concerned about performance in urban
settings
1973: Conceived 1989-1994: Launched 1983: Ronald Reagan
mandated civilian access 2000: Bill Clinton
removed “Selective Availability”
2004: aGPS for mobile demonstrated by Qualcomm
2008: iPhone 3G 2000-2013:
Improvements in accuracy
Albrecht Schmidt. 1999. There is More to Context than Location.
In Linguistics
Context is the “hidden” information you need to interpret a statement
▪ “He was becoming increasingly angry with her.”
▪ “Put it there. No, behind that one.”
▪ “It should be the same as last time.”
In Computing
(Roughly) Information outside the application that could affect the behavior inside the application
Schilit, Adams, and Want (‘94) A Context-Aware system can “examine the
computing environment and react to changes to the environment”
Important aspects of context ▪ Where you are
▪ Who you are with
▪ What resources are nearby
Not just location: light, noise, connectivity, social situation
Dey and Abowd (2001)
"any information that can be used to characterize the situation of an entity.”
Key dimensions
▪ location
▪ identity
▪ activity
▪ time
Dey and Abowd (2001)
"any information that can be used to characterize the situation of an entity.”
Key dimensions
▪ location
▪ identity
▪ activity
▪ time
We can do these now (mostly)
Dey and Abowd (2001)
"any information that can be used to characterize the situation of an entity.”
Key dimensions
▪ location
▪ identity
▪ activity
▪ time
We can do a simple version of this
Dey and Abowd (2001)
"any information that can be used to characterize the situation of an entity.”
Key dimensions
▪ location
▪ identity
▪ activity
▪ time
We can only scratch the surface of this
Social situation Human intent Internal state (attention, mood, emotion)
We’ll come back to “what makes context hard”
Manual Automatic
Information
Command
Schilit, Adams, and Want (‘94)
Manual Automatic
Information Show information relevant to context
Reconfigure app/device based on context
Command Change behavior of command based on context
Take action based on changes in context
Schilit, Adams, and Want (‘94)
Manual Automatic
Information Most existing apps (Yelp, GMaps, etc.)
Adding GeoTags to tweets & photos (push notifications?)
Command Print to “nearest printer”
Motion-sensitive lights; Contextual reminders
Schilit, Adams, and Want (‘94)
Context-awareness’ “next frontier” Activity: common sense notion of “what
someone is doing”
Sleeping, cooking, running, doing yoga, watching TV, playing Angry Birds
Closely related: transportation mode
Driving, walking, cycling, taking a bus
Map my run Nike+ Fitbit Phone-based
pedometers (e.g., Pacer)
… but these only detect movement Also steps and fuel
(generalized activity)
Map My Run
Nike Fuel Band
Walking Running Cycling Elliptical trainer Stair machine
Mobile Sensing
Platform
3-axis accelerometer
barometer
Walking Cycling Train Bus Driving (+ carpooling)
Mobile Sensing
Platform GSM-based mobility
GPS Accelerometer Gyroscope Magnetometer Light sensor Proximity sensor NFC Bluetooth WiFi Camera Microphone System state (on, off, current app, on phone) Ignoring: touch screen, buttons
Sensing Inference
Actuation
Logging
Sensing Inference
Actuation
Logging
Note: for Location, GPS does this for you!
Rules
if user1 is near user2, issue alert
if sensor.value() > threshhold then state = X
Machine learning
System learns when a pattern of sensor values indicate a particular state
▪ E.g., detecting “walking” based on accelerometer
“Walk” “Run”
Need lots of examples • Sensor could change position • Different walking and running speeds • Different individuals
Image credit: https://wiki.engr.illinois.edu/display/ae498mpa/Run-Walk+differentiator+---+2-axis+Accelerometer
EASY
Coarse location (~10m) Device orientation Motion Time Date Proximity (~10cm) Proximity (~10m) (Light) (Noise)
HARD
Precise location Indoor location User orientation Activity Social setting User emotion/mood User intent User attention
Indirect mapping GPS -> location : direct mapping (more or less)
Accelerometer -> activity: indirect Limitations of Sensing Light sensor useless when phone in pocket/bag
Magnetometer and GPS unreliable indoors
Accelerometer readings not unique for different motions, may be different for same motions
Un-sensable aspects of context Social nuance
Intent
Attention
Active sensing (e.g., take a picture)
Training recognizers (e.g., labeling input data or contextual states)
Correcting errors (e.g., providing negative feedback)
Compensating for shortcomings (e.g., manual entry of ‘I’m busy’)
Soliciting user input
Providing training data and/or corrections can be burdensome
Maintaining user trust
Conveying uncertainty of inferences
Explaining system actions
Privacy and disclosure
What data is being tracked? Who owns it? Who can access it?
As part of a personalized learning process, we need users to “label” events and states
Intille, et al. CHI ’03. A Context-Aware Experience Sampling Tool.
Lim, et al. MobileHCI 2011. Design of an intelligible mobile context-aware application
Tsai, et al. CHI 2009. Who’s viewed you? Location-sharing app that lets you review access log
Goal: Provide an effective user experience Best practices: iterate (rapidly)
Identify Needs
(Re)Design
Prototype / Build
Evaluate
Start
End
Best practice for all interactive systems
You’ll never get it right the first time
Especially for context
Does the system notion of “context” match the user’s notion?
User interaction/personalization
Confidence, accuracy, trust
Privacy and disclosure
Expensive and time-consuming
Very difficult to replicate contextual conditions in the lab
Two basic approaches to support context-aware prototyping
Make deployment easier
Support simulation
credit: Damon Hart-Davis
credit: the-ark.org
Talking Points field study (Yang et al. ASSETS 2011)
Field testing lo-fi prototypes (WOz)
Topiary (Li, Hong, and Landay 2004)
Activity Designer (Li and Landay 2008)
Momento (Carter and Mankoff 2007)
Topiary
credit: Damon Hart-Davis
credit: the-ark.org
credit: http://www.akworld.net/webblog/tag/centro/page/3/
Simulation
UBIWISE (Barton 2001)
TATUS (O’Niell et al. 2005)
UBIWISE
Capture Probes RePlay System Application under development
49
50
Episodes & Clips
World State
Player
Transforms
Preview
52
70 Clips 12 Episodes 3 weeks Team members &
friends
10 Java Developers 7 sessions
2 hours each Demo of RePlay + H&N
2 tasks
Qualitative Analysis: System Logs Think Aloud
Estimated Time of Arrival (ETA)
Arrival Detection (AD)
57
Only one participant “solved” both tasks
Others improved understanding, e.g.,
What happens if the signal is lost?
AD distance threshold needs to be larger than average GPS error
58
Why?
Selecting examples
Manipulating the data
Controlling playback during iterative testing
Clip Browser
Clip Editor
Annotation-Based Playback
Extensions
Selection brushes
Dynamic Queries
Attribute-based Filtering Attribute-based
Markup
Raw data editing
Transforms (attribute-based editing)
Automatic Annotations
Manual Annotations
Annotation-based Control
Automatic Annotations
Manual Annotations
Annotation-based Control
Automatic Annotations
Manual Annotations
Synchronized Pop-ups
10 Java Developers 10 sessions
2 hours each Demo of RePlay + H&N
2 tasks
Qualitative Analysis: System Logs Think Aloud
200 Clips No Episodes Several months Team members &
friends
Seven developers succeeded in both tasks One succeeded in one task Two failed in part because they didn’t
understand the study setup (relationship between the system-under-test and the RePlay tool)
Extensive use of the Clip Browser, including annotations
Little use of Clip Editor and Annotations Severald usability problems
Context-aware mobile computing has a long history (kinda)
Current mobile systems mainly focus on location
Location is not easy, but other forms of context are even harder
HCI issues will play a big role in user adoption even after technical issues get worked out
It is still too hard to prototype and iterate on context-aware systems
Contributors • http://inteco.groups.si.umich.edu
• Stanley (Yung-Ju) Chang
• Perry (Pei-Yao) Hung
• Rayoung Yang
• Manchul Han
• I-Chun Hsiao
• Gaurav Paruthi
• Jeff (Chuan-che) Huang
• Jungwoo Kim • Ben Congleton • Mark Ackerman • Atul Prakash
Helpful comments • Jason Hong • Jeffrey Heer • Eytan Adar • Paul Resnick • Members of MISC • Friendly reviewers
Funding • National Science Foundation
(0705672 and 0905460) • Intel Research