cognitive insights drive self-driving accessibility

19
CREATING THE WORLD’S MOST ACCESSIBLE, SELF-DRIVING VEHICLE OVERVIEW 11 May 2017

Upload: diannepatricia

Post on 21-Jan-2018

139 views

Category:

Technology


0 download

TRANSCRIPT

Page 1: Cognitive Insights drive self-driving Accessibility

CREATING THE WORLD’S MOST ACCESSIBLE, SELF-DRIVING VEHICLEOVERVIEW 11 May 2017

Page 2: Cognitive Insights drive self-driving Accessibility

Important Factors

• Over one billion people, or 15% of the world's population, experience some form of disability (World Health Organization)

• According to the United Nations, the number of people aged 60 years or older is projected to grow by 56 percent worldwide by 2030.

• 1.6 billion people or 23% of the world’s population is over age 50

• The world is experiencing an explosion in connected devices and cognitive computing capabilities, with over 6.4 billion connected devices and interfaces worldwide

• Estimated that 10 million self-driving cars will be on the road by 2020.

People with Disabilities

New Technologies

Aging Population

Self-Driving Vehicles

Page 3: Cognitive Insights drive self-driving Accessibility

About the ProjectIBM Research and Watson IoT have announced a new collaboration with Local Motors and the Consumer Technology Association to create the world's most accessible, self-driving vehicle.

• This partnership will leverage the power of IBM Watson, the Internet of Things, and new innovations in accessibility technology.

• We will be launching a series of workshops with talented people of all ages, backgrounds and abilities, including P-TECH schools, High School STEM, AARP and universities including MIT and Princeton

• Creating the most accessible vehicle will:

• Allow older adults to age in place and remain independent and self-sufficient in their communities

• Help those with cognitive, vision, memory or physical challenges to leverage transportation, improving their independence and quality of life

• Provide enhanced access to work opportunities and community services to people of all ages and abilities

• Ensure a successful transition to driverless cars for all

Page 4: Cognitive Insights drive self-driving Accessibility

Workshops and Hackathons

Page 5: Cognitive Insights drive self-driving Accessibility

P-TECH at Carver Workshop

Location: Local Motors National Harbor MDDate: February 9, 2017Audience: PTECH Carver Students

Brainstorm Challenge - High Impact Area Recommendations

AARP Hatchery Workshop

Location: AARP Hatchery, Washington DCDate: February 10, 2017Audience: Older adults (60+)

Recommendations

• How would you know what to do in an emergency or dangerous situation (can't hear or see)

• How to safely find and get to an Olli stop• How do I get on and off Olli• How do I know where to sit• How do I know where I am at and when to get off; missing stops• How do I interact with other passengers (can't hear, see, speak)

• Make Olli On Demand - Olli should come to the door, pickup at exact time and location via easy to use call application

• Make Olli Safe – should be able to avoid danger/accidents, ensure safety at stops and onboard; give emergency instructions and call to nearest law enforcement and first responders based upon location

• Make Olli Personal - should recognize/scan riders before entering vehicle, know their preferences and selected profile information

• Make Olli comfortable - easy to get on/off, ability to accommodate or stow equipment (cane, wheelchair, umbrella, etc..), seating that adjusts for bone/joint issues

• Make Olli easy to understand - Information delivery should be clear and delivered in multiple formats - visual and auditory

• Make Olli affordable to use

Page 6: Cognitive Insights drive self-driving Accessibility

Front Porch Workshops Location: Retirement and assisted living centers in San Diego, Carlsbad and La Jolla, CADate: February 27-28, 2017Audience: Older Adults (60+)

Recommendations

Visual Impairment Workshop

Location: IBM Research, Cambridge MADate: March 7, 2017Audience: Advocates for Visually Impaired and Transportation (Mass Association for the Blind, Perkins School, Umass, Carroll School, MBTA)

Recommendations

• Alerts/Notifications - Need multiple interfaces – audio, text, haptic • Develop bluetooth connection for headset coupled with a

smartphone/pad app - can provide personalized instructions and experience

• Provide an Alexa-like service or app to call for or schedule an Olli• Need secure way to access and utilize vehicle - fob key, biometric

or visual recognition• Improve access on/off for wheelchair users and others with

assistive devices• Low riser for entrance/exit• Provide storage for walker, canes, packages• Develop a seat that can raise or lower automatically to help with moving

between sitting and standing positions; provide options for individual seats

• Implement grab bars with certain seats• Accessibility features should be discreet - ramps should raise/lower

once user who needs them is detected.

• Have Olli recognize rider and implement personalized profile option

• Have a personalized narration delivered over smart device/headphones; deliver ability to repeat info

• Voice enabled navigation for entrance/exit and seat location• Provide ability to secure wheelchairs and other assistive

devices• Some vehicles have built in clamps/restraints - others

secure via straps; no standards in place from wheelchair manufacturers, depends on device and vehicle

• Provide a pre-made wheelchair for Olli• Provide moveable/retractable seats to accommodate

wheelchairs• Provide redundancy for wheelchair accessible entrance and

exit• Accommodate service animals

Page 7: Cognitive Insights drive self-driving Accessibility

HLAA and Henry Claypool Discussions (Hearing and Mobility Impairment)

Date: March 29, 2017Recommendations:

Hackathons

MIT ATHACK, Cambridge, MADate: March 4, 2017

Challenge: Create a system to guide a visually impaired rider to an open seat on a public vehicle

Solution:

• Overhead camera analyzes interior images to identify open and occupied seats

• Rider pairs smartphone to vehicle • Camera communicates with smartphone app• App opens in window the size of the screen,

simplifying engagement• App communicates via Bluetooth audio navigation

guidance to rider

• As a baseline, all information should be available via text –app, exterior of vehicle and interior

• Sign language translation highly desirable – sign to Olli, and receive sign language back

• Voice activated systems need to integrate with Tcoilloops/assistive devices

• Integration with Bluetooth devices desirable• Integrate Haptic (touch) technology for alerts• Provide for ambient noise reduction - phone/listening cone

at each seat; have info directed to telecoil/hearing aid• All information – route, destination, updates, safety

instructions, alerts – should be provided via multiple interfaces – visual, text, auditory with Tcoil loop assist, sign, haptic

• Need to provide automated lock down for wheelchairs• Need to provide space to turn wheelchair around in vehicle

Page 8: Cognitive Insights drive self-driving Accessibility

Hackathons

HackPrincetonDate: March 31, 2017

Future Workshops & Hackathons

Workshops• City of Columbus/OSU – Mobility, TBD• City of Las Vegas, TBD• City of Santa Clara/Palo Alto VA

Hackathons • TOM-NYC (Tentative), Brooklyn, NY

Events• AAPD May 17, Local Motors National Harbor• M-Enabling Summit, June 13-14, Washington DC • Watson Developer Conferences, 9 locations worldwide• Q2-Q4

Challenge: Win the Best Use of Watson IoT Challenge by creating an innovative Accessibility solution using #AccessibleOlli.

Results:• 12 teams built with Watson. 67 teams overall submitted

projects.• 500 developers engaged, 50 enabled with Watson IoT

platform

Page 9: Cognitive Insights drive self-driving Accessibility

User Scenarios and Use Cases

Page 10: Cognitive Insights drive self-driving Accessibility

Potential Solution Ideas• Uber-like app for Olli, voice and text interfaces

• Personalization is a key component - Olli should be able to recognize your face or your fingerprint, and know about what route you usually take, what your schedule is, interaction preference, language preference, who to contact in an emergency

• All interactions on Olli should be multi-modal – voice, text, touch

• Have a personal "Olli in your ear" that guides you on your trip - have Olli describe what you are passing, give you reminders and suggestions, and tell you when you have arrived at your stop

• Use voice, text and haptic or other technologies to guide people on the vehicle and find open seats; Seats should vibrate to signal a stop

• Olli should have a seat that automatically folds away to accommodate a wheelchair user and a place to store walker or other assistive mobility devices. Wheelchairs and other assistive devices should automatically be secured.

• Have Olli detect emergency situations either in or outside the vehicle, and call the nearest responders (EMT, police, fire) for help. All emergency instructions should be given via text and speech.

Focus on riders with these challenges:

• Visual impairments, from low vision to complete blindness• Cognitive issues, ranging from dyslexia to memory loss• Hearing impairments including deafness• Mobility issues requiring assistive devices such as

wheelchairs, canes and walkers

For each category, use cases will cover

• Arranging for a ride on Olli• Navigating to the vehicle• Navigating onto the vehicle• Securing a seat/location on the vehicle• Experiencing the ride and being alerted to destination• Arriving at destination and exiting the vehicle

Page 11: Cognitive Insights drive self-driving Accessibility

Arranging for a ride on Olli Navigating to Olli Navigating onto Olli & Securing seat

Experiencing ride and destination alert

Arriving at destination and exiting vehicle

Use Case Template

Cognitive “Middleware” Across All Use casesMulti-Modal Interface• Voice• Text• Haptic

Personalization• Profile• Language• Interface• Alerts

Security• App or biometric access

Emergency contacts and process

Visual

Hearing

Mobility

Cognitive

Directions & Guidance• Location Support• Curb cut-outs• Ramps• Elevators

Integration with App or Device

Page 12: Cognitive Insights drive self-driving Accessibility

Arranging for a ride on Olli Navigating to Olli Navigating onto Olli & Securing seat Experiencing ride and destination alert

Arriving at destination and exiting vehicle

Visual Use Case

Multi-Modal InterfacePersonalization Security Directions &Guidance

“Uber” or “Alexa” –like app, voice activated • Establish secure profile with

preferences• Call/arrange for Olli pickup• Get updates on

arrival/timetable• Get details on location

Using voice-activated smartphone get:• Updates on location• Arrival time• Any route issues/delays• Navigation guidance to

pickup site

Smartphone app should sync with Olli before boarding and communicate/confirm via Bluetooth w/headset:• Onboard info

• Identification of rider• Destination and preferred route• Preferences (language, interface)• Emergency contact

• Navigation• Using VR and interior camera, identify

open seats• Communicate open seat location to rider

via app (voice)• Communicate location of storage area

(canes, packages, etc..

Smartphone app to provide Olli in your ear/app” – voice interaction via Bluetooth integration• Pre-notification alert • Arrival alert at destination. • If desired, route narration• Emergency situations

• All instructions given via voice

• Emergency personnel/first responders alerted that riders with disabilities are on board

Delivered via Smartphone via voice• Destination

announcement• Reminder and

retrieval of stored items

• Rider advised of way out and other accessibility features at destination

Cognitive middleware integration with voice activated Smartphone app

Page 13: Cognitive Insights drive self-driving Accessibility

Looking Towards the Future

Page 14: Cognitive Insights drive self-driving Accessibility

Existing Watson Integration with OlliPotential New Technology Areas• Visual Recognition

• If trained on wheelchair/walker, automatically lower ramp, fold seat

• Detecting someone is at a stop and/or has ordered Olli

• Rider recognition• Open/Occupied seats• Emergency situations

• Personalization• Rider profile – created by caregiver or individual• Interface and Language choice• Destination options• Reminders• Notification is rider gets off/on at non-profile locations

• Haptic• For blind/low vision aid with navigation on/off and with

finding seat• Haptic feedback to notify rider they are at destination

• Bluetooth integration

• Braille reader integration

Page 15: Cognitive Insights drive self-driving Accessibility

Watson Analytics and Machine Learning• There are a number of Watson services in use with Olli today –

the next step is to incorporate Watson Analytics and domain specific Machine Learning algorithms

• Watson Analytics is a cloud-based, smart data discovery service• Guides data exploration• Automates predictive analytics

• Machine learning is a form of artificial intelligence (AI)• Provides computers with the ability to learn without being

explicitly programmed• Computer programs that can change when exposed to

new data

Page 16: Cognitive Insights drive self-driving Accessibility

Data Driven Insights• Time spent in transit can be leveraged to yield many data driven

insights

• Big data and analytics, coupled with cognitive computing approaches can be used

• Analysis can be performed to asses general health and well-being of passengers over time, using non-intrusive sensors to monitor:

• Weight – via seat mounted sensors• Heart rate – via infrared cameras• Facial expressions (for emotional well-being) – via

machine learning and visual recognition• Gait - via machine learning and visual recognition• Cognitive decline – via onboard digital “games” to access

mental well-being

• This data can be fed to a larger, holistic cognitive solution…

Page 17: Cognitive Insights drive self-driving Accessibility

Considering a Holistic Cognitive Solution• A cognitive Knowledge Aggregation solution is

embodied by a contextual data fusion engine, that centralizes IoT and System of Record/Engagement data across multiple channels

• “Snapshots” of passenger data captured while riding Olli can be applied to this larger, holistic cognitive solution to monitor passenger health and well-being

• Over time, Olli-based data, along with data captured from other channels (home, wearables, social, etc.) can be used to predict inflection points in well-being

Page 18: Cognitive Insights drive self-driving Accessibility

Questions & Answers

Page 19: Cognitive Insights drive self-driving Accessibility