sixth sense ppt

30
ABOUT 'SixthSense' is a wearable gestural interface that augments the physical world around us with digital information and lets us use natural hand gestures to interact with that information. We've evolved over millions of years to sense the world around us. When we encounter something, someone or some place, we use our five natural senses to perceive information about it; that information helps us make decisions and chose the right actions to take. But arguably the most useful information that can help us make the right decision is not naturally perceivable with our five senses, namely the data, information and knowledge that mankind has accumulated about everything and which is increasingly all available online. Although the miniaturization of computing devices allows us to carry computers in our pockets, keeping us continually connected to the digital world, there is no link between our digital devices and our interactions with the physical world. Information is confined traditionally on paper or digitally on a screen. SixthSense bridges this gap, bringing intangible, digital information out into the tangible world, and allowing us to interact with this information via natural hand gestures. ‘SixthSense’ frees information from its confines by seamlessly integrating it with reality, and thus making the entire world your computer. The SixthSense prototype is comprised of a pocket projector, a mirror and a camera. The hardware components are coupled in a pendant like mobile wearable device. Both the projector and the camera are connected to the mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while the camera recognizes and tracks user's hand gestures and physical objects using computer-vision based techniques. The software program processes the video stream data captured by the camera and tracks the locations of the colored markers (visual tracking fiducials) at the tip of the user’s fingers using simple computer-vision techniques. The movements and arrangements of these fiducials are interpreted into gestures that act as interaction instructions for the projected application interfaces. The maximum number of tracked fingers is only constrained by the number of unique fiducials, thus SixthSense also supports multi-touch and multi-user interaction.

Upload: sowjanya-srinivasan

Post on 07-Apr-2015

287 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: Sixth Sense Ppt

ABOUT

'SixthSense' is a wearable gestural interface that augments the physical world around

us with digital information and lets us use natural hand gestures to interact with that

information.

We've evolved over millions of years to sense the world around us. When we encounter something, someone or some place, we use our five natural senses to perceive information about it; that information helps us make decisions and chose the right actions to take. But arguably the most useful information that can help us make the right decision is not naturally perceivable with our five senses, namely the data, information and knowledge that mankind has accumulated about everything and which is increasingly all available online. Although the miniaturization of computing devices allows us to carry computers in our pockets, keeping us continually connected to the digital world, there is no link between our digital devices and our interactions with the physical world. Information is confined traditionally on paper or digitally on a screen. SixthSense bridges this gap, bringing intangible, digital information out into the tangible world, and allowing us to interact with this information via natural hand gestures. ‘SixthSense’ frees information from its confines by seamlessly integrating it with reality, and thus making the entire world your computer.

The SixthSense prototype is comprised of a pocket projector, a mirror and a camera. The hardware components are coupled in a pendant like mobile wearable device. Both the projector and the camera are connected to the mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while the camera recognizes and tracks user's hand gestures and physical objects using computer-vision based techniques. The software program processes the video stream data captured by the camera and tracks the locations of the colored markers (visual tracking fiducials) at the tip of the user’s fingers using simple computer-vision techniques. The movements and arrangements of these fiducials are interpreted into gestures that act as interaction instructions for the projected application interfaces. The maximum number of tracked fingers is only constrained by the number of unique fiducials, thus SixthSense also supports multi-touch and multi-user interaction.

The SixthSense prototype implements several applications that demonstrate the usefulness, viability and flexibility of the system. The map application lets the user navigate a map displayed on a nearby surface using hand gestures, similar to gestures supported by Multi-Touch based systems, letting the user zoom in, zoom out or pan using intuitive hand movements. The drawing application lets the user draw on any surface by tracking the fingertip movements of the user’s index finger. SixthSense also recognizes user’s freehand gestures (postures). For example, the SixthSense system implements a gestural camera that takes photos of the scene the user is looking at by detecting the ‘framing’ gesture. The user can stop by any surface or wall and flick through the photos he/she has taken. SixthSense also lets the user draw icons or symbols in the air using the movement of the index finger and recognizes those symbols as interaction instructions. For example, drawing a magnifying glass symbol takes the user to the map application or drawing an ‘@’ symbol lets the user check his mail. The SixthSense system also augments physical objects the user is interacting with by projecting more information about these objects projected on them. For example, a newspaper can show live video news or dynamic information can be provided on a regular piece of

Page 2: Sixth Sense Ppt

paper. The gesture of drawing a circle on the user’s wrist projects an analog watch.

The current prototype system costs approximate $350 to build. Instructions on how to make your own prototype device can be found here (coming soon)

Despite the advances and advantages of computer-aided design tools, the traditional pencil and

paper continue to exist as the most important tools in the early stages of design. The goal of the

project ‘inktuitive’ is to combine the intuitive process of creation that is inherent in paper and

pencil with the power of computing that the digital design tools provide. Inktuitive also extends the

natural work-practice of using physical paper by giving the pen the ability to control the design in

physical three-dimensional space, freeing it from its tie to the paper. The intuition of pen and paper

are still present, but lines are be captured and translated into shapes in the digital world. The

physical paper is augmented with overlaid digital strokes. Furthermore, the platform provides a novel

interaction mechanism for drawing and designing using above the surface pen movements.

‘inktuitive’ is an intuitive physical design workspace that aims to bridge the gap and bring

together the conventional design tools such as paper and pencil with the power and convenience of

the digital tools for design. .

Abstract

Despite the advances and advantages of computer-aided design (CAD) tools such as Autodesk

Revit, ArchiCad and SketchUp, the traditional pencil and paper continue to exist as some of the most

important tools in the process of design. We present ‘Inktuitive’, an intuitive physical design

workspace that aims to bridge the gap and bring together the conventional design tools such as

paper and pencil with the power and convenience of the digital tools for design.

Keywords: Design tools, paper and pencil, 3D drawing, workspace, connecting physical and digital

world.

1 Introduction

The development of computer-aided design (CAD) tools has leveraged the power of the digital world

in the process of design by allowing designers to express their creations in new ways. As opposed to

the conventional method of drawing primitives on multiple 2-dimensional (2D) representations

(views), it is increasingly becoming popular to directly model 3-dimensional (3D) representations of

the actual component objects being used and to specify the relationships between them

parametrically to design an architectural product. Digital tools such as Autodesk Revit, ArchiCad and

SketchUp are examples of tools that support this new paradigm also known as Building Information

Modelling (BIM) [BIM]. However, many architects and designers still prefer to use physical tools such

as paper and pen to articulate their ideas especially at early stages of the design, where it is critical

Page 3: Sixth Sense Ppt

that your hand motion have direct and immediate effect on the object being designed. As the new

design paradigm proliferates, challenges in adopting such affordances of physical design tools into

the digital realm will become of increased importance. In addition, as designers start thinking in

terms of 3D objects rather than in 2D views of objects, a new set of interactions that augments the

conventional use of pen on paper needs to be sought. Instead of replacing the traditional tools,

paper and pencil, it will be more fruitful to merge the advantages and functionalities of digital design

tools with traditional work practice of using paper and pen, hence connecting the physical and digital

experiences.

2 Inktuitive - A physical design workspace

Inktuitive is an intuitive physical design workspace where designers can create and manipulate

digital models and representations of ideas in a more direct and natural way. The goal of Inktuitive is

to combine the intuitive process of creation that is inherent in paper and pencil with the power of

computing that the modern digital design tools provide. Inktuitive also extends the natural work-

practice of using physical paper by giving the pen the ability to control the design in physical 3D

space, freeing it from its tie to the paper. The following use scenario outlines major features of the

‘Inktuitive’ platform.

Figure 1 presents the working prototype of ‘Inktuitive’ system [Mistry and Sekiya. 2008]. The

‘Inktuitive’ system provides a physical workspace environment where digital content is projected

from below onto the paper placed over the frosted-glass surface of the desk. Pen movement on the

surface of the paper is tracked by ultra-sonic digital-pen hardware. Two stationary sensors receive

ultra-sonic waves that are emitted by the

Figure 1. ‘Inktuitive’ prototype.

transmitter placed at the tip of the pen. The device measures the location of the pen tip on the paper

using the calculation of receiving time of the waves received by the two stationary receivers. Further,

the pen tip is also augmented with infra-red (IR) LED array for tracking the 3D coordinates {X,Y,Z} of

the pen above the paper surface using stereo vision mechanism. Two USB cameras with IR-pass

filters mounted above the workspace detect the IR light emitted by the IR-LEDs which are used to

calculate the {X,Y,Z} coordinates of the pen tip via triangulation mechanism. The projector, ultra-

sonic pen hardware, cameras and the navigational knob that lets user navigate through 3D space

are all connected to a computer. A software program processes the data collected by the digital-pen

hardware and two cameras. Image processing of the two camera inputs provides the system with

absolute location of the pen tip (IR LEDs) in 3D space above the paper surface while the ultra-sonic

components help capturing precise user strokes on the paper surface. The software program uses

these two streams of user hand motion in augmenting the paper on the desk with projected digital

information. A vertical display screen provides the user with 3D view of the created objects.

3 Conclusion

In this paper, we present ‘Inktuitive’, an intuitive physical design workspace which brings the power

of computing and digital design tools to the traditional work practice for designing using paper and

pencil. Furthermore, the platform provides a novel interaction mechanism for drawing and designing

using above-the-surface pen movements in 3D.

Page 4: Sixth Sense Ppt

References

BIM - BUILDING INFORMATION MODELING.

http://en.wikipedia.org/wiki/Building_Information_Modeling

As accessed on 15th Dec, 2008.

MISTRY, P. AND SEKIYA, K. 2008. Inktuitive: An Intuitive Physical Design Workspace. In

Proceedings of 4th International Conference on Intelligent Environments (IE08). Seattle, USA

An Invisible Computer Mouse – Yet Another Invention by Pranav Mistry

Pranav Mistry, who earlier had made headlines for his invention Sixth Sense and even received

Popular Science 2009 Invention Award for it, has now invented yet another similar device and this

time its invisible – A mouse and amusingly it costs just 20$ to build its prototype.

The perpetual changes in computer technology & web has seen many evolutions, right from large

room size CPUs to miroprogrammed slim netbooks, heavy bulky monitors to thin LCDs, few MBs

capacity hard disks to trillion capacity HDs but in all these what remained nearly unchanged and un-

evolved is mouse – moving it around to help us interact computer.

Mouseless is an invisible computer mouse project done in in MIT Fluid Interfaces Group headed by

Pranav Mistry, this invisible mouse provides the familiarity of interaction of a physical mouse without

actually needing a real hardware mouse, hence removes the requirement of having a physical

mouse altogether but still provides the intuitive interaction of a physical mouse that everyone is

familiar with.

Page 5: Sixth Sense Ppt

Mouseless consists of – (1) an Infrared (IR) laser beam, (2) an Infrared camera and both embedded

in the computer itself. The laser beam module is modified with a line cap and placed such that it

creates a plane of IR laser just above the surface the computer sits on. The user cups their hand, as

if a physical mouse was present underneath, and the laser beam lights up the hand which is in

contact with the surface. The IR camera detects those bright IR blobs using computer vision. The

change in the position and arrangements of these blobs are interpreted as mouse cursor movement

and mouse clicks. As the user moves their hand the cursor on screen moves accordingly. When the

user taps their index finger, the size of the blob changes and the camera recognizes the intended

mouse click.

INTRODUCTION

³Sixth Sense is a wearable gestural interface device that

augments the physical world with digital information and lets people use natural

hand gestures to interact with that information. ´ It was developed by PRANAV MISTRY,

below and start sharing.

Link account

Readcast this Document

edit preferences

Page 6: Sixth Sense Ppt

Download this Document for Free

e8a0bc300d6b5e Submit

1 document_comme 4gen

Page 7: Sixth Sense Ppt
Page 8: Sixth Sense Ppt
Page 9: Sixth Sense Ppt

³Sixth Sense is a wearable gestural interface device that

augments the physical world with digital information and lets people use natural

hand gestures to interact with that information. ´ It was developed by PRANAV MISTRY,

3

Page 10: Sixth Sense Ppt

4

Page 11: Sixth Sense Ppt

/ 326

Search com

Page 12: Sixth Sense Ppt

Reading should be social! Post a message on your social networks to let others know what you're reading. Select the sites below and start sharing.

Link account

Readcast this Document

edit preferences

e8a0bc300d6b5e Submit

1 document_comme 4gen

Page 13: Sixth Sense Ppt

Download this Document for Free

Page 14: Sixth Sense Ppt
Page 15: Sixth Sense Ppt
Page 16: Sixth Sense Ppt
Page 17: Sixth Sense Ppt

4

Page 18: Sixth Sense Ppt

A pocket projector,

A mirror,

A camera.

Page 19: Sixth Sense Ppt

Mobile component

Colored markers5

Page 20: Sixth Sense Ppt

The Projector:

projects visual information, enabling surfaces, walls and physical objects around the wearer to be used as interfaces; The Camera And Hands: recognizes and tracks the user's hand gestures and physical objects using computer-vision based techniques. The Software Program: processes the video stream data

took by the camera and tracks the locations of the colored markers (visual tracking of colored fingers) at the tip of the user¶s fingers using simple computer-vision techniques. The maximum number of tracked fingers is only constrained by the number of unique fiducials, thus Sixth Sense also supports multi-touch and multi-user interaction

Technology behind sixth sense

Page 21: Sixth Sense Ppt

Hardware Setup.

Software Setup.

Events & Gestures.

System Architecture

Hardware setup MOBILE Nokia n95 smart phone is used (running symbianO.S s60 edition). It has multitasking capability. Built-in camera provides execution of both: Gesture tracking engine. Gesture enabled application

PROJECTOR

Pocket projector Pk101 from Optoma is used.

It augments near by surfaces.

It is a LED based projector.

Suitable for mobile usa

Software setup Applications are implemented using

Page 22: Sixth Sense Ppt

JAVA 2MICRO edition. Computer vision library is written in symbian c++(used in Gesture tracking). The software for the sixth sense prototype is developed on a Microsoft Windows platform using C#, WPF and open CV

HOW S/W WORKSThe software works on the basis of computer vision. A small camera acting as an eye, connecting us to the world of digital information. Processing is happening in the mobile phone, and basically works on computer vision algorithms

THE SOFTWARE RECOGNIZES THREE KINDS OF GESTURES:

MULTITOUCH:like the ones we see in the iphone ± where we touchthe screen and make the map move by pinching anddragging.FREE HAND: like when you take a picture or a namaste gesture to start the projection on the wall.

ICONIC: drawing an icon in the air. Like, Whenever we draw

a star, show us the weather details. When we draw amagnifying glass, show us the map

Events & Gestures

Page 23: Sixth Sense Ppt

Events & Gestures are detected using colored markers attached to fingers. The user can zoom in or out by moving his hands/fingers farther or nearer to each other, respectively The user can draw on any surfaces using the movement of the index finger used as a pen. System also supports freehand gestures (postures). one example is to touch both the index fingers with the opposing thumbs, forming a rectangle or framing gesture

This gesture activates the photo taking application. Another example of such gestures is the µNamaste¶ posture that lets the user navigate to the home screen. System lets the user draw icons or symbols in the air using the movement of the index finger

Page 24: Sixth Sense Ppt

16

Page 25: Sixth Sense Ppt

TECHNIQUE BEHIND

The hardware that makes Sixth Sense work is a pendant like mobile wearable interface It has a camera, a mirror and a projector and is connected wirelesslyto aBluetooth smart phone that can slip comfortably into one¶s pocket The camerarecognizesindividuals, images, pictures, gestures one makes with their hands

Page 26: Sixth Sense Ppt

Information is sent to the Smartphone forprocessing The downward-facing projectorprojects the output image on to the mirror Mirrorreflectsimage on to the desired surface Thus, digital information isfreed from its confinesand placed in the physical world

COST

T he current prototype system costs approximately$350 to build, mainly due to the micro-projector. The software may be available for free on the model of open and editable freeware

ADVANTAGES

Portable

Inexpensive

Multi-sensory

Connectedness between the world andinformation It is an open source Data access directly from machine in real time

LIMITATIONS

Software does support the ability to use real time

Page 27: Sixth Sense Ppt

video streams in order to produce augmented reality. Hardware limitations of the devices, that we currently carry around with us. For example many phones will not allow the external camera feed to be manipulated in real time. Post processing can occur however

F U T U R E O F 6 T H S E N S E

Interactive Advertisements. 3d visualizations. Solar batteries via small solar panel. Camera can act as a third eye for the blind person ³According to researchers, after 10years we

will be here with the ultimate sixth-sense brain implant.´

C O N C L U S I O N

Sixth Sense recognizes the objects around us, displaying information automatically and letting us to access it in any way we need. The Sixth Sense prototype implements several applications that demonstrate the usefulness, viability and flexibility of the system. Allowing us to interact with this information via natural hand gestures. The potential of becoming the ultimate "transparent" user interface for accessing information about everything around

Page 28: Sixth Sense Ppt