interactive screen

27
Section 1 Faculty of computer and information science Ain Shams University

Upload: hajermohammed

Post on 01-Jul-2015

418 views

Category:

Education


0 download

DESCRIPTION

Interactive Screen is a Graduation Project for the 2010 year, Ain Shams University, It is extended project of Interactive Wall 2009 Interactive Screen won in a competition called MIE (Made In Egypt) Organized by IEEE, and The project did take TOP ONE rating. For Any Details Please Contact:[email protected]@[email protected]@gmail.com

TRANSCRIPT

Page 1: Interactive Screen

Section 1

Faculty of computer and

information science

Ain Shams University

Page 2: Interactive Screen

Supervisors:Professor Dr. Mohammed RoushdyFaculty of computer and information sciences

Dr. Haythem El-MessiryFaculty of computer and information sciences

T.A. Ahmad SalahFaculty of computer and information sciences

Sponsors:

Page 3: Interactive Screen

Teamwork:

Abu-Bakr Taha Abdel Khalek

Hadeel Mahmoud Mohammed

Hager Abdel Motaal Mohammed

Mahmoud Fayez El-Khateeb

Yasmeen Abdel Naby Aly

Page 4: Interactive Screen

Agenda:1. Introduction.2. Interactive Screen Vs Other systems.3. Market Research &Customer needs.4. Physical Environment.5. System Framework.6. System Modules.7. Applications.8. Limitations.9. Future Work.10. Final Demo.11. References.

Page 5: Interactive Screen

Overview:Projector and two cams system

HCI (Human Computer Interaction) system.Interact with hand gestures(shapes).Extension of interactive wall ‘09

Introduction: Problem Definition:Human interacts normally with another human by using motions, It might be annoying and impractical to use hardware equipments to interact with someone/something.

Page 6: Interactive Screen

Introduction: Motivation:

1. “Interactive Wall

2009”.

2. multi touch technology

3. large size of touch screen with appropriate cost.

4. flexibility.

Page 7: Interactive Screen

future without annoying

input devices, and to be

proud of being a part of

accomplishing such a

dream.

develop the

Interactive Screen

System to be able to

handle more features

and gestures and to

get over its limitation

which will make the

user get satisfied with

its usability and

flexibility of use.

Page 8: Interactive Screen

Time Plan:

Milestones:. Segmentation modules. May-2010

• Multi-hand tracking. April-2010• Automatic hand detection. April-2010 • Z-Depth Module. April-2010• Dynamic gesture Module. May-2010

Page 9: Interactive Screen

Physical Environment:

• Simple components constructs a new environment of interactive screens that overcomes limitations of other systems.

•Environment limitations

Traditional Environment

Page 10: Interactive Screen

Proposed Physical Environment:

Page 11: Interactive Screen

Physical Environment:

other alternative solutions VS proposed solution

Page 12: Interactive Screen

Interactive screen Vs other Systems

Microsoft surface.

Diamond Touch.

Touch Screens.

• Cost• No need To touch the screen.• Gesture Recognition.• Dynamic Gesture Recognition.• Bare Hands.• No sensors, pure image

processing.

Page 13: Interactive Screen

• In 1991, First Smart White Board.

• Over 1.6 million smart whiteboards have been installed throughout the world.

• Surveys indicates that interactive whiteboards benefit student engagement, learner motivation and knowledge retention.

Market Research

Page 14: Interactive Screen

Framework

System Controlling

Input

Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Page 15: Interactive Screen

Experimental Results

Input apply a geometric calibration using the four calibration points acquired

by the configuration module. Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Page 16: Interactive Screen

Experimental ResultsSimple back ground

Complex back ground

It’s main task is to generate a binary image from the captured

image represents the foreground

Input

Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Page 17: Interactive Screen

Experimental Results

responsible for detecting the hand position automatically at any position,

with certain gesture (Open Hand)

Input

Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Page 18: Interactive Screen

Experimental Results

responsible for keep track with the user hand, know the actual position of the hand all the time

Input

Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Page 19: Interactive Screen

Experimental Results

the main task is to decide whether the user did touch the

screen or not

Input

Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Page 20: Interactive Screen

Experimental Results

Responsible for recognize the shape of the hand.

•Static.•Dynamic.

Input

Calibration

Segmentation

Hand Detection

Multi hand Tracking

Touch Detection

Gesture Recognition

Event

Interface

Main Finger Main Fingers

Open Hand Closed Hand Fingers

Page 21: Interactive Screen

Smart whiteboard application

• In class rooms.• In meeting rooms.

Application

Page 22: Interactive Screen

Limitations

• Non Skin color for the user top clothes.

• user must wear Long Sleeves.

•Enter the system with certain gesture.

Page 23: Interactive Screen

1. Multi-user system.

2. Body Tracking.

1. Detecting any shape of hand.

Future work

Page 24: Interactive Screen

Demo

Page 25: Interactive Screen

[1]. Mennat-Allah Mostafa Mohammad,Nada Sherif Abd El Galeel,Rana Mohammad Ali Roshdy,Sarah Ismail Ibrahim, Multi Touch Interactive Surface, Faculty of Computer and Information Sciences, Ain shams University, Cairo, Egypt, 2009.

[2]. Kai Briechle, Uwe D. Hanebeck, Template Matching using Fast Normalized Cross Correlation, Institute of Automatic Control Engineering,Technische Universität München, 80290 München, Germany.,2001.

[3]. Rafeal C. Gonzalez, Richard E. Woods, DIGITAL IMAGE PROCESSING, Third edition, Pearson, 2008.

[4]. Gray Bradski, Adrian kaebler, Learning Open CV, O'Reilly Media, 2008.

[5]. Alan M. McIvor, Background Subtraction Techniques In Proc. of Image and Vision Computing, Auckland, New Zealand, 2000.

[6]. Francesca Gasparini, Raimondo Schettini , Skin Segmentation using multiple thresholding, Milano Italy, 2007.

References

Page 26: Interactive Screen

[7]. Hideki Koike, Masataka Toyoura, Kenji Oka and Yoichi Sato, 3-D Interaction with Wall-Sized Display, IEEE computer society,2008.

[8]. Mahdi Mirzabaki , A New Method for Depth Detection Using Interpolation Functions using single camera, INTERNATIONAL ARCHIVES OF PHOTOGRAMMETRYREMOTE SENSING AND SPATIAL INFORMATION SCIENCES, 2004, VOL 35; PART 3, pages 724-726.

[9]. Patrick Horain and Mayank Bomb, 3D Model Based Gesture Acquisition Using a Single Camera,proceedings of sixth IEEE on applications of computer vision ,2002.

[10] Z. Cˇernekova´, N. Nikolaidis and I. Pitas, SINGLE CAMERA POINTING GESTURE RECOGNITION USING SPATIALFEATURES AND SUPPORT VECTOR MACHINES,EUSIPCO,Pozan,2007.

References

Page 27: Interactive Screen