[ieee 2013 ieee virtual reality (vr) - lake buena vista, fl (2013.3.18-2013.3.20)] 2013 ieee virtual...

2
An Middleware for Motion Capture Devices applied to Virtual Rehab Eduardo Filgueiras Damasceno*, Alexandre Cardoso α , Edgard Afonso Lamounier Jr . Computer Graphs Lab Federal University of Uberlandia – Minas Gerais - Brazil ABSTRACT This demonstration allows the visitors to use assorted technologies to motion capture devices for uses in virtual rehabilitation. We presented the MixCap Middleware, developed to blend different types of motion capture technologies and to accomplish in one data format for biomechanical analysis. The Middleware, together with the associated prototypes were developed for use by health professionals who have low cost tracking movement equipment. Keywords: Middleware, Motion Capture Devices, Virtual Reality. 1 INTRODUCTION Virtual Reality (VR) is increasingly used as a clinical resource to simulate situations in which patients may present some difficulty, both cognitive and motor performance. This is justified by the intrinsic VR characteristic: simulation and control environment. Those are important features for analysis of human behavior. Recent motion capture technologies that transpose the patient moves to a virtual environment to perform tests (walk, jump, balance), has proved to be difficult to be performed before the naked eye. The hardware takes to capture motion and perform a biomechanical analysis of this movement has some cost that often healthcare professionals (physical therapists and physical trainers) or small clinics are still having difficulty to acquire. Then it is very common to see nonconventional interaction devices such as Webcams and motion sensors (accelerometers and inertial sensors) commonly used in digital games, applied to a range of VR systems to support tracking technology. The main obstacle to be overcome to the development of motion capture applications for use in rehabilitation is the heterogeneity of motion capture devices, characterized by limitations in terms of processing, bandwidth of communication and its proprietary libraries. Because of these different characteristics, it is necessary to develop specific versions for each type of application for these different environments, according to the characteristics of each device. However, the creation of different versions may not be feasible and may not reach the state "human" because of the number of platforms and devices available. Therefore, a way of combining and adding various types of technology and tracking devices to generate information for biomechanical analysis of movement can be valuable for this stakeholder. 2 THE MIDDLEWARE Middleware Architecture defines the functions that enable communications in a distributed system and the tools that improve the overall usability of architecture made up of products from many different vendors on multiple platforms [1]. The motion capture can be accomplished by inertial devices, magnetic, and optical sound, depending on the investment to be made in the equipment can have a multitude of technologies and devices. However, this work focused on low-cost devices implemented for motion capture, and they based devices nonconventional interaction. Augmented Reality (AR) was the chosen technology because it is cheaper and more efficient to capture movements. In this demo, the user will be able to test different ways to captures his movements: from fiducial markers [2], the sensing device MS-Kinect TM and the device's accelerometers WII TM Remote Motion Capture System. The framework developed will cover the three main ways of scoring in the real environment and converts them into a model of human skeleton points representing the virtual environment. In possession of this mesh points, and required at least three points (or markers), one can calculate the range of motion in a joint or muscle. An abstraction of the framework that can be viewed in Figure 1 where it can be seen that it is possible to stick one or more devices in the framework. Figure 1: MixCap Middleware Architecture. For each type of device stuck in the framework, distance information is required. The recording format data file is the XML standard that provides real-time interoperability between devices. Figure 2 shows the use of the framework capture motion in augmented reality environments. The illustration to the right side shows the use of the sensor MS-Kinect and illustration on the left shows the motion capture with fiducial markers . The application development can be accomplished with standard libraries for both technologies (ARToolKit [5] and MSKinect SDK [6]). However, the framework is provided by exchanging messages with the middleware. Next, information range of motion and position of the center of balance of the body is returned. * [email protected], [email protected] [email protected] 171 IEEE Virtual Reality 2013 16 - 20 March, Orlando, FL, USA 978-1-4673-4796-9/13/$31.00 ©2013 IEEE

Upload: edgard-afonso

Post on 10-Mar-2017

215 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: [IEEE 2013 IEEE Virtual Reality (VR) - Lake Buena Vista, FL (2013.3.18-2013.3.20)] 2013 IEEE Virtual Reality (VR) - An middleware for motion capture devices applied to virtual rehab

An Middleware for Motion Capture Devices applied to Virtual Rehab

Eduardo Filgueiras Damasceno*, Alexandre Cardosoα, Edgard Afonso Lamounier Jr∆. Computer Graphs Lab

Federal University of Uberlandia – Minas Gerais - Brazil

ABSTRACT This demonstration allows the visitors to use assorted technologies to motion capture devices for uses in virtual rehabilitation. We presented the MixCap Middleware, developed to blend different types of motion capture technologies and to accomplish in one data format for biomechanical analysis. The Middleware, together with the associated prototypes were developed for use by health professionals who have low cost tracking movement equipment.

Keywords: Middleware, Motion Capture Devices, Virtual Reality.

1 INTRODUCTION Virtual Reality (VR) is increasingly used as a clinical resource to simulate situations in which patients may present some difficulty, both cognitive and motor performance. This is justified by the intrinsic VR characteristic: simulation and control environment. Those are important features for analysis of human behavior. Recent motion capture technologies that transpose the patient moves to a virtual environment to perform tests (walk, jump, balance), has proved to be difficult to be performed before the naked eye.

The hardware takes to capture motion and perform a biomechanical analysis of this movement has some cost that often healthcare professionals (physical therapists and physical trainers) or small clinics are still having difficulty to acquire. Then it is very common to see nonconventional interaction devices such as Webcams and motion sensors (accelerometers and inertial sensors) commonly used in digital games, applied to a range of VR systems to support tracking technology.

The main obstacle to be overcome to the development of motion capture applications for use in rehabilitation is the heterogeneity of motion capture devices, characterized by limitations in terms of processing, bandwidth of communication and its proprietary libraries.

Because of these different characteristics, it is necessary to develop specific versions for each type of application for these different environments, according to the characteristics of each device. However, the creation of different versions may not be feasible and may not reach the state "human" because of the number of platforms and devices available.

Therefore, a way of combining and adding various types of technology and tracking devices to generate information for biomechanical analysis of movement can be valuable for this stakeholder.

2 THE MIDDLEWARE Middleware Architecture defines the functions that enable communications in a distributed system and the tools that improve the overall usability of architecture made up of products from many different vendors on multiple platforms [1].

The motion capture can be accomplished by inertial devices, magnetic, and optical sound, depending on the investment to be made in the equipment can have a multitude of technologies and devices. However, this work focused on low-cost devices implemented for motion capture, and they based devices nonconventional interaction. Augmented Reality (AR) was the chosen technology because it is cheaper and more efficient to capture movements. In this demo, the user will be able to test different ways to captures his movements: from fiducial markers [2], the sensing device MS-KinectTM and the device's accelerometers WIITM Remote Motion Capture System.

The framework developed will cover the three main ways of scoring in the real environment and converts them into a model of human skeleton points representing the virtual environment. In possession of this mesh points, and required at least three points (or markers), one can calculate the range of motion in a joint or muscle. An abstraction of the framework that can be viewed in Figure 1 where it can be seen that it is possible to stick one or more devices in the framework.

Figure 1: MixCap Middleware Architecture.

For each type of device stuck in the framework, distance information is required. The recording format data file is the XML standard that provides real-time interoperability between devices.

Figure 2 shows the use of the framework capture motion in augmented reality environments. The illustration to the right side shows the use of the sensor MS-Kinect and illustration on the left shows the motion capture with fiducial markers .

The application development can be accomplished with standard libraries for both technologies (ARToolKit [5] and MSKinect SDK [6]). However, the framework is provided by exchanging messages with the middleware. Next, information range of motion and position of the center of balance of the body is returned.

* [email protected], � [email protected][email protected]

171

IEEE Virtual Reality 201316 - 20 March, Orlando, FL, USA978-1-4673-4796-9/13/$31.00 ©2013 IEEE

Page 2: [IEEE 2013 IEEE Virtual Reality (VR) - Lake Buena Vista, FL (2013.3.18-2013.3.20)] 2013 IEEE Virtual Reality (VR) - An middleware for motion capture devices applied to virtual rehab

a) MS-Kinect Sensor b) AR – Tracking

Figure 2: Two different movement captured by the framework.

In order to produce clinical information, graphs like that shown

in Figure 3 are provided.

Figure 3: Exercise Rehab Report.

3 DISCUSSION In our examples shown using the MixCap framework can be a

useful addition to many different types of experiments. The primary requirement is to define the type of rehabilitation exercise and muscular group that will focus of biomechanical measurement. As the technology used for motion capture, forms of acquisition can be developed with their own examples contained in the development packages provided by the manufacturers and technology developers, and therefore with a small programming effort you can create applications that talk each other through XML files or messages in Sockets layers (TCP / IP).

Experiments combining larger, besides the movement, speed tend to be more successful with the technologies based on inertial Senores (WiiTM) and the motion sensor KinectTM. But still, the dependence is evident in the quality of the hardware (webcam) to say with more certainty that technologies based on fiducial markers tracking in augmented reality are most appropriate for this trial.

On the other hand, when it refers to precisely the devices related fiducial markers have a more adequate and thus meet the requirements of clinical information.

4 OUR LAB In our examples shown using the MixCap framework can be a useful addition to many different types of experiments. The primary requirement is to define the type of rehabilitation exercise and muscular group that will focus of biomechanical measurement. As the technology used for motion capture, forms of acquisition can be developed with their own examples

contained in the development packages provided by the manufacturers and technology developers, and therefore with a small programming effort you can create applications that talk each other through XML files or messages in Sockets layers (TCP / IP).

Experiments combining larger, besides the movement, speed tend to be more successful with the technologies based on inertial Senores (WiiTM) and the motion sensor KinectTM. But still, the dependence is evident in the quality of the hardware (webcam) to say with more certainty that technologies based on fiducial markers tracking in augmented reality are most appropriate for this trial.

On the other hand, when it refers to precisely the devices related fiducial markers have a more adequate and thus meet the requirements of clinical information.

5 ACKNOWLEDGMENTS We would like to acknowledge the contributions of teachers Augusto Veloso da Silveira, Adriano Andrade, Universidade Federal de Uberlândia and teachers of Veronica Teichrieb Voxar Labs at the Federal University of Pernambuco in the course of this research.

REFERENCES [1] G. Coulson What is reflective middleware. IEEE Distributed

Systems Online, 2001 [2] T. Bartindale and C. Harrison. Stacks on the surface: resolving

physical order using fiducial markers with structured transparency. In Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces (ITS '09), 2009.

[3] S. W. Bailey and B. Bodenheimer A comparison of motion capture data recorded from a Vicon system and a Microsoft Kinect sensor. In Proceedings of the ACM Symposium on Applied Perception (SAP '12)., 2012.

[4] M. Singhee, J. F. Holmes, J.K. Allen, and F.Mistree. Layout Design of a WiiTM Remote Motion Capture System: Formulation and Solution ASME Conf. Proc. 2010, 323 (2010), DOI:10.1115/DETC2010-28795

[5] A. C. Sementille, L. E. Lourenço, J.R.F.B, and I R. A motion capture system using passive markers. In Proceedings of the 2004 ACM SIGGRAPH international conference on Virtual Reality continuum and its applications in industry (VRCAI '04), 2004.

[6] MICROSOFT RESEARCH, “Programming Guide: Getting Started with the Kinect for Windows SDK Beta” Disponível em:http://research.microsoft.com/en- us/um/redmond/projects/kinectsdk/docs/ProgrammingGuide_KinectSDK.docx.

172