school of computer science and software engineering large object segmentation and region priority...

34
School of Computer Science and Software Engineering Large Object Segmentation and Region Priority Rendering Monash University Yang-Wai Chow Dr Ronald Pose Dr Matthew Regan

Upload: emory-gregory

Post on 18-Dec-2015

217 views

Category:

Documents


1 download

TRANSCRIPT

School of Computer Science and Software Engineering

Large Object Segmentation and Region Priority Rendering

Monash University

Yang-Wai ChowDr Ronald PoseDr Matthew Regan

Overview

• Background Address Recalculation Pipeline Priority Rendering

• Description of the challenges/problems Computation of object validity periods Large object segmentation Tearing

• Solution to the problems Region Priority Rendering Region Warping

• Experimental Screenshots• Future Work

• BackgroundAddress Recalculation Pipeline Priority Rendering

• Latency is a major factor that plagues the designing of immersive Head Mounted Display (HMD) virtual reality systems

• End-to-end latency is defined as the time between a user’s actions and when those actions are reflected by the display

The Address Recalculation Pipeline (ARP) was designed to reduce the end-to-end latency due to user head rotations for immersive Head Mounted Display (HMD) virtual reality systems

User Actions Delay

End-to-end latency

Actions reflected by Display

The latency problem

Lengthy delays in immersive Head Mounted Display (HMD) virtual reality systems can have adverse effects on the user

• Latency can completely destroy the illusion of reality that the virtual reality system attempts to present to the user

Head Mounted Display (HMD)

Conventional virtual reality display systems attempt to shorten the end-to-end latency by reducing scene complexity and/or by using faster rendering engines

• Even with the fast graphics accelerators available today that can render over 100 frames per second (fps) the end-to-end latency remains a factor to be contended with

• The update cycle is still bound by the need to obtain up-to-date head orientation information (where the user is looking) before any form of rendering can commence

HeadTracking

ImageCreation

BufferSwap

ImageValid

Normal sequence of events

Conventional systems attempt to shorten this section

On conventional graphics systems, the rendering process is bound by the need to obtain up-to-date head orientation information prior to rendering

Geometrictransform

Databasetraversal

Faceclassific.

Lighting

ClippingViewportmapping

Scanconv.

Pixeladdressing

Imagecomp.

Displayimage

Display buffer

Head Orientation

Conventional virtual reality system

The ARP is fundamentally different from conventional systems in that it implements delayed viewport mapping, a concept whereby viewport mapping is performed post rendering

Geometrictransform

Databasetraversal

Faceclassific.

Lighting Clipping

Viewportmapping

Scanconv.

Imagecomp.

Displayimage

Display buffer

Anti-aliasing

Pixeladdressing

Wide anglecorrection

Locatepixel

Head Orientation

The Address Recalculation Pipeline (ARP) virtual reality system

The ARP effectively decouples viewport orientation mapping from the rendering process, and in this manner removes the usually lengthy rendering time and buffer swapping delays from the latency

• In separating the viewport orientation mapping from the rendering process, latency is now bound to the HMD unit’s update rate and the time required to fetch pixels from display memory

• The systems is very much less dependent on the rendering frame rate and is therefore fairly independent of scene complexity

Average latency to head rotations WITHOUT pipeline

Average latency to head rotations WITH pipeline

HeadTracking

ImageCreation

BufferSwap

ImageValid

HeadTracking

ImageValid

In order to implement delayed viewport mapping, the ARP requires the scene that encapsulates the user’s head to be pre-rendered onto display memory

• The surface of a cube was chosen to be the rendering surface surrounding the user’s head, mainly because of its rendering simplicity

• The rendering surface of a cube contains six standard viewport mappings each orthogonal from the other

• There are standard algorithms for cube surface rendering

• The use of such rendering can be found in a computer graphics technique known as cube environment mapping

Top

Front

Bottom

RightLeft Back

A rendering method known as Priority Rendering was developed to be used in conjunction with the ARP system, for the purpose of reducing the overall rendering load

• Priority rendering is based on the concept of Image Composition• Different section of the scene can be rendered onto separate display

memories before being combined to form an image of the whole scene

Image Composition

Priority Rendering allows different section of the scene that surrounds the user’s head to be rendered onto separate display memories and therefore can updated at different update rates

• In the ARP system, most objects in the scene will remain valid upon user head rotations

• Perspective ‘foreshortening’ – objects closer to the display will appear larger than distant objects

• Also, upon user translations objects closer to the display will appear to move by larger amounts compared to distant objects

Priority Rendering

• Description of the challenges/problemsComputation of object validity periodsLarge object segmentationTearing

Initial implementation of Priority Rendering required the computation of validity periods for each individual object

• Object validity periods were estimates as to how long an object would remain valid in display memory with respect to the user’s translational speed

• These estimates consisted to three components Translational validity period (trans)

Size validity period (size)

Animation validity period (anim)

Object validity periods

Overall validity period = Min (trans, size, anim)

Objects were sorted according to the computed overall validity periods and then individually assigned to be rendered onto display memories with particular update rates

Size validity period

User’s eye Distance

θt

Object’s bounding sphere

Maximum translational

distance

Object’s radius

User’s eye

Distance

θt

Object’s bounding sphere

Maximum translational

distance

Translational validity period

distance×sqrt(2×(1-cos(θt))trans= relative speed

distance–radius/sin(θt+asin(radius/distance))size= relative speed

It is conceivable that the use of large object segmentation in conjunction with Priority Rendering could potentially further reduce the overall rendering load

• Fractal terrain example – A fractal terrain typically consists of thousands of polygons. If the terrain were to be segmented for priority rendering, different sections of the fractal terrain could be updated at different update rates

Large object segmentation

The implementation of object segmentation with Priority Rendering gives rise to a potential scene tearing problem

• Tearing can potentially occur when different sections of the same object are rendered at different update rates, whilst the user is translating through the scene

The tearing problem

Scene tearing artefacts will completely destroy the illusion of reality, and therefore has to be addressed before object segmentation can be used effectively

• Fractal terrain tearing example

• Solution to the problemsRegion Priority RenderingRegion Warping

Region Priority Rendering devised to implicitly sort objects and also to provide a criterion for object segmentation

Region Priority Rendering

• Upon observations of previous priority rendering results, it was concluded that object validity periods followed the spatial and temporal locality principles

• This methodology involved dividing the virtual world objects into equal sized clusters or regions

By dividing the virtual world into square regions, objects could be assigned to the different display memories with the different update rates without having to calculate individual object validity periods

• Objects in the regions were assigned to the display memories with the different update rates based on spatial locality

• Computation of object validity periods were no longer necessary

• This also provided a criterion for large object segmentation in that large objects could be segmented along region boundaries

• In this way tearing would be predictable and the size of the tearing could also be computed

Region Priority Rendering

Region Warping was designed to hide the scene tearing artefacts resulting from object segmentation with Priority Rendering

Region Warping

• Region Warping essentially involves the perturbation of object vertices in order to hide the tearing artefacts

• Screenshots from the experiments

Scene used for the experiments

Gallery scene

In this screenshot, individual objects can clearly be seen as they were rendered in a single distinct color

Virtual world objects

The virtual world was divided into regions

Virtual world regions

The scene rendered in wireframe, showing the segmentation of the objects along the region boundaries

Object segmentation

Sections of the scene rendered onto separate display memories at different update rates

Different display memories

Display Memory 0 (60fps)

Display Memory 1 (39.25fps)

Display Memory 2 (28.21fps)

Display Memory 3 (16.15fps)

With object segmentation

Without object segmentation

87 object segments 44 object segments 20 object segments 62 object segments

160 object segments 8 object segments 3 object segments 42 object segments

Overall rendering load estimates

The implementation of object segmentation reduces the overall rendering load

Display Memory 0 1 2 3

FPS (frames per sec) 60 39.25 28.21 16.15

Number of Object Segments

With Segmentation 87 44 20 62

Without Segmentation 160 8 3 42

With object segmentation

= 87 × 60 + 44 × 39.25 + 20 × 28.21 + 62 × 16.15 = 8512.5

Average rendering load per second

= ∑ (num of object segments × fps)

Without object segmentation

= 160 × 60 + 8 × 39.25 + 3 × 28.21 + 42 × 16.15 = 10676.93

Average rendering load per second

= ∑ (num of object segments × fps)

An example of a single frame showing the scene tearing effect

Scene tearing

The exact same frame, this time with Region Warping

Region Warping results

An error image showing the difference between the frame with the tearing and the frame with region warping

Difference image

Where to from here…

• Human visual perception experiments• Relationships between level of distortion, region sizes,

speed of user translations, and etc.• Computational and rendering load• Dynamic shadow generation and shaders

Future work

Questions

or

Suggestions?