by: michael smith

Post on 30-Dec-2015

80 Views

Category:

Documents

2 Downloads

Preview:

Click to see full reader

DESCRIPTION

By: Michael Smith. Sandstorm: A Dynamic Multi-contextual GPU-based Particle System, that uses Vector Fields for Particle Propagation. acknowledgments. Overview. Introduction Background Idea Software Engineering Prototype Results Conclusions and Future Work. Introduction. - PowerPoint PPT Presentation

TRANSCRIPT

Sandstorm: A Dynamic Multi-

contextual GPU-based Particle

System, that uses Vector Fields for

Particle Propagation

By: Michael Smith

acknowledgments

Overview

Introduction

Background

Idea

Software Engineering

Prototype

Results

Conclusions and Future Work

Introduction

The use of Virtual Reality(VR) to visualize

scientific phenomenon, is quite common.

VR can allow a scientists to immerse

themselves in the phenomenon that they are

studying.

Introduction

Such phenomenon, such as dust clouds or

smoke, would need a particle system to

visualizes such fuzzy systems.

Vector fields can be used to 'guide' particles

according to real scientific data.

Not a new idea, Vector Fields by Hilton and

Egbert, c 1994.

Introduction

VR applications and simulations require a multi-

context environment.

A main context, controls and updates multiple

rendering contexts.

This multi-contextual environment can cause

problems with particle systems.

Introduction

GPU offloading techniques have been proven

to allow applications and simulations to offload

work to the graphics hardware.

This can allow for acceleration of non-traditional

graphics calculations.

GPU offloading can be used to accelerate

particle calculations.

Introduction

Sandstorm

Dynamic

Multi-contextual

GPU-based

Particle System

Using Vector Fields for Particle Propagation

Background

Helicopter and Dust Simulation(Heli-Dust), is a

scientific simulation in which the effect of a

helicopter's downdraft on the surrounding

desert terrain.

Written using the Dust Framework, a framework

which allows the developer to setup a scene

using an XML file

Background

Background

Early prototypes for Heli-Dust, implemented a

very simple particle system.

This particle system did not have a way to

guide particles, according to observed scientific

data.

Background

Virtual Reality, is a technology which allows a

user to interact with a computer-simulated

environment, be it a real or imagined one.

Immerses the user in an environment.

Background

Depth Cues, is an indicator in which a human

can perceive information regarding depth.

They come in many shapes and sizes.

Monoscopic

Stereoscopic

Motion

Background

Monoscopic depth cue

Information from a single eye, or image is available.

Information can include:

Position

Size

Brightness

Background

Stereoscopic depth cue:

Information from two eyes.

This information is derived from the parallax

between the different images received by each eye.

Parallax, is the apparent displacement of objects

viewed from different locations.

Background

Motion depth cue

Motion parallax

The changing relative position between the head

and the object being observed.

Objects in the distance move less than objects

closer to the viewer.

Background

Stereoscopic Displays, 'trick' the user's eyes

into thinking there is depth where no depth

exists.

Come in all shapes and sizes.

Background

Background

Background

Multiple Contexts

A main context which controls multiple rendering

contexts.

Because of these multiple context, a Virutal Reality

application developer needs to make sure that all

context sensitive information and algorithms are

multiple context safe.

Background

There are many Virtual Reality toolkits and

libraries.

Such toolkits and libraries handle things such

as:

Generating Stereoscopic Images

Setting up the VR environment

And some handle distribution methods.

Background

Virtual Reality User Interface, or VRUI, is a

virtual reality development toolkit.

Developed by Oliver Kreylos at UC Davis.

VRUI's main mission statement is to shield the

developer from a particular configuration of a

VR system.

Background

Tries to accomplish the mission by the

abstraction of three main areas

Display abstraction

Distribution abstraction

Input abstraction

Another feature of VRUI is its built in menu

systems.

Background

Background

FreeVR, developed and maintained by William

Sherman.

Open-source virtual reality interface/intergration

library.

FreeVR was designed to work on a diverse

range of input and output hardware.

FreeVR currently is designed to work on shared

memory systems.

Background

In 1983, William T. Reeves wrote, Particle

Systems – A Technique for Modeling a Class of

Fuzzy Objects.

This paper introduces the particle system, a

modeling method that models an object as a

cloud of primitives particles that define its

volume.

Background

Reeves categories particle systems as “fuzzy”

objects, in which they do not have smooth, well-

defined, and shiny surfaces.

Instead their surfaces are irregular, complex,

and ill defined.

This particle system was used to create the

Genesis Effect, for the movie Star Trek II: The

Wrath of Khan.

Background

Background

Background

Reeves described, in his paper, a particle

system that had five steps.

Particle Generation

Particle Attributes Assignment

Particle Dynamics

Particle Extinction

Particle Rendering

Background

Particle Generation

First the number of particles to be generated per

time interval is calculated.

Then the particles are generated.

Background

Particle Attributes Assignment, whenever a

particle is created, the particle system must

determine values for the following attributes:

Initial position and velocity

Initial size, color and transparency

And initial shape and lifetime.

Initial position of the particles is determined by

a generation shape.

Background

Background

Particle Dynamics, once all the particles have

been created and assign initial attributes, the

positions and or velocities are updated.

Particle Extinction, once a particle has live past

its predetermined lifetime, measured in frames,

the particle dies.

Background

Particle Rendering, once the position and

appearance of the particles where determined

the particles are rendered.

Two assumption where made

Particles do not intersect with other surface-based

objects.

Particles where considered point light sources.

Background

In recent years, graphics vendors have

replaced areas of fixed functionality with areas

of programmability.

Two such areas are the Vertex and Fragment

Processors.

Background

Vertex Processor, is a programmable unit that

operates on incoming vertex values.

Some duties of the vertex processor are:

Vertex transformation

Normal transformation and normalization

Texture coordinate generation and transformation.

Background

Fragment Processor, is a programmable unit

that operates on incoming fragment values.

Some duties of the fragment processor are:

Operations on interpolated values.

Texture access.

Texture application.

Fog

Background

While a program, shader, is running on one of

these processors, the fixed functionality is

disable.

Several programming languages where created

to aid in the development of shaders, one such

langauge is OpenGL Shading

Language(GLSL).

Background

Background

Vertex and Fragment shaders can't create

vertices, only work on data past to them.

Geometry shaders can create any number

vertices.

Can allow shaders to create geometry without

having to be told to by the CPU.

Background

Transform Feedback, allows a shader to specify

the output buffer.

The target output buffer can be the input buffer

of another shader.

Allows developers to create multi-pass shaders

that do not relay information back to the CPU

for the other passes.

Background

ParticleGS, is a Geometry Shader based

particle system, that does the following:

Stores particle information in Vertex Buffer Objects.

Uses a Geometry shader to create particles, and

store them as vertex information in VBOs.

Uses Transform Feedback, to send particle data in

between shaders.

Uses a Geometry shader to create billboards and

point sprites to render particles.

Background

In the days before shaders, the GPU was used

just for rendering.

But with the advent of shaders, GPU's can now

be used to aid scientific computation.

One can 'trick' the GPU into thinking that it is

working on rendering information

Background

Uber Flow, is a system for real-time animation

and rendering of large particle sets using GPU

computation.

Million Particle System, a GPU-based particle

system that can render a large set of particles

Background

Both particle systems doing the following

Store particle information to textures.

Use a series of vertex and fragment shaders to

update the particle information.

Use the CPU to create and send rendering

information.

And use a series of vertex and fragment shaders to

render the information from CPU.

Background

Idea

Sandstorm

Dynamic

Multi-contextual

GPU-based

Particle System

That uses Vector Fields for Particle Propagation.

Idea

Dynamic, Sandstorm should have the ability to

change certain attributes on the fly.

Rate of emission

Size of particles

Lifetime of particles

Idea

Multi-contextual, as previously stated 3D VR

environments uses multiple contexts.

Thus Sandstorm must be designed to handle

these multiple contexts.

Random number generation

Between screen consistency

Idea

GPU-based, Sandstorm will be designed to

leverage the uses of todays most powerful and

advanced GPUs.

Use Geometry shaders to create, update, and

render particles.

Use Transform Feedback to direct data

between shaders.

Idea

Vector Fields, in order to 'guide' particles

according to observed scientific data, vector

fields will be used in Sandstorm.

But, Sandstorm should not be a vector field

simulator, only take vector fields.

Software Engineering

Software Engineering

Software Engineering

Software Engineering

Prototype

GPU-Based Particle System, like most particle

systems, Sandstorm has three main phases:

Creation and Destruction

Update

Rendering

Prototype

Creating and Destroying Particles, Traditional

particle systems would create a particle and

store it into a dynamically growing data

structure.

But, GPU-based particle systems store the

particles in a texture.

Prototype

Textures need to describe a rectangular area

that encompasses the entire area of the data.

For example if we had 19 members, the texture

would need to cover an area of 20.

Not so with VBO's

VBO's can fit the exact amount of data that is to

be used.

Prototype

Like Million Particle System and Uber Flow,

Sandstorm stores its particle information in a

double buffered approach.

Each frame one of the buffers is used as the

read buffer and the other as a write buffer.

Each buffer holds two VBO's, one for the

particle position the other for the velocities

Prototype

Geometry shaders can emit one or more

vertices.

At the beginning of the Creation/Destruction

phase, the read buffer is passed to the shader.

The shader then determines if its dealing with

an emitter.

Prototype

If the shader is dealing with an emitter the

following happens.

How many particles are to be generated is

determined.

The initial information for the particles is

determined.

Prototype

Determining amount of particles to be emitted

How many particles have already been emitted, a

Subtract a from the about of particles that is to be

emitted per second, p

Divide p by the amount of time left in that cycle,

each cycle is a second.

Prototype

Determining the initial information of particles

A random information texture is used.

The texture is Translated, Scaled, and Rotated, by

a random amount, then sent to the shader.

Emitters, have random numbers in the velocity

information, that is used to do texture look ups.

The use of the texture is to make sure the particles

are consistent between contexts

Prototype

If the Creation/Destruction shader is passed a

particle something different happens.

First the particle is determined to be alive or

dead.

If alive, the particle is re-emitted into the buffer.

If dead, the a blank particle is emitted into the

buffer.

Prototype

Updating Particles, once new particles are

created and old ones destroyed, the particles

are updated:

The delta time between frames is passed to the

update shader.

A lookup in the vector field, 3D texture, is done

according to the particles position.

Vector field velocity is added to particles velocity,

and then the particles position is updated.

Prototype

Once the particles have been updated, the

particles are then rendered.

The particle positions are passed to the render

shader, using Transform Feedback.

Particles are rendered as either

Textured deferred shaded billboards

Or, points.

Prototype

Particle position represent the center of the

particle.

So, four points have to be determined to create

a billboard.

Information that is already known, the center of

the particle and the vector pointing to the eye.

Prototype

Prototype

Once the vectors are found, they can be added

to the particles position to get the four points.

Prototype

Once the points are found, they can be emitted

to create the billboard.

Once the billboard has been emitted, a texture

is applied.

Once the result of billboard shader is

determined, a deferred shading method can be

applied.

Prototype

Currently Sandstorm uses a deferred shading

method to blend the particles together.

First step is to accumulate the particles, per

pixel, so that the more particles that are behind

a particle pixel the denser it looks.

Once that is done the result can be rendered to

a full screen quad with the result textured onto

it.

Prototype.

There is a catch though.

Using this method requires the user of

Sandstorm to:

Render the scene into an off-screen buffer, with a

depth buffer attached.

Give the deferred shader the depth buffer, so that it

can blend with the scene, obscuring any solid

object and also being obscured by solid objects.

Prototype

Vector Fields, in Sandstorm are represented as

3D textures.

A texture was used instead of a VBO, because

of existing internal methods for dealing with 3D

textures, such as wrapping, indexing, and

interpolating.

Prototype

When a lookup is done on the vector field, to

get information for a particle particle the

following happens:

The position of the particle is interpolated

Dividing the members by the width, height, and

depth of the vector field.

Prototype

Like previously stated, when dealing with a

multi-contextual environment, one must be care

to make context sensitive data and algorithms

conform to the multi-contextual environment.

This is handle in Sandstorm, by having a

controlling class, which is stored on the main

context of the VR environment, control multiple

update/render classes.

Prototype

Dynamic

Sandstorm, has the ability to change some of

its attributes, both at run-time and compile-time.

Results

A sample application was created to test out

Sandstorm.

The Heli-Dust application was used as a test

bed.

A basic vector field was used in the sample

application

Results

Considering that Sandstorm is not a vector field

simulator, a simple helicopter interaction model

was made, as the helicopter throttle increase:

The rate of emission was increased.

The lifetime of the particle was increased.

And the maximum amount of particles was

increased.

Results

The sample application was run on the

following system, that powered a four side

CAVE environment.

A multi-cored shared memory machine, with four

quad-core chips.

48 Gbs of RAM

an Nvidia Quadroplex

Running Ubuntu 7.10 Linux

Results

Rendered 300,000 deferred shaded particles at:

15-20 FPS while standing in the particle system

~65 FPS while standing a good distance back from

the particle system.

Results

Show movies.

Conclusion and Future Work

Vector fields can be used to 'guide' particles.

Sandstorm can run in a multi-contextual

environment.

Sandstorm utilizes the latest in GPU off-loading

techniques

Sandstorm can render more than 100,000

particles at above 15 FPS.

Conclusion and Future Work

Opitmizations:

Currently both emitters and particles reside in the

same buffers, separating them can limit branching

in shaders.

Currently buffer sizes are static, allowing them to

grow and shrink can increase speed of updating

and rendering.

Conclusion and Future Work

Other improvements

Collisions between the particles and objects in the

scene.

Soft Particles, Motion Blur, and Light Scattering

could be used to give the particles more realism

A shader based physics model could be

implemented to allow user to change the behavior

of the particles

Conclusion and Future Work

Other work

A vector field simulator could be create to feed

Sandstorm dynamically changing vector fields, so

that particle motion acts more naturally.

A vector field creator/editor can be create to help

scientist visualize vector fields before they are used

in Sandstorm.

Questions/Comments?

top related