evaluation of spherical harmonic lighting and ambient occlusion as shadowing techniques for

158
Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for Image Synthesis Department of Mathematics, Natural Sciences and Computer Science Degree Course in Mathematics Thesis submitted by Markus Kranzler born in Rheinbach developed and written at Trixter Film GmbH Examiner: Prof. Dr. Cornelius Malerczyk Co-examiner: Dipl.-Math. (FH) Sabine Langkamm External supervisor: Curtis Edwards Technische Hochschule Mittelhessen University of Applied Sciences Friedberg, 2011

Upload: others

Post on 11-Sep-2021

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Evaluation of Spherical Harmonic Lightingand Ambient Occlusion as Shadowing

Techniques for Image Synthesis

Department ofMathematics, Natural Sciences and Computer Science

Degree Course in Mathematics

Thesis

submitted by

Markus Kranzlerborn in Rheinbach

developed and written at

Trixter Film GmbH

Examiner: Prof. Dr. Cornelius MalerczykCo-examiner: Dipl.-Math. (FH) Sabine LangkammExternal supervisor: Curtis Edwards

Technische Hochschule Mittelhessen University of AppliedSciencesFriedberg, 2011

Page 2: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for
Page 3: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

To the people in my heart

Page 4: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for
Page 5: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Acknowledgements

I would like to express my gratitude to all the people withoutwhich this thesis wouldnot have been possible. My special thanks go to my family: to my parents, who havealways supported me and believed in me throughout my entire life, and to my brotherand sister-in-law, for challenging me and keeping me motivated. I want to thank myexaminers , Dr. Cornelius Malerczyk and Dipl.-Math. SabineLangkamm, for theirguidance, their support and their great patience with me. Furthermore, I want to thankmy external supervisor and friend, Curtis Edwards, for asking the right questions andintroducing me to the way to see the big picture.

Naturally, a very big "Thank You" goes to all my colleagues and friends at TrixterFilm GmbH for their support and willingness to share their knowledge on any relatedor unrelated topics. In particular, I want to thank Stefan Braun for his technical ge-nius (which helped to save this work from failure) and for editing this paper; MichaelHipp, as my go-to-guy for any kind of difficulty; and Georg Wieland, who very kindlylooked after all my needs and made sure I lacked for nothing. I’m also thankful to OleGulbrandsen and Daniel Pielok, who helped me out with programming.

I also want to thank David Wallace for his friendship and for his tenacity in proof-reading my manuscript and turning it into proper English. I also would like to mentionthe technical assistance I have received from the members ofPixar’s RenderMan Team,including Peter Moxom, who supported me with educational RenderMan licenses andredirected my questions to the right people, and Chris Ford for letting me test Ren-derMan within the Windows Azure cloud computing environment and allowing me toinclude the results. In this context, I also want to thank Dave Fellows, who helped meto get the right render outputs from the Cloud.

Last but not least, I want to express my deepest gratitude to my girlfriend, LisaTheile, for giving me more support and care than I could have wished for and for notgiving up hope.

iii

Page 6: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for
Page 7: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Statement of authorship

Ich erkläre, dass ich die eingereichte Diplomarbeit selbstständig und ohne fremde Hilfeverfasst, andere als die von mir angegebenen Quellen und Hilfsmittel nicht benutztund die den benutzten Werken wörtlich oder inhaltlich entnommenen Stellen als solchekenntlich gemacht habe.

I hereby certify that this diploma thesis has been composed by myself, and describesmy own work, unless otherwise acknowledged in the text. All references and verba-tim extracts have been quoted, and all sources of information have been specificallyacknowledged.

Friedberg, April 2011

Markus Kranzler

v

Page 8: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for
Page 9: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Contents

Acknowledgements iii

Statement of authorship v

Contents vii

List of Figures xi

List of Tables xiii

Abstract xvii

1 Introduction 11.1 Motivation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.2 Objective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41.3 Outline. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

2 Background and Related Work 92.1 Lighting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

2.1.1 The Rendering Equation. . . . . . . . . . . . . . . . . . . . . 132.1.2 Direct Shadows. . . . . . . . . . . . . . . . . . . . . . . . . . 14

2.1.2.1 Depth Map Shadows. . . . . . . . . . . . . . . . . . 172.1.2.2 Raytraced Shadows. . . . . . . . . . . . . . . . . . 18

2.1.3 Indirect Shadows. . . . . . . . . . . . . . . . . . . . . . . . . 192.1.3.1 Ambient Occlusion. . . . . . . . . . . . . . . . . . . 192.1.3.2 Spherical Harmonic Lighting. . . . . . . . . . . . . 23

2.2 RenderMan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 272.2.1 The RenderMan Interface. . . . . . . . . . . . . . . . . . . . . 292.2.2 The RenderMan Shading Language. . . . . . . . . . . . . . . 34

vii

Page 10: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

viii Contents

2.3 Data Compression. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 362.4 Prior Work. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

3 Spherical Harmonics 413.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 413.2 Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46

4 Implementation 494.1 The C++ Tool . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

4.1.1 RenderMan Point Clouds. . . . . . . . . . . . . . . . . . . . . 494.1.2 Raytracing Introduction. . . . . . . . . . . . . . . . . . . . . . 53

4.1.2.1 Sampling. . . . . . . . . . . . . . . . . . . . . . . . 534.1.3 Spherical Harmonics. . . . . . . . . . . . . . . . . . . . . . . 57

4.2 RenderMan Shader. . . . . . . . . . . . . . . . . . . . . . . . . . . . 604.2.1 RenderMan Brick Maps. . . . . . . . . . . . . . . . . . . . . 604.2.2 Introduction To Shaders. . . . . . . . . . . . . . . . . . . . . . 614.2.3 The Code. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61

4.2.3.1 The Bake Shader. . . . . . . . . . . . . . . . . . . . 624.2.3.2 The Relight Shader. . . . . . . . . . . . . . . . . . . 654.2.3.3 The Light Shader. . . . . . . . . . . . . . . . . . . . 67

5 Pipeline Integration 71

6 Results & Analysis 816.1 Efficiency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 816.2 Quality. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87

7 Conclusions & Future Work 93

A Spherical Harmonic Representation xv

B Real Spherical Harmonics xix

C RenderMan Shaders xxiC.1 shBake.sl . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xxiC.2 shRelight.sl . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xxivC.3 shLight.sl . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xxviC.4 shadeop_sh.cpp. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xxviii

Page 11: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Contents ix

D Baked Spherical Harmonic Files xxxiiiD.1 Coefficients Point Cloud. . . . . . . . . . . . . . . . . . . . . . . . . xxxiiiD.2 Relighting Point Cloud & Brick Map. . . . . . . . . . . . . . . . . . . xxxiv

E Pipeline Scripts xxxvE.1 my_nodetemplate.rman. . . . . . . . . . . . . . . . . . . . . . . . . . xxxv

F Compared Renderings xli

Glossar xlvii

Bibliography xlix

Page 12: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for
Page 13: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

List of Figures

1.1 Examples for the imprtance of lighting and shadowing. . . . . . . . . . . 21.2 The influence of shadows on visual perception. . . . . . . . . . . . . . . . 31.3 Concept of indirect shadow calculation. . . . . . . . . . . . . . . . . . . . 41.4 Concept of directional occlusion. . . . . . . . . . . . . . . . . . . . . . . 51.5 Concept of bent normal. . . . . . . . . . . . . . . . . . . . . . . . . . . . 51.6 Bent Normal Error . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

2.1 Lambert’s cosine law. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102.2 Lambert BRDF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112.3 Direct Illumination . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122.4 Indirect Illumination . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122.5 The Rendering Equation. . . . . . . . . . . . . . . . . . . . . . . . . . . 132.6 Shadows from off screen objects. . . . . . . . . . . . . . . . . . . . . . . 142.7 No Shadows vs. direct shadows. . . . . . . . . . . . . . . . . . . . . . . . 152.8 Shadows caused by dimensionless Lights. . . . . . . . . . . . . . . . . . 162.9 Shadowing caused by area lights. . . . . . . . . . . . . . . . . . . . . . . 162.10 The shadow mapping depth comparison. . . . . . . . . . . . . . . . . . . 182.11 Depth Map Shadow. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182.12 Raytracing shadows. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192.13 Soft shadows: Raytraced vs. Depth Map. . . . . . . . . . . . . . . . . . . 202.14 Accessibility shading. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202.15 Ambient occlusion approximation with area lights. . . . . . . . . . . . . . 222.16 How ambient occlusion works. . . . . . . . . . . . . . . . . . . . . . . . 232.17 Visibility function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 262.18 Interreflected diffuse transfer. . . . . . . . . . . . . . . . . . . . . . . . . 272.19 Interreflected diffuse transfer calculation. . . . . . . . . . . . . . . . . . . 272.20 The Reyes render pipeline. . . . . . . . . . . . . . . . . . . . . . . . . . 302.21 Secondary Channel in RenderMan. . . . . . . . . . . . . . . . . . . . . . 32

xi

Page 14: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

xii List of Figures

2.22 The regular RIB file structure. . . . . . . . . . . . . . . . . . . . . . . . . 332.23 Rendering of sphere.rib. . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

3.1 The first six associated Legendre polynomials. . . . . . . . . . . . . . . . 44

4.1 A point cloud . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 504.2 Baking a point cloud from a different camera. . . . . . . . . . . . . . . . 504.3 Sampling with uniform grid and pure random distribution. . . . . . . . . . 544.4 Jittered Sampling. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 554.5 Jittered Sampling. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 564.6 A brick map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61

6.1 Test scenes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 826.2 Backplate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 886.3 Equirectangular HDR panorama. . . . . . . . . . . . . . . . . . . . . . . 886.4 Images for quality survey. . . . . . . . . . . . . . . . . . . . . . . . . . . 896.5 Score distribution for the images. . . . . . . . . . . . . . . . . . . . . . . 90

A.1 Visibility function using 16 SH coefficients. . . . . . . . . . . . . . . . . xvA.2 Visibility function using 36 SH coefficients. . . . . . . . . . . . . . . . . xviA.3 Visibility function using 100 SH coefficients. . . . . . . . . . . . . . . . . xviiA.4 Visibility function using 10,000 SH coefficients. . . . . . . . . . . . . . . xviii

D.1 16 SH coefficients on Hektor. . . . . . . . . . . . . . . . . . . . . . . . . xxxiiiD.2 Baked illumination point cloud & brick map levels of Hektor . . . . . . . . xxxiv

F.1 Real photo. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xliiF.2 Ambient occlusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xliiiF.3 Area light shadows. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xlivF.4 Spherical harmonic lighting. . . . . . . . . . . . . . . . . . . . . . . . . . xlv

Page 15: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

List of Tables

4.1 Comparison of speed of different sampling distributions . . . . . . . . . . . 57

6.1 Measured rendering data for baking SH point clouds. . . . . . . . . . . . 826.2 Measured rendering data for baking SH point clouds on 1 or5 Nodes. . . . 836.3 Measured rendering data for reusing SH point clouds. . . . . . . . . . . . 846.4 Comparing measured data for shadow generation. . . . . . . . . . . . . . 856.5 Comparing measured data for reusing baked illumination. . . . . . . . . . 866.6 Results from quality survey. . . . . . . . . . . . . . . . . . . . . . . . . . 906.7 Descriptive statistic of survey results. . . . . . . . . . . . . . . . . . . . . 916.8 Wilcoxon Test: Two-sided Probabilities using Normal Approximation . . . 92

xiii

Page 16: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for
Page 17: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

“Wo viel Licht ist, ist starker Schatten”1

- Johann Wolfgang von GoetheGötz von Berlichingen

1“A strong light casts a deep shadow”

Page 18: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for
Page 19: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Abstract

In this thesis, we evaluate the differences between the well-known ambient occlusionshadows and a recent algorithm called spherical harmonic lighting for feature-film-quality computer graphics. To ensure a fair comparison using PhotoRealistic Render-Man as a test environment, we describe the steps for extending its capabilities withalgorithms to precompute spherical harmonic encoded directional occlusion. To ac-complish this, we develop three shaders written in the RenderMan Shading Languageto split the required algorithms into smaller tasks.

Through custom passes that we will implement for RenderMan Studio, this ap-proach offers the freedom to automate, separate and repeat each task and thereforesaves a lot of adjustment and render time.

To evaluate the shadowing techniques, we initially distinguish between two clearlydefined main categories. First, we investigate the efficiency of the algorithms, i.e.,how the technical resources are occupied. This involves criteria such as RAM used forcalculating, disk space required for the created output files and, of course, the rendertime. Second, and more important, is the quality of the generated shadows. Therefore,we held a survey to collect this subjective data.

Technically speaking, ambient occlusion starts with a slight speed advantage, be-cause no time-consuming pre-computations (such as the calculation of the sphericalharmonic coefficients) are necessary. On the other hand, when it comes to reusing al-ready baked data, spherical harmonic lighting is the fastertechnique. But the loss oftime caused by calculating the coefficients requires us to gothrough numerous lightingiterations with spherical harmonic lighting to catch up with ambient occlusion.

As far as quality is concerned, a single ambient occlusion shadow in a photo-realistic rendering appears unnatural to most survey participants. By contrast, sphericalharmonic lighting was much better accepted and based on the ratings by the partici-pants, there is no verifiable, statistically significant difference between spherical har-monic lighting and the real photograph.

xvii

Page 20: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for
Page 21: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Chapter 1

Introduction

1.1 Motivation

Driven strongly by the film and gaming industries, the creation of photorealistic orbelievable imagery has become a high priority in computer graphics [CG].

A wide range of factors may influence the feasibility of this goal, but most seemto fall into one of the following three categories: hardware, software and artistic influ-ences. The first category (hardware) is beyond the scope of this thesis and will not becovered here, but as a general rule, hardware plays an important role in determining thespeed at which images can be generated and in helping the artist to improve workflowsand visualizations.

The role of software is to provide artists with a set of tools that will give them thecreative space they need to develop synthetic images. Because every branch of the CGindustry has its own requirements and preferences, there are many different softwarepackages, each trying to provide either a comprehensive or an in-depth solution. Onething they all have in common is their attempt to help artistsachieve their own or theirdirector’s artistic vision.

Artistic choices define how the final images will look. Whenever the decision ismade to imitate real life, the goal is probably a photorealistic look. An obvious exampleis the visual effects [VFX] industry, where the artistic choice has been to merge CGenvironments into live background plates. To a large extent, this also applies to thevisualization and computer games industries. By contrast,in other situations the aimmight be the so-called non-photorealistic [NPR] style often seen in full CG featurefilms or video games. But what most projects have in common is that they try to createbelievable, inherently consistent images.

1

Page 22: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

2 Introduction

Since visual perception is based on light and every real object casts a shadow, light-ing and shadowing play a key role in most aspects of the visualization industries, includ-ing film, architecture and game productions. Many departments can benefit from betterlighting and shadowing. These departments include, but arenot necessarily limited to,modeling, animation and final rendering. As a rule, believability is very important - notonly for full CG, where the artistic direction may involve creating fantasy worlds, butalso when aiming at photorealistic imagery. As a result, lighting and shadowing willalways play an important role.

(a) (b) (c)

Figure 1.1: Examples of how lighting and shadowing can:(a)make it easier to evaluatemodels (Dragon model courtesy of Stanford University1); (b) help detect mistakes inanimation (floating);(c) change the personality of a character or scene

Therefore, it is essential to provide lighting conditions that appear physically plau-sible to the viewer. For movies, this is done using a softwarecomponent called a “ren-derer”. Based on given scene descriptions, it calculates all the effects associated withillumination in the final image. The natural counterpart of light, shadow, is - due to itssubtlety - often considered to be only a secondary illumination effect. But in reality,shadows are at least as important as the light itself.

1Stanford University Computer Graphics Laboratory: The Stanford 3D scanning repositoryhttp://graphics.stanford.edu/data/3Dscanrep/

Page 23: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

1.1. Motivation 3

Nowadays, there are several shadowing techniques to choosefrom and these can bemerged in any combination to serve a variety of purposes. Generally, these methods canbe divided into two categories, shadows from direct and indirect lighting. From a light-source point of view, direct shadows are simply generated bylighting everything thatis directly visible to the light source and darkening everything that is hidden. On theother hand, indirect shadows - also referred to in the literature as “accessibility shading”or “ambient occlusion” - are computed by determining how much geometry is directlyvisible from a surface point that could block incoming light, thereby rendering the pointdarker, the more geometry surrounds it. The latter technique will be examined in greaterdetail, because currently available methods only provide aphysically inaccurate effect,as will be described in the next section.

(a) Ambient Light (b) Spot light

(c) Shadow Maps (d) Ambient occlusion

Figure 1.2: Spatial perception increases with the quality of the lighting. From very flatscenery lit by an ambient light(a); to a spotlight that adds some volume to the objects(b); to correct impression of depth with direct shadows(c); and finally to an even morevolumetric and deeper scenery effect with ambient occlusion (d).

Page 24: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

4 Introduction

1.2 Objective

The regular procedure for simulating indirect shadows has two huge constraints thatcan lead to three errors.

• First, it is assumed that the only source of light is a hemisphere surrounding thescene that uniformly emits light toward the inside.

• Consequently, the shadows obtained are dependent only on the geometry in thescene, and not on any kind of light source. This implies that the shadowing willnot change much if the light’s position changes.

• Based on this, an enormous amount of data is collected to determine - giventhe current shading point - the locations of any objects thatcould obstruct theaforementioned light. From these data sets, only the arithmetic mean is thencalculated and used. [Akenine-Möller et al. 2008]

To summarize, erroneous assumptions are used to collect a tremendous amount of data,only a poor approximation of which is then actually taken into account in order tocreate a fictional lighting situation. Nevertheless, in many cases this provides a visuallypleasing and therefore satisfying result. (Figure1.3illustrates the concept)

Figure 1.3: The current shading point examines how much of the surrounding hemi-sphere is hidden by other surfaces to calculate a percentageeclipse

One approach for overcoming the first two errors would be to limit the lookups toa specific direction, but to re-collect the data every time something changes. Two tech-niques are commonly used for this purpose. The first is often referred to as “DirectionalOcclusion”. Here, one has the option of specifying a certainregion of the hemisphereto be considered when performing intersection calculations, instead of using the fullrange. To do so, it is necessary to ignore the surface normal and use a custom vector,

Page 25: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

1.2. Objective 5

which is usually the direction from the current shading point to a specified light. (Figure1.4)

Figure 1.4: With directional occlusion the lookup area of the hemisphere is limited toa smaller cone angle

The second method is called “Bent Normal”. It uses a vector that is the averagedirection of all the unoccluded sample rays radiating from the current point. This “BentNormal” functions as a lookup into the environment map to receive incident radiancefor parts of the surface that lie within recesses or indentations in order to approximateimage-based lighting [IBL]. The true surface normal of an area that lies within an inden-tation is probably obscured by the model itself and the surface would not be receivinglight from the portion of the environment map that the regular normal would sample.The advantage now is that the sampled illumination value from the environment map iscloser to the color of light that should be reaching the depression. (Figure1.5)

NN

b

Figure 1.5: With the bent normal the lookup for the environment color isshifted, toprevent an obscured return value

Page 26: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

6 Introduction

The disadvantage here is that the illumination comes in at anangle, while the sam-pling model assumes the light comes vertically from the direction of the true surfacenormal. As a result, with the bent normals, sampled illumination values appear muchbrighter than they actually should be, often leading to the mistaken belief that lightis concentrated in recesses, just as radiosity illumination concentrates in corners. (Seealso Chapter2.1) Furthermore, the approximation with the bent normal as thedominantdirection of illumination is in some cases incorrect. (See Figure1.6)

Nb

Figure 1.6: A sample case where the “Bent Normal” representation supplies an incor-rect result, because the average of the unoccluded directions points in the direction ofan occluding surface.

Another way to solve the problems described would be to storeall the collected data.However, this would require a huge amount of disk space, morethan even most pro-fessional companies have available. Moreover, because of the limited cache in currentprocessors, these data could not be reapplied efficiently while rendering. Therefore,one of the most important areas of research involves ways of compressing such data.

A not-quite-new, but rediscovered, technique, that tries to solve this particular prob-lem uses “spherical harmonics”. For many years, these mathematical equations weremainly used in mathematical physics [Byerly 1893; Ferrers 1877; Heine 1861] and geo-physics [Pratt 1865; Pec et al. 1982; Lemoine et al. 1998]. This technique is alreadyknown in the games industry, where it is used for generating shadows2, and it gainsmore and more popularity in the film industry. Spherical harmonic lighting [SHL] isa rendering technique that - just like ambient occlusion - uses a pre-processing step toproject the lights, the model and the transfer function ontothe spherical harmonic basisto realistically render scenes using any type of light source. Currently almost none ofthe major renderers is capable of spherical harmonic lighting by default.

2For example the Unreal Engine http://www.devmaster.net/engines/engine_details.php?id=25 (accessed March 2011)

Page 27: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

1.3. Outline 7

The objective of this thesis is to evaluate different criteria for comparing ambi-ent occlusion and spherical harmonic lighting. To make the results comparable, it isnecessary to create an environment where all conditions (e.g. lights, models, camera,etc.) are the same, except for the bare shadowing algorithms. Pixar’s renderer, “Photo-Realistic RenderMan” [PRMan], is a well established renderer, capable of computingambient occlusion and everything needed for calculating images, but it currently lackssupport for spherical harmonic lighting. Instead of creating an entirely new enclosedtest renderer, this paper will demonstrate an approach for implementing the necessaryalgorithms and corresponding shaders to extend PRMan to make it capable of comput-ing and using spherical harmonic lighting. PRMan’s customizable open architectureand shading language facilitate easy and seamless applications. First, an independentC++ tool will be implemented as a test bench that uses PRMan’salready existing anddeveloper-friendly point cloud [ptc] file format for in- andoutput. Afterwards a set ofimplemented RenderMan shaders will be described which enable the calculation andusage of spherical harmonics directly in PRMan to obtain a homogeneous environmentfor comparing the results of ambient occlusion and spherical harmonic lighting.

1.3 Outline

The structure of this thesis follows the steps taken to implement the tools and miss-ing features to provide a basis for comparison and to exploreopportunities for furtherdevelopment.

To understand most of the underlying concepts and thoughts,it is vital to have atechnical understanding of the procedures used. Therefore, Chapter2 Backgroundand Related Work discusses the basic principles of the general rendering process andhow PRMan is built around this concept to achieve the desiredresults. Before divinginto spherical harmonics, the general idea behind signal compression will be explainedand a brief overview of the history and development of spherical harmonics will begiven.

The heavy mathematical equations of spherical harmonics constitute a fundamentalpart of this thesis, thus a clear understanding is essential. Chapter3 Spherical Har-monics provides some historical background on these functions, along with the mostimportant definitions and properties.

To use the spherical harmonic equations for lighting, the functions need to be trans-lated into machine-readable commands. First, an independent developing environ-ment is created through a standalone tool written in C++, using the OpenGL extensionfor graphical output. From there, the functions are simply transferred to RenderMan

Page 28: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

8 Introduction

shaders. Chapter4 Implementation is primarily concerned with the conversion of thealgorithms into the relevant programming languages.

To ensure smooth, artist3-friendly usage of the aforementioned shaders, Chapter5 Pipeline Integration shows how to customize some scripts for RenderMan Studio[RMS] version 3 to get a workflow similar to ambient occlusionin AutodeskR©MayaR©.

In Chapter6 Results & Analysis, the two techniques will undergo some tests, whichwill then be evaluated based on different criteria.

Finally, Chapter7 Conclusions & Future Work will present some conclusions,along with several suggestions for further improvements.

3In the context of this thesis,artist is interchangeable withuser

Page 29: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Chapter 2

Background and Related Work

This chapter covers some of the key technologies necessary to properly understand thechallenges and concepts introduced in the following chapters. First, we will explorethe fundamental process of lights and shadows in digital content creation software, aswell as how the rendering equation is used to compute the interaction of the sceneobjects and lights. RenderMan is a special kind of renderingarchitecture, so it is vitalto understand the underlying concepts, because it differs from some of the other knownapproaches. Data compression is one of the advantages of spherical harmonics thatneeds to be understood before it is applied. Finally, we discuss some prior work relatedto the field of spherical harmonics in general and how this relates to computer graphics,in particular.

2.1 Lighting

For photorealistic image synthesis, lighting is an indispensable aspect that contributesenormously to the believability and, quality of the generated images. From a physicaloptics viewpoint, light propagation and interaction with different material is a verycomplex process. The lighting models invented for computergraphics are simplifiedrepresentations that still provide realistic results.

In computer graphics, the field of lighting distinguishes between two types of illu-mination. The first and simplest case is the so-called “direct illumination”. It is a verysimplistic model that is useful only for a small number of lights, previewing or realtimeapplications. Famous examples are the Phong [Phong 1975], Blinn [Blinn 1977] orCook-Torrance model [Cook and Torrance 1982]. Direct illumination does not treat thescenery as a whole and calculates only directly incident light based on the properties of

9

Page 30: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

10 Background and Related Work

the lights and the receiving surfaces - which is why it is alsocalled “local illumination”.Therefore, it cannot compute global effects like shadowing, refraction or reflection, be-cause those would require taking into account the environment. This is accomplishedwith the second method, the “global illumination” [GI] model.

Direct Illumination

The direct or local illumination model is based on Lambert’s cosine law [Lambert1760], which states that the intensity of the incoming light is proportional to the co-sine of the incidence angle of the light ray with respect to the surface normal. Morespecifically, the illuminated area increases as the angle increases, while the emitted in-tensity stays the same (see figure2.1). That also explains why it is brighter at middaythan in the evening.

θ

A A’

A

NNA

θ

Figure 2.1: Lambert’s cosine Law illustrated

The ratio ofA toA′ follows from the definition of the cosine of an angleθ in a righttriangle1

A = A′ cos θ ⇒ A′ =A

cos θ(2.1)

The intensity of the incoming light can thereby be derived from the cosine of the anglebetween the incoming light vector and the surface normal. The cosine of the anglebetween two vectors is also equal to the dot product of the twovectors, each normalized.

Lin = LE cos θ = LE(l · n) (2.2)

1http://mathworld.wolfram.com/Cosine.html (accessed February 2011)

Page 31: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

2.1. Lighting 11

WhereLin the light intensity received by the surface

LE the light intensity emitted by the light source

θ the angle between the incoming light vector and the surface normal

l, n the normalized vectors of the incoming light and the surfacenormal, respec-tively

BRDF

(cos θ)

N

θ

Eye

Incoming

Light

Diffu

se

Reflections

Figure 2.2: Lambert BRDF

It is useful to know that the lambertiansurface also reflects the light to the ob-server based on the cosine of the angleθ

between the viewing vectorI and the sur-face normalN . This is called the Bidirec-tional Reflectance Distribution Function[BRDF] ρ(I). To see why, the two equa-tions2.1and2.2need to be combined.

Lout =Lin

cosϕρ(I)

=LE cos θ

cosϕcosϕ

= LE cos θ

(2.3)

The area visible to the observer is inversely proportional to the cosine of the angle be-tween the viewing and the normal vector (as in equation2.1). The perceived brightnessof the surface is therefore independent of the viewing angle, making an ideal diffusesurface, almost like chalk, clay or matte paper.

Global Illumination

The second, much more difficultglobal illuminationmodel, unlikelocal illumination,considers the whole environment and simulates the light radiation from any surface[Cohen et al. 1993]. This model makes it possible to calculate interdependenteffectssuch as reflections of glossy or diffuse surfaces, allowing the scene to be indirectly lit.For that reason, it is also known as the “indirect illumination” model. Furthermore,as already mentioned, it is a prerequisite for shadowing. GIcan be accomplished byusing different techniques, which can be classified under two basic approaches: (1) Al-gorithms that calculate the illumination depending on the camera position (e.g. pathtracing [Kajiya 1986]). The advantage is that these procedures limit the calculations

Page 32: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

12 Background and Related Work

to objects inside the viewing frustum. (2) The other class ofGI methods begins cal-culations from a light source (e.g. photon mapping [Jensen 1996]) and can generateadditional GI effects, such as caustics, and can solve the entire rendering equation (seesection2.1.1) without further extension.

Figure 2.3: An Image rendered with direct illumination. The objects themselves have asmooth diffuse shading, due to the angle of incoming light, but the whole image appearslike a collage of independent components.2

Figure 2.4: An image rendered with indirect illumination. The shadowsand bouncinglight make the image more realistic without changing any light settings. The transmis-sion of color from the wall onto the lighter surfaces is clearly visible.2

2The images show a so-called Cornell Box:http://www.graphics.cornell.edu/online/box/ (accessed March 2011)

Page 33: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

2.1. Lighting 13

2.1.1 The Rendering Equation

In 1986 James Kajiya formulated a unified equation to solve the problem of computing“global illumination”, the effect of interreflections between various types of surfaces.He called it the “Rendering Equation” [Kajiya 1986] and it is a mathematical descrip-tion of the behavior of light transport. It originally simulated a phenomenon that Kajiyaderived from the field of radiative heat transfer. [Siegel and Howell 2002]Therendering equationis

L(x, x′) = g(x, x′)

ε(x, x′) +

S

ρ(x, x′, x′′)L(x′, x′′)dx′′

(2.4)

where:L(x, x′) the energy of light at pointx coming from pointx′

g(x, x′) the geometric relation betweenx′ andx, defined as being0 if they are

not mutually visible or1

dist(x, x′)2otherwise

ε(x, x′) the energy of light emitted fromx′ to x

S the union of all surfaces

ρ(x, x′, x′′) the intensity of light scattered fromx′′ to x by a patch of surface atx′,derived by the BRDF [Nicodemus 1965] at x′

Figure 2.5: Therendering equationstates that the incoming light atx coming fromx′

is - weighted by the relation betweenx andx′ - the light emitted by the surface atx′

plus the sum of all incoming light that is modulated and reflected through the BRDF atx′ in the direction ofx

Page 34: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

14 Background and Related Work

This form of the equation is for the complexglobal illuminationmodel. It considersdiffuse interreflections under the law of energy conservation. When the samples arereflected off a surface, they have to gather the color of the material and transfer it tothe next intersected surface (e.g. figure2.4). This effect is called “color bleeding”. GIadds a lot to the realism of a digitally created image, but solving an integral equation isa very complex task, as is computing the enormous number of ray bounces.

Customary lighting algorithms are based on the simpler direct illumination, mean-ing light rays traveling only from the light source to the first intersecting surface, with-out getting reflected afterwards. Without much effort, the equation2.4can be simplifiedto compute only direct light.

L(x, x′) = g(x, x′)

ε(x, x′) +

S

ρ(x, x′, x′′)ε(x′, x′′)dx′′

(2.5)

The regular rendering equation is a recursive function, because the final intensity iscalculated based on the radiance value of former points (L(x′, x′′)) until the originatinglight source is reached. By replacing the radiance termL(x′, x′′) with ε(x′, x′′), onlyenergy directly emitted by surfaces is considered, which happens to be the case onlyfor light sources.

2.1.2 Direct Shadows

Shadows are absolutely vital for realistic computer generated images, because in thereal world, every object that is illuminated casts a shadow.Furthermore,

Figure 2.6: Shadowscast by off screen objectsextend the scene. From[Birn 2005]

in addition to making out the direction and number of in-coming lights, it is essential that the position, shape anddepth of an object be correctly perceived. If shadows aremissing in an image, the human brain has a hard time decod-ing the information and the viewer perceives that somethingis wrong (See figure2.7or 1.2 for examples). Another niceeffect worth mentioning is the expansion of the scene. Sim-ply letting shadows from the off-screen space be cast intothe image gives the impression of a complete environmentin which more objects seem to exist than are visible at firstglance (Figure2.6).

Generally speaking, shadows are surface or volume ar-eas that are not visible to a light. More precisely, however,that is only half the truth, because there are two sorts of

Page 35: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

2.1. Lighting 15

(a) Unshadowed (b) Shadowed

Figure 2.7: Comparison between a model rendered without any shadows(a); and ren-dered with direct shadows(b).The Sponza Atrium was modeled by Marko Dabrovic

shadows: the so-called “hard shadows” and their logical counterpart, “soft shadows”.Hard shadows are characterized by the crisp outlines of the shadowing area. This effectis caused by the size of the lights. By default, distant lights, point lights and spot lightscasthard shadows. Point lights and spotlights emit their light from an infinitesimalpoint, meaning all rays diverge so they cannot wrap around objects. Distant lights, onthe other hand, emit parallel rays from infinitely far away, meaning that if an area isoccluded, there is also no chance that another ray from the same light source can lightthat area. In these cases, the statement at the beginning of this paragraph is true. (Figure2.8)

In the case ofsoft shadows, there is a smooth transition range called thepenumbra3

located between the fully occluded area (theumbra4), and the fully lit area. This isbecause soft shadows are cast by area lights, i.e. lights that have an actual size andshape. Therefore, it is possible that a shaded point on a surface sees only part of thelight - like a partial solar eclipse. Thus, the amount of incoming light at those pointsis reduced, but is not completely null. Depending on the sizeand position of the light,the shadows can also be crisp and divergent, smooth and parallel or even very soft and

3Latin: paene = almost + umbra = shadow4Latin: umbra = shadow

Page 36: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

16 Background and Related Work

Umbra

Occluder

Umbra

Occluder

Figure 2.8: The shadows cast by infinitesimal light sources (left) and distant lights(right) by default only consist of a sharpumbra

convergent. In the last scenario, an additional shadowing area occurs, theantumbra5,which requires the light to be larger than the occluding object, and it only emerges afterabout 2

3of the distance between the light source and the occluding body, behind the

object concerned. (Figure2.9)

Umbra

Penumbra

Penumbra

AntumbraOccluder

Figure 2.9: The soft shadowing area consists of the umbra, the penumbraand - if thelight source is larger than the occluding body - an antumbra.

There are several different methods for calculating shadowed areas. Only the twomost popular will be described in more detail.

5Latin: anti = opposite + umbra = shadow

Page 37: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

2.1. Lighting 17

2.1.2.1 Depth Map Shadows

The most straightforward technique is probably the one referred to as “Depth MapShadows” or simply “Shadow Map” [Williams 1978]. It is also the most efficient tech-nique and yields the most popular kind of shadows [Birn 2005]. Generally the calcula-tion consists of two steps.

1. First, adepth mapor shadow map- which is an image rendered from the per-spective of the light source - is computed and for each pixel,the distance to thenearest object in that direction is stored. Technically, though, it is a matrix, witheach element displayed as a pixel.

2. Then, to render the image from the position of the camera, the shadow mapisprojected from the light using this matrix [Fernando and Kilgard 2003]

s

t

r

q

=

12

12

12

12

12

12

1

Light

Frustum

(projection)

Matrix

Light

V iew

(look at)

Matrix

Modeling

(world)

Matrix

x0

y0

z0

w0

(2.6)Reading from right to left, the coordinates of a point will betransformed initiallyfrom object space into world space. Then they are further transformed into lightspace, where they will be adjusted to fit into the ligh’s viewing frustum. Now thetransformed points have coordinates in a range from -1 to 1. Simply multiplyingby the transformation matrix will allow us to scale down the values and translatethem to fit into the range [0,1] of a regular texture space. Now, from the cameraview, for each pixel, the actual distance from the underlying shading point to thelight is compared with the projected depth value at that point. If they are thesame, the point is lit; otherwise, it is in shadow. (See figure2.10for illustration)

The quality ofdepth map shadowsis heavily dependent on the resolution and pref-erences of theshadow map. Usingshadow mapsfor very large areas is not advisable;because of the finite resolution, the shadows become pixelated and the file size becomesan issue. The edges of the shadows can be evenly blurred by filtering the map, creatingthe impression of asoft shadow. Sadly, the regulardepth map shadowsdo not supporteither transparent or colored shadows because they only store a single depth value, notallowing for the option of looking through transparent objects or even picking up theircolor6. These and other features are by default facilitated with “raytraced shadows”.

6An exception is offered by “deep shadow maps” [Lokovic and Veach 2000], a technique that pro-duces shadows for objects like glass, hair, fur, and smoke. Deep shadow maps store a representation of

Page 38: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

18 Background and Related Work

P

Image

Plane

Z

Z

S

PZ S<

ZP

Shadow Map

(a) P is shadowed

P

Shadow Map

Image

Plane

Z SZ P =

Z P Z S=

(b) P is lit

Figure 2.10: In (a) the current shading point is shadowed, because its depth (ZP ) isgreater than the stored value in theshadow map(ZS); the point in(b) has the samedepth as the stored value, so it is illuminated. From [Fernando and Kilgard 2003]

(a) Rendering (b) Depth map

Figure 2.11: A simple rendering with a single spotlight casting a depth map shadow(a)- the fog in the light cone was added to emphasize the cast shadow; and the correspond-ing depth map from the perspective of the light(b) where the closest point is white andthe farthest is black

2.1.2.2 Raytraced Shadows

The second shadowing technique is calledraytraced shadows. By using rays to deter-mine whether a surface is illuminated, the most reasonable approach would be to startat the light source, with the closest intersection of the raywith a surface then beingilluminated. But this method requires a lot of rays that might even come to nothing or

the fractional visibility through a pixel at allpossible depths. But only a few renderers such as PRMansupport them

Page 39: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

2.1. Lighting 19

might illuminate surfaces that are not visible to the camera. Therefore, the traditionalway of tracing shadow rays works backwards. This means that aray is cast into thescene for each pixel, starting from the camera. If that ray intersects with an object, anew ray is spawned, aiming at the light source. Now the ray checks to see whether itintersects any surface on its way to the light, resulting in shadow for a positive resultand illumination if it gets through unimpeded. Figure2.12illustrates the concept.

Figure 2.12: Raytracing starts with primary rays fired from the camera (white). But forray-traced shadows, rays need to be fired from each rendered point towards the light, tosee whether the path is clear (yellow) or blocked and requiring a shadow (red). Imagecourtesy of [Birn 2005]

Since the number of rays forRaytraced shadowsdepends on the final rendering,the shadows are always as crisp as the rendered image. Often they do not need asmuch adjustment asshadow mapsand they naturally get lighter and take on color whenshining through transparent objects. Moreover, thesoft shadowscast by area lightsare more realistic because they become softer and softer as the distance to the castobject increases, whereasdepth map shadowsblur all edges equally (See figure2.13).But the performance disparity betweenraytracedanddepth map shadowscan be huge,depending on how complex the scene is. Therefore, the latterare still mainly used inproduction, where time is an important factor. [Birn 2005]

2.1.3 Indirect Shadows

2.1.3.1 Ambient Occlusion

Ambient occlusion is a useful technique for adding shadowing to diffuse objects litwith ambient lighting. One of the main predecessors of ambient occlusion was the so-called “accessibility shading” [Miller 1994]. This technique was implemented to modeleffects like cleaning, dirtying, aging, tarnishing or the like, based on surface variations.

Page 40: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

20 Background and Related Work

Figure 2.13: A soft shadow generated by raytracing (top) and by a blurreddepth map(bottom)

For instance, the inaccessibility of a surface area determines how much tarnish it willretain when it is cleaned. This requires calculating the size of a spherical probe thatis tangent to the current point without penetrating any other surface (See figure2.14a).Ambient occlusion can be interpreted as the accessibility of light.

(a) Concept (b) Shading

Figure 2.14: Concept of Accessibility shading for two intersecting planes(a); Tangent-sphere accessibility for a sphere intersecting a plane(b); Images from [Miller 1994]

Even if the results sometimes appear similar, there is stilla logical distinction be-tween being able to access a point with a rag as opposed to being able to reach it withlight rays. Derived from this method, [Zhukov et al. 1998] implemented a technique

Page 41: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

2.1. Lighting 21

they called the “obscurances illumination model”. It calculates shadows from a diffuseambient light that comes in equally from all directions. Theobscuranceof a PointP isdefined as:

w(P ) =1

π

ω∈Ω

ρ(L(P, ω)) cos θ dω (2.7)

where

L(P, ω) =

dist(P,C)

where C is the first intersection point of rayPω withthe scene

+∞ otherwise (rayPω does not intersect the scene)

P the point on the surface

ω the direction of the sample ray

Ω the unit hemisphere centered inP aligned with the surface normal inP

ρ(L(P, ω)) a function that remapsL(P, ω) to values between 0 and 1, giving themagnitude of incoming ambient light from directionω

θ the angle between directionω and the normal atP

1

πthe normalization factor such that ifρ() = 1 over the whole hemisphereΩ, thenw(P ) = 1

The total amount of the incoming light intensity atP is calculated as:

I(P ) = IA ∗ w(P ) (2.8)

with IA being an ambient light power that is constant for the whole scene.The idea of ambient occlusion is a simplification of the “obscurances illumination

model”. It was first introduced in a course about RenderMan atthe SIGGRAPH con-ference in 2002. [Landis 2002] and [Bredow 2002] presented their two different ap-proaches, which they both called ambient occlusion. [Bredow 2002] described how, forthe movie “Stuart Little 2” Sony Pictures Imageworks used two very large area lights,one representing the sky and the other for the ground. (See figure2.15)

In the same course, [Landis 2002] describes the approach taken at Industrial Lights& Magic for the movie “Pearl Harbor”. This same technique is now used in numerousmovie productions and computer games and is implemented in every major renderingsoftware package. Hayden Landis and his two colleagues, KenMcGaugh and HilmarKoch, were awarded the Academy Technical Achievement Awardat the Oscars in 2010for this innovation. The article [Landis 2002] published a shader that randomly shoots

Page 42: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

22 Background and Related Work

Figure 2.15: Test rendering of Sony Pictures Imageworks “Stuart Little 2” with theirambient occlusion approach. Image from [Bredow 2002]

out rays (samples) into the scene and checks whether they intersect an object. Thenumber of samples which do not hit an object inside a user-defined radius is simplydivided by the total number of samples fired out. (See figure2.16) This percentage ofobscurance can be formulated as:

AO(P ) =1

n

n∑

i=1

V (P, ωi) (2.9)

whereV (P, ωi) is a binary visibility function defined as

V (P, ωi) =

1 if sample in directionωi is not blocked

0 if sample in directionωi is blocked

This sum is a Monte Carlo estimation of an integral [Lepage 1978; Weinzierl 2000],following the law of large numbers. Rewritten as integral:

AO(P ) =1

π

ω∈Ω

V (P, ω) cos θ dω (2.10)

where 1πcos θ is the “probability Density Function” [PDF] forV , normalizing the re-

sult over the full hemisphere with uniform direction weighted by cosine [Méndez-Feliuand Sbert 2009]. And this is the regular ambient occlusion equation, oftenfound inliterature7. Furthermore, it is apparent that there is a similarity between equation2.10

7Mostly in literature and software, the visibility functionV (P, ω) is defined inversely to the oneabove - returning 1 for blocked and 0 for unblocked samples. Therefore the ambient occlusion is thenstated as1−AO(P )

Page 43: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

2.1. Lighting 23

and equation2.7. The only difference is that theobscurances illumination modelincor-porates the whole scene and uses the varying distance values, while ambient occlusionlooks in a defined range and only distinguishes between values of 0 or 1 for intersectionsor misses, respectively. But this disparity makes it possible to achieve a considerableboost in computation speed.

Figure 2.16: Ambient occlusion is calculated by shooting out samples and dividing thenumber of samples that hit any surface by the total number of samples that were firedout.

2.1.3.2 Spherical Harmonic Lighting

The term “spherical harmonic lighting” [SHL] describes a technique for rendering dif-fuse objects in low-frequency lighting environments that was first introduced by [Sloanet al. 2002]. It is based on the fact that incoming light can be interpreted as a multi-dimensional signal - similar to those in physics - coming from a surrounding hemi-sphere. Therefore, we can treat it as a function to apply sometransformations for easiermanipulation as well as compression. (See section2.3)

Page 44: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

24 Background and Related Work

Low-frequency lighting refers to the amount of alteration in the illumination com-ing from the environment. A low frequency means that there isonly slow change overthe hemisphere, resulting from large area lights with a preferably constant illumination.High frequencies, on the other hand, are produced by either small light sources, likepoint lights, or by lights projecting textures with fine details. These are hard to repre-sent, because a fast-changing signal is difficult to compress and reconstruct. An ideallighting condition would be an outdoor scene on an overcast day, where the primarylight is scattered on the sky dome and comes from every direction [Green 2003].

Spherical harmonics are a mathematical set of base functions that, as the nameindicates, are represented in spherical coordinates. Since a lookup into an environmentis based on hemispherical sampling and integration, this isa handy property. A furtherexplanation of spherical harmonics equations and their properties is provided in Chapter3. This section is focused on the principles for using spherical harmonics for lighting.

Spherical harmonic lighting can be used to represent three progressively complexresponses of shaded objects to their environment. These aretransfer functions, mappingincoming to outgoing radiance. [Sloan et al. 2002]

Theunshadowed diffuse transferfunction is essentially just a representation of di-rect illumination. Given by

TDU(P ) =ρP

π

∫Li(P, ωi)HNP

(P )dωi (2.11)

whereP the point on the surface

ωi the incoming direction

S the unit sphere centered inP aligned with the surface normal inP

TDU(P ) the amount of light leaving pointP

ρP the surface albedo at pointP

Li(P, ωi) incoming light at pointP along vectorωi

HNP(P ) ⇔ max(NP ·ωi, 0) the cosine-weighted, hemispherical kernel aboutNP

NP the surface normal atP

Due to the orthogonality of the spherical harmonics base functions, the projectedintegral of the illuminationLi multiplied by thetransfer functiondenoted asMDU =

HNP (P ) is the same as the dot product of both their SH projected coefficient vectors

Page 45: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

2.1. Lighting 25

[Sloan et al. 2002; Green 2003].

Extending the equation2.11to include a visibility function leads to theshadoweddiffuse transfer, incorporating self shadowing.

TDS(P ) =ρP

π

∫Li(P, ωi)HNP

(P )V (ωi)dωi (2.12)

whereV (P, ωi) visibility term, 1 if sampleωi fails to intersect itself, 0 otherwise (see

eq.2.9)

With this modification, the lighting gains a lot of realism compared to theunshad-owed diffuse transfer, since it is now aglobal illuminationmodel. Here, thetransferfunction is read asMDS = HNP (P )V (P, ωi). The visibility term is the same as inambient occlusion. Using the visibility test from figure1.3and treating it as a functionin polar coordinates - the 2D equivalent of spherical coordinates - would result in thegraphs depicted in figure2.17. AppendixA shows the corresponding SH representationwith different numbers of coefficients. A relatively easy method for projectingV (P, ωi)

into spherical harmonics involves separately projecting each of its visibility samplesωi

that do not intersect other objects8 and adding up their coefficient vectors. Because thiscomputed vector will be oversized, it needs to be scaled backdown to fit the unit sphere

by dividing it by the probability of equal distribution on the spherep(ωi) =1

4πand the

total number of samples.

~cv =4π

N

N∑

i=0

~ci V (P, ωi) (2.13)

where~cv the coefficients ofV projected into SH

4π1

p(ωi)- the inverse equal probability function on a sphere

N the total number of samples

~ci the coefficients ofωi projected into SH

which is the vector of coefficients of the projected visibility function. Multiplying~ci byV helps to meet the requirement to gather only the non-intersecting samples, becausethe other caseV (P, ωi) = 0 eliminates them.

8meaning ωi : V (P, ωi) = 1

Page 46: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

26 Background and Related Work

0°180°

(a) Polar coordinates

0° 180°

(b) Cartesian coordinates

Figure 2.17: The visibility function in polar coordinates(a); and “unwrapped” in Carte-sian coordinates(b)

The last possible method is theinterreflected diffuse transfer, used to add color bleed-ing.

TDI(P ) = TDS(P ) +ρP

π

∫L(P ′, ωi)HNP

(P )(1− V (P, ωi))dωi (2.14)

whereL(P ′, ωi) reflected light from a pointP ′ towards point P

(1− V (P, ωi)) inverse visibility; 1 for intersecting samples and 0 otherwise

Theshadowed diffuse transferdescribed before only considers the direct light comingfrom the surrounding hemisphere. To also consider the indirect light coming from otherpoints, the termL acts as a sort of placeholder that takes all the transfer functions of thesurrounding points - visible to the current shading point - into account. Therefore, it isnot possible to explicitly denote the transfer function forinterreflectionMDI , becausethe number of visible points and their transfer functions are always different.

It is difficult to understand what is meant by “taking all the transfer functions ofthe surrounding points into account”. The following diagram simplifies the concept byillustrating the math behind it.After calculating theshadowed diffuse transfer, the occluded areas are checked forpoints that can reflect light to the current point and a cosine-weighted copy of the trans-fer function there is added into the current one.

At first it might seem surprising that the illumination from adirection not directlyvisible is added. But as already mentioned, the main property of SHL is low-frequencylight sources with only small variation. Therefore, it can be assumed that the lightreflected byP ′ has a very similar appearance to the light aboveP .

Page 47: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

2.2. RenderMan 27

P

P’

Figure 2.18: PointP fires a ray, that intersects pointP ′, to calculate the diffuse inter-reflection

cosθ

T (P)DS (N ω )P i.

NP

ωi

T (P’)DS

Figure 2.19: Theshadowed diffuse transferof pointP is added by the cosine-weightedcopy of the transfer function atP ′

Even though the calculation for SHL commonly assumes only diffuse surfaces, to acertain degree it is possible to pre-compute glossy BRDFs byusing a coefficient matrixinstead of a vector [Sloan et al. 2002]. However, highly specular appearances cannotbe modeled, due to their dependence on the viewing angle and the high frequency inspecular reflections.

2.2 RenderMan

The History of RenderMan reaches back into the early 1980s, when George Lucasfounded the Graphics Group as a part of Lucasfilm’s computer division. This group of

Page 48: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

28 Background and Related Work

CG researchers, who are well known today, began to develop a new renderer with aninnovative new architecture.

At the annual SIGGRAPH conference in 1987, Cook, Carpenter and Catmull pre-sented a paper on the core components of the rendering systemthey had implemented[Cook et al. 1987]. They called the algorithm they developed for rendering “Reyes”9.

In the year 1990, again at the SIGGRAPH conference, Pat Hanrahan and Jim Law-son published the first paper on the shading subsystem for RenderMan, the RenderManShading Language [RSL] [Hanrahan and Lawson 1990].

The term “RenderMan” actually stands for the standard interface that connects mod-eling and animation tools with a renderer. Rendering software that supports the inter-face specifications is said to be “RenderMan-compliant”. The RenderMan InterfaceSpecification was defined by Pixar [Pixar 2005] and published in 1988. Until 1994,the only RenderMan-compliant renderer available was Pixar’s own implementation ofthe interface,PhotoRealistic RenderMan[PRMan]. Therefore, the term “RenderMan”is often used as a generic term to refer to PRMan itself. Sincethen, several otherRenderMan-compliant renderers have appeared, but this thesis uses PRMan10. Other fa-mous implementations include 3Delight, AIR, Angel, Aqsis,BMRT, entropy and Pixie.

The RenderMan Interface Specification is divided into two parts, the RenderManInterface (Section2.2.1) and the RenderMan Shading Language (Section2.2.2).

To maximize efficiency, theReyesrenderers follow a specific divide-and-conquerpipeline before actually processing the data (which is provided to the renderer via a“RenderMan interface bytestream” [RIB] file). The steps leading down the pipelineare:

Bounding and Splitting Each piece of fed geometry is bounded by an appropriatebounding box. By using this bounding box, it is easy to determine whether thecontained geometry is completely or partially visible or not. Only the pieces thatare at least partially visible to the camera stay in memory and will be split andbounded again. This process of bounding, culling and splitting goes on until allremaining primitives reach a set “bucket” size.

Dicing Dicing subdivides the resultingbucket-sized patches into smaller rectangular“micropolygons”. Each of thesemicropolygonswill cover the same amount ofscreen space, as defined by the artist - regardless of its distance from the camera.For production, that is approximately one pixel each.

9It is an acronym for “Render Everything You Ever Saw”, a convenient phrase that summarizes theirambitious undertaking.

10To be precise, the versions RenderMan Pro Server 15.2 and Renderman Studio 3.0

Page 49: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

2.2. RenderMan 29

Shading Now the shading engine is run on the vertices of eachmicropolygon. Theshaders are executed in a SIMD (Single Instruction MultipleData) manner foronebucketin parallel.

Busting and Hiding All micropolygonswill now be detached into independent prim-itives to perform a last visibility test. After shading and displacing, RenderMandetermines whichmicropolygonsare hidden by other surfaces or moved outsidethe frustum to further increase efficiency through culling.

Sampling To calculate the final image, a user-defined number of samplesis firedto collect color values. In order to prevent aliasing, this is normally done on asub-pixel base, meaning that more than one sample is used perpixel.

Compositing and Filtering Depending on the depth and opacity of a returned sam-ple value, the renderer will determine the effect it has on the pixel. This blendingcontinues until the pixel is considered opaque. The user-defined reconstructionfilter then determines the final color from the transition between neighboring pix-els.

2.2.1 The RenderMan Interface

The RenderMan Interface Specification [Pixar 2005] [RISpec] was published one yearbefore PhotoRealistic RenderMan was released. At that time, every rendering or mod-eling tool had its own scene description language. The intention was to create a unified“Application Programming Interface” [API] that was supposed to become a standardfor 3D scenes, similar to PostScript [Adobe 1986] for 2D.

It is important to note that the RenderMan Interface [RI] specifies onlywhat to ren-der, nothow to render it. It does not matter whether the renderer uses regular scanline[Wylie et al. 1967], raytracing [Whitted 1980], Reyes[Cook et al. 1987], or any othertechnique. The renderer must accept the complete “RenderMan Interface Bytestream”[RIB] to generate an image, which can be directly displayed or saved as an image file.

Although a renderer may provide a wide range of capabilities, some fundamentalfeatures are required to make it RenderMan-compliant. Thisset of features is com-prehensive in order to ensure users basic compatibility between, and high performancefrom, all implementations of the RenderMan Interface. But an important restriction isthat the rendering programs cannot provide nonstandard alternative mechanisms, suchas an alternative shading language.

To implement the RenderMan Interface, a rendering program must11:11As listed in [Pixar 2005]

Page 50: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

30 Background and Related Work

Figure 2.20: A data flow diagram of the steps used byReyesrenderers. Image from[Cortes and Raghavachary 2008]

• provide the complete hierarchical graphics state, including the attribute and trans-formation stacks and the active light list.

• perform orthographic and perspective viewing transformations.

• perform depth-based hidden-surface elimination.

• perform pixel filtering (the way of averaging samples for a pixel) and antialiasing(avoiding jaggies).

Page 51: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

2.2. RenderMan 31

• perform gamma correction (controlling the brightness of animage to correctlydisplaying it) and dithering (intentional noise, to prevent banding) before quanti-zation.

• produce picture files containing any combination of RGB, A, and Z. The resolu-tions of these files must be as specified by the user.

• provide all of the geometric primitives described in the specification, and provideall of the standard primitive variables applicable to each primitive.

• provide the ability to perform shading calculations using user-supplied Render-Man Shading Language programs.

• provide the ability to index texture maps, environment maps, and shadow depthmaps.

• provide the fifteen standard light source, surface, volume,displacement, and im-ager shaders required by the specification. Any additional shaders, and any de-viations from the standard shaders presented in this specification, must be docu-mented by providing the equivalent shader expressed in the RenderMan ShadingLanguage.

In addition to these capabilities, some advanced features such as motion blur, arealights or ray tracing might be implemented; descriptions ofthese can also be found in[Pixar 2005].

The aforementioned RIB file is an ASCII- or binary-encoded collection of all mod-els, cameras, attributes, options, lights and shaders thatneed to be rendered. It canbe either used as an archive file format or to easily distribute the information neces-sary for remote rendering. The RIB file is structured as a listof commands insideof nested blocks. Figure2.23 displays the general sequence of such blocks. Eachcommand statement contained in such a block is only valid within that particular en-vironment. A special command must be stated to start and to end these areas, namely,BlocknameBegin andBlocknameEnd , respectively; “Blockname” can be eitherFrame , World , Transform or Attribute .

• A RIB file usually starts with some global options. These are attributes that af-fect only the image, not the objects it contains. The global options usually donot change over a sequence of frames, e.g. the image resolution or the displaychannels. The latter are important for outputting “Arbitrary Output Variables”[AOVs], which are numeric values from RenderMan shaders that get calculatedanyway but are simply piped into separate display drivers that display them. Usu-

Page 52: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

32 Background and Related Work

ally, that are either additional image files for later compositing, RenderMan’sspecific 3D file format “point clouds” or user-defined drivers. (Figure2.21)

Figure 2.21: A RenderMan beauty pass is a displayed12, or saved, image. In both cases,the data is “piped” through the primary display. The DisplayChannel enables AOVsthat store any RSL numeric data to be associated with their own “pipe”, or so-calledsecondary display channel13.

• The frame block options may be the image attributes (as the global options) butwill probably change over a sequence of frames inside the RIBfile, e.g. the fieldof view or camera position. But the frame blocks are usually not used, because inmost cases each frame is stored in a separate file with frame options merged intothe global options. This improves the handling of frames over several renderingmachines.

• The “World Blocks” contain descriptions of the objects in the scene. In theseblocks, additionalTransform andAttribute blocks are available for furthernesting of the corresponding operations.

A very basic example of a RIB file can be found in listing2.1.

13“it” stands for Pixar’s “imaging tool”, for displaying images13Source from http://www.fundza.com/rman_shaders/secondary_images /

index.html (accessed February 2011)

Page 53: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

2.2. RenderMan 33

Figure 2.22: The regular RIB file structure. Image from [Raghavachary 2004]

Listing 2.1: RIB file example

1 # sphere.rib

2 Display "sphere.tif" "file" "rgba" # render to RGBA tiff file3 Format 512 512 1 # image format

4 Projection "perspective" "fov" 30 # perspective camera5 WorldBegin # start scene describtion

6 LightSource "basicLight1" 1 "intensity" 0.8 # light source

7 TransformBegin # temporary transformations8 Translate 0 0 5 # move 5 units along Z

9 AttributeBegin # nested attributes10 Surface "basicSurface" # surface material

11 Sphere 1 -1 1 360 # draw unit sphere

12 AttributeEnd # discard inner attributes13 TransformEnd # undo inner transforms

14 WorldEnd # end scene

Page 54: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

34 Background and Related Work

Figure 2.23: sphere.tif

This RIB example can be rendered by typingprman

sphere.rib in the terminal or command line in Win-dows, which will then save an image of a sphere withRGBA color channels and a resolution of 512×512 pixelsas “sphere.tif”. The shaders for the light and the sphere sur-face can be found in the next section under listings2.2 and2.3, respectively.

2.2.2 The RenderMan Shading Language

Now, having shown how to tell the rendererwhat to render, it is time to tell ithow todo it.

The RenderMan Shading Language [RSL] is a powerful C-like extension to theRenderMan Interface for defining the appearance of surfaces, lights, phenomena oreven the image itself. Shaders are small subroutines that enable the renderer to per-form an unlimited number of operations to create images thatare limited only by theimagination of the artists themselves. The shading language per se has some restric-tions designed to make it a lot simpler, but it is still possible to bypass these constraints.RSL also provides an interface to extend its functionality by programming external Cfunctions as “Dynamic Shared Objects” [DSO shadeops].

The RenderMan Interface distinguishes between five different shader types. Eachof these fulfills a specific role in the rendering process and therefore cannot be inter-changed with others, although all of them work together and can also communicate.The distinctive shaders for each task in the rendering process are:

Displacement shadersDisplacement shaders are executed first in the shade tree,because they modify the position and/or the normal of surface points. This couldreveal points that were obscured before - or vice-versa. Moreover, the surfacenormal needs to be processed before the surface shader.

Surface shadersIn the second shading stage, the surface shaders attached toall ge-ometric primitives are processed. They are responsible fordetermining the colorand opacity of eachmicropolygonbased on the corresponding surface normals.To do so, they use procedural patterns or texture lookups that are manipulated bylooping over all incoming lights.

Page 55: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

2.2. RenderMan 35

Light source shaders Light shaders are run inside a surface shader, invoked by anilluminance loop. They can also use procedural patterns or textures to gen-erate a light color for the invoking point.

Volume shaders Volume shaders simulate changes to the color of light rays trav-eling through a medium of dense atomic particles, either be inside or outsidea geometric primitive. Volume shaders consist of three different types that canmodulate different types of rays. The atmosphere affects rays traveling fromthe camera to objects. The interior is enclosed in geometry.The exterior is themedium between objects and light sources.

Imager shader Imager shaders perform operations on final pixels before they areoutput.

There used to be a sixth shader type, theDeformation shader, which some renderersstill might support, but it has been officially removed from the RenderMan Interfacesince version 3.2.

To continue the example from section2.2.1, the two shaders used there, still needto be described.

Listing 2.2: Light shader example

1 / * -- basicLight1.sl -- * /2 light basicLight1( float intensity = 1.0 )

3 4 solar()

5 Cl = intensity * color (1, 1, 1); / * Cl: final lightcolor * /

6

Thesolar() call makes the light a distant light source.

Listing 2.3: Surface shader example

1 / * -- basicSurface.sl -- * /

2 surface basicSurface()3 / * N: surface normal * /

4 normal Nn = normalize ( N); / * Os: surface opacity * /5 Oi = Os; / * Oi: final surface opacity * /

6 Ci = Oi * Cs * diffuse (Nn); / * Cs: surface color * /7 / * Ci: final surface Color * /

Before shaders can be used in a scene, they need to be compiled, i.e. translated intobyte-code. This can be done by simply executing the commandsshader basicLight1.sl

andshader basicSurface.sl in a Terminal window on Linux. This will create

Page 56: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

36 Background and Related Work

a new file for each shader, with the name of the shader - not the source file name -followed by the suffix.slo14. These will then be used by the RIB file.

For further reading on the RenderMan Interface or the shading language, [Upstill1989], [Apodaca and Gritz 1999] or [Stephenson 2002] provide deeper insights.

2.3 Data Compression

Data compression is a digital procedure for reorganizing the data structure to either savedisk space or reduce transfer time. Therefore the original data needs to be encoded bythe sender first and later decoded by the receiver for reusingit [Strutz 2009].

In this section the terms “symbol”, “character” or “value” are synonyms for anelement of a set called the “alphabet”. This set is the entirety of all the different symbolsthat occur in the current context.

Generally, there are three distinct types of procedures forcompressing data thatwill be described here. In turn, each of these can be broken down into two differentcompression strategies.

Data Reduction

Data Reduction techniques are all methods that remove the least important informationfrom the source and thereby change the original information. Because the informationloss is irreversible, this strategy is called “lossy compression”. Which portions of theinformation are unnecessary is often determined by a prior decorrelation.

There are two ways of reducing the amount of data:

Sub-sampling reduces the resolution of the information by using fewer values/sam-ples, e.g., by only keeping every third value, or the extremevalues, of a signal.

Quantization reduces the resolution of the samples by mapping original data ontoa finite number of representatives. The number of samples stays the same. Forexample: rounding floating points to the nearest integer.

Coding

Coding exploits the redundancy of most real world data by generating a more efficientrepresentation of redundant values. Since the representation is unique, the procedure isa reversible or “lossless compression”.

Here, again, are two different schemes implementing this technique:14The suffix depends on the renderer. PRMan uses.slobut other implementations all come with their

own compilers and suffixes.

Page 57: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

2.3. Data Compression 37

Entropy encoding , developed mainly by Shannon [Shannon 1948], quantifies theexpected information value15 (entropy). It treats each occurring symbol indepen-dently and assigns shorter bit strings to frequently occurring symbols and longerbit strings to rarely occurring symbols. The most famous example is probably theMorse code.

Precoding makes use of statistical relationships between elements. For example, theorder of letters in a sentence depends on the preceding letters (It is very probablethat a ‘q’ will be followed by a ‘u’, but a ‘z’, for instance, isvery unlikely). Oc-curring symbol sequences will be replaced by symbols from a different alphabet.Examples include “run length coding” and “dictionary-based coding”.

Decorrelation

Decorrelation concentrates the signal information of the source signal into fewer sam-ples and it can separate relevant signal parts from irrelevant parts. Most algorithms aredesigned for specific purposes, because decorrelation needs to know the signal charac-teristics in order to efficiently remove the correlation between values. For instance, inimage compression, it is assumed that the neighboring pixels are similar in size. Thedecorrelation is not a compression procedure per se; it simply expresses the informationin a manner that facilitates the decisions in subsequent compression steps.

It is also subdivided into two main categories:

Prediction tries to forecast future signal values by recursively analyzing previousvalues. Here, only the so-called “prediction error” is stored, i.e. the differencebetween the actual value and the predicted value. With a wellfunctioning predic-tion, the errors are very small, and therefore similar, which simplifies subsequentcompression. Additionally, instead of just using local correlation, this can beextended to detect timely redundancy, for instance using animage sequence topredict motion by using reference images (I-frames).

Transformations and Filterbanks are basically equivalent [Strutz 2009]. Both de-compose the signal into coefficients to weight the base functions or frequencydomains, respectively. Since they only map one descriptionto another, these al-gorithms do not reduce data, but they can concentrate important data into fewersamples to make subsequent compression more effective, while having less im-pact on the reconstructed signal. While applying transformations, filters or their

15In information theory, the information content differs from the common understanding. It is alsocalled the uncertainty value, which better describes the fact that the information value measures the pre-dictability or uncertainty of a symbol and therefore the lack of information if a specific symbol containedin a signal is missing

Page 58: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

38 Background and Related Work

inverses may be valid and lossless operations from a mathematical standpoint, theuse of computers may result in errors (due to limited precision) or may requiremore time than can be justified. Established examples include the discrete cosinetransformation, wavelet transform or the famous Fourier transformation.

2.4 Prior Work

Spherical harmonics [Courant and Hilbert 1953] are a set of eigenfunctions that rep-resent a particular case of the harmonic functions [Axler et al. 2000], the solutions toLaplace’s equation. Since their discovery in the late 18th century, the usage of spheri-cal harmonics has a long history and reaches over a wide rangeof fields. For instance,they are used in geophysics to model ocean tides [Dickman 1989] or the shape of thegeoid [Pec et al. 1982]. In physics, they serve as a solution for potential functions likethe heat equation [Byerly 1893; MacRobert 1948]. They are now used in medical re-search to model the asymmetric cortical surface. Other fields that are more relevantto rendering include the modeling of wave propagation [Ishimaru 1978] and radiation[Chandrasekhar 1960].

One of the first precise uses in computer graphics was for ray tracing dense volumes,like clouds or fog [Kajiya and Von Herzen 1984]. Since then, several others have imple-mented algorithms for computer graphics based on sphericalharmonics. Among theseare methods to model micro-scaled BRDFs [Cabral et al. 1987; Westin et al. 1992], ageneral light transfer simulation [Sillion et al. 1991], interactive illumination throughimage-based rendering [Wong et al. 1997a; Wong et al. 2002; Wong et al. 2003] andmodeling of light source emission [Dobashi et al. 1995].

With respect to the main focus of this thesis, the first paper employing sphericalharmonics in conjunction with lighting discussed the use ofan approximated irradiancemap to efficiently represent distant lighting for diffuse surfaces [Ramamoorthi and Han-rahan 2001]. One year later, the authors extended the implementation with what theycalled a “Spherical Harmonic reflection map” and isotropic BRDFs [Ramamoorthi andHanrahan 2002]. Focusing primarily on the real-time/game market, spherical harmon-ics were introduced into interactive lighting in a process called “Precomputed RadianceTransfer” [PRT] that was presented by [Sloan et al. 2002]. This includes complex globalillumination effects such as soft shadows and interreflections for a limited number ofBRDFs. To handle some more general, particularly glossy BRDFs, it was extended by[Kautz et al. 2002; Lehtinen and Kautz 2003; Sloan et al. 2003; Kautz et al. 2005].More complex material effects like subsurface scattering [Sloan et al. 2003] and refrac-tion [Génevaux et al. 2006] have already been added, as well. A major drawback of

Page 59: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

2.4. Prior Work 39

most of these approaches is that they work only for static scenes. To address this issue,[Zhou et al. 2005; Ren et al. 2006] started developing techniques for dynamic objects.

There are also some resources that share details about translating the equations intosource code. For instance, [Press et al. 1988] lists a C function that computes the as-sociated Legendre polynomials. The often cited “Gritty Details” paper [Green 2003]provides even more insight into the math and the practical algorithms for adding spher-ical harmonics into a game engine. In addition, some furtherreal-life examples ofimplementations coming from the games industry were published in [Oat 2004; Chenand Liu 2008; Kaplanyan 2009].

However, since most of the literature mentioned is focused on real-time applicationsand their requirements, some methods might not be applicable for off-line rendering,where appearance is more important than speed. One of the early adopters was Pixar’sPRMan. In the manual for version 13, an application note16 states that the programuses spherical harmonics to represent distant point cloud clusters for ambient occlusionand color bleeding. A few years later, Christensen explained some of the equationsand concepts in [Gross and Pfister 2007]. The next year, Pixar released a technicalmemo explaining an extended method [Christensen 2008]. Another application wasimplemented in a PhD thesis about multiple scattering in rendered hair [Moon 2010]. Avery recent and probably familiar example was used in the movie “Avatar” [Pantaleoniet al. 2010]. NVIDIA and Weta Digital collaborated to implement PantaRay, a systemfor precomputing sparse directional occlusion caches on the GPU.

16http://hradec.com/ebooks/CGI/RPS_13.5/prman_technic al_rendering/AppNotes/pointbased.html (accessed February 2011)

Page 60: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for
Page 61: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Chapter 3

Spherical Harmonics

Having explained in the previous chapter how Spherical Harmonics are applied in light-ing and how transformation can aid data compression, in thischapter we will providea brief mathematical insight into Spherical Harmonics - thesolution to the Laplaceequation. At the first sight, Spherical Harmonics might appear daunting, but a closerexamination reveals that their real benefit actually lies intheir simplicity.

3.1 Definition

Just as the Fourier basis is an important tool for evaluatingconvolutions over the unitcircle for one-dimensional functions, spherical harmonics can do the same over the unitsphere for two-dimensional functions [Sloan, Peter-pike 2008]. Therefore, they areparameterized in spherical coordinates.

x

y

z

=

r sin θ cosϕ

r sin θ sinϕ

r cos θ

r

θ

ϕ

=

√x2 + y2 + z2

arccos( zr)

arctan( yx)

(3.1)

θP

ϕr

x

y

zwhere(x, y, z) Cartesian coordinates

r radius of the sphere (on the unit spherer = 1)

θ polar angle (zenith angle)

ϕ azimuth angle (along the equator)

As the goal is to represent a function in Spherical Har-monics, the function needs to be “transformed”, or “pro-

41

Page 62: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

42 Spherical Harmonics

jected”, into spherical harmonic space. The transformation from Cartesian into spher-ical coordinates is one example. In order to understand the concept of the projectioninto spherical harmonics, one must first understand the meaning of “basis functions”.

Similar to basis vectors that span a vector space, basis functions combine with otherbasis functions to form a “function space”. Every functionf(x) can be represented inthat space by a linear combination of the basis functionsBi(x) of the destination space.This combination is given by

f(x) = limn→∞

n∑

i=0

ciBi(x) (3.2)

whereci are the scaling factors for the corresponding basis functions. These “coeffi-cients” are calculated by integrating the basis functionsBi(x) with the functionf(x),overf(x)’s domain of definition D [Dempski and Viale 2004].

ci =

D

f(x)Bi(x)dx (3.3)

An illustrated example of the procedure used to obtain thesecoefficients might looklike this:

∫ × = c1

∫ × = c2

∫ × = c3

...

By continuing this process over all basis functions, we obtain c, the vector of scalingcoefficients. Applying this vector...

× =c1

× =c2

× =c3

...

Page 63: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

3.1. Definition 43

ΣciBi =

... and summing the results yields the approximated projection. [Green 2003]This example only uses linear basis functions, returning a piecewise linear approxima-tion. The basis functions are special insofar as they fulfilla further property. They areorthogonal, i.e. they are linearly independent, which has the advantage that the influ-ence each one has on the projected function is independent and unique. In the exampleabove, it is easy to see that the approximation, for instanceof c1B1(x), cannot be ac-complished by using a combination of any otherciBi(x), since they are all0 at thatpoint. Mathematically, the orthogonality is expressed as:

∫ b

a

φi(x)φj(x)dx =

k if i = j

0 if i 6= j(3.4)

wherek is just a scalar value. But we will discuss the specific basis functions of spher-ical harmonics in more detail later.

The general definition of spherical harmonics uses complex numbers, as shown inequation3.5,

Y ml (θ, ϕ) =

√(2l + 1)

(l −m)!

(l +m)!P

|m|l (cos θ)eimϕ , l, |m| ∈ N0, −l ≤ m ≤ l (3.5)

wherel andm are both indices (not exponents), wherebyl is called the “band”, whichis equivalent to the degree (i.e. 0 is a constant, 1 is linear,etc.) [Sloan, Peter-pike2008] andm represents the order inside of each band.Pm

l are the so-called “associatedLegendre polynomials” defined over the range [-1,1] as:

Pml (x) =

(−1)m

2ll!(1− x2)m/2 dl+m

dxl+m(x2 − 1)l (3.6)

Because this equation consists of derivatives and cancellation between successive terms,which alternate in sign, from a programming standpoint it isa very tedious and inac-curate algorithm. Fortunately, the associated Legendre polynomials also satisfy recur-rence relations [Press et al. 1988].

(l −m)Pml (x) = x(2l − 1)Pm

l−1(x)− (l +m− 1)Pml−2(x) (3.7)

This is the general recursion term, calculating a new band based on the previous twobands.

Pmm (x) = (−1)m(2m− 1)!!(1− x2)m/2 (3.8)

Page 64: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

44 Spherical Harmonics

This forms the best starting point for the calculations, since it is independent of previousvalues. The notationn!! is the double factorial, only multiplying every second numberless than or equal ton (which in the case ofn = (2l − 1) is always an odd number).

Pmm+1(x) = x(2m+ 1)Pm

m (x) (3.9)

This equation is the second prerequisite for calculating equation3.7. It is also a sim-plification of that equation, obtained by settingl = m + 1 andPm

l−2(x) = 0, making itpossible to liftPm

m to the next higher band.Between bands, the associated Legendre functions are orthogonal to each other with

respect to a constant term1:∫ 1

−1

Pml (x)Pm

l′ (x) dx =2

(2l + 1)

(l +m)!

(l −m)!δll′ (3.10)

whereδll′ is the Kronecker Delta2, returning 1 ifl = l′ and 0 forl 6= l′. Figure3.1shows a plot of the first six associated Legendre polynomials.

P22(x)

P21(x)

P20(x)

P10(x)

P11(x)

P00(x)

–1

1

2

3

–1 –0.8 –0.6 –0.4 –0.2 0.2 0.4 0.6 0.8 1

Figure 4. The first six associated

Figure 3.1: The first six associated Legendre polynomials

The spherical harmonics themselves are not just orthogonal, but even orthonormal.This means that, compared with equation3.4, the product of two basis functions alwaysreturnsk = 1 if the basis functions are the same, and otherwise, 0. They therefore needto be normalized by applying a factor which can be obtained bysolving the followingequation:

S

Y ml (ω)Y

m′

l′ (ω) sin θdω = δll′δmm′ =

1 if l = l′ ∧m = m′

0 otherwise(3.11)

1http://mathworld.wolfram.com/AssociatedLegendrePoly nomial.html (ac-cessed March 2011)

2http://mathworld.wolfram.com/KroneckerDelta.html (accessed March 2011)

Page 65: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

3.1. Definition 45

where theYm′

l′ denotes the complex conjugated value. Thesin θ weights the values ofthe function based on their distance to the equator. [Schönefeld 2005]

∫ 2π

0

∫ π

0

Y ml (θ, ϕ)Y

m′

l′ (θ, ϕ) sin θ dθ dϕ

=

∫ 2π

0

∫ 1

−1

Y ml (θ, ϕ)Y

m′

l′ (θ, ϕ)d(cos θ) dϕ

=

∫ 2π

0

∫ 1

−1

K|m|l P

|m|l (cos θ)eimϕK

|m′|l P

|m′|l (cos θ)eimϕd(cos θ) dϕ

=K|m|l K

|m′|l

∫ 1

−1

P|m|l (cos θ)P

|m′|l (cos θ)d(cos θ)

∫ 2π

0

eimϕeimϕ dϕ

(3.12)

The θ dependent integral can be substituted by equation3.10, while theϕ dependentintegral can be solved by applying the Euler Formula3 and the integral identities4 of sinandcos:

∫ 2π

0

eimϕeimϕ dϕ

=

∫ 2π

0

(cos(mϕ) + i sin(mϕ))(cos(m′ϕ)− i sin(m′ϕ)) dϕ

=

∫ 2π

0

cos(mϕ) cos(m′ϕ) dϕ +

∫ 2π

0

sin(mϕ) sin(m′ϕ) dϕ

+ i

[∫ 2π

0

sin(mϕ) cos(m′ϕ)dϕ −∫ 2π

0

cos(mϕ) sin(m′ϕ) dϕ

]

= πδmm′ + πδmm′ + i[ 0− 0 ] = 2πδmm′

(3.13)

Then, plugging the terms from equations3.10and3.13 into equation3.12yields thefollowing:

K|m|l K

|m′|l

2

(2l + 1)

(l +m)!

(l −m)!δll′ · 2πδmm′ = δll′δmm′

⇔ K|m|l =

√(2l + 1)

(l −m)!

(l +m)!

(3.14)

which obviously is the same factor used in equation3.5and is responsible for the nor-malization.

3http://mathworld.wolfram.com/EulerFormula.html (accessed March 2011)4http://mathworld.wolfram.com/FourierSeries.html Equations (1)-(5) (accessed

March 2011)

Page 66: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

46 Spherical Harmonics

Since the functions that are going to be approximated will not be complex-valuedfunctions, it is sufficient to use only real spherical harmonics [Green 2003], which aregiven by the following equation:

yml (θ, ϕ) =

√2Re(Y m

l ) =√2K

|m|l Pm

l (cos θ) cosmϕ if m > 0

Y 0l = K0

l P0l (cos θ) if m = 0

√2Im(Y m

l ) =√2K

|m|l P

|m|l (cos θ) sin(|m|ϕ) if m < 0

(3.15)

This is the final notation most often used in CG. It also has theadvantage of subdividingthe equation into smaller problems that are easier to translate into source code, as willbe shown in chapter4.

A Table showing the first 16 spherical harmonics and a plot showing a unit spherethat is distorted by scaling each point radially using the absolute value of the function,can be found in AppendixB.

3.2 Properties

Orthonormality

Thanks to their properties, spherical harmonics can serve as a useful toolkit when ap-plied as basis functions. The most important property - one that has a lot of advantages- is that they are orthonormal, as shown in section3.1:

S

Y ml (ω)Y m′

l′ (ω)dω = δll′δmm′ =

1 if l = l′ ∧m = m′

0 otherwise(3.16)

whereS denotes an integral over the whole unit sphere andω is a spherical coordinateon the unit sphere.

The projection of a functionf(ω) into spherical harmonics is analogous to the ex-ample in equation3.3and can be achieved by simply replacing the arbitrary basis func-tions with the real spherical harmonic basis functions.

cml =

S

f(ω)yml (ω)dω (3.17)

The reconstruction of the approximated function is just as easily derived by:

f(ω) =n−1∑

l=0

l∑

m=−1

cml yml (ω) =

n2∑

i=0

ciyi(ω) , i = l(l + 1) +m (3.18)

Page 67: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

3.2. Properties 47

Fast Integration

One of the most important properties is derived directly from the orthonormality. Tolight a scene, a function describing the incoming light willbe multiplied by a surfacereflectance function (or transfer function). This needs to be done over the entire sphereof incoming illumination. Therefore this process integrates:

S

L(ω)t(ω)dω (3.19)

whereL is the incoming light andt is the transfer function. Instead of using the originalfunctions and performing a time-consuming integral calculation, an elegant workaroundwould be to use the SH approximated functions:

S

L(ω)t(ω)dω

=

S

[n−1∑

l=0

l∑

m=−1

cml yml (ω)

n−1∑

l=0

l∑

m=−1

dml yml (ω)

]dω

=

n−1∑

l=0

l∑

m=−1

cml dml

S

yml (ω)y

ml (ω)dω

=

n−1∑

l=0

l∑

m=−1

cml dml

(3.20)

in which a complex integration of two functions over the unitsphere is broken downinto a simple sequence of multiply-adds. [Green 2003]

Rotational invariance

If a functiong is a copy off arbitrarily rotated by rotationR over the unit sphere,

g(ω) = f(R(ω)) (3.21)

then after SH projection, it is still true that

g(ω) = f(R(ω)) (3.22)

This means that there is no difference in projecting the rotated functiong or rotatingthe projected functionf . This has the advantage that in animated scenes, no aliasingartifacts or light amplitude fluctuation will occur.

Page 68: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for
Page 69: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Chapter 4

Implementation

The preceding chapters described all the theoretical aspects involving the calculation ofspherical harmonics and the general principles of CG. This chapter examines the practi-cal aspects of how PRMan was extended to include the capability to compute sphericalharmonic lighting. To do so, this chapter will first evaluatedifferent approaches forthe required subroutines in a test application written in C++. Based on the results, therelevant code is then translated into RenderMan shaders.

4.1 The C++ Tool

This part deals with the implementation of a “test bench” in C++ and OpenGL in or-der to compare different approaches for the smaller subroutines and to decide whichimplementation is most suitable.

4.1.1 RenderMan Point Clouds

This is the format chosen for the in- and output, because it comes with some featuresthat are useful for fast integration. As the name indicates,a point cloud is simply acloud of points in 3D space, capable of storing an arbitrary number of channels foreach point (e.g. color, area, shadows, etc.). The point clouds generated by RenderManrenderers take advantage of theReyesarchitecture. For each micropolygon, a point inthe point cloud is created that contains the position in space, the surface normal at thatpoint, a radius for a gapless representation and any user-defined data. Figure4.1showsan example of a point cloud with a few data channels. [Pixar 2010]

Since the created point cloud file is based on theReyesalgorithm - which discardseverything outside the viewing frustum - it is often advisable to use a separate camera

49

Page 70: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

50 Implementation

Figure 4.1: A point cloud file containing some user-defined channels. Image courtesyof Pixar Animation Studios

for determining the points that are going to be stored when baking rather than usingthe final render view. This way, the number of points written into the point cloud canbe independently defined. Most often, this is done to define a larger field of view, forexample to include objects that are not in the final image, butstill have an influence onthe current view, or to store objects that become visible after just a few frames, whenthe render camera moves. (An illustration is shown in figure4.2) But sometimes thisseparation is also used to shrink the baked data set, as will be shown in the later lightshader.

Bake Camera

Render Camera

Figure 4.2: Baking a point cloud from a different camera to cover a larger set of objects.The Bunny, Dragon and Buddha model courtesy of Stanford University1

Page 71: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

4.1. The C++ Tool 51

To manipulate point cloud files, PRMan provides the Point Cloud File API, whichprovides the functionality to read and write point clouds [Pixar 2010]. All API functionprototypes are defined in the header filepointcloud.h 2, which simply needs to beincluded in the project source. To compile a source code, thelibrary file libprman.so

(or on windowslibprman.dll ) needs to be linked.

The whole reading and writing process is very simple, as willbe briefly shown here,because all functions are already defined. The manipulations performed between thesetwo steps are the tricky part.

Inside thepointcloud.h a point cloud is simply defined as a void pointer:

typedef void * PtcPointCloud;

As with any other file handling in program code, a point cloud [ptc] file first needs tobe opened:

PtcPointCloud m_cloud = PtcSafeOpenPointCloudFile( char * filename);

Now we can access the data channels from the ptc file. But before reading the pointdata, we need to retrieve some basic information about the point cloud file:

float world2eye[16], world2ndc[16], format[3];

char ** m_vartypes;char ** m_varnames;

int datasize, m_nvars, npoints;

PtcGetPointCloudInfo(m_cloud, "world2eye" , world2eye);

PtcGetPointCloudInfo(m_cloud, "world2ndc" , world2ndc);PtcGetPointCloudInfo(m_cloud, "format" , format);

PtcGetPointCloudInfo(m_cloud, "nvars" , &m_nvars);PtcGetPointCloudInfo(m_cloud, "vartypes" , &m_vartypes);

PtcGetPointCloudInfo(m_cloud, "varnames" , &m_varnames);

PtcGetPointCloudInfo(m_cloud, "datasize" , &datasize);PtcGetPointCloudInfo(m_cloud, "npoints" , &npoints);

where

1Stanford University Computer Graphics Laboratory: The Stanford 3D scanning repositoryhttp://graphics.stanford.edu/data/3Dscanrep/

2ships with RenderMan Pro Server

Page 72: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

52 Implementation

world2eye world-to-eye transformation matrixworld2ndc world-to-NDC [normalized device coordinates] transformation matrixformat image x resolution, y resolution, and pixel aspect rationvars number of variables in pointsvartypes types of the variables in points (an array of strings)varnames names of the variables in points (an array of strings)datasize number of float data in each data pointnpoints number of points in the point cloud file

These are just a few of the types of data that can be queried, but they are also the basicdata that must be provided later when creating a new point cloud.

Next, the points and the stored channels can be read. This is usually done in a loopthat repeats the operations point by point.

float m_point[3], m_normal[3], radius;float * m_data = ( float * ) malloc(datasize * sizeof ( float ));

for ( int i = 0; i < npoints; ++i)

PtcReadDataPoint(m_cloud, point, normal, &radius, m_dat a);// copy the data into buffers for use in other calculations

wherepoint receives the location,normal the orientation,radius the radius andm_data all the other data channels specified in the point cloud. Becausem_data isa float array, it is important to knownvars andvartypes beforehand in order toaccess the right data with the correct index. For instance, if only 16 different floats(SH coefficients) are stored, they can be accessed fairly easily. However, a color lyingbetween these values would occupy three consecutive slots.From this, it follows thatdatasize ≥ nvars .

Finally, the input ptc file can be closed by:

PtcClosePointCloudFile(m_cloud);

After the desired calculations have been completed, the writing process follows thesame logic, except in reverse. First, the new file needs to be created with the aforemen-tioned information:

PtcPointCloud outptc = PtcCreatePointCloudFile(( char * )filename,m_nvars, m_vartypes, m_varnames, world2eye, world2ndc, f ormat);

The next step is to populate the file with data:

for ( int p = 0; p < npoints; ++p)

Page 73: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

4.1. The C++ Tool 53

if (!PtcWriteDataPoint(outptc, point, normal, radius, new_ data))fprintf(stderr, "Unable to write into output file ’%s’" ,

filename);

To make sure all data is written into the file before it is closed or the program exits, theAPI provides the following function:

PtcFinishPointCloudFile(outptc);

4.1.2 Raytracing Introduction

The complete field of Raytracing [Whitted 1980] contains a list of many complex pro-cedures which, taken together, make it possible to render photorealistic images. Butcovering every aspect literally fills entire books [Pharr and Humphreys 2004; Suffern2007; Akenine-Möller et al. 2008]. To implement this tool, only selected elements wererequired; these will be described in the following sections.

As the name implies, the general process of raytracing generates an image by tracingthe path of the light rays. This process defines an image planewith a grid structure thatrepresents the pixels in the image. From each grid cell, a rayis cast into the scene. If theray intersects the surface of an object, then based on this surface shader, calculationsare triggered which return a color value for the pixel. The computation for the point ofintersection is usually the most time-consuming process inraytracing [Whitted 1980].

For spherical harmonic lighting, the starting point is not the image plane, but thesurface points in the scene. From each point, new rays are cast and traced in a hemi-spherical distribution, looking for intersections to construct the visibility function (seechapter2.1.3.2).

4.1.2.1 Sampling

In signal processing terms, the process of casting rays intothe scene to obtain a discreterepresentation of the continuous environment is also referred to as “sampling”. To savecomputation time, there should be as few of these rays, or “samples”, as possible, butas many as required to obtain a well approximated visibilityfunction. This is a crucialrequirement for the sample distribution. To avoid overrating or missing features, thesamples should neither cluster nor contain excessively wide gaps. But too uniformlydistributed samples can cause artifacts as well and may require more samples to com-pensate. In this step of the procedure, three distributionsthat were considered suitablewere compared.

Page 74: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

54 Implementation

After a moment’s consideration, it is obvious that the two simplest distributions -a pure random and a uniform grid distribution - do not meet theaforementioned re-quirements and must therefore be excluded. Figure4.3gives a planar illustration of theproblems that occur with these two distributions.

(a) uniform grid (b) random

Figure 4.3: Artifacts or errors may occur with a too uniform distribution(a); or with apure random distribution(b) of samples. Both 4×4 samples

The first tested distribution overcomes the issues encountered by each of the pre-viously mentioned methods by simply combining the two. Instead of centering thesamples in each grid cell, the rays are randomly shifted, butonly inside each cell. Thisis called “stratified” or “jittered sampling”. Figure4.5 (a)illustrates the planar distri-bution.

Nevertheless, it is important to keep in mind that, since thedistribution should bemapped on a sphere, simply jitteringθ andϕ in the same manner as before would causethe upper and lower rows to fill less space with the same numberof samples and formclusters at the poles (see figure4.5 (b)). Therefore, while the transformation is moreinvolved than simply applying equation3.1, in reality it is not much more difficult3:

θ = arccos(2x− 1)

ϕ = 2πy(4.1)

The other two algorithms are both so-called “quasi-random sequences”4. Thismeans that, instead of generating samples with all directions having the same proba-

3http://mathworld.wolfram.com/SpherePointPicking.htm l (accessed March2011)

4http://mathworld.wolfram.com/QuasirandomSequence.ht ml (accessed March2011)

Page 75: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

4.1. The C++ Tool 55

(a) planar jittering pattern (b) planar jittering on sphere (c) optimized jittering on sphere

Figure 4.4: Jittered sampling with(a): 4×4 samples;(b),(c): 32×32 samples

bility (which can result in samples with the same direction), the quasi-random samplescorrelate with the directions of all previous samples with alow discrepancy. The firstalgorithm we will discuss is called the “Hammersley point set”5 which is the basis ofthe second algorithm, which is referred to as the “Halton point set”.

Both of these approaches are based on the theorem that every natural numberk canbe expanded by using a prime basep:

k =r∑

i=0

aipi (4.2)

where eachai is a natural number in[0, p− 1]. Now defining a functionΦp(k) by

Φp(k) =r∑

i=0

ai

pi+1(4.3)

If d is the dimension of the sampled space, any sequencep1, p2, ..., pd−1 of prime num-bers defines a sequenceΦp1,Φp2 , ...,Φpd−1

of functions whose correspondingk-th d-dimensional Hammersley point is:

(k

n,Φp1(k),Φp2(k), ...,Φpd−1

(k)

)for k = 0, 1, 2, ..., n− 1 (4.4)

wherep1 < p2 < ... < pd−1 andn is the total number of Hammersley points. For thesurface of a sphere,d = 2 and therefore equation4.4simplifies to:

(k

n,Φp1(k)

)for k = 0, 1, 2, ..., n− 1 (4.5)

5http://mathworld.wolfram.com/HammersleyPointSet.htm l (accessed March2011)

Page 76: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

56 Implementation

This yields the points for the Hammersley distribution. Mapping this planar 2D distri-bution onto the sphere requires two steps:

(k

n,Φp1(k)

)7→ (ϕ, z) 7→ (

√1− z2 cosϕ,

√1− z2sinϕ, z) (4.6)

where the first mapping is a linear scaling to the required cylindrical domain,(ϕ, z) ∈[0, 2π)× [−1, 1]. The second mapping is az-preserving radial projection from the unitcylinder to the unit sphere.

Obviously, the coordinatek

nwill always change for differentn, resulting in different

distributions. The Halton distribution, on the other hand,differs slightly in the way itspoints are calculated:

(Φp1(k),Φp2(k)) for k = 0, 1, 2, ..., n− 1 (4.7)

wherep1 6= p2. This generates a distribution independent ofn, where for alln0 ≤n the firstn0 samples are exactly the same, and all following samples,n0 + 1, ..., n

will be placed in the remaining gaps. This is called a “hierarchical” algorithm. Thetransformation into spherical coordinates follows the same steps as before:

(Φp1(k),Φp2(k)) 7→ (ϕ, z) 7→ (√1− z2 cosϕ,

√1− z2sinϕ, z) (4.8)

These equations and the corresponding C code can be found in [Wong et al. 1997b].Figure4.5shows the sampling patterns of the two distributions.

(a) Hammersley (b) Halton

Figure 4.5: Jittered sampling with(a): 4×4 samples;(b),(c): 32×32 samples

Table4.1shows the computation time for generating different numbers of sampleswith the described distributions.

Page 77: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

4.1. The C++ Tool 57

1 Point 5000 Points32×32 64×64 128×128 32×32 64×64 128×128

Jittered sampling 0.568ms 2.132ms 8.462ms 2772.025ms 10842.973ms 42921.147msHammersley 0.387ms 1.529ms 6.389ms 1923.840ms 7840.377ms 32136.357msHalton,p2 = 7 0.511ms 2.101ms 8.824ms 2585.132ms 10631.2911ms 44203.951ms

Table 4.1: Comparison of speed of different sampling distributions,measured in mil-liseconds, for different numbers of samples

In terms of measurements and the properties, the Hammersleydistribution bringsthe most advantages. Even though it is not a hierarchical algorithm, the speed advantage- which is “small” here - makes a big difference in real production scenes, without anyloss in quality. For instance, in a scene of 500,000 points and a shot length of 180frames, with 32×32 samples, we save about 3 hours6. And 500,000 points is rathersmall for production scenes. Furthermore, it is unlikely that the number of samples forsuch a precomputation will differ from frame to frame. Moreover, Hammersley still hasthe advantage that the quasi-random distribution stays constant for the same number ofsamples - as opposed to therandom() call in the stratified sampling - and thereforeHammersley minimizes flickering in animations.

4.1.3 Spherical Harmonics

Since the mathematical equations of spherical harmonics have already been explainedin Chapter3, the corresponding code is a pushover.

As in equation3.15, we handle each of the three cases form in yml independently.Then, as mentioned above, each of the variables can be outsourced into separate func-tions.

Listing 4.1: C++ Spherical Harmonics

double SphericalHarmonic( int l, int m, double theta, double phi)

double yconst double sqrt2 = sqrt(2.0);

if (m > 0)

y = sqrt2 * K(l,m) * cos(m * phi) * Legendre(l,m,cos(theta));else if (m == 0)

y = K(l,0) * Legendre(l,0,cos(theta));else

y = sqrt2 * K(l,-m) * sin(-m * phi) * Legendre(l,-m,cos(theta));

return y;

6difference between Hammersley and Halton≈ 0.000124sec ∗ 500, 000 ∗ 180 = 11700sec = 3.1h

Page 78: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

58 Implementation

A comparison with equation3.15reveals that nothing fancy is used here. The onlytrick is that we took advantage of the fact that all cases form < 0 are already filteredby the if -statement, therefore we can use−m instead of the more tediousabs()

function.

Next, the normalization factorKml from equation3.14is a simple one-line calcula-

tion.

Listing 4.2: C++ SH normalization factor (K)

double K( int l, int m)

double k = sqrt(((2.0 * l + 1.0) * factorial(l-m)) / (4.0 * PI *factorial(l+m)));

return k;

It is known that a factorial is a recursive and tedious function; therefore, we used alookup table for the most expected values before going into the recursion.

Listing 4.3: C++ factorial with look up table

double factorial( int x)

const int nPrecomputed = 13;static double precomputedFactorials[nPrecomputed] = 1.0, // 0

/ * 1 * / 1.0,/ * 2 * / 2.0,

/ * 3 * / 6.0,

/ * 4 * / 24.0,/ * 5 * / 120.0,

/ * 6 * / 720.0,/ * 7 * / 5040.0,

/ * 8 * / 40320.0,

/ * 9 * / 362880.0,/ * 10 * / 3628800.0,

/ * 11 * / 39916800.0,/ * 12 * / 479001600.0;

if (x < nPrecomputed)return precomputedFactorials[x];

double result = precomputedFactorials[nPrecomputed-1];for ( int i = nPrecomputed; i <= x; ++i)

result * =i;return result;

This only required a mid-range calculator, but can save a lotof time at runtime.

Page 79: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

4.1. The C++ Tool 59

Last but not least are the associated Legendre polynomials.These are a bit trickier,but consulting the equations3.7- 3.9 is already half the battle.

Listing 4.4: C++ Associated Legendre Polynomials

double Legendre( int l, int m, double x)double pmm = 1.0;

if (m > 0)

double somx2 = sqrt(1.0-x * x);double fact = 1.0;

for ( int i = 1; i <= m; ++i)pmm * = (-fact) * somx2;

fact += 2.0;

if (l == m)return pmm;

...

As mentioned in section3.1, the best starting point for the recursion isPmm . A closer

look at equation3.8 reveals that each of the multiplication factors consists ofm mul-tiplications. Therefore, the equation can be easily implemented altogether in a singlefor -loop.

Listing 4.5: C++ Associated Legendre Polynomials

...

double pmp1 = x * (2.0 * m + 1.0) * pmm;

if (l == m+1)return pmp1;

double plm = 0.0;

for ( int ll = m+2; ll <= l; ++ll)plm = ((2.0 * ll - 1.0) * x * pmp1 - (ll + m - 1.0) * pmm) / (ll

- m);

pmm = pmp1;pmp1 = plm;

return plm;

ConvertingPmm+1 is apparently very straightforward.

The first computation ofPml is for l = m+ 2 and can be easily accomplished by using

the previously calculatedPmm andPm

m+1. Then we can overwritepmmandpmp1 to

Page 80: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

60 Implementation

continue with the next step of the recursion.Most of these code pieces were extracted from [Green 2003].

4.2 RenderMan Shader

This section covers the implementation of the RenderMan shaders that will undertakethe dirty work of computing the spherical harmonics coefficients and the lighting for arendered image in RenderMan. As in Section4.1, we first discuss a file format that issimply an addition to the point cloud described above.

4.2.1 RenderMan Brick Maps

Brick maps7 are useful 3D textures with multiple resolutions. They are generated fromone or more point clouds by using the application “brickmake” and they store the samedata as the input files.

brickmake [options] pointclouds.ptc outputbrickmap.bkm

A brick map is an adaptive, sparse octree with a brick at each octree node. Each ofthese bricks is a three-dimensional version of a small texture tile, with a resolutionof 8×8×8 voxels, capable of storing an arbitrary number of channelsfor each voxel.The brick map format can be seen as a 3D generalization of Pixar’s 2D MIP map8

texture format [Cortes and Raghavachary 2008] for regular 2D textures, containingincreasingly refined levels of detail. Figure4.6shows an example of the multiple detaillevels for surface data (here subsurface scattering color). [Pixar 2010]

The brick map format has several advantages [Pixar 2010]:

• The brick map is independent of the surface representation.

• Specifying a 2D parameterization for surfaces such as subdivision surfaces, im-plicit surfaces, and dense polygon meshes is not necessary.

• Brick maps automatically adapt to the data density and variation. If, for example,a fairly smooth 3D texture has only one small region with a lotof detail, there willonly be many bricks in that one small region. (This is in contrast to traditional 2D

7PRMan’s brick map format is a tiled version of the adaptive octree formats used by [DeBry et al.2002; Benson and Davis 2002]

8MIP: acronym for Latin: multum in parvo = many in small space9Stanford University Computer Graphics Laboratory: The Stanford 3D scanning repository

http://graphics.stanford.edu/data/3Dscanrep/

Page 81: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

4.2. RenderMan Shader 61

Figure 4.6: A brick map file containing a user-defined data channel (i.e.subsurfacescattering color). Bunny model courtesy of Stanford University9

textures where the entire texture has to have high resolution if just a small part ofit has a lot of detail.)

• The MIP map representation is suitable for efficient filtering, and the tiling makesit ideal for caching. This means that PRMan can deal efficiently with large brickmaps - even collections of brick maps much larger than the available memory.

• The user can specify the required accuracy when the brick mapis created. Thismakes it simple to trade off data precision vs. file size.

Drawbacks of reading point clouds

• Even if only a few points are used, the entire point cloud file is read in.

• Until a frame has finished, the entire point cloud stays in memory .

• The point cloud format does not provide a level-of-detail representation, makingfiltering and blurring difficult.

4.2.2 Introduction To Shaders

4.2.3 The Code

To accomplish the spherical harmonics calculation in PRMan, a three-stage processis created. First, we generate a point cloud with the SH coefficients for each surfacepoint (section4.2.3.1). Then, from the light source point of view, another point cloudis generated with the actual illumination from this light (section4.2.3.2), which in thethird step will be used by the light as a 3D shadow map (section4.2.3.3).

Page 82: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

62 Implementation

4.2.3.1 The Bake Shader

Listing 4.6: shBake.sl

plugin "/CINE/_GlobalScripts/RenderMan/GlobalRendermanShad ers/

shadeop_sh" ;...

In the first line, a custom DSO shadeop is loaded. This overtakes the actual calculationof the spherical harmonics, and is faster than doing the computation completely insidethe shader. The code for the shadeop can be found in AppendixC.4.

...

surface shBake( string Filename= "" )...

This is a shader that will be attached to surfaces; it requires only a name for a pointcloud to store the final results as input.

...

normal Nn = normalize ( N);

normal nworld = normalize ( ntransform ( "current" , "world" , N));point pworld = transform ( "current" , "world" , P);

vector v = normalize (- I );...

Usually, one of the first steps in a shader is to transform and/or normalize surface points,normals and vectors for use in calculations. Here, they are transformed from the defaultspace, “current”10, into world space.

...

uniform float Samples = 64;vector dir_sample=0;

...

Specifying the number of samples in one dimension11 and creating a vector that willtemporary store the direction of the samples.

...

float Theta[Samples * Samples];

float Phi[Samples * Samples];...

These arrays will store the spherical coordinates of the samples that did not intersectanother surface.

10In PRMan the “current” space is identical to camera space11the total number of samples is then samples*samples

Page 83: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

4.2. RenderMan Shader 63

...uniform float Lbands = 3;

uniform float n_coeffs = (Lbands+1) * (Lbands+1);float coeffs[n_coeffs];

...

This specifies the number of coefficients by setting the number of bands and creates anarray that will hold all the values.

...

float fSample[Samples * Samples];float i=0,j=0,hits=0,l=0,m=0;

...

These are some supporting variables.fSample will be used as a Boolean array thatrecords whether or not a samples has hit a surface.i, j are used as indices,hits isa counter for the failing samples andl, m are the band and degree for the sphericalharmonics.

...

gather ( "illuminance" , P,Nn, PI ,Samples * Samples, "distribution" , "uniform" , "bias" ,0.1, "ray:direction" , dir_sample)

//hit something

fSample[i] = 0;hits += 1;

i+=1; else

//hit nothingdir_sample = normalize (dir_sample);

Theta[i]= acos (dir_sample[2]);

Phi[i]= atan (dir_sample[1],dir_sample[0]);fSample[i] = 1;

i+=1;

...

With thegather() call, information about the surrounding is collected via ray trac-ing. It is set up to run automatically over a specified number of samples and to ex-ecute different statements depending on whether an intersection with another surfaceoccurs. The“illuminance” parameter states that it will collect direct illumina-tion. The samples are distributed from the starting pointP, with a maximum angleof PI rad12 to the direction of the surface normalNn. The total number of samplesis Samples * Samples . As the pattern of“distribution” for the samples, the

12a maximum ofπ rad to the surface normal means 360 around the pointP

Page 84: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

64 Implementation

previously created arraysample_dirs[] is used. Furthermore, the small“bias”

of 0.1 prevents self-intersection. Since only the samples that miss are of interest, thecorresponding element of arrayfSample[] will be 1, otherwise it is 0. Moreover,only these samples need to be transformed into spherical coordinates.

...

uniform float n_index = 0;

float factor = 4 * PI /(Samples * Samples);

for (l = 0; l <= Lbands; l += 1)

for (m = -l; m <= l; m += 1)n_index = l * (l+1)+m;

coeffs[n_index] = 0;

for (i = 0; i < Samples * Samples; i += 1)if (fSample[i])

coeffs[n_index] += shadeop_SH(l_array[n_index],m_arra y[n_index],Theta[i],Phi[i]);

coeffs[n_index] * = factor;

...

This part is responsible for the actual calculation of the SHcoefficients. Therefore itadds up all the SH-projected representations of each sample. The projection is done bythe same SH function as in Listing4.1.

...

if (Filename != "" )

bake3d (Filename, "coeff_00,coeff_01,coeff_02,coeff_03,coeff_04,coeff_05,coeff_06,coeff_07,coeff_08,coeff_09,coeff_ 10,

coeff_11,coeff_12,coeff_13,coeff_14,coeff_15", P,Nn,

"coeff_00" ,coeffs[0],"coeff_01" ,coeffs[1],

"coeff_02" ,coeffs[2],

"coeff_03" ,coeffs[3],"coeff_04" ,coeffs[4],

"coeff_05" ,coeffs[5],"coeff_06" ,coeffs[6],

"coeff_07" ,coeffs[7],

"coeff_08" ,coeffs[8],"coeff_09" ,coeffs[9],

"coeff_10" ,coeffs[10],

Page 85: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

4.2. RenderMan Shader 65

"coeff_11" ,coeffs[11],"coeff_12" ,coeffs[12],

"coeff_13" ,coeffs[13],"coeff_14" ,coeffs[14],

"coeff_15" ,coeffs[15],"interpolate" ,1);

Ci = 1-hits/(Samples * Samples);

In the final step, if an output file name is provided, a new pointcloud is created by thebuilt-in function bake3d() , with the 16 corresponding coefficients baked into eachpoint. Only the channels listed in the comma-separated string of custom channel names,right after the file name, will be written out. Each channel type must be specified in theRIB file with theDisplayChannel statement13. Each channel is a pair consisting of("representation name", value) . When"interpolate" is turned on,the micropolygon’s midpoints are baked out by interpolating it from its four corners,which are usually baked. The advantage of this is that it eliminates the double shadingpoints along the edges of the shading grids. For the output color Ci , ambient occlu-sion is arbitrarily chosen just to make this pass more attractive while computing. Anexample of a point cloud containing the baked coefficients can be seen in AppendixD.1.

4.2.3.2 The Relight Shader

Listing 4.7: shRelight.sl

plugin "/CINE/_GlobalScripts/RenderMan/GlobalRendermanShad ers/

shadeop_sh" ;

float getSHOcclusion( string Filename; vector Ln)float coeff[16]=0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;

extern normal N;

extern point P;

normal Nn = normalize ( N);

texture3d (Filename, P,Nn,

"coeff_00" ,coeff[0],"coeff_01" ,coeff[1],

"coeff_02" ,coeff[2],"coeff_03" ,coeff[3],

13This process will be shown in Chapter5

Page 86: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

66 Implementation

"coeff_04" ,coeff[4],"coeff_05" ,coeff[5],

"coeff_06" ,coeff[6],"coeff_07" ,coeff[7],

"coeff_08" ,coeff[8],"coeff_09" ,coeff[9],

"coeff_10" ,coeff[10],

"coeff_11" ,coeff[11],"coeff_12" ,coeff[12],

"coeff_13" ,coeff[13],"coeff_14" ,coeff[14],

"coeff_15" ,coeff[15]

);return shOcclusion(coeff,Ln);

...

Once again, we start by loading the DSO. This is followed by a function that handlesthe reading of the pre-baked coefficients.Ln will be the vector from the current pointto the light source. Outside the actualsurface() shader, the global parametersP

andNneed to be imported by declaringextern representations. Thetexture3d()

function is symmetrical to the previously usedbake3d() and needs the name of thepoint cloud file, the location of the point and its normal for a3D texture lookup. The("channel", variable) pairs state which variable shall absorb which channel.The functionshOcclusion(coeff,Ln) projects the light vectorLn into sphericalharmonics and calculates the dot product of this vector and the loaded coefficients (asdescribed in equation3.20).

...

surface shRelight( string InPointCloud= "" ;string OutPointCloud= "" )

normal Nn = normalize ( N);...

This shader takes in a point cloud and returns a new one.

...

vector mylight = 0;

illuminance ( P, Nn, PI /2)

mylight = normalize ( L);

...

Page 87: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

4.2. RenderMan Shader 67

A new vector is created to store the light vector, which is only available inside theilluminance() statement. The lookup in a cone angle ofPI/2 rad means a hemi-sphere centered aroundNn.

...vector __Ln; // normalized light-vector

matrix bakeCam = getWorld2Eye(InPointCloud);

__Ln = vtransform ( "world" , mylight);

__Ln = vtransform (bakeCam,__Ln);__Ln = normalize (__Ln);

...

Transforming the light into the same space as the already baked coefficients (which isobtained by thegetWorld2Eye return matrix) is important to ensure constant lightintensity, independent of the incidence angle.

...float unoccluded = getSHOcclusion(InPointCloud,__Ln);

Ci = unoccluded;

bake3d ( OutPointCloud, "unoccluded" ,P, Nn,

"unoccluded" , unoccluded,"interpolate" ,1);

The getSHOcclusion returns the final gray scale values for the 3D shadow map,which get baked into the new point cloud and also are used for the rendering of this pass.The point cloud created will also be converted into a brick map via thebrickmake

command, before it is reapplied as a shadow map in the following light shader. Anexample of a point cloud and brick map containing the baked variableunoccluded

can be seen in AppendixD.2.

4.2.3.3 The Light Shader

This light shader is a minimal example deduced from the “uberlight” internally used atTrixter, containing only the SH related commands.

Page 88: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

68 Implementation

Listing 4.8: shLight.sl

plugin "/CINE/_GlobalScripts/RenderMan/GlobalRendermanShad ers/

shadeop_sh" ;

float getSHOcclusion( string Filename; vector Ln; floatshFilterRadius)

float unoccluded = 0;

extern normal N;

normal Nn = normalize ( N);

texture3d (Filename, Ps,Nn,"filterradius" , shFilterRadius,

"lerp" , 1,

"unoccluded" , unoccluded);

return unoccluded;

...

This function reads the shadow information from a brick map.The filterradius

is a command for brick maps that specifies the radius of the disk/sphere that the lookupcovers. If set to 0, the voxel data are not interpolated; instead, only the data values ofthe finest voxel that point P is in are used.lerp is another brick map limited param-eter which, if set to 1, invokes two lookups in the brick map (at two different depths)and linearly interpolates the results. This makes the result a quadrilinear interpolation:trilinear interpolation in 3D space and a linear interpolation in the resolution. [Pixar2010]

...

light shLight(/ * Basic intensity and color of the light * /

float intensity = 1;color lightcolor = 1;

color shadowcolor = 0;

float width = .5, height = .5, wedge = .1, hedge = .1;

/ * Spherical Harmonics Occlusion * /uniform float doSphericalHarmonics = 1;

uniform float sphericalHarmonicsMult = 1;

uniform string shPointCloud= "[RMSExpression::passinformanMakeSHPass filename]" ;

uniform float shFilterRadius = 0.5;

Page 89: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

4.2. RenderMan Shader 69

/ * Miscellaneous controls * /

float nonspecular = 0;float nondiffuse = 0;

output varying float __nonspecular = 0;output varying float __nondiffuse = 0;

)

...

The light has some of the basic light parameters for color andintensity. width andheight control the spread of the illuminated cone, whilewedge andhedge controlthe penumbra. The parameterdoSphericalHarmonics is a switch to activate ordeactivate spherical harmonics as a shadowing technique14. sphericalHarmonics-

Mult is used to scale the shadow intensity. As the input, theshPointCloud caneither use a point cloud or a brick map. Important here is the expression:

"[RMSExpression::passinfo rmanMakeSHPass filename]"

this will automatically be replaced by the full path of the (in this case brick map) filecreated by the indicated “rmanXXXPass”. The creation of such a custom pass will bedescribed in chapter5. TheshFilterRadius will be passed to thetexture3d()

call mentioned above. Both thenonspecular andnondiffuse parameters controlhow much the light will contribute to the surface’s specularity or diffuseness, respec-tively.

...

__nonspecular = nonspecular;

__nondiffuse = nondiffuse;

/ * For PRMan, we’ve gotta do it the hard way * /

point from = point "shader" (0,0,0);point _from = transform ( "shader" ,from);

vector axis = normalize ( vector "shader" (0,0,1));

/ * Spot Light * /uniform float maxradius = 1.4142136 * max(height+hedge, width+

wedge);

uniform float angle = atan (maxradius);...

These variables will define the cone in which the light is cast. Analogous to a simplespotlight in Maya, the position where the light is created (point from ) is at the

14This is mainly necessary for the complete uberlight, where several shadowing algorithms are avail-able

Page 90: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

70 Implementation

origin and the direction (vector axis ) is along the positive z-axis. The arbitraryvalue of1.4142136 ≈

√2 is the typical scaling factor for the spot-like uberlight [Cortes

and Raghavachary 2008] and equals the diameter of the smallest circle enclosing theunit square.

...

illuminate (from, axis, angle)

color lcol = lightcolor;

float unoccluded = 1;

if (doSphericalHarmonics != 0)

unoccluded * = (getSHOcclusion(shPointCloud, shFilterRadius) *

sphericalHarmonicsMult);

...

The illuminate statement specifies the light cast by the light source. In theif

branch, the value of the brick map multiplied by the scaling factor is assigned to thevariableunoccluded .

...lcol = mix (shadowcolor, lcol, unoccluded);

__nonspecular = 1 - unoccluded * (1 - __nonspecular);

__nondiffuse = 1 - unoccluded * (1 - __nondiffuse);

Cl = intensity * lcol;

Eventually, the appropriate light and shadow colors for thelight are assigned, usingunoccluded as blending factor. The final output values are scaled according to theuser-defined intensity.

Page 91: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Chapter 5

Pipeline Integration

This chapter is dedicated to the process of customizing RenderMan Studio 3.0 so as tointegrate the previously implemented shaders and make thisfeature as easy to use asambient occlusion. The steps should also work for “RenderMan for Maya”, where theonly change that needs to be done is to modify the directory paths.

Because RenderMan Studio and Maya are platform-independent, the only differ-ence in the steps required is in the paths to the installed files; hence, we assume theenvironment variable $RMSTREE points to the correct installation folders. That are intheir default setting:

/opt/pixar/RenderManStudio-3.0.0-maya2010

or C:\Program Files\Pixar\RenderManStudio-3.0.0-maya201 0

for Linux or Windows, respectively.But first things first. For such a task, it is important to lay out a concept for achieving

the best workflow. To recapitulate, the following three shaders need to be embedded.

shBake.sl Calculates the Spherical Harmonic coefficients and bakes them into a cor-responding point cloud.

shRelight.sl Takes the baked point cloud and combines it with the current lighting,then bakes a new point cloud with the resulting shadowing calculations.

shLight.sl Reuses the shadowing point cloud to project it onto the surfaces.

With all these properties in mind, and considering the logicbehind the calculationof the Spherical Harmonic coefficients, it is apparent thatshBake.slneeds to knowabout the whole scene in order to provide accurate results. Therefore, it is necessaryto equip all the objects in the scene with the shader at the same time; the advantageof this is that it only has to be done once, thanks to the reusability of point clouds.

71

Page 92: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

72 Pipeline Integration

The shadershRelight.slreuses the prior point cloud and bakes a new file every timethe lighting is changed, because in some scenes, dozens of lights may be used and itwould not make sense to recalculate the shadowing for everything every time a singlelight is tweaked. Thus, for the sake of performance, the mostreasonable approach is tohave every light calculate its very own relighting brick map. Another advantage of thisconcept is that these single point clouds can bake only the least set of points illuminatedby the particular light. TheshLight.slcan be optimized by automating the process ofcreating a light with only one mouse click.

RenderMan Studio already uses a similar approach for globalillumination and otherfeatures. When the so-called “Passes” are created, each of them can automate severaltasks and generate several outputs while the frame is being rendered. There are a coupleof scripts that RenderMan Studio uses to interpret the passes.

To create a pass without affecting existing system files, a new script file can be setup. The default location for the pass files is

$RMSTREE/lib/rfm/

There, a new file namedmy_nodetemplate.rmanis created and can be edited with anysimple text editor1. The*.rman files are basically tcl scripts.

Most parts of the script are similar to the standardnodes_globalillum.rmanin thesame location. To facilitate reading, only the changes madewill be listed here. The fullscript file can be found in AppendixE.1 my_nodetemplate.rman.

Listing 5.1: my_nodetemplate.rman

NodeType pass:render:RenderSH reference Collection RequiredRenderSettings

torattr phase subtype selector

range

"Once Per Job" /Job/Preflight/Maps/Shadow"Every Frame" /Job/Frames/Maps/Shadow

default /Job/Frames/Maps/Shadow

...

The pass is called “RenderSH ”. The phase attribute defines the stage in the render-ing process when the pass will be calculated2.

1Note: under Linux, root privileges are necessary to create or edit a file in the/opt/ folder.2The exact order of all rendering stages can be found inRenderMan_for_Maya.ini(mentioned be-

low) under a preference calledJobPhaseTree .

Page 93: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Pipeline Integration 73

Listing 5.2: my_nodetemplate.rman

...

torattr outputSurfaceShaders default 0

...

torattr outputDisplacementShaders

default 1subtype switch

...

The feature for executing the surface shaders for coloring before baking is unnecessary.On the other hand, it is reasonable to evaluate the displacement shaders so that changesin shapes are reflected in the spherical harmonic coefficients.

Listing 5.3: my_nodetemplate.rman

...torattr defaultRiAttributesScript

default RiSurface \"/CINE/_GlobalScripts/RenderMan/

GlobalRendermanShaders/shBake\" \"string Filename\" ‘rmantcl eval \"RMSExpression::passinfo this filename\" ‘

...

For the baking process, the shadershBake.slneeds to be applied to all objects. For thiswe use the generaldefaultRiAttributesScript .TheRiSurface uses a Linux path to point to the shader.The cryptic commandrman tcl eval "RMSExpression::passinfo this

filename" has the advantage that it automatically replaces itself with a filenameconsisting of the proper project path, frame number, name ofthe current pass and thecorrect file extension. However, as a result, it is necessaryto define a “mask” for thisfilename, as will be described below in greater detail.

Listing 5.4: my_nodetemplate.rman

...torattr defaultRiOptionsScript

default RiDisplayChannel \"float coeff_00\" ;

RiDisplayChannel \"float coeff_01\" ;

RiDisplayChannel \"float coeff_02\" ;RiDisplayChannel \"float coeff_03\" ;

RiDisplayChannel \"float coeff_04\" ;

Page 94: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

74 Pipeline Integration

RiDisplayChannel \"float coeff_05\" ;RiDisplayChannel \"float coeff_06\" ;

RiDisplayChannel \"float coeff_07\" ;RiDisplayChannel \"float coeff_08\" ;

RiDisplayChannel \"float coeff_09\" ;RiDisplayChannel \"float coeff_10\" ;

RiDisplayChannel \"float coeff_11\" ;

RiDisplayChannel \"float coeff_12\" ;RiDisplayChannel \"float coeff_13\" ;

RiDisplayChannel \"float coeff_14\" ;RiDisplayChannel \"float coeff_15\" ;

uistate hidden

...

...

When data need to be baked into a point cloud, the names of the variables from theshader must be stated in the RIB file asDisplayChannel , along with the propervariable type, to be piped into a point cloud. And because theDisplayChannel

needs to be defined before theWorldBegin block, it must be stated as “RiOption ”.

Now the first pass is set up, but RMS does not know about it, because it does notknow about the script file. In

$RMSTREE/etc/

the file calledRenderMan_for_Maya.inimust be edited as well.

Listing 5.5: RenderMan_for_Maya.ini

...

SetPref PassRefValidityTable

...RenderRadiosity RenderRadiosity RenderApproxGlobalDi ffuse \

Final BakeRender ReferenceRender Reflection SSRenderFilterApproxGlobalDiffuse

...

RenderSH Ren der Radios ity Render Approx Glob alD if fuse \Fi nal BakeRender Ref er enceRen der Reflec tion SSRender

Fil ter Approx Glob alD if fuse...

...

Page 95: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Pipeline Integration 75

“The PassRefValidityTable maps each pass class to a list of classes in which such areference would be valid. For example, Shadow passes may be referenced by shadersin Final passes, but it wouldn’t make sense for Shadow passesto be referenced byshaders in Shadow passes. When shader parameters containing pass references areoutput at render time, this table is used to determine which pass references to expand.”3

Obviously the “RenderSH ” classes list is just a copy of the RenderRadiosity list.

Listing 5.6: RenderMan_for_Maya.ini

...

# now define renderman globals interfaceLoadExtension rman [file join $cfg gui.rman]

# load cus tom node def i ni tions

Load Exten sion rman [file join $RMSTREE/lib rfm my_nodetemplate.rman]

...

RenderMan Studio now knows about the custom script file and the pass should be avail-able in Maya. But the above mentioned command“ rman tcl eval "RMSExpression::passinfo this filename" ”still does not know, how to substitute itself with a proper file path. Therefore, the finalstep is to make one small addition to the fileRMSWorkspace.ini.Also located in:

$RMSTREE/etc/

Listing 5.7: RMSWorkspace.ini

...

# Many passes have a single output. Here the passclass can be# used to control the output directory as well as the file

extension.# Some passes have several outputs of different types. This

case# is trickier, relying on .e.g. per-dspy information.

foreach filetype ext outclass specialpattern

...VolumeScatter ptc data

RenderSH ptc data $BASE_$PASSID.$FRAME.$EXT

...

3Source: seeRenderMan_for_Maya.ini

Page 96: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

76 Pipeline Integration

With this line added, RMS will generate a filename for our pass, where$BASE = Name of the Maya scene$PASSID = Name of the pass from the render settings$FRAME = Either frame number or “job”$EXT = File extension (ptc)

Now, everything for baking a point cloud with the Spherical Harmonic coefficientshas been set up correctly and in the next render, a point cloudwill be created in thedirectory$MAYA_PROJECT/renderman/$SCENE_NAME/data/$FRAME_NUM/

After accomplishing these tasks, the following steps for creating a pass to bake theillumination are now fairly easy.

It might also be helpful to use a so-called “brick map” when reusing the bakedillumination, because it provides the option of pre- or post-blurring the shadowing.

The baking pass is just a modification of the RenderSH pass andonly those settingswhich differ will be listed. (For the complete listing, see AppendixE.1)

Listing 5.8: my_nodetemplate.rman

NodeType pass:render:BakeSHLighting reference Collection RequiredRenderSettings

torattr phase

subtype selectorrange

"Once Per Job" /Job/Preflight/Maps/Photon"Every Frame" /Job/Frames/Maps/Photon

default /Job/Frames/Maps/Photon

...

The new pass “BakeSHLighting” is now calculated in the Photon stage. This ensuresthat it will always be calculated afterthe RenderSH pass.

Listing 5.9: my_nodetemplate.rman

...

torattr defaultRiAttributesScript

default RiSurface \"/CINE/_GlobalScripts/RenderMan/GlobalRendermanShaders/shRelight\" \"string InPointCl oud

\" ‘rman tcl eval \"RMSExpression::passinfo

Page 97: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Pipeline Integration 77

rmanRenderSHPass filename\" ‘ \"string OutPointCloud\" ‘rman tcl eval \"RMSExpression::passinfo this filename\" ‘

...

Naturally, the shader used is changed toshRelight, but in addition, it is possible thatthe input point cloud will point directly to rmanRenderSHPass. RMS follows a namingconvention (rman$PASSNAMEPass) when a pass is added to the scene, meaning thatit always points to the RenderSH pass. This applies to all BakeSHLighting passes andthereby satisfies the objective of creating a universal SH point cloud.

Listing 5.10: my_nodetemplate.rman

...torattr defaultRiOptionsScript

default RiDisplayChannel \"float unoccluded\" ;uistate hidden

...

Following the logic in the shader, we can deduce that onlyunoccluded is written tothe point cloud and needs to be set as display channel.

The third pass can be derived from the “MakeFilterApproxGlobalDiffuse” pass in thefile nodes_globalillum.rman. (For the complete listing, see AppendixE.1)

Listing 5.11: my_nodetemplate.rman

NodeType pass:command:MakeSH

reference Collection RequiredPassSettingstorattr phase

subtype selectorrange

"Once Per Job" /Job/Preflight/Maps/Photon

"Every Frame" /Job/Frames/Maps/Photon

default /Job/Frames/Maps/Photon

reference NodeType

pass:render:BakeSHLighting

...

“MakeSH” is also calculated in the Photon stage, but with theattributereference

NodeType , MakeSH will be a parent pass for BakeSHLighting. First, every time

Page 98: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

78 Pipeline Integration

either one of the two passes is added to the scene, RMS automatically creates both;second, they are computed in order from child to parent.

Listing 5.12: my_nodetemplate.rman

...param brickmake:omitgeometry

default 1

...

This only results in smaller brickmap files, because this omits the data indicating whichvoxels actually containing geometry.

Listing 5.13: my_nodetemplate.rman

...

torattr passCommand

default \"\\$RMANTREE/bin/brickmake\" $CMDARGS \"[passinfo this/0 filename]\" \"[passinfo this filename]\"

In contrast to the static connection between the BakeSHLighting passes and the Ren-derSH pass, thepassinfo this/0 filename can dynamically access the child’soutput, thanks to the parent-child relationship, .The next step is to map the pass classes to other passes, as before.

Listing 5.14: RenderMan_for_Maya.ini

...

SetPref PassRefValidityTable

...RenderSH RenderRadiosity RenderApproxGlobalDiffuse \

Final BakeRender ReferenceRender Reflection SSRenderFilterApproxGlobalDiffuse

BakeSHLight ing Ren der Approx Glob alD if fuse \

Fi nal BakeRender Ref er enceRen der Reflec tion SSRender

Fil ter Approx Glob alD if fuse

MakeSH MakeAp prox Glob alD if fuse Render Approx Glob alD if fuse \Fi nal BakeRender Ref er enceRen der Reflec tion SSRender

Fil ter Approx Glob alD if fuse

...

...

Page 99: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Pipeline Integration 79

Because the two new passes are added to our already loadedmy_nodetemplate.rmantemplate, RMS instantly knows about them and the only thing missing is the namingconvention fromRMSWorkspace.ini.

Listing 5.15: RMSWorkspace.ini

...RenderSH ptc data $BASE_$PASSID.$FRAME.$EXT

BakeSHLight ing ptc data $BASE_$PASSID.$FRAME.$EXT

MakeSH bkm data $BASE_$PASSID.$FRAME.$EXT

This completes the integration into RMS and the light creation can be automated usingthe following Python script. However, please note that a detailed description of howPython works with Maya and RenderMan is beyond the scope of this paper; therefore,only the logic behind the commands will be explained.

import maya.cmds as cmds

import maya.mel as mel

#create a spot light

cmds.spotLight(n= "SHLight1" )Light = cmds.ls(sl = True)

#create a camera

cmds.camera(n= "LightBakeCam1" )

Cam = cmds.ls(sl = True)

#parent constrain camera on the lightcmds.select(Light, Cam)

cmds.parentConstraint()

#attach the RenderMan light shader attribute to the light

mel.eval( ’rmanAddAttr ’ +cmds.listRelatives(Light[0], children =True)[0]+ ’ rman__torattr___customLightShader "" ’ )

#create RenderMan light shader node

lightShader = cmds.shadingNode( "RenderManLight" , asLight=True)

#connect the light shader to the light

cmds.connectAttr(lightShader+ ".message" , cmds.listRelatives(Light[0], children = True)[0]+ ".rman__torattr___customLightShader" ,

f=True)

#first use the defaultlight to restore the default settings in

case the Node already existed in the past

Page 100: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

80 Pipeline Integration

mel.eval( ’rman loadShader ’ +lightShader+ ’ "/opt/pixar/RenderManStudio-2.0.2-maya2010/rmantree/lib/shaders /

defaultlight.slo" -sync 1’ )mel.eval( ’rman loadShader ’ +lightShader+ ’ "/CINE/_GlobalScripts/

RenderMan/GlobalRendermanShaders/SHLight.slo" -sync 1 ’ )cmds.setAttr((lightShader+ ".shadername" ), "/CINE/_GlobalScripts/

RenderMan/GlobalRendermanShaders/SHLight.slo" , type= "string" )

cmds.select(Light[0])

First, the necessary modules for using Maya commands in Python are imported. Then,a spotlight called “SHLight1” is created, followed by a new camera, “SHLightBake-Cam1”. Next, we create a parent constraint from the light (parent) to the camera (child),to let the camera follow the light’s positioning. Then, the “customLightShader” at-tribute is attached to the light, so we can override the default light with our customshader. To do this, a new RenderMan shader node is created (asin the Hypershade).This node is attached to the previously added "customLightShader" attribute. Finally,our compiled light shadershLight.slois loaded into the shader node4. The last commandsimply selects the light again, in order to mimic the regularobject creation process inMaya. Execution from the Script Editor or a Shelf in Maya creates a new, properlyconfigured light.

Now, inside eachBakeSHLightingPass, choosing one of theSHLightBakeCamsascamera and the correspondingSHLight as light set has the desired effect of creatingfor every light a separate point cloud containing only thosepoints that are absolutelynecessary for illumination.

4under Windows the paths to the shaders need to be adjusted

Page 101: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Chapter 6

Results & Analysis

In the previous chapters, we extended PRMan by adding a feature to compute sphericalharmonic coefficients. In this chapter, we gather and analyze empirical data to compareexisting methods with the technique we have implemented. For an algorithm involvingshadowing, we need to consider two different aspects of the technique: first, the objec-tive criteria that provide information concerning efficiency and how well our methodperforms in production, and second, the subjective, perceived quality, which tells ushow well people accept the results.

6.1 Efficiency

To measure and compare utilization of computational resources, we used four scenefiles of varying complexity.

Scene Number ofTriangles

Hektor1 35,890 TrianglesAmadillo2 345,946 TrianglesDragon2 871,416 TrianglesBuddha2 1,087,718 Triangles

The settings for the scenes were all identical: rendering with a resolution of 640×480pixels at a shading rate of 1 for the beauty pass and a shading rate of 10 for theRen-derSHpass. For the sampling inside the clouds, we used 256 rays; the bucketandgrid

1Hektor model, courtesy of Trixter Film GmbH2Stanford University Computer Graphics Laboratory: The Stanford 3D scanning repository

http://graphics.stanford.edu/data/3Dscanrep/

81

Page 102: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

82 Results & Analysis

(a) Hektor3 (b) Armadillo4 (c) Dragon4 (d) Buddha4

Figure 6.1: The four test scenes used for comparison (close up shots fortessellation):Hektor3 (a); Armadillo4 (b); Dragon4 (c); Buddha4 (d)

sizes are set to the default values of 16×16 and 256, respectively. The system configu-ration for the single-node solution consists of a 2.4GHz Intel R© Core

TM2 Quad processor

Q6600 with 8 GB of RAM. Each rendering involves several different steps. The firstdata set collected is the most important part for the SHL: calculating and baking thecoefficients. The data measured are shown in Table6.1.

Hektor Armadillo Dragon BuddhaNumber of Triangles 35,890 345,946 871,416 1,087,718Number of Points 37,940 361,018 882,007 1,098,090File Size 3.03 MB 28.9 MB 70.6 MB 87.9 MBComputation Time 0:00:47 0:06:03 0:17:06 0:19:57Maximum Memory Consumption 0.09 GB 0.44 GB 0.95 GB 1.31 GB

Table 6.1: Values measured while baking SH point clouds for our test scenes. Each isfor a rendering of 640×480 pixels on a 2.4GHz IntelR© Core

TM2 Quad processor Q6600.

It should be noted that the number of points in a point cloud depends on the dis-tance to the camera, the resolution of the rendered image and, of course, the numberof triangles inside the viewing frustum. A change in the number of points has thebiggest influence on the remaining values. Logically, this will affect the render time

3Hektor model, courtesy of Trixter Film GmbH4Stanford University Computer Graphics Laboratory: The Stanford 3D scanning repository

http://graphics.stanford.edu/data/3Dscanrep/

Page 103: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

6.1. Efficiency 83

as well. In all tests made thus far, it appears that the computation time increases al-most linearly with the number of points, which is a useful attribute when it comes toplanning times for final renderings with high resolution based on collected data fromlow resolution preview renderings. The proportionally increasing file size is causedby appending a new data set of position, normal and coefficients for every new point5.Memory consumption depends on various factors. Naturally,the number of trianglesand points plays into that, because when calculating coefficients, the points need to re-main in memory and all the variables used in the shader also occupy memory, i.e., forevery point that is shaded simultaneously. The aforementionedbucketandgrid sizescontrol the number of concurrently computed shading pointsand, therefore, memoryallocation. As a result, the values measured for occupied memory are not fixed, butact as a benchmark, to prove that it is even possible to renderthe frames on a low-endworkstation without a huge amount of required memory.

As already mentioned, it is only necessary to carry out this step - calculating thespherical harmonics coefficients - once, and therefore can be pre-computed in an earlierstep long before the actual shading, lighting or rendering takes place.

To demonstrate the possibility of accelerating the computations by calculating thecoefficients over several machines, it is possible to use RenderMan running within theWindows Azure cloud environment. This allows users to scalethe required computingresources as needed, without maintaining one’s own in-house render farm. Table6.2shows a comparison of the single node (introduced above) anda five-node cloud. Forthe renderings, a much higher number of samples was used (1024), which is why thesingle node needs much longer than for the “Buddha” scene before.

Single Node 5 Cloud NodesNumber of Triangles 1,222,665Number of Points 646,648File Size 57.9 MBComputation Time 2:20:21 0:53:00Maximum Memory Consumption 1.17 GB 1.05 GB

Table 6.2: Values measured while baking SH point clouds for a test scene. Both wererendering a 512×512 pixel image with 1024 samples for the SH lookup. Middle columnuses a single 2.4GHz IntelR© Core

TM2 Quad processor Q6600, right column five 2.1GHz

Quad-Core AMD OpteronTM

processors in the Windows Azure cloud.

For the cloud solution, five 2.1 GHz Quad Core AMD OpteronTM

computers with 7GB RAM were used. To distribute the file over the nodes, the image was split into 8×6

5Although the variablenpoints for the number of points in the cloud is incremented, it stilloccu-pies the same disk space.

Page 104: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

84 Results & Analysis

smaller image tiles, which were separately computed on the nodes. The spread of thegeometry inside the scene file determines the possible acceleration; here the 34th tiletook 40 minutes while the next most complex tile only took 13 minutes, resulting in47 finished tiles after 34 minutes, while during the last 20 minutes only one node wasrendering. That means that, with a uniform tile complexity,the render time in the cloudwould have been very close to five times as fast as the single node, as one would expect.

The next step in the hierarchy is to reuse the generated pointcloud for lightingpurposes. Therefore, we use the same scene, switch thermanRenderSHPass to“Reuse” and create armanMakeSHPasses and a “SHLight”6. Table6.3 shows theamount of resources utilized when baking the lighting.

Hektor Armadillo Dragon BuddhaNumber of Points in new ptc 87,898 396,002 921,295 1,137,438File Size new ptc 2.01 MB 9.06 MB 21.0 MB 26.0 MBNumber of Bricks 445,285 1,236,857 4,982,822 5,401,214Brick Levels 9 17 20 20File Size bkm 1.72 MB 5.63 MB 31.2 MB 34.9 MBComputation Time 0:00:04 0:00:15 0:00:38 0:00:48Maximum Memory Consumption 0.06 GB 0.26 GB 0.45 GB 0.50 GB

Table 6.3: Values measured while SH-lighting our test scenes with pre-baked SH pointclouds. Each is for a rendering of 640×480 pixels on a 2.4GHz IntelR© Core

TM2 Quad

processor Q6600.

A comparison of Table6.1and Table6.3highlights some differences. The disparityin the point clouds is due to the different position of the light baking camera. Thetremendous drop in file sizes - despite the fact that more points are created - is a resultof the reduction in data channels; instead of 16 coefficients, we store only a singlefloat variable. When creating and sorting the bricks, thebrickmake application usesspecial proprietary algorithms that are not available for inspection; consequently, it ishard to predict the number of bricks or the levels of detail inthe generated file. However,it is useful to note that, even though the brick map files hold many more voxels thanpoints in the point cloud, the file sizes are still reasonable. The most remarkable changeis in the render time. Even though this is still a baking process for the lighting, allthat is computed is the coefficients vector for the light multiplied by the SH vectors ofeach point read from the point cloud. This simple calculation also requires much less

6Only one “SHLight” is used here, because those values can be more easily multiplied and added,depending on how many lights are baking or reusing data.

Page 105: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

6.1. Efficiency 85

memory, because most of the required data are already held inthe point cloud and justneed to be copied into memory.

The data gathered so far concern only SHL, and only for bakingprocesses. Everyvalue is acceptable and throughout all the tests performed during the development stage,no noticeable deflections occurred. But to evaluate the potential appropriately, it is es-sential to also consider already established shadowing techniques and their advantagesand compare those to our approach.

In Table6.4, we compare some of the values collected for shadows calculated withSHL to shadows generated with ambient occlusion and area lights. Since the SHL canbe considered as a three-step approach7 and we assume that the step for calculating thecoefficients has already been performed in a department prior to rendering, the follow-ing comparisons examine only the last two steps regarding the actual illumination. Butfor a comprehensive analysis, those values need to be kept inmind.

Hektor Armadillo Dragon BuddhaSpherical Harmonics baking 0:00:04 0:00:15 0:00:38 0:00:48File Size new ptc 2.01 MB 9.06 MB 21.0 MB 26.0 MBMaximum Memory Consumption 0.06 GB 0.26 GB 0.45 GB 0.50 GB

Ambient Occlusion baking 0:00:08 0:00:16 0:00:32 0:00:42File Size new ptc 4.02 MB 18.1 MB 42.1 MB 52.0 MBMaximum Memory Consumption 0.07 GB 0.24 GB 0.44 GB 0.54 GB

Area Lights 0:00:20 0:00:33 0:01:10 0:01:14Maximum Memory Consumption 0.08 GB 0.44 GB 1.03 GB 1.25 GB

Table 6.4: Gathered data for generating shadows with different techniques. Each is fora rendering of 640×480 pixels on a 2.4GHz IntelR© Core

TM2 Quad Q6600.

Spherical Harmonics and ambient occlusion both generate point clouds for theircalculations. The same camera was used as the baking camera for both methods andtherefore the point clouds contain the same number of points. It is notable that thecomputation times for ambient occlusion and spherical harmonics are very similar.Considering a scene containing one million triangles as being at a production level,the six-second disparity compared to the optimized algorithms for ambient occlusionin PRMan is still a remarkable result8. Only the area light shadows drop noticeablybehind. It should not be forgotten that several lights are generally used in production,

71. Baking Coefficients; 2. Baking Illumination; 3. Reusing Illumination8This time difference could be reduced even further by only using point clouds for baking instead of

additionally converting to a brick map, but in that case we would lose functionality.

Page 106: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

86 Results & Analysis

which multiplies the render time and their time lags9. The amount of occupied diskspace and memory for SHL is to some extent even better than theusage by ambientocclusion or area lights. But as far as contemporary workstations are concerned, thosevalues carry almost no weight, because all are well below thecommon capacity ofworkstations or render nodes. Ambient occlusion always stores two colors and a floatchannel, which is why the point cloud is twice as big.

The next scenario that we consider is the reuse of the data calculated in the previousstep. Although it is possible to bake the illumination of area lights into textures or pointclouds, often it is still not used and therefore will be skipped. Table6.5 shows thesystem resources consumed for reusing the illumination point clouds.

Hektor Armadillo Dragon BuddhaSpherical Harmonics reusing 0:00:02 0:00:06 0:00:12 0:00:15Maximum Memory Consumption 0.06 GB 0.19 GB 0.39 GB 0.49 GB

Ambient Occlusion reusing 0:00:06 0:00:09 0:00:18 0:00:21Maximum Memory Consumption 0.07 GB 0.24 GB 0.43 GB 0.57 GB

Area Lights 0:00:20 0:00:33 0:01:10 0:01:14Maximum Memory Consumption 0.08 GB 0.44 GB 1.03 GB 1.25 GB

Depth Map Shadows (1024×1024) 0:00:04 0:00:10 0:00:20 0:00:25

Table 6.5: Collected data on reusing baked lighting with different techniques. Notethat the area lights did not change, since they are not baked.Each is for a rendering of640×480 pixels on a 2.4GHz IntelR© Core

TM2 Quad Q6600.

The Table above shows that reusing the illumination from SHLis the fastest methodfor the tested scenes. This advance is mainly caused by two factors: It is faster to readonly one float from the point cloud than it is to read seven, butbasically we bake theactual illumination into the point cloud, while ambient occlusion only stores the areaand radiosity of a point. That means that ambient occlusion only saves time by notbaking the points again, but it recalculates the shadows over and over again. Since wedo not bake the area lights, they fall far behind in the rankings. The depth map shadowsalso do not bake their data and they are only listed to show that we are even faster thanthe simplest of all shadowing methods.

At least compared to the raytraced area lights, the SHL has a very strong potentialto make up the time required for baking the coefficients in advance. In a complex scene

9For ambient occlusion it is difficult to apply this rule, because in most cases only one environmentlight is employed at a time

Page 107: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

6.2. Quality 87

that normally requires many iterations to reach the correctlight setup, there is a goodchance that SHL will catch up with ambient occlusion.

After focusing on the technical properties, the next section will deal with the overallimpression created by the different techniques.

6.2 Quality

To companies engaged in film production and visualization, the quality of an image ismore important than the rendering speed10, because such companies have much morecomputing power and more time to render a single frame than, for instance, a computergame has.

The quality of an image is an attribute that cannot be measured using technical de-vices, because each viewer decides subconsciously about the believability of and his/herpreferences for an image. It is either a question of how well aknown appearance is vir-tually reconstructed to blend homogeneously with the environment or how to convincethe viewer that a never-before-seen effect has actually been observed. This subjectiveperception or assessment must be made by individuals, who can then be interviewed.Therefore, in a survey, we showed three pictures of the same scenery - in which only theshadows differ - to participants and asked them for their personal opinions. In addition,as a cross-check, a real photograph was also slipped into thetest pictures. This meantthat it was necessary to use a simple scene setup in order to ensure that the virtual scenematched the real scene as closely as possible.

The test setup consisted of a gazing ball resting on a table ina physics laboratory,with direct light coming from six windows in the northeast wall of the room. For thebackground, a backplate without the ball was shot with a resolution of 2000×3008pixels (See Figure6.2). To also include the correct reflections from the environment, a360 spherical HDR panorama was generated as well (See Figure6.3). For the renderedimages, the exact same scene files were used, with only the lights changing. In the firstimage, area lights were used; these were placed to match the window positions. For thesecond frame, an environment light was used to generate ambient occlusion shadows.The last CG double is lit by our implemented spherical harmonic shadows. Figure6.4displays the images as they were shown to the participants.

Because for the most part, regular viewers only rate a picture based on a generalimpression, it was important to us to capture viewers’ opinions and, of course, thereasons behind them. But since our main focus here is on shadows, we naturally covered

10But the render time is still limited, because rendering is also an expensive process.

Page 108: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

88 Results & Analysis

Figure 6.2: The backplate for the renderings

Figure 6.3: The equirectangular HDR panorama for the reflections

this specific topic, as well. We interviewed a total of 18 regular students with majorsoutside of this field. Some of the findings are listed in Table6.6and the texts below.

Page 109: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

6.2. Quality 89

(a) (b)

(c) (d)

Figure 6.4: These are the images shown to the participants of the survey: A real pho-tograph(a); a CG sphere with ambient occlusion shadows(b); with shadows from arealights (c); and with spherical harmonic shadows(d). Larger versions of the images canbe seen in AppendixF.

Page 110: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

90 Results & Analysis

Real Area Ambient SphericalPhoto Lights Occlusion Harmonics

Votes for favorite image 11 2 1 4Overall Score (on a scale of 18 to 180) 123 101 89 112Average Score (on a scale of 1 to 10) 6.833 5.611 4.944 6.222Votes for favorite shadow 7 2 5 4Votes for most realistic shadow 10 3 0 5

Table 6.6: Quantitative results from our quality survey. Total of 18 people without CGbackground were interviewed.

Figure6.5shows the corresponding distribution of points for the images.

1 2 3 4 5 6 7 8 9 100

1

2

3

4

5

6

Real Photo

Area Lights

Ambient Occlusion

Spherical Harmonics

Scores

Fre

qu

en

cy

Figure 6.5: The distribution of points each image received from the 18 participants (1is poor, 10 is excellent).

We also asked the participants to explain the reasons behindtheir decisions andwhether they noticed anything out of the ordinary when viewing the images. Out ofthe 18 people interviewed, 16 noted the missing reflection ofthe camera in three of theimages, while even after extended observation, only 8 noticed additional differences inthe reflections. Interestingly, 8 of the 11 voters who liked the real photo most at thebeginning indicated that the reason for their decision was solely the differing reflection.Only three participants perceived a difference in shadows before we pointed it out.This again proves the subtlety of shadows, but the fact that more than half of the peoplechanged their preference after considering the shadows shows their importance.

Furthermore, to avoid leaping to conclusions, we ran some statistical tests on theresults to see whether there is a significant difference. Thetests were performed usingMYSTAT 12 software, the educational version of SYSTAT11.

11SYSTAT 12, SYSTAT Software, Inc., Chicago, IL, 2007,www.systat.com

Page 111: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

6.2. Quality 91

First, we start with a descriptive statistic to gather the first important values, suchas the arithmetic mean, the standard deviation and the variance (see Table6.7). The“Ranking” variable is the aggregate of all four cases.

Real Area Ambient SphericalPhoto Lights Occlusion Harmonics Ranking

N of Cases 18 18 18 18 72Minimum 1,000 2,000 1,000 2,000 1,000Maximum 10,000 10,000 9,000 10,000 10,000Range 9,000 8,000 8,000 8,000 9,000Sum 123,000 101,000 89,000 112,000 425,000Median 7,500 5,500 5,000 6,000 6,000Arithmetic Mean 6,833 5,611 4,944 6,222 5,903Standard Deviation 2,792 2,200 2,209 2,390 2,462Variance 7,794 4,840 4,879 5,712 6,061Shapiro-Wilk Statistic 0,874 0,955 0,966 0,942 0,960Shapiro-Wilk p-value 0,021 0,508 0,719 0,319 0,022

Table 6.7: The descriptive statistic for the results of our survey. A total of 18 peoplewithout CG background were interviewed on each image; the “Ranking” is the unionof all 4 methods.

In the Shapiro-Wilk test, the p-value indicates whether thehypothesis of having aGaussian distribution is true or false. Smaller p-values imply stronger evidence againstnormality, but most of our values are considered relativelybig (on a scale of 0 to 1)12,so we have a normal distribution which can be verified using a maximum-likelihoodestimation.

Since we are rating ranks, the following tests are all non-parametric.Next, we performed a Kruskal-Wallis one-way analysis of variance to see whether

there is a significant difference between all four images. The test gives a result ofp = 0.081, which is close but still higher than the required 5% threshold required to becertain that there is a noticeable difference.

By running a Wilcoxon signed rank test, we can get more precise results concerningthe resemblance or difference between pairs of two images ata time.

Table6.8 shows that a statistically significant difference can only be verified be-tween ambient occlusion and the real photograph (3.2% similarity). But it can be saidthat the resemblance between our spherical harmonic lighting and the photo is closerthan between any other images.

121 = 100%

Page 112: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

92 Results & Analysis

Real Area Ambient SphericalPhoto Lights Occlusion Harmonics

Real Photo 1Area Lights 0.196 1Ambient Occlusion 0.032 0.216 1Spherical Harmonics 0.392 0.071 0.068 1

Table 6.8: Wilcoxon Test for two-sided probabilities using normal approximation

From all the numerical data we collected, the only statementwe can make is that wecannot make a statement. But this itself is not without meaning. It tells us that, givenviewing habits and using state-of-the-art technology, we can achieve levels of realismthat a regular audience cannot distinguish from reality. Applied to our specific case,this also means that our implementation integrates seamlessly into this level of qualityand does not produce any conspicuous flaws.

Page 113: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Chapter 7

Conclusions & Future Work

Conclusions

We have presented a method to extend PRMan to precompute directional occlusionencoded by spherical harmonics. This not only offers CG shadows that have beenvery well accepted by viewers, but also makes it possible to have "final movie quality"cinematic lighting from the beginning of the lighting stage, without adding noticeableoverhead in rendering.

We started in Chapter1 Introduction by outlining the objective of this thesis: toevaluate the differences between the established shadowing method of ambient occlu-sion and a new approach called spherical harmonic lighting.

Therefore, we had to implement the latter in PRMan in order toensure a fair com-parison between the two . This required knowledge of a wide range of subjects, rang-ing from different lighting and rendering techniques through the special case of Ren-derMan’s approach to a general understanding of data compression, all of which wascovered in Chapter2 Background and Related Work.

The Chapter3 Spherical Harmonics focused on the transition to spherical har-monics by exploring the mathematical equations for our source codes and explainingthe properties that make spherical harmonics such a useful basis for our purposes.

For the implementation of spherical harmonics that we had covered in Chapter4Implementation, we touched on the code pieces of our test bench - which was pro-grammed in C++ - that made it into the RenderMan shaders. We found this to be amuch better starting point, since all of the components involved were at our disposalhere, enabling us to tweak and experiment with every element. The description of theaforementioned shaders also made up a big part of that chapter, encompassing threeshaders:

93

Page 114: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

94 Conclusions & Future Work

• The first one for calculating and baking the spherical harmonic coefficients, whichare independent of the lighting

• The second one for calculating the actual illumination based on the position of acustom light and

• The last shader was that particular custom light shader, which also projects thecomputed illumination and occlusions onto the scenery.

We closed the programming/scripting part in Chapter5 Pipeline Integration byaddressing one of our goals: to make usage of our implementedshaders artist-friendly.There, we discussed how to add custom render passes to RMS. These passes broughtus several advantages:

• They take over the task of successively assigning our two surface shaders to allobjects in the scene, allowing us to obtain a final rendering at once. Otherwisewe would have had to re-render the image with the different shaders, one afteranother.

• They provide the option to reuse already calculated output without having tooverwrite approved settings.

• We can create separate outputs for every light in just one render process. Oth-erwise, we would need to render the scene several times, always with just onesingle light turned on.

• We can easily reduce the data and computation time by using separate camerasto consider only those points affected by the currently baked light.

Previously, in Chapter6 Results & Analysis, we examined the differences in re-source consumption and visual quality between ambient occlusion and spherical har-monic lighting. We showed that with our three-pass-approach it is possible to bake thecoefficients within a department prior to lighting and rendering. This way, the lightingartists can work with production-ready shadows while avoiding a serious slowdown.Using spherical harmonic lighting turned out being even faster than ambient occlusionwhen the illumination is reused, which is more often the case. The final rendering pro-cess in particular benefits from this, because all the required data will have already beencalculated. We could even overtake the simplest available shadowing technique - thedepth map shadows - by reusing our baked pass. All this shows that, from a technicalstandpoint, spherical harmonic lighting brings quite a fewadvantages, but dependingon the reuse frequency of the baked data, ambient occlusion is slightly ahead. However,

Page 115: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Conclusions & Future Work 95

as we noted above, technical properties do not count as much as visual attributes in filmproductions and visualizations.

We surveyed 18 participants, all of whom were educated in fields unrelated to CG.We showed the interviewees three lighting methods and a realphotograph, then askedthem for their opinions. Statistically, all of the images were very similar, which showshow realistic today’s renderings can look. Only the real photograph and ambient occlu-sion had a verifiable difference. But this, together with thefact that there was a 39.2% similarity between the rating of the real photograph and spherical harmonic lightingversus only 6.8 % similarity between ambient occlusion and spherical harmonics showsa trend: the latter received higher acceptance from the viewers. Eleven of the partic-ipants noted that the ambient occlusion image lacks directionality and most of themfound that unnatural and suggested it was shot with a different light position. A fewpeople even recognized that the light sources (the windows)were visible in the gazingball, making the image completely unbelievable to them.

In summary, it can be said that spherical harmonic lighting eked out a narrow vic-tory. In terms of resources, it is heavily dependent on the pre-baking, baking and reusingof the passes, but as far as the more important quality aspects are concerned, SHLclearly has the advantage. Ambient occlusion is rarely usedas the sole type of shadow;instead, it is mainly used in combination with some sort of direct shadows. And, ratherthan treating the two as rivals, perhaps the advantages of both can be combined to createan even higher level of believability.

Future Work

It is not necessary to point out that technological advancesin computers will have ahuge impact on the computation of spherical harmonic coefficients. But there are fur-ther research areas we could not cover in this paper which could enhance the speed orquality of lighting.

One interesting field that needs more investigation is the possible optimization inspeed for calculating spherical harmonics coefficients. One possible solution couldbe to outsource the calculations to a plug-in running on the Graphics Processing Unit[GPU]. [Pantaleoni et al. 2010] has already demonstrated the potential of this approach.Today’s programmable graphics hardware represents a separate area of research anddevelopment by itself. Starting from a point cloud file - which holds all the requireddata - the necessary calculations for the coefficients wouldbe ideal for parallelization.This could yield a significant reduction in the time requiredto compute the coefficients.Another possibility for optimizing computing time would beto investigate an algorithm

Page 116: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

96 Conclusions & Future Work

that uses a scanline approach for the visibility function, instead of raytracing.As we have already discussed, due to the softness of SHL, it works best for outdoor

scenes. Therefore, another avenue for future work would be to assemble additional al-gorithms for compressing the visibility function that alsocover high frequencies with-out requiring hundreds of spherical harmonic coefficients.Possible approaches mightbe to use Wavelet compression [Ng et al. 2003] or Spherical Radial Basis Functions[Tsai and Shih 2006].

To further improve the algorithms already implemented, we would suggest that thenext logical step would be to integrate the effects of color bleeding. Due to the lowfrequency of diffuse reflections, this offers a suitable topic that might be accelerated oreven beautified by spherical harmonics.

Another low frequency effect that we would like to further investigate is subsurfacescattering. Usually, this computation is a difficult and tedious task, which also fits thepattern of spherical harmonics quite well.

Furthermore it is possible to easily customize our C++ application to also supportother renderers with different data formats besides RenderMan.

Page 117: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Appendix A

Spherical Harmonic Representation

Figure A.1: Visibility function using 16 SH coefficients

xv

Page 118: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

xvi Spherical Harmonic Representation

Figure A.2: Visibility function using 36 SH coefficients

Page 119: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Spherical Harmonic Representation xvii

Figure A.3: Visibility function using 100 SH coefficients

Page 120: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

xviii Spherical Harmonic Representation

Figure A.4: Visibility function using 10,000 SH coefficients

Page 121: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Appendix B

Real Spherical Harmonics

Y ml (θ, ϕ) for the first 16 Spherical Harmonic Coefficients

m = -3 m = -2 m = -1 m = 0 m = 1 m = 2 m = 3

l = 0 12

√1π

l = 1√

34π

· xr

√34π

· yr

√34π

· zr

l = 2 14

√5π· −x2−y2+2z2

r212

√15π· yzr2

12

√15π· xzr2

12

√15π· xyr2

14

√15π· x2−y2

r2

l = 3 14

√7π· z(2z2−3x2−3y2)

r314

√352π

· (3x2−y2)yr3

14

√352π

· (x2−3y2)xr3

14

√105π

· (x2−y2)zr3

12

√105π

· xyzr3

14

√212π

· y(4z2−x2−y2)r3

14

√212π

· x(4z2−x2−y2)r3

(x,y,z) are Cartesian coordinates andr =√x2 + y2 + z2

xix

Page 122: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

xxR

ealSphericalH

armonics

m = -3 m = -2 m = -1 m = 0 m = 1 m = 2 m = 3

l=0

l=1

l=2

l=3

Page 123: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Appendix C

RenderMan Shaders

C.1 shBake.sl

Listing C.1: shBake.sl

1 / *2 sample visibility, then compress it using spherical coordi nates

3 * /4 plugin "/CINE/_GlobalScripts/RenderMan/GlobalRendermanShad ers/

shadeop_sh" ;

56 surface shBake( string Filename= "" )

7 8 normal Nn = normalize ( N);

9 normal nworld = normalize ( ntransform ( "current" , "world" , N));

10 point pworld = transform ( "current" , "world" , P);1112 vector v = normalize (- I );13 vector dir_sample=0;

1415 uniform float Samples = 16;

1617 float Theta[Samples * Samples];18 float Phi[Samples * Samples];

1920 //number of bands

21 uniform float Lbands = 3;

2223 uniform float n_coeffs = (Lbands+1) * (Lbands+1);

24 float coeffs[n_coeffs];

xxi

Page 124: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

xxii RenderMan Shaders

252627 float fSample[Samples * Samples];28 float i=0,j=0,hits=0,l=0,m=0;

2930 //access these with [i]

31 float l_array[n_coeffs];

32 float m_array[n_coeffs];3334 //flatten out our array35 for (l = 0; l <= Lbands; l += 1)

36 for (m = -l; m <= l; m += 1)

37 float index = l * (l+1)+m;38 l_array[index] = l;

39 m_array[index] = m;40

41

4243 i=0;

44 hits = 0;45 gather ( "illuminance" , P,Nn, PI ,Samples * Samples, "distribution" , "

uniform" , "bias" ,0.1, "ray:direction" , dir_sample)46 //hit something

47 fSample[i] = 0;

48 hits += 1;49 i+=1;

50 else 51 //hit nothing

52 dir_sample = normalize (dir_sample);

53 Theta[i]= acos (dir_sample[2]);54 Phi[i]= atan (dir_sample[1],dir_sample[0]);

55 fSample[i] = 1;56 i+=1;

57

5859 Ci = 1-hits/(Samples * Samples);

6061 uniform float n_index = 0;

6263 float factor = 4 * PI /(Samples * Samples);

6465 for (n_index = 0; n_index < n_coeffs; n_index += 1)66 coeffs[n_index] = 0;

67 for (i = 0; i < Samples * Samples; i += 1)

Page 125: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

C.1. shBake.sl xxiii

68 if (fSample[i] > 0)69 coeffs[n_index] += shadeop_SH(l_array[n_index],m_arra y[

n_index],Theta[i],Phi[i]);70

71 coeffs[n_index] * = factor;72

7374 if (Filename != "" )75 bake3d (Filename, "coeff_00,coeff_01,coeff_02,coeff_03,coeff_04,

coeff_05,coeff_06,coeff_07,coeff_08,coeff_09,coeff_ 10,coeff_11,coeff_12,coeff_13,coeff_14,coeff_15"

76 , P,Nn,

77 "coeff_00" ,coeffs[0],78 "coeff_01" ,coeffs[1],

79 "coeff_02" ,coeffs[2],80 "coeff_03" ,coeffs[3],

81 "coeff_04" ,coeffs[4],

82 "coeff_05" ,coeffs[5],83 "coeff_06" ,coeffs[6],

84 "coeff_07" ,coeffs[7],85 "coeff_08" ,coeffs[8],

86 "coeff_09" ,coeffs[9],87 "coeff_10" ,coeffs[10],

88 "coeff_11" ,coeffs[11],

89 "coeff_12" ,coeffs[12],90 "coeff_13" ,coeffs[13],

91 "coeff_14" ,coeffs[14],92 "coeff_15" ,coeffs[15],

93 "interpolate" ,1);

94 9596 Ci = 1-hits/(Samples * Samples);97

Page 126: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

xxiv RenderMan Shaders

C.2 shRelight.sl

Listing C.2: shRelight.sl

1 / *2 reads sh coefficients from a point cloud, then relights the

scene

3 * /

4 plugin "/CINE/_GlobalScripts/RenderMan/GlobalRendermanShad ers/shadeop_sh" ;

56 float getSHOcclusion( string Filename; vector Ln)

7 8 float coeff[16]=0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0;

9 extern normal N;

10 extern point P;1112 normal Nn = normalize ( N);13 texture3d (Filename, P,Nn,

14 "coeff_00" ,coeff[0],

15 "coeff_01" ,coeff[1],16 "coeff_02" ,coeff[2],

17 "coeff_03" ,coeff[3],18 "coeff_04" ,coeff[4],

19 "coeff_05" ,coeff[5],

20 "coeff_06" ,coeff[6],21 "coeff_07" ,coeff[7],

22 "coeff_08" ,coeff[8],23 "coeff_09" ,coeff[9],

24 "coeff_10" ,coeff[10],

25 "coeff_11" ,coeff[11],26 "coeff_12" ,coeff[12],

27 "coeff_13" ,coeff[13],28 "coeff_14" ,coeff[14],

29 "coeff_15" ,coeff[15]30 );

31 return shOcclusion(coeff,Ln);

32 3334 surface shRelight( string InPointCloud= "" ;35 string OutPointCloud= "" )

36

37 normal Nn = normalize ( N);38 vector mylight = 0;

39

Page 127: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

C.2. shRelight.sl xxv

40 illuminance ( P, Nn, PI /2)41 mylight = normalize ( L);

42 4344 vector __Ln; // normalized light-vector4546 matrix bakeCam = getWorld2Eye(InPointCloud);

47 __Ln = vtransform ( "world" , mylight);48 __Ln = vtransform (bakeCam,__Ln);

49 __Ln = normalize (__Ln);5051 float unoccluded = getSHOcclusion(InPointCloud,__Ln);

5253 Ci = unoccluded;

5455 bake3d ( OutPointCloud, "unoccluded" ,

56 P, Nn,

57 "unoccluded" , unoccluded,58 "interpolate" ,1);

5960

Page 128: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

xxvi RenderMan Shaders

C.3 shLight.sl

Listing C.3: shLight.sl

1 plugin "/CINE/_GlobalScripts/RenderMan/GlobalRendermanShad ers/

shadeop_sh" ;23 float getSHOcclusion( string Filename; float shFilterRadius)

4 5 float unoccluded = 0;

6 extern normal N;78 normal Nn = normalize ( N);9

10 texture3d (Filename, Ps,Nn,

11 "filterradius" , shFilterRadius,12 "lerp" , 1,

13 "unoccluded" , unoccluded14 );

15 return unoccluded;

16 1718 light shLight(19 / * Basic intensity and color of the light * /

20 float intensity = 1;

21 color lightcolor = 1;22 color shadowcolor = 0;

23 float width = .5, height = .5, wedge = .1, hedge = .1;2425 / * Spherical Harmonics Occlusion * /

26 uniform float doSphericalHarmonics = 1;27 uniform float sphericalHarmonicsMult = 1;

28 uniform string shPointCloud= "[RMSExpression::passinformanMakeSHPass filename]" ;

29 uniform float shFilterRadius = 0.5;3031 / * Miscellaneous controls * /

32 float nonspecular = 0;33 float nondiffuse = 0;

34 output varying float __nonspecular = 0;35 output varying float __nondiffuse = 0;

36 output float __foglight = 1;

37 )38

39 __nonspecular = nonspecular;

Page 129: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

C.3. shLight.sl xxvii

40 __nondiffuse = nondiffuse;41 color __shade_;

42 point PL = transform ( "shader" , Ps);4344 / * For PRMan, we’ve gotta do it the hard way * /45 point from = point "shader" (0,0,0);

46 point _from = transform ( "shader" ,from);

47 vector axis = normalize ( vector "shader" (0,0,1));4849 / * Spot Light * /50 uniform float angle;

51 uniform float maxradius = 1.4142136 * max(height+hedge, width+

wedge);52 angle = atan (maxradius);

5354 illuminate (from, axis, angle)

55 color lcol = lightcolor;

5657 float unoccluded = 1;

5859 if (doSphericalHarmonics != 0)

60 unoccluded * = (getSHOcclusion(shPointCloud, shFilterRadius) *sphericalHarmonicsMult);

61

6263 lcol = mix (shadowcolor, lcol, unoccluded);

6465 __nonspecular = 1 - unoccluded * (1 - __nonspecular);

66 __nondiffuse = 1 - unoccluded * (1 - __nondiffuse);

6768 Cl = intensity * lcol;

69 70

Page 130: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

xxviii RenderMan Shaders

C.4 shadeop_sh.cpp

Listing C.4: shadeop_sh.cpp

1 / *2 relight sh-shadeop3 * /

4 #include <stdio.h>

5 #include <stdlib.h>6 #include <string>

7 #include <math.h>8 #include "RslPlugin.h"

9 #include "rx.h"10 #include "pointcloud.h"

1112 #include <vector>1314 using namespace std;1516 float poly( int l, int m, float x)

17 18 // evaluate an Associated Legendre Polynomial P(l,m,x) at x

19 float pmm = 1.0;2021 if (m > 0)

22 float somx2 = sqrt((1-x) * (1+x));23 float fact = 1.0;

24 for ( int i = 1; i <= m; ++i) 25 pmm * = (-fact) * somx2;

26 fact += 2.0;

27 28

2930 if (l == m)

31 return pmm;3233 float pmmp1 = x * (2.0 * m+1.0) * pmm;

3435 if (l == m+1)

36 return pmmp1;3738 float pll = 0.0;

3940 for ( float ll = m+2; ll <= l; ++ll)

41 pll = ( (2 * ll-1) * x* pmmp1-(ll+m-1.0) * pmm ) / (ll-m);

Page 131: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

C.4. shadeop_sh.cpp xxix

42 pmm = pmmp1;43 pmmp1 = pll;

44 45 return pll;

46 4748 float factorial( int num)

49 50 if ( num == 0 ) return 1;

51 if ( num == 1 ) return 1;52 if ( num == 2 ) return 2;

53 if ( num == 3 ) return 6;

54 if ( num == 4 ) return 24;55 if ( num == 5 ) return 120;

56 if ( num == 6 ) return 720;57 if ( num == 7 ) return 5040;

58 if ( num == 8 ) return 40320;

59 if ( num == 9 ) return 362880;60 if ( num == 10) return 3628800;

61 if ( num == 11) return 39916800;62 if ( num == 12) return 479001600;

63 if ( num == 13) return 6227020800;64 if ( num == 14) return 87178291200;

65 if ( num == 15) return 1307674368000;

66 if ( num == 16) return 20922789888000;67

6869 float K( int l, int m)

70

71 // renormalisation constant for SH function72 float temp = ((2.0 * l+1.0) * factorial(l-m)) / (4.0 * M_PI* factorial(

l+m));73 return sqrt(temp);

74

7576 float sh( int l, int m, float theta, float phi)

77 78 // return a float sample of a Spherical Harmonic basis functi on

79 // l is the band, range [0..N]80 // m in the range [-l..l]

81 // theta in the range [0..Pi]

82 // phi in the range [0..2 * Pi]8384 float sqrt2 = sqrt(2.0);

Page 132: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

xxx RenderMan Shaders

8586 if (m == 0)

87 return K(l,0) * poly(l,m,cos(theta));88 else if (m > 0)

89 return sqrt2 * K(l,m) * cos(m * phi) * poly(l,m,cos(theta));90 else

91 return sqrt2 * K(l,-m) * sin(-m * phi) * poly(l,-m,cos(theta));

92 9394 extern "C" 9596 RSLEXPORTint shadeop_SH(RslContext * rslContext, int argc, const

RslArg * argv[])97 RslFloatIter result(argv[0]);

98 RslFloatIter l(argv[1]);99 RslFloatIter m(argv[2]);

100 RslFloatIter phi(argv[4]);

101 RslFloatIter theta(argv[3]);102103 int gridsize = argv[0]->NumValues();104105 for ( int g = 0; g < gridsize; ++g)106 ( * result)=sh( * l, * m,* theta, * phi);

107 ++l;

108 ++m;109 ++phi;

110 ++theta;111 ++result;

112

113 return 0;114

115116 RSLEXPORTint getSampleVectors(RslContext * rslContext, int argc,

const RslArg * argv[])

117 RslFloatIter result(argv[0]);118 RslFloatIter samples(argv[1]);

119120 RslFloatArrayIter theta(argv[2]);

121 RslFloatArrayIter phi(argv[3]);122123 int gridsize = argv[0]->NumValues();

124125 for ( int g = 0; g < gridsize; ++g)

126 ++phi;

Page 133: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

C.4. shadeop_sh.cpp xxxi

127 ++theta;128 ++result;

129 130 return 0;

131 132133 / *134 shOcclusion(float[] coefficients,vector[] lights);135 * /

136 RSLEXPORTint shOcclusion(RslContext * rslContext, int argc,const RslArg * argv[])

137138 RslFloatIter result(argv[0]);139 RslFloatArrayIter coeffs(argv[1]);

140 RslVectorIter light(argv[2]);141142 int gridsize = argv[0]->NumValues();

143 float theta=0, phi=0;144 int L=3, i=0;

145146 for ( int g = 0; g < gridsize; ++g)

147 * result = 0;148 i=0;

149 theta = acos(( * light)[2]);

150 phi = atan2(( * light)[1], ( * light)[0]);151152 for ( int l = 0; l <= L; ++l)153 for ( int m = -l; m <= l; ++m)

154 * result += ( * coeffs)[i] * sh(l,m,theta,phi);

155 ++i;156

157 158 ++light;

159 ++result;

160 ++coeffs;161

162 return 0;163

164165 RSLEXPORTint getWorld2Eye(RslContext * rslContext, int argc,

const RslArg * argv[])

166 167 RtMatrix cameraInverse;

168

Page 134: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

xxxii RenderMan Shaders

169 RslMatrixIter result(argv[0]);170 RslStringIter fname(argv[1]);

171 PtcPointCloud myfile = 0;172 myfile = PtcSafeOpenPointCloudFile( * fname);

173 if (myfile>0)174 PtcGetPointCloudInfo(myfile, "world2eye" ,&cameraInverse);

175 int gridsize = argv[0]->NumValues();

176 for ( int g=0;g<gridsize;g++)177 for ( int i=0;i<4;i++)

178 for ( int j=0;j<4;j++)179 ( * result)[i][j] = cameraInverse[i][j];

180

181 182 ++result;

183 184 PtcClosePointCloudFile(myfile);

185

186 return 0;187

188189 static RslFunction myFunctions[]=

190 191 "float shOcclusion(float[],vector)" ,shOcclusion,NULL,NULL,

192 "float shadeop_SH(float,float,float,float)" ,shadeop_SH,NULL,

NULL,193 "uniform matrix getWorld2Eye(uniform string)" ,getWorld2Eye,

NULL,NULL,NULL,NULL,194 NULL

195 ;

196 RSLEXPORT RslFunctionTable RslPublicFunctions(myFunct ions);197 ;

Page 135: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Appendix D

Baked Spherical Harmonic FilesD.1 Coefficients Point Cloud

coeff_00 coeff_01 coeff_02 coeff_03

coeff_04 coeff_05 coeff_06 coeff_07

coeff_08 coeff_09 coeff_10 coeff_11

coeff_12 coeff_13 coeff_14 coeff_15

Figure D.1: Displaying the first 16 spherical harmonic coefficients using our Hektor

xxxiii

Page 136: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

xxxiv Baked Spherical Harmonic Files

D.2 Relighting Point Cloud & Brick Map

Point Cloud

Brick Map Level 0 Brick Map Level 1 Brick Map Level 2 Brick Map Level 3

Brick Map Level 4 Brick Map Level 5 Brick Map Level 6 Brick Map Level 7

Brick Map Level 8 Brick Map Level 9

Figure D.2: Baked illumination point cloud & brick map levels. We zoomed out a bit,to show that only required points/bricks are stored (missing pieces on the ground)

Page 137: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Appendix E

Pipeline Scripts

E.1 my_nodetemplate.rman

Listing E.1: my_nodetemplate.rman

1 rman "-version 1" 23 NodeType pass:render:RenderSH 4 # this pass produces a .ptc with baked SH coefficients

5 # for pt-based SHL.

6 reference Collection RequiredRenderSettings7 torattr phase

8 subtype selector9 range

10 # It is going to be used as a Shadowing technique

11 "Once Per Job" /Job/Preflight/Maps/Shadow12 "Every Frame" /Job/Frames/Maps/Shadow

13 14 default /Job/Frames/Maps/Shadow

15 1617 # Should the surface shaders be evaluated for baking into

point cloud18 torattr outputSurfaceShaders

19 # No, we just need geometry information, slow down otherwise20 default 0

21 description

22 subtype switch23

24

xxxv

Page 138: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

xxxvi Pipeline Scripts

25 # Should the displacement shaders be evaluated for baking in topoint cloud

26 torattr outputDisplacementShaders 27 # Yes, because change of shape could affect shadowing

28 default 129 subtype switch

30

3132 # When this pass is rendered, all shaders will be substituted

with the given one33 # "defaultSurfaceShader" didn’t work, so we use RiAttribut e

34 torattr defaultRiAttributesScript

35 default RiSurface \"/CINE/_GlobalScripts/RenderMan/GlobalRendermanShaders/tx_shBake\" \"string Filename\ "

‘rman tcl eval \"RMSExpression::passinfo this filename\" ‘

36

3738 # Bake into a point cloud

39 torattr passExtFormat 40 default .ptc

41 uistate hidden42

4344 # No need to specify extra display45 reference NodeType

46 settings:display:PreviewNull47

4849 # also bake faces of which we don’t see the front50 riattr cull:backfacing default 0

5152 # also bake faces that are hidden behind other faces

53 riattr cull:hidden default 0

5455 # dicing of the grids view-dependant

56 riattr dice:rasterorient default 05758 #59 riopt user:shading_normalmode default 1

6061 # For baking into a pointcloud we need to specify the Display-

channels

62 # before the world block (therefore RiOption)

Page 139: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

E.1. my_nodetemplate.rman xxxvii

63 torattr defaultRiOptionsScript 64 default

65 RiDisplayChannel \"float coeff_00\" ;66 RiDisplayChannel \"float coeff_01\" ;

67 RiDisplayChannel \"float coeff_02\" ;68 RiDisplayChannel \"float coeff_03\" ;

69 RiDisplayChannel \"float coeff_04\" ;

70 RiDisplayChannel \"float coeff_05\" ;71 RiDisplayChannel \"float coeff_06\" ;

72 RiDisplayChannel \"float coeff_07\" ;73 RiDisplayChannel \"float coeff_08\" ;

74 RiDisplayChannel \"float coeff_09\" ;

75 RiDisplayChannel \"float coeff_10\" ;76 RiDisplayChannel \"float coeff_11\" ;

77 RiDisplayChannel \"float coeff_12\" ;78 RiDisplayChannel \"float coeff_13\" ;

79 RiDisplayChannel \"float coeff_14\" ;

80 RiDisplayChannel \"float coeff_15\" ;81 uistate hidden

82 8384 # Shading rate for baking can be much higher than for final

image

85 riattr ShadingRate

86 default 1087 description "This value should be as large as quality

requirements allow"88

8990 # Self-explanatory91 riopt PixelSamples

92 default 1 193 subtype slider

94 range 0 16 1

95 96

9798 # create a brickmap

99 # most copied from nodes_globalillum.rman ->MakeApproxGlobalDiffuse

100 NodeType pass:command:MakeSH

101 reference Collection RequiredPassSettings102 torattr phase

103 subtype selector

Page 140: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

xxxviii Pipeline Scripts

104 range 105 "Once Per Job" /Job/Preflight/Maps/Photon

106 "Every Frame" /Job/Frames/Maps/Photon107

108 default /Job/Frames/Maps/Photon109

110 param brickmake:maxerror

111 default .004112

113 param brickmake:progress 114 default 2

115 uistate hidden

116 117118 # if the brick map is only intended for use as a119 # 3D texture -- and not intended to be used as a

120 # geometric primitive for rendering -- the information

121 # about which voxels have geometry in them can be122 # omitted. This can reduce the brick map file size

123 param brickmake:omitgeometry 124 default 1

125 126127 # This is important, because it automatically creates

128 # a BakeSHLightingPass as child and can use its output129 reference NodeType

130 pass:render:BakeSHLighting131

132 torattr passCommand

133 default \"\\$RMANTREE/bin/brickmake\" $CMDARGS \"[passinfo this/0 filename]\" \"[passinfo this filename]\"

134 135

136137 NodeType pass:render:BakeSHLighting 138 reference Collection RequiredRenderSettings

139 torattr phase 140 subtype selector

141 range 142 "Once Per Job" /Job/Preflight/Maps/Photon

143 "Every Frame" /Job/Frames/Maps/Photon

144 145 default /Job/Frames/Maps/Photon

146

Page 141: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

E.1. my_nodetemplate.rman xxxix

147148 # No need to specify extra display

149 reference NodeType 150 settings:display:PreviewNull

151 152153 # Should the surface shaders be evaluated for baking into

point cloud154 torattr outputSurfaceShaders

155 # No, we just need geometry information, otherwise it would156 # slow down with unnecessary calculations

157 default 0

158 description 159 subtype switch

160 161162 # Should the displacement shaders be evaluated for baking in to

point cloud163 torattr outputDisplacementShaders

164 # Yes, because change of shape could affect the shadowing165 default 1

166 subtype switch167

168169 # When this pass is rendered, all shaders will be substituted

with the given one

170 # "defaultSurfaceShader" didn’t work, so we use RiAttribut e171 torattr defaultRiAttributesScript

172 default RiSurface \"/CINE/_GlobalScripts/RenderMan/

GlobalRendermanShaders/mk_shRelight\" \"stringInPointCloud\" ‘rman tcl eval \"RMSExpression::passinfo

rmanRenderSHPass filename\" ‘ \"string OutPointCloud\" ‘rman tcl eval \"RMSExpression::passinfo this filename\" ‘

173

174175 # Bake into a point cloud

176 torattr passExtFormat 177 default .ptc

178 uistate hidden179

180181 # also bake faces of which we don’t see the front182 riattr cull:backfacing default 0

183

Page 142: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

xl Pipeline Scripts

184 # also bake faces that are hidden behind other faces185 riattr cull:hidden default 0

186187 # dicing of the grids view-dependant

188 riattr dice:rasterorient default 0189190 #

191 riopt user:shading_normalmode default 1192193 # For baking into a pointcloud we need to specify the Display-

channels

194 # before the world block (therefore RiOption)

195 torattr defaultRiOptionsScript 196 default RiDisplayChannel \"float unoccluded\" ;

197 uistate hidden198

199200 # Shading rate for baking can be much higher than for final

image

201 riattr ShadingRate 202 default 10

203 description "This value should be as large as qualityrequirements allow"

204

205206 # Self-explanatory

207 riopt PixelSamples 208 default 1 1

209 subtype slider

210 range 0 16 1211

212 213214

Page 143: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Appendix F

Compared Renderings

On the following four pages are the renderings we showed to the participants in oursurvey.

xli

Page 144: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

xlii Compared Renderings

Figure F.1: Real photo

Page 145: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Compared Renderings xliii

Figure F.2: Ambient occlusion

Page 146: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

xliv Compared Renderings

Figure F.3: Area light shadows

Page 147: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Compared Renderings xlv

Figure F.4: Spherical harmonic lighting

Page 148: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for
Page 149: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Glossar

2D Two-dimensional

3D Three-dimensional

Antialiasing Preventing jagged edges in images, by smoothly blending nearbypixels

AO AmbientOcclusion

API ApplicationProgrammingInterface

BRDF BidirectionalReflectanceDistributionFunction

CG ComputerGraphics

CPU CentralProcessingUnit

Dithering An intentional added kind of noise, to fake smoother transitionsin limited color palettes

Gammacorrection

Controlling the overall brightness of an image to counteract thenonlinear relation between the recorded and the humanly per-ceived light intensity

GI Global I llumination

GPU GraphicalProgrammingUnit

IBL ImageBasedL ighting

Maya AutodeskR©MayaR©

Software package extensively used in the CG industry

NPROpenGL

Open GraphicsL ibrary

2D and 3D graphics API

Pixel filtering The way of averaging and weighting the samples for one pixel

xlvii

Page 150: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

xlviii Glossar

PRMan PhotorealisticRenderManRenderMan compliant render engine developed by Pixar

ptc RenderMan point cloudFile format (*.ptc)

Reyes RenderEverythingYouEverSawRenderingmethod developed by Pixar [Cook et al. 1987]

RGBA Red,Green,Blue, andAlphaColor components

RI RenderManInterfaceRenderMan’s standard interface between modeling and renderingprograms [Pixar 2005]

RIB RenderManInterfaceBytestreamFile format for storing scene data, readable by RenderMan com-pliant renderers [Pixar 2005]

RMS RenderManStudioPlug-in for using Pixar’s RenderMan in Maya

RSL RenderManShadingLanguage

Sample Refers to a value or set of values at a point in time and/or space.

SDK SoftwareDevelopmentK it

SH SphericalHarmonics

SHL SphericalHarmonicL ighting

SIMD Single InstructionMultiple Data

SphericalHarmonics

In this thesis: Pierre Simon de Laplace’s orthogonal systemofspherical harmonics

VFX Visual effects

Visual F/X Visual effects

Page 151: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Bibliography

ADOBE. 1986. PostScript Language Tutorial and Cookbook, 1st ed. Addison-WesleyLongman Publishing Co., Inc., Boston, MA, USA.

AKENINE-MÖLLER, T., HAINES, E., AND HOFFMAN, N. 2008.Real-Time Rendering3rd Edition. A. K. Peters, Ltd., Natick, MA, USA.

APODACA, A. A., AND GRITZ, L. 1999. Advanced RenderMan: Creating CGI forMotion Picture, 1st ed. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA.

AXLER, S., BOURDON, P., AND RAMEY, W. 2000. Harmonic Function Theory.Springer, New York.

BENSON, D., AND DAVIS, J. 2002. Octree Textures.ACM Trans. Graph. 21(July),785–790.

BIRN, J. 2005.Digital Lighting and Rendering (2nd Edition). New Riders Publishing,Thousand Oaks, CA, USA.

BLINN , J. F. 1977. Models of light reflection for computer synthesized pictures.SIGGRAPH Comput. Graph. 11(July), 192–198.

BREDOW, R. 2002. Renderman on Film. InSiggraph Course Notes, vol. 16: Render-Man in Production.

BYERLY, W. E. 1893. An elementary treatise on Fourier’s series and spherical,cylindrical, and ellipsoidal harmonics with applicationsto problems in mathemat-ical physics. Ginn & Company, Boston.

CABRAL , B., MAX , N., AND SPRINGMEYER, R. 1987. Bidirectional ReflectionFunctions from Surface Bump Maps.SIGGRAPH Comput. Graph. 21(August),273–281.

CHANDRASEKHAR, S. 1960.Radiative Transfer. Dover Publications.

xlix

Page 152: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

l Bibliography

CHEN, H., AND L IU , X. 2008. Lighting and material of halo 3. InACM SIGGRAPH2008 classes, ACM, New York, NY, USA, SIGGRAPH ’08, 1–22.

CHRISTENSEN, P. H. 2008. Point-Based Approximate Color Bleeding. Tech.rep.,Pixar Animation Studios.

COHEN, M. F., WALLACE , J., AND HANRAHAN , P. 1993. Radiosity and realisticimage synthesis. Academic Press Professional, Inc., San Diego, CA, USA.

COOK, R. L., AND TORRANCE, K. E. 1982. A Reflectance Model for ComputerGraphics.ACM Trans. Graph. 1(January), 7–24.

COOK, R. L., CARPENTER, L., AND CATMULL , E. 1987. The Reyes Image Render-ing Architecture.SIGGRAPH Comput. Graph. 21(August), 95–102.

CORTES, R., AND RAGHAVACHARY, S. 2008. The RenderMan Shading LanguageGuide. Thomson Course Technology.

COURANT, R., AND HILBERT, D. 1953. Methods of Mathematical Physics, vol. 1.Interscience Publishers, Inc., New York, NY.

DEBRY, D. G., GIBBS, J., PETTY, D. D., AND ROBINS, N. 2002. Painting andRendering Textures on Unparameterized Models.ACM Trans. Graph. 21(July),763–768.

DEMPSKI, K., AND V IALE , E. 2004.Advanced Lighting and Materials With Shaders.Wordware Publishing Inc., Plano, TX, USA.

DICKMAN , S. R. 1989. A complete spherical harmonic approach to luni-solar tides.Geophysical Journal International 99, 3, 457–468.

DOBASHI, Y., KANEDA , K., NAKATANI , H., YAMASHITA , H., AND NISHITA , T.1995. A Quick Rendering Method Using Basis Functions for Interactive LightingDesign. InComputer Graphics Forum, 229–240.

FERNANDO, R., AND K ILGARD , M. J. 2003.The Cg Tutorial: The Definitive Guideto Programmable Real-Time Graphics. Addison-Wesley Longman Publishing Co.,Inc., Boston, MA, USA.

FERRERS, N. M. 1877. An elementary treatise on spherical harmonics and subjectsconnected with them. Macmillan & Co., London.

Page 153: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Bibliography li

GÉNEVAUX , O., LARUE, F., AND DISCHLER, J.-M. 2006. Interactive refractionon complex static geometry using spherical harmonics. InProceedings of the 2006symposium on Interactive 3D graphics and games, ACM, New York, NY, USA, I3D’06, 145–152.

GREEN, R. 2003. Spherical Harmonic Lighting: The Gritty Details.Archives of theGame Developers Conference(March).

GROSS, M., AND PFISTER, H. 2007.Point-Based Graphics (The Morgan KaufmannSeries in Computer Graphics). Morgan Kaufmann Publishers Inc., San Francisco,CA, USA.

HANRAHAN , P., AND LAWSON, J. 1990. A language for shading and lighting calcu-lations.SIGGRAPH Comput. Graph. 24(September), 289–298.

HEINE, E. 1861.Handbuch der Kugelfunctionen. Cornell University Library historicalmath monographs. Georg Reimer, Berlin.

ISHIMARU, A. 1978. Wave propagation and scattering in random media. AcademicPress, New York.

JENSEN, H. W. 1996. Global illumination using photon maps. InProceedings of theeurographics workshop on Rendering techniques ’96, Springer-Verlag, London, UK,21–30.

KAJIYA , J. T., AND VON HERZEN, B. P. 1984. Ray tracing volume densities.SIG-GRAPH Comput. Graph. 18, 3, 165–174.

KAJIYA , J. T. 1986. The rendering equation.SIGGRAPH Comput. Graph. 20(August),143–150.

KAPLANYAN , A., 2009. Light propagation volumes in cryengine 3.

KAUTZ , J., SLOAN , P.-P.,AND SNYDER, J. 2002. Fast, arbitrary BRDF shading forlow-frequency lighting using spherical harmonics. InProceedings of the 13th Euro-graphics workshop on Rendering, Eurographics Association, Aire-la-Ville, Switzer-land, Switzerland, EGRW ’02, 291–296.

KAUTZ , J., SLOAN , P.-P.,AND LEHTINEN, J. 2005. Precomputed radiance transfer:theory and practice. InACM SIGGRAPH 2005 Courses, ACM, New York, NY, USA,SIGGRAPH ’05.

Page 154: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

lii Bibliography

LAMBERT, J. H. 1760. Photometria sive de mensure de gratibus luminis, colorumumbrae. Eberhard Klett.

LANDIS, H. 2002. Production-Ready Global Illumination. InSiggraph Course Notes,vol. 16: RenderMan in Production.

LEHTINEN, J., AND KAUTZ , J. 2003. Matrix radiance transfer. InProceedings of the2003 symposium on Interactive 3D graphics, ACM, New York, NY, USA, I3D ’03,59–64.

LEMOINE, F. G., KENYON, S. C., FACTOR, J. K., TRIMMER, R., PAVLIS , N. K.,CHINN , D. S., COX, C. M., KLOSKO, S. M., LUTHCKE, S. B., TORRENCE,M. H., WANG, Y. M., WILLIAMSON , R. G., PAVLIS , E. C., RAPP, R. H., AND

OLSON, T. R. 1998. The NASA GSFC and NIMA Joint Geopotential Model.

LEPAGE, G. P. 1978. A new algorithm for adaptive multidimensional integration.Journal of Computational Physics 27, 2 (May), 192–203.

LOKOVIC, T., AND VEACH, E. 2000. Deep shadow maps. InProceedings of the27th annual conference on Computer graphics and interactive techniques, ACMPress/Addison-Wesley Publishing Co., New York, NY, USA, SIGGRAPH ’00, 385–392.

MACROBERT, T. M. 1948. Spherical Harmonics: An Elementary Treatise on Har-monic Functions with Applications, 2nd ed. Methuen & Co. LTD, London.

MÉNDEZ-FELIU , A., AND SBERT, M. 2009. From obscurances to ambient occlusion:A survey.Vis. Comput. 25(January), 181–196.

M ILLER, G. 1994. Efficient algorithms for local and global accessibility shading. InProceedings of the 21st annual conference on Computer graphics and interactivetechniques, ACM, New York, NY, USA, SIGGRAPH ’94, 319–326.

MOON, J. T. 2010.Rendering Multiple Scattering in Hair and Other Discrete RandomMedia. PhD thesis, Cornell University.

MUCKLOW, G. H. 1980. Automatic satellite tracking system for the NASA SatellitePhotometric Observatory.

NG, R., RAMAMOORTHI , R., AND HANRAHAN , P. 2003. All-frequency shadowsusing non-linear wavelet lighting approximation.ACM Trans. Graph. 22(July), 376–381.

Page 155: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Bibliography liii

NICODEMUS, F. E. 1965. Directional Reflectance and Emissivity of an Opaque Sur-face.Applied Optics 4, 7, 767–773.

OAT, C., 2004. Irradiance Volumes for Games.

PANTALEONI , J., FASCIONE, L., HILL , M., AND A ILA , T. 2010. PantaRay: fastray-traced occlusion caching of massive scenes.ACM Trans. Graph. 29(July), 37:1–37:10.

PEC, K., MARTINEC, Z., AND BURŠA, M. 1982. Expansion of geoid heightsinto a spherical harmonic series.Studia Geophysica et Geodaetica 26, 115–119.10.1007/BF01582304.

PHARR, M., AND HUMPHREYS, G. 2004.Physically Based Rendering: From Theoryto Implementation. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA.

PHONG, B. T. 1975. Illumination for computer generated pictures.Commun. ACM 18(June), 311–317.

PIXAR . 2005.The RenderMan Interface, 3.2.1 ed., November. RenderMan Specifica-tion.

PIXAR . 2010.RenderMan Pro Server 15.0 Documentation.

PRATT, J. H. 1865.A treatise on attractions, Laplace’s functions and the figure of theearth. Macmillan & Co., London.

PRESS, W. H., FLANNERY, B. P., TEUKOLSKY, S. A., AND VETTERLING, W. T.1988.Numerical recipes in C: the art of scientific computing. Cambridge UniversityPress, New York, NY, USA.

RAGHAVACHARY, S. 2004.Rendering for Beginners: Image Synthesis Using Render-Man. Focal Press.

RAMAMOORTHI , R., AND HANRAHAN , P. 2001. An Efficient Representation forIrradiance Environment Maps. InSIGGRAPH ’01: Proceedings of the 28th an-nual conference on Computer graphics and interactive techniques, ACM Press, NewYork, NY, USA, 497–500.

RAMAMOORTHI , R., AND HANRAHAN , P. 2002. Frequency space environment maprendering.ACM Trans. Graph. 21(July), 517–526.

Page 156: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

liv Bibliography

REN, Z., WANG, R., SNYDER, J., ZHOU, K., L IU , X., SUN, B., SLOAN , P.-P., BAO,H., PENG, Q., AND GUO, B. 2006. Real-time soft shadows in dynamic scenes usingspherical harmonic exponentiation.ACM Trans. Graph. 25(July), 977–986.

SCHÖNEFELD, V., 2005. Spherical harmonics.

SHANNON, C. E. 1948. A mathematical theory of communication.Bell system tech-nical journal 27.

SIEGEL, R., AND HOWELL, J. R. 2002.Thermal radiation heat transfer. Taylor &Francis.

SILLION , F. X., ARVO, J. R., WESTIN, S. H., AND GREENBERG, D. P. 1991. Aglobal illumination solution for general reflectance distributions. SIGGRAPH Com-put. Graph. 25(July), 187–196.

SLOAN , P.-P., KAUTZ , J., AND SNYDER, J. 2002. Precomputed Radiance Transferfor Real-Time Rendering in Dynamic, Low-Frequency Lighting Environments.ACMTrans. Graph. 21(July), 527–536.

SLOAN , P.-P., HALL , J., HART, J., AND SNYDER, J. 2003. Clustered principalcomponents for precomputed radiance transfer.ACM Trans. Graph. 22(July), 382–391.

SLOAN , PETER-PIKE, 2008. Stupid Spherical Harmonics (SH) Tricks.

STEPHENSON, I. J. 2002.Essential RenderMan Fast. Springer-Verlag New York, Inc.,Secaucus, NJ, USA.

STRUTZ, T. 2009.Bilddatenkompression : Grundlagen, Codierung, Wavelets,JPEG,MPEG, H.264, 4., überarbeitete und ergänzte Auflage ed. Vieweg, Wiesbaden.

SUFFERN, K. 2007.Ray Tracing from the Ground Up. A. K. Peters, Ltd., Natick, MA,USA.

TODHUNTER, I. 1875. An elementary treatise on Laplace’s functions, Lamé’s func-tions, and Bessel’s functions. Macmillan & Co., London.

TSAI, Y.-T., AND SHIH , Z.-C. 2006. All-frequency precomputed radiance transferus-ing spherical radial basis functions and clustered tensor approximation.ACM Trans.Graph. 25(July), 967–976.

Page 157: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for

Bibliography lv

UPSTILL, S. 1989. RenderMan Companion: A Programmer’s Guide to RealisticComputer Graphics. Addison-Wesley Longman Publishing Co., Inc., Boston, MA,USA.

WEINZIERL, S., 2000. Introduction to monte carlo methods, June.

WESTIN, S. H., ARVO, J. R.,AND TORRANCE, K. E. 1992. Predicting reflectancefunctions from complex surfaces.SIGGRAPH Comput. Graph. 26(July), 255–264.

WHITTED, T. 1980. An improved illumination model for shaded display. Commun.ACM 23(June), 343–349.

WILLIAMS , L. 1978. Casting curved shadows on curved surfaces.SIGGRAPH Com-put. Graph. 12(August), 270–274.

WONG, T.-T., HENG, P.-A., OR, S.-H.,AND NG, W.-Y. 1997. Image-based Render-ing with Controllable Illumination. InProceedings of the Eurographics Workshop onRendering Techniques ’97, Springer-Verlag, London, UK, 13–22.

WONG, T.-T., LUK , W.-S.,AND HENG, P.-A. 1997. Sampling with Hammersley andHalton Points.J. Graph. Tools 2(November), 9–24.

WONG, T.-T., WING FU, C., WING FU, C., ANN HENG, P., ANN HENG, P., SING

LEUNG, C., AND SING LEUNG, C. 2002. The Plenoptic Illumination Function.IEEE Transactions on Multimedia 4, 361–371.

WONG, T.-T., OR, S.-H., AND FU, C.-W. 2003.Real-time relighting of compressedpanoramas. Charles River Media, Inc., Rockland, MA, USA, 375–388.

WYLIE , C., ROMNEY, G., EVANS, D., AND ERDAHL , A. 1967. Half-tone perspec-tive drawings by computer. InProceedings of the November 14-16, 1967, fall jointcomputer conference, ACM, New York, NY, USA, AFIPS ’67 (Fall), 49–58.

ZHOU, K., HU, Y., L IN , S., GUO, B., AND SHUM , H.-Y. 2005. Precomputed shadowfields for dynamic scenes.ACM Trans. Graph. 24(July), 1196–1201.

ZHUKOV, S., INOES, A., AND KRONIN, G. 1998. An Ambient Light IlluminationModel. InIn Proceedings of Eurographics Rendering Workshop ’98, Springer-VerlagWien New York, G. Drettakis and N. Max, Eds., Eurographics, 45–56.

Page 158: Evaluation of Spherical Harmonic Lighting and Ambient Occlusion as Shadowing Techniques for