cw208-3/4-2010/11, chp 2: lec 31 performance & optimisation techniques programming for games...

42
CW208-3/4-2010/11, Chp 2: Lec 3 1 Performance & Optimisation techniques Programming for Games Devices

Upload: oliver-parker

Post on 15-Jan-2016

229 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 1

Performance & Optimisation techniques

Programming for Games Devices

Page 2: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 2

2010/11 Year Plan

Page 3: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 3

Course Structure

Android Part I Introduction Building basic Android applications An open source framework for 2D

games Game Project walkthrough

Android Part II Optimisation techniques 3D Graphics with OpenGL ES

Page 4: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 4

Agenda

Introduction Measuring performance

Traceview Caliper

Optimising High-level Low-level

Page 5: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 5

Measuring Performance

Traceview A graphical tool that depicts

execution activity by an Android application

Useful for debugging and profiling Profiling helps identify where the

bottlenecks are in an application Traceview is part of the Android

SDK and is located in the folder: android-sdk-windows\tools

Page 6: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 6

Measuring Performance

Traceview generates a log file containing the trace information to be analysed

The android.os.Debug class must be included in the application

The Debug.startMethodTracing() and Debug.stopMethodTracing() methods are used to start/stop the logging of trace information to disk.

Page 7: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 7

Measuring Performance

Usually startMethodTracing() is done inside an Activity’s onCreate() method and stopMethodTracing() is done in that activity’s onDestroy() method

For Andengine apps, use the onLoadComplete() method to start tracing and onUnloadResources() to stop tracing:

Page 8: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 8

Measuring Performance

@Override

public void onLoadComplete() {

Debug.startMethodTracing("calc");

}

@Override

public void onUnloadResources() {

Debug.stopMethodTracing();

}

This is the name of the trace file that is created (i.e calc.trace) When using the Android emulator, the trace file is written to an SD card image – this must be setup beforehand (more on this shortly)

Page 9: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 9

Measuring Performance

However, a more convenient option is to simply derive your andengine app from the BaseExample class (see the SpriteExample in the andengine examples)

This allows us to toggle tracing on and off through the emulator ‘Menu’ button

For example:

Page 10: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 10

Measuring Performance

Pressing the ‘Menu’ button reveals this command button to start Method Tracing. Press ‘Menu’ again to reveal the ‘Stop Method Tracing’ button.

Page 11: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 11

Measuring Performance

BEFORE initiating a trace in your application you MUST do the following:

1. Create an SD card image For example, to create an image

named “imgcd”, issue the following command from your

android-sdk-windows\tools folder:

> mksdcard 1024M ./imgcdThis creates a 1GB file in the android-sdk-windows\tools folder called imgcd. This file will be mounted on the emulator file system and mapped to the sdcard folder (on the emulator)

Page 12: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 12

Measuring Performance

2. Mount the SD card image To mount or bind the SD card image to

the emulator, we can do so via Eclipse before launching the Activity 2.1 Choose Run->Run Configurations... 2.2 Choose the ‘Target’ tab 2.3 In the ‘Additional Emulator Command

Line Options’ box (bottom of window), enter:

-sdcard \the-path-to-sd-card-file\my-sdcard-file

(see screenshot on next slide)

Page 13: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 13

Measuring Performance

In this example, to path to my SD card image file is C:\Users\imgcd

Page 14: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 14

Measuring Performance

Finally, open AndroidManifest.xml and add the following line:

<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />

(see AndroidManifest.xml in the andengine examples)

Otherwise the activity will not be able to write to the SD card!

Page 15: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 15

Measuring Performance

Troubleshooting If the Activity will not launch, first

remove the –sdcard parameter we added to the Run Configuration in the previous slides.

We will manually mount the SD card from the command line

Add the android-sdk-windows\tools folder to your PATH environment variable (for convenience)

Page 16: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 16

Measuring Performance

Manually mounting the SD card... Open a command prompt and issue the

following command from the location of your SD card image:

emulator –avd AVDName –sdcard ./imgcd

Where AVDName is the name you have assigned your emulator (in Eclipse, open Window->Android SDK and AVD Manager to view the name

Page 17: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 17

Measuring Performance

Now you can launch the application as usual.

Afterwards you can view the tracefile on the emulator (do not close the emulator)

From the android-sdk-windows\platform-tools folder give the following command: adb shell

Page 18: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 18

Measuring Performance

This opens a shell prompt. Give the Linux command to see a directory listing:#ls

Note the sdcard folder #ls sdcard Will show its contents – note

carefully the name of the trace file Now type CTRL+D to exit the shell

Page 19: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 19

Measuring Performance

We need to retrieve the trace file from the emulator and copy it to the local file system

In the android-sdk-windows\platform-tools folder, make a subfolder called tracefile (i.e. android-sdk-windows\platform-tools\tracefile)

Depending on your Android SDK version, this folder might not exist. In this case, use the android-sdk-windows\tools folder instead

Page 20: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 20

Measuring Performance

From the android-sdk-windows\platform-tools folder give the following command:

adb pull /sdcard/calc.trace .\tracefile

This copies the file calc.trace from the emulator and places it in the android-sdk-windows\platform-tools\tracefile folder

Page 21: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 21

Measuring Performance

The final step is to view trace files using the Traceview tool

Traceview is a traditional Java app and is launched using the JDK

First make sure the JDK is in your system path, e.g. Open a command prompt and type java -version

Page 22: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 22

Measuring Performance

If Windows complains that java is an unrecognized command you must add it to the PATH environment variable

For example, the JDK folder might be located at:

C:\Program Files (x86)\Java\jdk1.6.0_23\bin;

This directory must be added to the PATH

Page 23: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 23

Measuring Performance

To run traceview and view the trace files, go to android-sdk-windows\tools folder and enter

traceview <full-path-to-trace-file> E.g.

traceview ../platformtools/tracefile/calc

Page 24: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 24

Measuring Performance

Traceview has two panels – a Timeline panel and a Profile panel

We will focus on the Profile panel To learn more about traceview, visit:

http://developer.android.com/guide/developing/tools/traceview.html#timelinepanel

Page 25: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 25

Measuring Performance

The profile pane shows a summary of all the time spent in a method.

The table inclusive and exclusive times (as well as the percentage of the total time).

Exclusive time is the time spent in the method.

Inclusive time is the time spent in the method plus the time spent in any called functions.

Page 26: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 26

Measuring Performance

Next we consider a traceview capture from the SpaceInvaders project

Note that traceview shows the most time consuming methods first

The first two lines only are shown next:

Page 27: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 27

Measuring Performance

The top-level routine consumes 100% of the time including all called methods, but only uses 0.2% of total time itself (excluding called methods

onTickUpdate() consumes 63.6% of the available time including all called methods

Page 28: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 28

Measuring Performance

We can expand each method to see what other methods it calls and what % time they are consuming

In the case of the Andengine we will see engine methods listed initially – we must drill down into these to make sense of what is going on

Lets “follow the money trail” and see what methods onTickUpdate() calls:

Page 29: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 29

Measuring Performance

Expanding onUpdate() gives us...

Expanding updateUpdateHandlers() gives us...

Page 30: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 30

Measuring Performance

Finally, expanding UpdateHandlerList.onUpdate() tells us that CollisionHandler.onUpdate() is consuming a lot of cycles!

Page 31: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 31

Measuring Performance

Returning to Eclipse, a bit of detective work is now required.

Open the CollisionManager class in AndroidInvadersActivity class

Note the method: @Override

public boolean onCollision(IShape pCheckShape, IShape pTargetShape) {

Page 32: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 32

Measuring Performance

Right click on this method and choose ‘Open Call Hierarchy’. This tells us...

...that CollisionHandler.onUpdate() is the engine method that invokes our callback method onCollision().

Page 33: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 33

Measuring Performance

Traceview Conclusion If we needed to optimise Space Invaders,

we would start here. One possible solution is to not use this

callback at all and use an alternative, cheaper method for collision checking between the enemies and the player ship

Some interesting real world stories about Traceview here

(http://android-developers.blogspot.com/2010/10/traceview-war-story.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed:+blogspot/hsDu+%28Android+Developers+Blog%29)

Page 34: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 34

Measuring Performance

Traceview tech note Traceview currently disables the JIT

(the Just in-time Compiler), which may cause it to misattribute time to code that the JIT may be able to win back.

After profiling and optimising make sure the resulting code actually runs faster when run without Traceview.

Page 35: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 35

Microbenchmarking

Microbenchmarking looks at small snippets of code and returns timing results

Microbenchmarking is not the same thing as profiling

Profiling looks at the entire application Microbenchmarking returns timing

results that may not be reliable There are many reasons...here are three:

Page 36: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 36

Microbenchmarking

1. The results you see are valid only for the particular hardware, OS and JRE it was run on; one small change to any of these, and things can be drastically different

2. Your code might not suffer nearly as many cache misses when it's running inside a microbenchmark as it does in real life

3. Differing circumstances can affect the many layers of abstraction that lie below even the machine code, in unpredictable and uncontrollable ways

Page 37: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 37

Microbenchmarking

Why microbenchmark? Still a good idea to seek to minimize

the overall "amount of work" that your code needs to perform.

E.g. Break a loop early, do something in O(log n) instead of O(n)

Do not obsess over whether to loop backwards or forwards, cache a field value in a local variable, and other such tricks.

Page 38: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 38

Microbenchmarking

Caliper Caliper is an open-source framework

for writing, running and viewing the results of JavaMicrobenchmarks

A microbenchmark attempts to measure the performance of a "small" bit of code (typically in the sub-ms range)

Usually the code performs no I/O

Page 39: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 39

Microbenchmarking

Unfortunately it is not straightforward to integrate Caliper with andengine projects

You can read further about Caliper here

(http://developer.android.com/guide/practices/design/performance.html)

There is a benchmarking class as part of the andengine library which reports the FPS of an andengine project

Now we consider how to use it...

Page 40: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 40

Microbenchmarking

The BaseBenchmark class This class enables the recording of

FPS benchmarks in a derived class Usage... Step 1) Subclass your andengine main

activity class from BaseBenchmark e.g.

public class AndroidInvadersActivity extends BaseBenchmark {

Page 41: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 41

Microbenchmarking

Step 2) Implement the following methods:@Override

protected int getBenchmarkID() {

return 0;

}

@Override

protected float getBenchmarkStartOffset() {

return 2;

}

@Override

protected float getBenchmarkDuration() {

return 10;

}

This is simply an ID for your benchmark (it is used if the results are submitted online)

This is a delay in seconds before the benchmarking should begin

This is duration in seconds the benchmarking should run for

Page 42: CW208-3/4-2010/11, Chp 2: Lec 31 Performance & Optimisation techniques Programming for Games Devices

CW208-3/4-2010/11, Chp 2: Lec 3 42

Microbenchmarking

Note that an Android handset will typically report much higher FPS than the emulator

This approach of measuring FPS should be taken as a general indicator of trends where we could answer questions such as:

Is my new algorithm design an improvement on its previous incarnation in terms of performance?

rather than relying on hard numbers which may not be 100% reliable