Benchmarking Unity3d scripts
As part of my ongoing involvement with Big Robot I've been doing a bit of work recently on working out "how best to do things" in Unity3d. Some of the output of that might turn up in a later post, but, since this is my first post in a looong time, I thought I'd break myself in gently.
Part of what I've been looking at is what I'd call "general performance issues;" that is, "how should I write a typical bit of code in the most performant way?" Rather than testing by gut, I wanted some reusable code that would help me get genuine results, and so I created a class that would let me profile code running in a scene at a prefab level.
The way it's intended to be used is this:
- create a scene, and add a single gameObject
- add the benchmarking component to the gameObject
- drag a prefab from your project into "Prefab Under Test" in the inspector
- run the scene (or run a build, if you want more real-world results)
What you'll see is a long pause, and then console output with the test results; something like this:
PERFTEST: initialise 0.04875484s; run 1.685191s (59.34047 FPS). UnityEngine.Debug:Log(Object)
The test creates a large number of your prefab, and measures two things: the "first-frame" time (which includes the time taken to call
Start and the first
Update for every instance), and the time taken for a configurable number of frames to render (which measures just your
There's a short (configurable) delay on starting, which ensures that no other startup code is getting in the way of the benchmark, and the instance count and frame count are configurable so that you don't choke your IDE on too many instances.
I've put this on a public Gist in case it's useful. Most likely I'm going to add support to test multiple prefabs in succession, for comparitive testing, but I'm not sure how much of a factor execution order might be in such a test - so for now it's left for each prefab to be profiled as a separate run.
The code in full: