Strategies for Efficient Authoring of Content in Middle

Strategies for Efficient Authoring of Content in
Middle-earth: Shadow of Mordor by Doug Heimer
Author: Anton Klinger
February 20, 2017
Tags: reducing development overhead, profiling, asset management, asset pipeline optimization, automated build deployment, file formats
Track: GDC 2015 - Programming
Url: http://www.gdcvault.com/play/1022415/Strategies-for-Efficient-Authoring-of
Speaker: Doug Heimer, Monolith Productions
Abstract
Games are getting bigger each year with increasing asset file size and count. This increases
development overhead and decreases time actually spent developing and iterating on game
content.
By profiling and optimizing the asset pipeline, development overhead can be reduced and
allows developers to spend their time on developing the game instead.
1
Summary of Talk
Doug Heimer explains how Monolith Productions improved their process of authoring content in
their game "Middle Earth: Shadow of Mordor". He refers to development overhead as "damage
over time" to a project, as quickly iterating over content is important for efficient development.
They reduced waiting times of developers by reducing load times, processing assets in parallel
and structuring their assets in a scalable format. Monolith is developing all their tools and engine
in-house, therefore they can adapt it to their needs.
Still, Doug suggests that anyone should analyze and identify their bottlenecks and focus on
quick wins that can be quickly implemented and have the highest return on investment.
1.1
1.1.1
Reducing Sync Time
Nightly Builds
Updated assets need to be pre-processed before they can be used in the build. Monolith uses a
nightly build job that automatically processes all assets and provides an installable build of the
game.
They considered also automatically deploying the newest version on everyone’s computer, but
decided against it, because of too many potential issues. For example in cases when dependencies
are missing or the binary is still running.
1.1.2
Deterministic File Format
A deterministic file format helps having to sync less, so that assets are only registered as modified
when they really have been modified. This is achieved by avoiding to store timestamps, unseeded
GUIDs or any absolute paths in assets.
1.2
1.2.1
Reducing Load times
Fast File Formats
Special consideration should be taken when choosing the file format of assets. It can make sense
to sacrifice human-readability to reduce file size and load times.
1
Monolith tried to replace their XML-like format with a compressed ASCII format to reduce file
size, but it actually took longer to load, because it had to be decompressed additionally to being
parsed. A binary serialization that can directly loaded into memory proved to be the fastest and
smallest asset format for them.
They also provided an utility for developers to convert it into a human-readable format.
1.2.2
Load on Demand
Loading all parts of a particular asset, when a developer wants to only edit specific parts of it, is a
waste of the developers time. By splitting up assets into multiple smaller files they can be loaded
individually and only when needed, this significantly reduces the amount of time needed before
the application has finished starting up and the developer can start working.
1.3
Profiling Performance
It is not possible to know ahead of time where bottlenecks will occur. Therefore profiling is important in identifying the pain points. Monolith runs Windows Performance Toolkit [2] continuously
to gather data and create performance reports.
Also they use Windows Management Instrumentation [3] to collect high-level hardware specifications.
They also include a "Report Problem" button in their tools. When users use them to open a
ticket, all the performance reports are automatically sent with it.
1.4
Parallelization
When applicable, parallelization of tasks can lead up to linear speed improvements, but it will
only get as fast as the longest running job. Also parallelization cannot alleviate poor choice of
algorithms.
There are some caveats that have to be kept in mind when using parallelization techniques like
Thread Pools or Process Pools.
The machine will need a lot more memory running the jobs in parallel than running the jobs
in sequence. If not enough memory is present this can lead to paging, which can slow down the
performance significantly. Parallelization will introduce additional complexity to your software,
like shared memory management in the case of Thread Pools or inter-process communication in
the case of Process Pools.
Process Pools have the advantage that they are self-contained, in contrast to Thread Pools.
When one process crashes, the others can keep running.
1.5
Content Dependencies
When structuring your asset hierarchy try to be as efficient as possible. Reusing assets can safe a
lot of work.
Monolith uses Asset encapsulation to be able to change all instances of a specific asset by only
changing the template. They also use Data Inheritance, which allows assets to inherit all properties
of a template and only have it define the properties that are unique to it.
When asset hierarchies become complex it is important to invest into visibility. Monolith made
it easy to view all dependencies of an asset and what other assets depend on it.
Monolith also makes sure to validate their assets automatically as often and much as possible,
it is important for them to detect problems as soon as possible.
1.6
Hardware Optimization
Before improving your software, a quick win is usually to keep the hardware up to date.
The time spent syncing can be reduced by ensuring the networking equipment is up to date.
Some old switches and cables cannot run Gigabit speeds. The capabilities of connected cables can
be determined by the Windows Performance Toolkit. Exchanging HDDs for SSDs is a quick and
easy way to reduce load times. SSDs usually have the downside of smaller storage capacity though.
2
2
Overview and Relevance
As games get larger, more thought has to be taken into how the sheer size of assets can be managed.
Unless you develop your own engine and tools, like e.g. Monolith does, you will not be able to
change the fundamentals of your asset pipeline. However, profiling and analyzing inefficiencies in
development can be done even when using third-party engines. Profiling and collecting user data
can be done with Windows Performance Toolkit [2] and Windows Management Instrumentation
[3].
Automation is one of the key tools to reduce development overhead, besides improving content
authoring, automated test can also improve efficiency of development [4].
3
References and Further Sources
[1] Doug Heimer, Strategies for Efficient Authoring of Content in Middle-earth: Shadow of Mordor,
http://www.gdcvault.com/play/1022415/Strategies-for-Efficient-Authoring-of,
2015.
[2] Bruce Dawson, ETW Training Videos, https://randomascii.wordpress.com/2014/08/19/
etw-training-videos-available-now/, 2014.
[3] MSDN, Geheimnisse von Windows Management Instrumentation Problembehandlung und
Tipps, https://msdn.microsoft.com/de-de/library/dn151197.aspx, 2004.
[4] Mark Cooke, Automated Build and Test Systems for Games, http://www.gamasutra.com/
view/feature/1784/automated_build_and_test_systems_.php?print=1, 2006.
3