Wednesday, March 20, 2013

A focus on engines...





Cog in the wheel

Game Engines are tools used to engineer and develop video games in today's world. They're built is similar to that of a framework and are invaluable in the production process of video games nowadays. This week's blog is going to peruse one of the most popular game engines out there today, the Unreal Engine. I will cover topics such as the development of Gears of War using Unreal Engine 3, what this engine brought to the table and how it does what it does so well. The information from this blog is taken from Michael Capps' presentation on Unreal Engine 3 (which can be found here). 


An Overview

Game Engines have always been fundamental to the creation and development of high quality video game titles. They provide a software framework which developers use to create games for specific consoles (XBox 360, PS3, Wii) and PC's (Personal Computers). Most game engines consist of a renderer for 2D and 3D graphics, a physics engine - this is where collision detection and response takes place, sound, scripting, animation, A.I., networking, streaming, memory management, threading, localization support, and scene graphs. The preceding components compose the core functionality of any game engine which economizes the game development process by allowing developers to reuse or adapt their existing framework in any manner for a plethora of different titles. The idea behind a game engine is also to make it easier for developers to port their games to multiple platforms.






Some engines provide visual development tools in addition to reusable software components. These are present in what is known as an Integrated Development Environment (IDE) to help with the rapid production of games in a data-driven manner. When building game engines, developers keep in mind ways to simplify their lives by developing robust systems which will handle any and all elements of a video game. The idea behind several game engines is to sell the framework at a premium price in order to generate revenue from fans and game makers interested in developing titles similar to the ones released by various popular development teams. 

Software Development Kits are released for this exact purpose and are licensed out to game development teams for a marginal profit. Of course not all the tips and tricks of the trade are released in this from and most companies make use of their engines to produce quality games which entice developers to purchase their Dev Kits. This is a lucrative business proposition as "middleware" does a good job of providing a flexible and reusable software platform with all the core functionality needed to develop game applications right out of the box. This, in turn, reduces costs, complexities and time-to-market which are crucial factors in a the highly competitive gaming industry. 

So let's dig into the meat of the matter.




Unreal

Epic Games was founded by CEO Tim Sweeney and have been made famous with multiple hits in the Unreal and Unreal Tournament series. The Unreal Engine was in development for 10 years and has been licences to external developers since 1997 with more than 50 games making use of Epic's Unreal Engine; from Deus Ex and Rune to Harry Potter and Bioshock. The Unreal Engine has mainly been built for the development of First-Person Shooters. With the release of Unreal Engine 3 came a completely shader-driven rendering pipeline with per-pixel lighting and shadowing everywhere. There were no legacy rendering paths and the engine supported all game types including MMO's and Fighting games. 

It is important to keep in mind that at this stage of the game, consumer expectations are steadily rising with advances in technology and graphical capabilities of most game engines. As such there is a growing need to adhere to industry standards with the release of every next-gen console. Some consumers would even go so far as to base a game's merit strictly on it's beauty as a pose to it's ability to provide an enriching gameplay experience or a captivating storyline.





Rendering Pipeline

As games move to next-gen consoles, there is considerable expense associated with the advancements, as Michael Capps quite aptly points out, you can't simply go from 2,500 to 2 million poly character models or 100,000 to 100 million poly scenes for free. 


Let's talk Rendering Pipeline. All rendering in Unreal Engine 3 is High Dynamic Rendering, all lighting and shadowing options are orthogonal and there is frequent use of deferred rendering techniques. These features come in handy when creating large outdoor or city-like environments such as those in Gears of War or Lost Odyssey. Additionally, high-detail environments are also created with the same techniques. The rendering of any scene consists of three primary stages:

  • A depth set-up pass (Z pre-pass)
  • A pass for all pre-computed lighting
  • A pass per dynamic light

The depth set-up pass uses a fast hardware path with no shading, inherently generating a Z-buffer for all opaque objects. Additionally, per-object hardware occlusion queries are used to cull the later shader rendering which is expensive. The pass for all pre-computed lighting combines direction light maps and emissive materials for each object. The 3-component direction light map texture is applied to the materials normal map. As well, the material may produce light independent of any light sources. The pass per dynamic lighting renders stencil shadows to the stencil buffer and soft shadow-buffer shadows to the alpha-channel. A screen space quad is rendered over the screen extent affected by the shadow. This does not require the re-rendering of objects affected by shadowing. Deferred rendering makes the cost of shadowing dependent on the number of pixels potentially shadowed and not on the number of objects in the scene. 






Shaders & Lighting

The shaders for the Unreal Engine are based on an artist-driven pipeline with complete realtime visual tools. Artists would write shaders by linking Material Expressions. The system was based on a visual node-editing paradigm and would enable artists to visually connect colour, alpha, and coordinate outputs. Programmers could add functionality by coding new Material Expressions in C++ and HLSL. Artists could create extremely complex materials using these programmer-defined components as the code would be generated on the fly. On the other hand, in-game shader code would be compiled statically ahead of time with no dynamic shader compilation in-game and no combinatorial explosion in the shaders. The Material Instance Framework was made for reusable templates with parameters completely scriptable. 

Unreal Engine's lighting and shadowing system was fully orthogonal allowing artists to choose specific lighting/shading techniques per light. This allowed them to customize light/shadow interactions with shadow culling options and lighting channels. All lighting and shadowing was pre-computed and stored in three 3-component DXT1 textures. This techniques is used for its speed and efficiency in supporting any number of lights in one pass. It further preserves normal-mapping detail but deals with diffuse light only. Fo real-time lighting, one rendering pass was run per light/object interaction. This supported dynamic specular highlights and dynamic shadow light functions. 

As far as Unreal's shadowing techniques go, static shadows were pre-computed in the form of soft shadow occlusion into textures. The light could be dynamic but was not allowed to move. To take care of dynamic shadowing, stencil-buffer shadows were used. This supported arbitrary moving lights and were hard-edged. Soft shadowing would happen via 16X oversampling and was used per-object for moving shadows. With that said, the extent of the shadow limited to the light-object frustum, hence, avoiding scalability issues inherent in full-scene shadow buffers. 





Epic Lessons Learned

Through the forty years spent in development of Unreal Engine 3, Epic admitted to several lessons learned during the process which is the only way for them to improve their engine and make it better in time for next generation consoles. One of the primary lessons learnt was that one unified shadowing solution scales poorly to large-scale game development. What you want is many lighting and shadowing options. Of course, you must be prepared to make tradeoffs such as those with static verses dynamic lighting and shadowing as well as soft verses hard shadow edges (stencil verses shadow buffer). Further tradeoffs include scene complexity verses dynamic lighting/shadowing complexity and disbaling shadows when nonessential to visual quality. The development team also realized that it is important to expose lighting/shadowing options orthogonally to allow different tradeoffs to be chosen in a single scene. 

Empowering artists to make these tradeoffs requires great artist tools for measuring and understanding performance as well as a greater emphasis on the "technical artist" role on every project. At the end of the day, Epic surmised that they had to really trust their artists. The realized that they had to make default options (static lighting, pre-computed shadows, etc.) fast in order to force designers to explicitly choose to improve visual quality at the expense of performance. It was important for Epic to make all their rendering features scriptable through in-engine design tools. This was crucial in avoiding spending 30 days building a level in Maya before bringing it in-engine to see how it performed simply to be disappointed with errors and rendering inabilites. 

The most valuable lesson Epic learnt was that next-gen engine development is a hard job. They spent a total of 40 so-called "man-years" for their full-features next-gen engine in order to make it easier for game developers everywhere who may now use Unreal Engine 3 instead of building their own engines. 






To see how other game companies do it, you could always take a look at Jason Mitchell's presentation on Valve's Source Engine at SIGGRAPH 2006 (it can be found here!).


No comments:

Post a Comment