Friday, September 27, 2013

Doom 3 Post Mortem





The Aftermath

Perusing the source code of id-Tech 4, the engine responsible for creating the legendary title, Doom 3, I found it interesting the way the game was structured and the routes and conventions the developers decided to go with when putting the engine together. Delving further into the topic, I stumbled upon a rather interesting talk on the Postmortem of Doom seventeen years after the game was released. The talk was both informative and funny (as game developers often are). The following blog is a summary outlining some of the key points of the talk that I found particularly fun to discuss among the general public. And of course, if you haven't played Doom yet, GO! Seriously, why are you still reading this? Go play it. Now. Please? 

The information on this blog was a result of the viewing of John Romero and Tom Hall's talk on Doom's Post-Mortem at GDC 2011 and can be found here






In the Beginning


In 1991, Id Software was composed of four individuals, artist Adrian Carmack, programmers John Carmack and John Romero, as well as game designer Tom Hall. Towards the end of 1991, Steve Jobs came out with something called a 'NeXT' computer which peaked the interest of the team at id. After an $11,000 investment on the NeXT computer, id went on to develop Wolfenstein 3D for DOS, using the NeXT Cube for the game's hint manual. They were asked to port the game over to the Super Nintendo and so the had a developer come in and work on this while the team continues pursuing other titles. 

After a series of FPS games (Hovertank One, Catacombs 3D, Walfenstein 3D, Spear of Destinywith which the team had plenty of time to experiment with textures and tools of the trade, id decides to try something different with Doom - but we all know that's not really true at all. After brainstorming ideas for Doom and after a lot of Dungeons and Dragons, the guys at id decided to make a game with demons and by the end of November Tom Hall had already created a Doom Bible. The Bible was meant to be more of a design document for the game, outlining where all the levels were, the nomenclature, the tools, the progressions one goes through, etc. Hall was interested in pursuing a level-by-level progression through the game to give the player checkpoints where they could feel a sense of accomplishment upon completion of a rather difficult level. At the same time, John Carmack was beginning to think about the tech involved in the production of such a game. He began by laying down some of the data structures and upon stepping back and taking a look at the system as a whole wanted the game to appear as a singular giant world. At this point, the team begins projecting their memory usage for a project this big soon realizing that their scope was far too large to begin with. After weeks upon weeks of intricate design, Hall is told that there's no way for them to be able to fit all his designs within the final game. There just wasn't enough memory (only about a megabyte allowed). It was quite apparent that as a programmer, Carmack was trying to get the latest technology out on the market with Doom while Hall was trying to give the game the best and most innovative design causing both members to clash heads. 






Coming together


During the production of Doom, there was never a time when the guys at id would draw something, think about where to go with it and then decide whether or not to put it in the game. Once an asset was drawn, once an idea was put on paper, it was definitely going in the game. By January 1993, the technology and the art was in the works and feeling quite confident with their abilities, the team decides to put out a Press Release stating their hopes and aspirations for the game as well as the main features they intended on having (three or four of these didn't even make it into the game). Adrian Carmack , the studio's artist, modeled figures in clay and then put them in front of a camera that would read the model and represent it pixel by pixel. Unfortunately the lights placed on the clay figures would cause them to melt making this method inefficient. 

Id had the engine itself up and running relatively quickly and took some screenshots and alphas of the original stages of development. While Hall attempted to mimic military building designs for levels, Carmack tried to stress optimization and making the game run as smooth and as fast as possible. Due to Carmack's concerns with every environment's polygon count, Hall had to get back to the drawing board and design orthogonal and occlusive shapes. Hall ended up making a bunch of levels composed of blocks which weren't much more than corridors with rooms to go into. Around this time they were also experimenting with the U.I., shooting designs back in forth between Hall and Adrian Carmack (the designer and the artist). The team actually went out and bought toy guns and held them in front of the camera to model their final designs behind. 






Doom - Evil Unleashed was the first pre-alpha release of the game using the data structures Carmack had come up with and all the current designs implemented for testing purposes. With their initial designs, the team realized that the player wasn't getting much of a good view due to too much UI "crap" as they call it. Luckily the game was at a stage where if things didn't work too well, they could afford to just take it out. 

In March of the same year, 20th Century Fox contacted the team at id and offered them the license to Aliens in order to produce a game based on the movie. Although the entire team were big fans of the movie, they decided to turn the offer down in exchange for continuing to run with their original idea for Doom

Work continued on as scheduled and now Romero was attempting to make the levels feel a little bit more realistic and recognizable experimenting with contrasts in room height and lighting.   






Disaster strikes and Doom continues


Remember earlier when I told you that someone tasked id to port Wolfenstein 3D to the Super Nintendo? Do you remember when I said that they had a guy working on it? Well that guy didn't manage to finish on time and Imagineer (the company that requested the port) was not pleased at all. It had been nine months since they had been in contact with Imagineer and when asked about the status of the port, id realized that they couldn't get a hold of the guy working on it and would have to halt production of Doom to fix this problem. In the end, they managed to push the final game out in an amazing three weeks. 

Once Doom resumed production, something truly great took place. After a recursion problem with receding stairs, John Carmack noticed that the game was beginning to look really good and instead of asking Hall to fix the design, he took it upon himself to solve the problem so that the aesthetics could live on. The resolve of this problem eventually lead to the BSP Tree (based on the idea by Bruce Nailer) which helped with helped cull out non-visible sectors inside of a level. 







On April 2nd, 1993 id felt confident that they had reached what felt like an Alpha milestone in Doom's production and so made the decision to release the game in its current stage to Beta testers. In May, the team released another Pre-Beta build of the game which featured a lot more textures, a better UI, a menu system, skill levels and monsters of varying difficulties. In addition to this, the game run a lot smoother and started looking more and more like the end product. 

In June, Computer Gaming World (a very popular magazine) did a preview of Doom right in the middle of development. By July, Doom had a lot of variety in monsters thanks to Hall fighting to have constant surprises for the player in terms of enemy types. Despite this win, Hall felt he was being restricted creatively and that his designs were constantly being shot down in pursuit of a faster-paced shooting game. In August of 1993, Tom Hall decides to leave id and goes to Apogee where he works on Wolfenstein II for a bit before creating Rise of the Triad and Prey. At the end of the day, this worked out best for everyone on the team since Hall was happily able to release his creative juices while Romero and Carmack could go back to doing awesome stuff with Doom on the programming and optimization side of things. Neither side was weighing down the other anymore. 






The Final Stretch

After Hall departed from id, the team hired a couple of developers (Dave Taylor and Sandy Peterson) to help with finishing the game on time. Into September the team was making good progress on the automap in addition to pumping out tons of levels and adding DMX sound drivers. In October, a pre-release is revealed two months from release with a lot more textures, a better-looking UI, way better lighting and just overall playable environments that worked well with the theme of Doom. Players could pick up items and the game would keep track of them. At the time, Romero considered the game pretty new school and so decided to avoid the arcade tactic of setting a player back to the start once all their lives were depleted. Instead, he had the player reset to the beginning of the level upon dying as well as a save feature which allowed players to save at any point during the game, down to the second giving way to a high probability of players actually getting to the end of the game and beating it. The next step seemed to be the final game. 

November comes around and the first IPX multiplayer was created as was Deathmatch, co-op mode, the game's final systems, intermissions, menus, installation, text files and serial code for modems. All the game maps were also modified for Deathmatch and co-op play. The development team was spending more and more time at the office putting tons of polish on the game while hype steadily grew thanks to fans in eager anticipation of a game that had been in production for a year now. 

In the final stretch the team pulled a 30 hour no-sleep work session where everyone would be running the game on every machine looking for last-minute bugs. A bug concerning the timers the game had implemented arose but luckily the team had so much experience with the engine and tech at this point that fixing it didn't take all that long. Finally the game was uploaded to the University of Wisconsin's FTP server essentially crashing the entire server and preventing anyone from logging on to download the game while the entire file was being uploaded.

And the rest is history ladies and gentlemen...






In Conclusion


As Game Developers, we stand to learn a lot from id's experience with creating Doom. There were conflicts, setbacks and technical difficulties but what remained constant was the team's strong belief in the game and where it would go. They also believed in each other's abilities and everyone pushed to put the best of themselves into the game. I think we can all agree that the game industry is grateful for the efforts of the team at id in producing a beautiful and well-functioning title that is still held in high regard within the hearts of Game Developers everywhere. 








Thursday, September 19, 2013

The Source Engine...





The only Source of knowledge is experience

In 2004, the VALVE Corporation came out with a game engine that changed everything. The Source Engine was not only famous for its notable technology but also for the several popular titles that resulted from its design. With Counter Strike, Half-Life 2, Team Fortress and Portal among many more award-winning developments under their belt, Valve knew that the original Source Engine was just the beginning of a powerful legacy that would set standards for game engine design everywhere. 

This blog focuses on the Valve Corporation's Source Engine, some of it's more popular design elements, how it began, where it is now and what's in Valve's future. Information on this blog can be traced back to the Source Engine's Licensing Information Sheet found here






More than the sum of its parts

The Source Engine debuted with Counter Strike: Source which was a remake of the popular first-person shooter which pits a team of counter-terrorists against a team of terrorists in a series of deathmatch rounds where one team is crowned victorious when an objective is met or all members of the opposing team are no longer standing. The Source Engine has never had a meaningful version numbering scheme and is instead designed in constant incremental updates. Although the engine was initially created to power first-person shooters, it was later used to create everything from RPG to RTS games. 


Technology implemented within the Source Engine includes Direct3D for rendering on PC, XBox, XBox 360 and XBox One and OpenGL for rendering on Mac, Linux and PlayStation 3. High Dynamic Range Rendering is used for post-processing while a lag-compensated client-server is used as the networking model. The physics engine is derived from Havok in addition to being network-enabled and bandwidth-efficient. Source has scalable multiprocessor support with pre-computed radiosity lighting and dynamic shadow maps. Deferred lighting is supported on consoles and the facial animation system fashions auto-generated and localizable lip-syncing. Water flow effects, a blended skeletal animation system, inverse kinematics, dynamic 3D wounds, and cloth simulation are all supported among tons of other technology. The popularity of the Source Engine may be due to its significant source code access for mod teams as well as its distributed map compiler. 

Source distantly originates from the GoldSrc engine which itself is a heavily modified version of John D. Carmack's Quake Engine. Valve's development of Source has always been a mixture of licensed middleware and in-house developed code. In fact, John Carmack himself commented that

"There are still bits of early Quake code in Half-Life 2."






Man of few words, aren't you?


For the scope of the Source Engine's design, the developers at Valve were sure able to bring some very powerful technology to life. Areas of major enhancement include their character animation, advanced A.I., real-world physics, and shader-based rendering. With all this technical prowess it's not wonder that Source offers expressive characters that convey a message without having to say a word and can be both extremely capable allies and more than worthy foes. Furthermore, these characters populate beautifully rendered and physically simulated worlds that feature water refraction, HDR lighting and projected shadows which greatly enhance Source's visual fidelity. 

With robust networking code providing support for 32-player LAN and Internet games, the Source Engine is built for multiplayer mode. and includes a complete toolset for level design, character animation, and demo creation making it a modder's paradise. The Source Engine features state-of-the-art prediction analysis for interpolating collision and hit detection. The characters provide intelligent believable player interaction with simulated musculature for outstanding emotions, speech and body language. Skeletal/bone systems are present for animation and a layered animation system synthesizes complex animations out of several pieces. The worlds created are themselves more responsive with realistic interactions, sounds and graphics following from physics. NPCs can even interact with physically simulated objects and are structured with ragdoll physics. There are kinematic animated bone followers and custom procedural physics controllers. Vehicles built using the Source Engine have wheels that slip and skid, realistic suspensions, multiple player handling and tunable horsepower, gearing, max speed, shift speed, tire material, tire friction, spring tension and dampening. 






Now let's discuss the A.I., U.I. and sound within the Source Engine. Graphical entity placement allows level designers to quickly control the interactive gaming environment and sophisticated navigation allows for characters that run, fly, jump, crouch, climb stairs and ladders, and burrow underground. The characters are able to sense things using sight, sound and smell, as well as determine relationships such as friend/foe status of other entities. Battle capabilities allow squads of A.I. characters to operate together, knowing when to advance, retreat, lay cover fire, etc. The A.I. even provides for intelligent character interaction when not in combat. 

The Source Engine supports 5.1 surround sound for 4 speakers and high-quality 3D spatialization. There is custom software DSP as well as automatic DSP based on environmental geometry. There is support for audio streaming on any wave as well as real-time wave file stitching. There are even pre-authored Doppler effect and distance variant encoded waves.

The server browser displays all active game servers and allows players to select a server to join. Players may even filter and sort server lists in order to speed up the display and selection of a server. Players may use the instant messenger to chat with each other both in and out of games in addition to joining friends in existing games. Finally, Valve's GUI (VGUI) allows for both in-game and out-of-game user interface uniformity and is both platform independent and Unicode compliant. 

All code for Valve's Source Engine is written in C/C++ using Visual Studio 7.1 making it possible to quickly and easily derive new entities from existing base classes. Included is an internal context sensitive performance monitoring system and graphics performance measurement tools. DLLs allow for swapping out of core components for easy upgrading or code replacement through modular code design. Dx9 shaders are all written in HLSL as the engine allows for HLSL shaders. 







The future of Valve

One of Valve's largest projects to date is the development of new content authoring tools for Source. These will replace the currently outdated tools allowing speed and efficiency in content production. The Valve Corporation's fan site ValveTime revealed that Valve might be in development of a "Source 2" engine. This was based on coding from the Source Filmmaker that directed technology from the upcoming version. Gabe Newell, the head of the studio, has confirmed the development of this up and coming engine but remarks that Valve is waiting for a game to roll it out with. 

Although image-based rendering technology had been in development for Half-Life 2 it was cut from the engine before the game's release but was mentioned by Newell as a piece of technology he would like to add to Source in order to implement support for much larger scenes than are possible with strictly polygonal objects. 

Gabe Newell has also mentioned that he believes Linux is the future of gaming, often hinting to a gaming box built on the open-source operating environment. Since the launch of the company's online platform, developers have created 198 game on it. He hopes that this points to a future where games will be nodes in a connected economy where the vast majority of goods and services will be created by individuals not companies. 






In Conclusion

Valve's Source Engine is a remarkable feat for a large company that is revered for their modding community. The Source Engine's popularity stems from the tools which give power back to the player allowing them to being their own creativity to life in a sandbox controlled by the Valve Corporation. Not only do they sport a strong Game Engine, the foundation for great things to come, but with AAA titles in continuous release, it is no wonder that the Source Engine receives praise for its technical merit and ingenious design.









Thursday, September 12, 2013

Guess who's back...





Engines 

Ladies and gentlemen, I've got to say it is good to be back. With thoughts to spill on a page and a course worth talking about, let's jump right in. 

Game engines are the topic of discussion so this blog is going to give a you an overview of some of the breakthrough engines created in the past decade. The ones that have set the bar and those that have created an entire community of modders - individuals who like to take the idea of making a game your own to a whole nother level. So let's begin by asking the most fundamental question, what is a game engine? 






Built not bought

A game engine is a framework of visual development tools as well as reusable software components which are provided in an integrated development environment to enable simplified and rapid development of video games in a data-driven manner. Each game engine is unique and as such no one game engine may be used to design all game  genres. There is context and precision in each engine and no two are ever alike. Back in the day, companies built their engines purely to run their games and nobody else's. Eventually, gaming companies realized that there were gamers who wanted to play around with the engines they built and along came Quake II and the Unreal Engine which provided tools for gamers to let their imaginations run wild and modify what they had built.

As engines grew in complexity and structure they were put on the market and SDKs (Software Development Kits) became available for purchase and commercial use. You may be thinking, "But you just said no game engine can be used to design all game genres." While that might be true, there is still much similarity between the mechanics and game logic of many genres and with a solid engine and a lot of tweaking, a little could go a long way. Building a game used to mean starting from scratch and re-inventing the wheel - compiling all new code. So when did this change? What changed? Who helped? 

Let's find out. 

Information found on this blog has been taken from Paul Lilly's (write for MaximumPC) article on a visual history of 3D game engines found here.







I wouldn't leave if I were you, DOS is much worse

One of the many infamous quit messages from the game that sends you on a serious guilt trip when attempting to leave. Doom was most certainly one of the most memorable and important PC games of all time. The reason for this was that Doom pushed reusable game engines as a viable programming model. The best part was that id software's Doom engine wasn't really a 3D engine at all but a very well conceived 2D sector-based engine with flat sprites representing objects, characters and anything not tied down to the map. This meant a good rendering speed for the hardware of the time which only really had to be capable of handling texture-mapped environments. 

After a time, NovaLogic came out with their Voxel Space engine, introducing the concept of a voxel (a combination of the words volumetric and pixel) which was used by several games for specific rendering of in-game items or even vehicles. Voxel was a way to represent volumetric objects as 3D bitmaps instead of as vectors. Terrain was rendered in layers making graphics more smoothly contoured and detailed. This, in turn, also provided smoother gameplay.

Build followed in the footsteps of the Doom engine with Duke Nukem 3D as far as rendering worlds on a 2D plane with sprites populating on the map went. The Build engine broke the entire world into individual sectors arranged in a grid with the ceiling and floors in each sector built at a different height. This meant that, unlike in Doom, players could now look up and down. 







Enter 3D

Let's move into the world of 3D gradually and begin with the Jedi engine. The Jedi engine was responsible for Star Wars and proved highly successful in creating 3D-like environments by allowing for areas (sectors) to be stacked. Yet, not everything was 3D with models being rendered into bitmaps form different angles in 45-degree intervals. Upto 32 angles were supported for each object which would be continually rescaled as the player got closer. 

The first truly 3D game engine was the Quake engine which had to be meticulously crafted in order to ensure that it ran smoothly without a ton of processing power. One of the neat techniques used in Quake was map purging, where an area of a map wasn't processed if the player couldn't see it. This reduced the number of polygons by at least half if not much more. Additionally, the Quake engine made use of Z-buffering, included 3D light sources and supported 3D hardware acceleration. 

Renderware is one of the more popular game engines which claims over 200 game titles on several platforms making it a multiplatform game engine. Renderware allowed developers to manipulate art and game processes in real time. That means a developer could change the paint job on a car without altering the underlying code and having to render the scene all over again. Rudimentary physics within the engine worked the same way. 

The original Quake offered hardware graphics acceleration. Along came Quake II and offered native OpenGL support and included coloured lighting effects and a new game model which allowed for both software and OpenGL renders instead of having to choose between the two. As mentioned earlier, Quake II was known for its moddability when id released their source code into the modding community in 2001 while still keeping the rest of their engine proprietary. With such a robust game engine, savvy developers were able to use it to power full-fledged role-playing games with additional features.





The Unreal engine was the main competitor to Quake II and also happened to become a popular engine in the modding community. UnrealScript was the engine's very own scripting language which was bundled with the game along with a map editor and UnrealEd (a modification program). Software and hardware rendering were both present in the Unreal engine amongst collision detection, colored lighting and rudimentary texture filtering. 







In Conclusion

With the gaming technology on the rise, there is room for a lot of growth in the game engine territory. There is so much we have yet to see and so much we have to build on. Continuing innovation gives us hope for games to grow closer and closer to reality. We look back at all the engines that have preceded our time and we admire the ingenuity of developers back then. But we also use their design patterns, their ideas and their motivation to build better and more efficient engines that will power the games of tomorrow.

It's good to be back.