The year was 1992, and the world had just experienced its first taste of 3D computer graphics with “Wolfenstein 3D.” Computer systems of this era were limited in resources and quite primitive. An Intel i9 from a few years ago boasts 3,750 times more cache memory — a quick-access memory internal to the processor — than the top-of-the-line Intel Pentium processor available for desktops at that time.
This resource constraint rendered 3D graphics almost impractical until two developers came along and released a 3D game about shooting Nazis that consumed less memory than a standard iPhone photo. Everyone eagerly watched John Romero and John Carmack, awaiting their next venture.
In 1993, a floppy drive started making rounds in university dorm rooms and garage sales. Eventually, the contents of the drive make their way to leagues of Usenet forums where users discover an inconspicuous WAD file named “Doom.” This file forever altered the landscape of the gaming industry and beyond.
“Doom,” released in 1993, was a first-person shooter with a protagonist who goes around killing various demons. It was the inaugural game to harness genuine 3D graphics while utilizing groundbreaking optimization strategies at every step of its game design journey.
A smaller footprint meant a more uniform and enjoyable experience for more players regardless of their hardware limitations. This was achieved through the pioneering of new algorithms that have since become foundational in computer graphics programming.
Get The Daily Illini in your inbox!
Earlier games used clever mathematics to hide the fact that the player was actually playing a 2D game dressed up and manipulated to look 3D. “Wolfenstein 3D,” in essence, is a 2D maze game akin to “Pac-Man,” but uses mathematics to determine the distance of the player. The engine splits the screen into columns and then draws each column to reflect the distance thereby creating a pseudo-3D environment.
This way, there isn’t really a third dimension in the world and all walls and entities are of the same height where the concept of height falls apart.
The Doom engine, on the other hand, created a world with three perpendicular coordinate axes and then used Euler transformations to bring them into the perspective of the player. The world in itself is 3D and allows for different heights, shapes and slants. This process of creating three dimensions introduced a new can of worms that quickly swallowed the development process of “Doom,” namely occlusion.
Graphics developers continually grapple with the issue of efficient occlusion where the engine has to decide how polygons must be drawn in the player’s field of view so that objects hidden by other objects are occluded away and behind the visible faces.
The prominent solution at the time was to implement the painter’s algorithm, which was wildly inefficient. “Doom” does not do this. It instead delegates this issue of slowness to the loading stage of the game, which takes longer than games that use the painter’s algorithm but uses a much more elegant solution that is lightning-fast during playtime.
Every time a map is loaded in “Doom,” the engine creates a tree structure of all the possible occlusions using a version of the map drawn in vector space. Once it creates such a tree, the game essentially knows all possible drawing patterns depending on the player’s location and does not have to sort or calculate distances between the player and the world. In this way, the game utilizes binary space partitioning, which was an algorithm heavily researched by the U.S. Air Force.
Binary space partitioning allowed for another optimization opportunity that Carmack identified and pioneered: hidden surface removal. This is a complicated strategy where the tree generated by the engine and the player’s current position is used to instantly recognize what faces of objects are completely occluded by others. This information is used to entirely ignore rendering those faces to the screen.
Through these and other strategies, the developers crafted a game with resource requirements smaller than most of today’s webpages. To put it in perspective, the game’s data could fit 134 times on a standard vinyl record, and it’s even been run on a pregnancy test.
Reflecting on the development of “Doom,” the creators described it as a “year of madness” due to the number of new things they had to discover and implement to achieve their goal of creating a game that anyone, anywhere, could play. That same goal fostered a culture of hacking and playing with electronics to get them to run “Doom.” From obsolete radios from the ’80s to supercomputers today, nothing is safe from “Doom” and its monsters — and that in itself is a testament to the foresight and care of innovation put into its design.
While algorithms and optimizations within computer graphics have been at a wonderful high point with Unreal Engine 5 and the insane things it brings to the table with nanite, games themselves don’t seem to be able to meet the mark. Gamers are left more disappointed with every release cycle full of unfinished products stuffed with bugs and performance issues.
For instance, games like “Cyberpunk” disappointed fans with exceedingly subpar performance on older consoles. Sony removed the game from its stores and refunds were offered to anyone who bought the game. In addition, this issue of horrible player experience on devices only one generation old also brought multiple class-action lawsuits to CD Projekt Red’s doors.
It has become the norm to put out unplayable and unequal experiences for PC players who don’t have the latest and greatest graphics cards and CPUs that cost multiple thousands of dollars.
It seems like the industry has forgotten its roots in making games with good code and design, and has just resorted to throwing tons of computing resources at a problem until it goes away. Players who can’t afford the steep resource requirements are punished with bad playtime experience.
This issue also further contributes to the problem of resource scalping, where people buy up hundreds of graphics cards and CPUs to sell at a higher price. This, and practices like forced obsolescence, created by electronics manufacturers, directly contribute to appalling human rights abuses in countries where the raw materials for these resources are produced.
It also heavily contributes to the mounting electronic waste problem, as users have no choice but to keep up with this resource requirement since it has become an outright necessity for a lot of software that is being written now.
In 1985, eight years before “Doom,” the creators of the Nintendo Entertainment System utilized tremendous foresight and ingenious design to make a device that would remain at the forefront of gaming technology for at least the next decade. They did this by leaving space in the device for the innovators after them who would solve problems they couldn’t imagine with solutions they anticipated.
Maybe it’s time for us to start moving forward by looking back into the past and taking pages out of the play written by the geniuses of that era.
Prarthik is a junior in Engineering.