articles

Retro Video Game & Old Computer Graphics Explained: How Old Video Games Were Made

Sunday, November 08, 2015


We are currently in the Eighth Generation of video game consoles, and there is no doubt that video game computer graphics have come along way since 1972, when the First Generation of video game consoles started out with the advent of the Magnavox Odyssey video game console. Home computers entered the market 5 years later in 1977, and just one year earlier in 1976 the Second Generation of video game consoles was underway with the Atari 2600 (8-bit), which was inarguable the most popular game console of the Second Generation.

By 1983 and 1989, the Nintendo Entertainment System and TurboGrafx-16 Entertainment SuperSystem gave birth to the Third Generation and Fourth Generation of game consoles respectably carrying an 8-bit and 16-bit computer graphics respectably.


How Old Video Games Were Made
To understand how old video games were made, we must understand the challenges video game designers faced when it came to adding color to video games. Factors they had to keep in mind were color depth and video RAM. Color depth (a.k.a bit depth) is the number of bits used for each color component of a single pixel; while Video RAM (a.k.a screen memory) is the memory used to store the image data in the screen display.


To give you an idea of how much video RAM color used, if video game designers just wanted to add the colors black and while (1-bit color or monochrome), that alone would require 8 kilobytes of RAM just to store the information displayed in the screen.

If game designers were to add 4-bit color (16 colors) or even 8-bit color (256 colors), this would require 32 kilobytes and 64 kilobytes of RAM respectively. You can imagine now how much of an issue video RAM was to video game designers back then. It may not sound like much now but at the time, 8-bit color (256 colors) would essentially use up 64K bytes of video RAM. Even the highest spec home computer of the day such as the Aamber Pegasus only had 64K RAM.

Three Ways To Add More Color Without Increasing RAM
As video RAM alone would eat up all the memory space, leaving out no room for the code, video game designers had to figure out ways to add more color to the screen display without using up
too much video RAM. They came up with 3 ways to overcome the video RAM high-usage issue by either using Color Cells (like Commodore and Nintendo Entertainment System did), using NTSC Artifact Coloring (like Apple II and Tandy Color computer did) and using CPU Driven Graphics which the Atari 2600 is a great example of.

Color Cells 
The most popular method used by most early 1980's video game designers was adding color by using an additional 1K of RAM and defining Color Cells on the screen. These cells divided the screen up into areas that were 8x8 pixels. Within each cell, video game designers were able to define a foreground and background color.

Defining Color cells was a great way to add color and free up video RAM but it was limited as game designers could only have 2 colors per cell. You can see how by using color cells game designers were able to add 16 colors to old video games. Saying this, video game designers could not always put the colors exactly where they wanted them to go, which made color cell coloring very laborious. Realizing this now, it makes me realize how hard video game designers had to work back then to bring video games to life!

Multi-color Mode and Hardware Sprite 
Video game designers at the time knew the limitations of using color cells so they had to employ more flexible options to add color such as using a multi-color mode and a hardware sprite. Multi-color mode (used by Commodore 64 home computer) made the pixels twice as wide which cut the screen resolution by half, and only consumed 9K RAM. Multi-color mode allowed video game designers to use 4 colors per cell instead of 2.

The other popular coloring option was called a hardware sprite which worked by allowing graphical objects to move independently of the game playfield. For instance, Atari VCS's sprites, called players and missiles, were constructed from a single row of pixels that displayed on a scan line to produce a two-dimensional shape. The sprite's single-row bitmap is altered by software from one scanline to the next. The Commodore 64 had 8 different sprites, the Nintendo had 64 different sprites. The Mario character alone was made of 4 sprites.


NTSC artifact coloring
NTSC artifact coloring was mostly used by the Tandy Color Computer (a.k.a. TRS-80 Color Computer) and had a 256x192 pixel mode (Color Computer 2) that could do artifact coloring to give an effective resolution of 280x192 with four colors (black, white, red and blue).

The Apple II also supported a 280x192 pixel High Resolution Graphics mode.  The actual pixels were white and did not contain color/hue or saturation information.  Each of bits, except the highest, of every memory byte for the HGR page could display a single dot if set (logical 1).  A single memory location could set up to seven consecutive pixels.  Pixels on even horizontal lines would appear as purple first, on odd lines they would be green first.  If one pixel was set, it would be in color on a color monitor.  If two adjacent pixels were set, they would appear as a double-wide white pixel.

Similarly, if two adjacent pixels were off, they would appear as black. The trick to getting solid colors was to place pixels in an alternating on-off-on pattern.  That is why you would frequently see "serrated" or "stripey" graphics with a monochrome display instead of a solid color. The effective color resolution is something close to 140x192.

CPU Driven Graphics
CPU driven graphics was another technique used by video game designers to produce visuals depending on whether you had a monochrome or color screen. Apple II and Atari 2600 pioneered this technique which had its drawbacks. For instance, an Apple II attached to a color screen would show white text on a black background appearing almost rainbow-like because of the pixel placement. The colors blue and green next to each other were especially tricky to align on the screen, which made some games images bleed into each other.

As far as some Atari 2600 games displaying black bars on the left side of the display, this was not due to CPU bottleneck, but to the dedicated graphics processor (called the TIA) of the Atari 2600. As the TIA drew the screen per scanline, instead of, per full screen bitmap it created the black bars. The CPU on the Atari 2600 simply loaded the relevant data to the TIA registers for it to display, just like the Nintendo Entertainment System and Commodore 64's CPU did.

Similar Gadget Explained Reviews

0 comments

Connect With Gadget Explained