Section 15: Hardware (original) (raw)

A Critical History of Computer Graphics and Animation

Section 15: Hardware advancements



For a little more than a decade in the beginning of computer graphics history, images were created and displayed as vectors - straight lines connecting points on the screen of a CRT. These displays were "refreshed" from a display list, which was a portion of memory that the display controller accessed to determine what to draw. A change in the image was accomplished by a change in the contents of the display list, which could be done fairly rapidly.

The "democratization of computer graphics", as one author put it, was made possible by the introduction of raster graphics. This technology used a standard television CRT that utilized a video controller to scan the image from top to bottom, turning on or off the individual addressable points on the screen, and providing information as to the color of the points. The video controller obtained its information from a memory array whose contents represented in a one-to-one match the points on the screen. This memory was called a frame buffer, and was significantly larger than the display list memory of vector devices. Each point on a line in a raster system had to be stored, not just the endpoints of the line required by the vector displays. A standard TV had approximately 250,000 addressable points (called pixels) on the screen, so the memory was rather large, particularly when color was included (which increased the size by a factor of three.)

As was mentioned earlier, core memory was the technology until the 1960s. There were some early implementations of a frame buffer comprised of core arrays, but they were bulky and expensive. The first breakthrough in affordable memory technology came with the introduction of integrated circuits. At this point, some experimental "shift-register" frame buffers were introduced. Each location on the screen was represented in the shift register as a bit. A refresh cycle for a scan line on the raster device was accomplished by shifting all the bits in the register and reading the one on the end, placing it at the beginning and shifting the bits again. The intensity of a screen element could only be changed when its bit came up at the end of the register. This resulted in a delay for screen updates.

The real contribution came with the introduction of the Random Access Memory (RAM) chip a few years later. Instead of the sequential storage required in the shift register, the RAM allowed the computer or display processor to access any bit at any time - randomly. A 1K (1024) bit RAM chip was available in 1970, allowing for the affordable construction of a frame buffer that could hold all of the screen data for a TV image, and it could be updated rapidly because of the access capabilities for each bit. In order to increase the complexity and realism of the image on the screen, even larger frame buffers allowed displays as large as 1024x1024. The depth (the number of bits used to represent the intensity and color of each pixel) increased from 1 to 8 to accommodate color, and then to 24 (8 bits for each RGB value) and upwards.

Because of the cost of the frame buffer at this time, a user who needed the image complexity of the 24 bit version had to look for other approaches. (24-bit color is sometimes referred to as "true" color. With 256 levels of red, green and blue a total palette of 16.7 million colors is possible.) One approach was to use an 8 bit frame buffer and the "look-up table".

In an 8 bit buffer, 8 bits of data represented all the information for a pixel. A normal configuration would allow 3 bits each for the Red and Green color channels, and 2 for Blue. Thus Red and Green could achieve 8 levels each (2x2x2), and Blue had 4 levels (2x2). Hence one could represent a total palette of 256 colors, which is minimally capable of realistic color. Better realism is achieved through the use of an "adaptive" palette of colors. In this case the image is analyzed and the best fit of 256 colors is directed to a look-up-table (LUT) where each color is associated with an 8-bit address in the table. So rather than representing the intensity/color directly with the 8-bit element of the frame buffer, the location of the intensity/color stored in the table was recorded as an 8-bit address in the frame buffer. To refresh the screen, the address is read from the frame buffer, the processor then pulled the intensity value from the table at that address, and refreshed the image accordingly.

Click on the images below to view a larger version (when available).

Still, the frame buffer designs were constrained by the high bandwidth required to refresh the entire screen. Software and hardware modifications made this less of an issue, with implementations like the BITBLT (block transfer of bits) operator that allowed larger portions of the image to be updated as a unit, and word organizations in which the buffer was organized into words that contained multiple pixels. Later, the frame buffers were enhanced with hardware; the z-buffer was introduced by Catmull in 1974, hardware implementations of which included RAM to represent not only the RGB colors for each pixel, but also the z depth from the 3D geometry to be used for update comparisons. Hardware, such as the enhanced frame buffer described by Whitted, was added to the memory to perform operations such as the z-comparison separate from the CPU.

The frame buffer was developed and expanded in both commercial and proprietary environments from the late 60s through the early 80s. Bell Labs developed a 3-bit system in 1969; Dick Shoup developed an 8 bit frame buffer at Xerox PARC for the SuperPaint system; NYIT developed the first 24 bit RGB buffers; a flexible system was designed at North Carolina State University, and later modified at Ohio State; companies like Evans and Sutherland, Genisco and Raster Tech developed commercial versions; the NCSU buffer evolved into a commercial programmable display developed by Nick England called the Ikonas system, which later became the Adage system. In 1984, Loren Carpenter introduced the alpha channel to the frame buffer, allowing images to be efficiently composited with antialiasing.

Most installations used the same workflow configuration during this time: a mainframe computer or minicomputer was accessed by multiple users, each with a terminal but sharing a frame buffer, which was connected to the computer with a Bus interface. The concept of the single user workstation, configured with its own internal frame buffer was still on the horizon.

E. Catmull. A Subdivision Algorithm for Computer Display of Curved Surfaces, Ph.D. Thesis, Report UTEC-CSc-74-133, Computer Science Department, University of Utah, Salt Lake City, UT, 1974

Whitted, Turner, "Hardware Enhanced 3-D Raster Display System," Proceedings of the 7th Man-Computer Communications Conference, 1981.

For a discussion of paint systems and the development of hardware support, see Alvy Ray Smith's article Digital Paint Systems:An Anecdotal and Historical Overview from the April 2001 issue of the IEEE Annals of the History of Computing

Fuchs, Henry. Distributing A Visible Surface Algorithm Over Multiple Processors. Proceedings of the ACM National Conference 1977


Superpaint demo - 1977
(no sound)


Alvy Ray Smith's Vidbits

One of the most important contributions in the area of display hardware is attributed to Jim Clark of Stanford in 1982. His idea, called the Geometry Engine, was to create a collection of components in a VLSI processor that would accomplish the main operations that were required in the image synthesis pipeline: matrix transforms, clipping, and the scaling operations that provided the transformation to view space. Clark attempted to shop his design around to computer companies, and finding no takers, he and colleagues at Stanford started their own company, Silicon Graphics Inc.

SGIs first product was the IRIS (Integrated Raster Imaging System). It used the 8MHz M68000 processor with up to 2 MB memory, a custom 1024x1024 frame buffer, and the Geometry Engine to give the workstation its impressive image generation power.

At about the same time, Sun Microcomputers was founded. They also introduced a workstation that had an embedded frame buffer. The CG1, CG2 and CG3 boards were the boards used in the Sun 1 , Sun 2 and Sun 3 workstations. (The Apollo workstation also provided the single user - dedicated frame buffer technology.) Sun later used an add-on accelerator board made by Trancept Systems for the Sun 3 and Sun 4 workstations. According to Nick England, one of the designers of the TAAC board:

In the Spring of 1987 we introduced the TAAC-1 product for Sun Microsystems workstations. The TAAC-1 consisted of two large PC boards, one full of video RAM, the other full of a micro-programmed wide-instruction-word (200 bits) processor optimized for graphics and imaging operations. The TAAC-1 was plugged into and memory mapped onto the Sun's VME bus.

Clark, James H. The Geometry Engine: A VLSI Geometry System for Graphics. Computer Graphics (SIGGRAPH 82 Proceedings) 16(3) July 1982

For a little Sun frame buffer and graphics processor history, click here.

Graphics accelerator boards that were developing at this time were including hardware acceleration for many of the image synthesis operations, hardware pan and zoom, antialiasing, alpha channels for compositing, scan conversion, etc. The concept of adding a coprocessor to take the graphics operations away from the CPU was instrumental in bringing complex graphics images to the masses. The early frame buffers and the later accelerator boards associated with them are now embodied in the graphics cards in today's computers, such as those manufactured by _n_VIDIA, 3dlabs and ATI.

HowStuffWorks.com has a reasonable explanation of the emerging role of these graphics coprocessors:

How Graphics Boards Help

Since the early days of personal computers, most graphics boards have been translators, taking the fully developed image created by the computer's CPU and translating it into the electrical impulses required to drive the computer's monitor. This approach works, but all of the processing for the image is done by the CPU -- along with all the processing for the sound, player input (for games) and the interrupts for the system. Because of everything the computer must do to make modern 3-D games and multi-media presentations happen, it’s easy for even the fastest modern processors to become overworked and unable to serve the various requirements of the software in real time. It’s here that the graphics co-processor helps: it splits the work with the CPU so that the total multi-media experience can move at an acceptable speed.

As we’ve seen, the first step in building a 3-D digital image is creating a wireframe world of triangles and polygons. The wireframe world is then transformed from the three-dimensional mathematical world into a set of patterns that will display on a 2-D screen. The transformed image is then covered with surfaces, or rendered, lit from some number of sources, and finally translated into the patterns that display on a monitor’s screen. The most common graphics co-processors in the current generation of graphics display boards, however, take the task of rendering away from the CPU after the wireframe has been created and transformed into a 2-D set of polygons. The graphics co-processor found in boards like the VooDoo3 and TNT2 Ultra takes over from the CPU at this stage. This is an important step, but graphics processors on the cutting edge of technology are designed to relieve the CPU at even earlier points in the process.

One approach to taking more responsibility from the CPU is done by the GeForce 256 from _n_VIDIA (the first graphics processing unit, or GPU). In addition to the rendering done by earlier-generation boards, the GeForce 256 adds transforming the wireframe models from 3-D mathematics space to 2-D display space as well as the work needed to show lighting. Since both transforms and ray-tracing involve serious floating point mathematics (mathematics that involve fractions, called “floating point” because the decimal point can move as needed to provide high precision), these tasks take a serious processing burden from the CPU. And because the graphics processor doesn’t have to cope with many of the tasks expected of the CPU, it can be designed to do those mathematical tasks very quickly.

The new Voodoo 5 from 3dfx takes over another set of tasks from the CPU. 3dfx calls the technology the T-buffer. This technology focuses on improving the rendering process rather than adding additional tasks to the processor. The T-buffer is designed to improve anti-aliasing by rendering up to four copies of the same image, each slightly offset from the others, then combining them to slightly blur the edges of objects and defeat the “jaggies” that can plague computer-generated images. The same technique is used to generate motion-blur, blurred shadows and depth-of-field focus blurring. All of these produce smoother-looking, more realistic images that graphics designers want. The object of the Voodoo 5 design is to do full-screen anti-aliasing while still maintaining fast frame rates.

The technical definition of a GPU is "a single chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second." (_n_VIDIA)

The proliferation of highly complex graphics processors added a significant amount of work to the image making process. Programming to take advantage of the hardware required knowledge of the specific commands in each card. Standardization became paramount in the introduction of these technologies. One of the most important contributions in this area was the graphics API.

The Application Programming Interface (API) is an older computer science technology that facilitates exchanging messages or data between two or more different software applications. The API is the virtual interface between two interworking software functions, such as a word processor and a spreadsheet. This technology has been expanded from simple subroutine calls to include features that provide for interoperability and system modifiability in support of the requirement for data sharing between multiple applications.

An API is a set of rules for writing function or subroutine calls that access functions in a library. Programs that use these rules or functions in their API calls can communicate with any others that use the API, regardless of the others' specifics. In the case of graphics APIs, they essentially provide access to the rendering hardware embedded in the graphics card. Early APIs included X, Phigs, Phigs+ and GL. In 1992, SGI introduced OpenGL, which has become the most widely used API in the industry. Other approaches are Direct3D and vendor specific approaches, like Quartz for the Macintosh and the Windows API.


Certainly some of the big push toward better and faster graphics image generation came from the video and arcade game industry. Graphics moved from the primitive images of Higenbotham's 1958 oscilloscope-based Tennis for Two and Steve Russell's Spacewar on the PDP-1, to today's realistic environments found in Myst and Riven, EA sports games, and games for systems like the Xbox.

The first commercial home video game was the Magnavox Odyssey, designed in 1968 and released in 1972. The game was Pong, which was created by Ralph Baer, who worked for the defense-electronics company Sanders Associates. It was licensed to Magnavox and for a time was Sanders' most profitable line, even though many in the company looked down on game development.The Odyssey used discrete logic, a type of removable circuit card that inserted into a slot similar to a cartridge slot. The system also came with plastic overlays that gamers could put on their TV screen to simulate the playing of different "games," but only two TV sizes were supported. It also came with plastic game tokens and score sheets to help keep score, much like more traditional board games.

The next technology to hit the market was dedicated (nonprogrammable) video game consoles, which were often inaccurately called "analog" but actually used discrete logic. Examples of this kind of console was the Atari Pong game, sold by Atari (Nolan Bushnell's company - Bushnell was a University of Utah graduate.) The Atari unit also connected to a separate television set.

Following the success of Pong, Atari in 1977 introduced the 2600 VCS, which is considered one of the first generation of "8-bit" programmable systems. It was the home unit modeled after the very popular arcade game Pac Man. Other similar systems were the Odyssey2 (1978) (known in Europe as the Philips G7000), the Channel F (1976) and the Astrocade (1977). The 2600 used the MOS Technologies 6507 CPU with 128 bytes of RAM and a TIA video processor. Memory was so expensive that there was simply no way to have a frame buffer included with sufficient resolution. Instead Atari allocated enough memory for only one line of the display at a time. When the TV completed drawing a line, the next line was stuffed into the TIA while the TV was resetting for the next line. (This became known to 2600 programmers as "racing the beam".)

The 2600 proved to be one of the most complex machines in the world to program. Nevertheless it was that same complexity that actually made the system incredibly flexible, and when authors discovered the "tricks" the games soon started to gain in power far beyond what the original designers had ever imagined.

Work proceeded on more sophisticated systems, which can be categorized as the second "8-bit" generation. Games in this category included Intellivision (1980), Colecovision (1982), Philips G7400 (1983) (This was to be released in the USA as the Odyssey3; changing market conditions prevented its release), Arcadia 2001 (1982), Vectrex (1982) and the Atari 5200 (1982). The Intellivision used a General Instruments CP1610 16-bit CPU running at 894.886 KHz and 371 bytes RAM, and 16 colors was available with 160 pixels wide by 196 pixels high (5x4 TV pixels make one Intellivision pixel). The Colecovision graphics was supplied by a Texas Instruments TMS9928A, 256x192 resolution, 32 sprites, and 16 colors. It used the Z80A at 3.58Mhz CPU.

This generation was followed by a collapse in the video game market in North America (1984).


Higenbotham's Tennis for Two (left) and Russell's Spacewar


Screenshot from Cyan's Riven


Frames from Pong


Magnavox Odyssey


Screens for the Odyssey that attached to the TV screen. Click the image to see it larger.

The market was dismal for several years, and didn't begin returning until the introduction of the third "8-bit" generation game systems. These included the Nintendo Family System or Famicom (1985-1995), Sega Master System (1986) and the Atari 7800 (1986). At first, the Famicon console was supposed to be a 16-bit machine with a disk drive. However, the price was too high due to component prices, and so they made an 8-bit system. The Nintendo Family Computer was made intentionally to look like a toy. The Famicom design was only used in the Japanese version of this console. It used the Nintendo 2A03 8 bit processor based on MOS Technologies 6502 core, running at 1.79MHz, with 2 KB RAM. Video memory was a PPU with 2 KB of tile and attribute RAM, 256 bytes of sprite position RAM ("OAM"), and 28 bytes of palette RAM (allowing for selection of background color), with 256x240 pixels. Donkey Kong was one of the first games used on this system. The Sega used an 8-bit 3.6 MHz Zilog Z80 CPU and a Texas Instruments TMS9918 derived VDP (Video Display Processor).

The fourth generation machines really started the rebirth of the industry. The most significant early entry in 1989 was the 16 bit Sega Genesis. The Genesis initially competed against the 8-bit Nintendo Entertainment System, but although it had superior graphics and sound, had a hard time overcoming Nintendo's ubiquitous presence in the consumer's home. The Genesis used the 16-Bit Motorola M68000 processor running at 7.61Mhz and a VPD (Video Display Processor) dedicated for playfield and sprite control, giving 512 colors, 80 sprites and 320 x 224 resolution. Other fourth generation machines were the Super Nintendo Entertainment System and the Philips CD-i .

Fifth generation games included the Apple Pippin (1996) (which was never marketed), Atari Jaguar (1993) (marketed as "64-bit"), Playdia (1994), 3DO (1993), the Sega Saturn (1994-1998), and the cream of the crop, the Sony PlayStation (1994) and the Nintendo 64 (1996-2002) (also marketed as "64-bit").

The N64 specifications included a 93.75 MHz MIPS 64-bit RISC CPU (customized R4000 series) with a RCP (Reality Control Processor) which mapped hardware registers to memory addresses and an RDP (pixel drawing processor) with Z-buffer, anti-aliasing, and realistic texture mapping (tri-linear filtered MIP-map interpolation, perspective correction, and environment mapping). In comparison, the Playstation CPU was the R3000A 32bit RISC chip running at 33.8688MHz manufactured by LSI Logic Corp with technology licensed from SGI. It also contained in the same chip the Geometry Transfer Engine and the Data Decompression Engine. It could render 1.5 Million Flat-Shaded Polygons per second, 500,000 texture mapped and light-sourced polygons per second with 16.7 Million Colors, Gouraud shading, and texture mapping.

The Playstation console angered Nintendo who subsequently filed a lawsuit claiming breach of contract. They attempted to, in federal court, obtain an injunction against the release of the PlayStation but the federal Judge presiding over the case denied the injunction. (See note below)

The current models are the Sixth generation systems. They include the Sega Dreamcast, Sony PlayStation 2, Nuon, Microsoft Xbox and Nintendo GameCube. Their specs are included below for comparison with the preceding generation machines.


Nintendo Family System


Sega Genesis


Nintendo 64


Nintendo Gamecube

For more information on these video games, with listings of the titles associated with each, go to

http://en2.wikipedia.org/wiki/History_of_the_video_game

For a detailed history of video and computer games, go to

http://www.emuunlim.com/doteaters/

Playstation 2

• CPU: 128 Bit "Emotion Engine" clocked at 300 MHz with 32 MB Direct Ram bus and a FPU (Floating Point Multiply Accumulator x 1, Floating Point Divider x 1) Co-Processor. The Floating Point Performance is 6.2 GFLOPS
• 3D CG Geometric Transformation: 66 million Polygons Per Second
• Graphics: "Graphics Synthesizer" clocked at 150MHz
• Pixel Configuration: RGB:Alpha:Z Buffer (24:8:32)
• Maximum Polygon Rate: 75 Million Polygons Per Second

Xbox

• CPU: 733 MHz Pentium III with 64 MB RAM
• Graphics Processor: 250 MHz XGPU custom chip developed by Microsoft and _n_VIDIA
• Polygon Performance: 125 M/sec
• Micropolygons/particles per second: 125 M/sec
• Simultaneous Textures: 4
• Pixel Fill Rate - No Texture: 4.0 G/Sec (anti-aliased)
• Pixel Fill Rate - 1 Texture: 4.0 G/Sec (anti-aliased)
• Full Scene Anti-Alias

Gamecube

• CPU: Power PC 705CXe 485 MHz with 32-bit Integer and 64-bit Floating-point
• Embedded 2 MB Frame Buffer:
• Pixel Depth: 24-bit RGB / RGBA and 24-bit Z-buffer
• Image Processing Functions: Fog, Subpixel anti-aliasing, 8 hardware lights, Alpha blending, Virtual texture design, Multi-texturing, bump mapping, Environment mapping, MIP mapping, Bilinear filtering, Trilinear filtering, and Anisotropic filtering

Sega Dreamcast

• CPU: SH-4 RISC CPU with 128 Bit graphic computational engine built-in (operating frequency: 206 Mhz 360 MIPS/1.4 GFLOPS)
• Graphics Engine: PowerVR2 DC (capable of drawing more than 3 million polygons per second)

Note: Nintendo was known in the 1980s for its draconian licensing conditions and rabid prosecution of all "unlicensed" game producers. One of the sore points among developers of the period was the fee that they had to pay to Nintendo to get their games licensed, which meant that Nintendo tested them and produced them at its own facilities (either part of the fee or for an additional cost). Another sore point was that Nintendo actually decided how many cartridges of each game it would manufacture. The company's virtual monopoly in the console market at the time allowed it to basically impose any rules that it chose, and the rules were always meant to increase profit, quality, or both. Several companies began producing games, refusing to pay the licensing fee (or being refused by Nintendo), and all were eventually forced out of business or out of production by legal fees and court costs for extended lawsuits brought by the giant against the transgressors. The one exception was Color Dreams whose religious themed games under the subsidiary name Wisdom Tree prevented Nintendo from suing due to a fear of public backlash. Companies that made unlicensed games include (Nintendo did not sue every one of these companies):

• Active Enterprises – only two games
• American Game Cartridges – several games
• American Video Entertainment – several small companies rolled into one
• Camerica
• Color Dreams
• S.E.I. – one game: Impossible Mission 2
• Tengen – the most popular of the unlicensed companies; many games
• Wisdom Tree – was not sued due to religious themes in games



Next: The GUI and the personal computer

<<Back to Section Index