Useful Tips

The history of the development of video cards

Perhaps none of the components of a computer is the object of such worship as a graphics adapter for a PC. Adherents of this cult are numerous fans of computer games who, without hesitation, can spend a considerable amount on the next upgrade for the sake of a beautiful smooth picture. This money continuously fuels the development of new and more powerful adapters: and these devices are developing much faster than central processing units. But only 30 years ago no one could have imagined such a thing. Let's turn to those times and consider how it all began.

No graphics

The first graphics card for a PC is considered to be the MDA (Monochrome Display Adapter) video adapter, which was part of the famous IBM PC (1981), the founder of the PC-compatible PC family. Unlike its predecessors, integrated into the main board of the computer, the IBM MDA was assembled on its own board and installed in the slot of the universal XT-bus. In fact, it was a simple video controller that displays the contents of the video memory on the display. Even the mandatory RAMDAC for later adapters was missing for the simple reason that the signal generated by the MDA for the monitor was digital. In addition to the video controller chip itself, the MDA board carried 4 KB of video memory, a ROM chip with a font, and a clock generator.

Interestingly, the first video adapter for the IBM PC was completely text-based, i.e. did not have a graphical mode of operation. At the same time, most PCs of those years knew how to work with graphics. What is the reason for such disregard for graphics on the part of IBM? It's about positioning. In those years, the computer's ability to "draw" on the screen was strongly associated with games and other frivolous activities, and, from the company's point of view, a business computer was completely unnecessary.

What did MDA do? Quite a lot for its time. On the monitor screen, it could display 25 lines of 80 characters each, with each character occupying a 9x14 pixel matrix. In other words, it provided a resolution of 720x350 pixels, and therefore the text displayed by it had a high definition not available to competitors. In addition, each character could have one of five attributes: regular font, underlined, bright, flashing, inverse. Of course, MDA was used exclusively with monochrome (single color) monitors. Another feature of the adapter was the presence of a printer port on it, which saved users about $ 100, which cost a separate controller.

Frivolous brother

Yet the IBM PC would not have gained such popularity if it hadn't had the graphics capabilities. For less serious use of its PC, IBM prepared another graphics adapter called CGA (Color Graphics Adapter), released in the same 1981. Providing not as high a picture resolution as MDA, CGA could work in many different modes - both text and graphics, for which it was required to equip it with 16 KB of video memory.

Like MDA, CGA could display 25 lines of 80 or 40 characters, but the resolution of each character was only 8x8 pixels. But the symbols could have 16 different colors.

CGA graphics could output in one of three modes: 640x200 with 1-bit color (monochrome mode), 320x200 pixels with 2-bit color (4 colors), 160x100 pixels with 4-bit color (16 colors). The latter was technically an emulation of graphics using text mode (i.e. pixels were simulated by symbols that were a half-filled 8x8 pixel square).

The CGA adapter was equipped with the same 9-pin port as the MDA, and also transmitted video signal in digital form, while also having a composite output for connection to a color TV. It could work with a conventional monochrome MDA display. This compatibility continued until the more advanced EGA, released in 1984.

More color, more clarity

So, the evolution of graphics adapters went along the path of increasing the resolution and color of the picture.The EGA (Enhanced Graphics Adapter, 1984) could display graphics with a resolution of 640x350 pixels at 4-bit color (16 colors). The amount of video memory increased first to 64 kb, and then to 256 kb, which allowed the EGA to operate with several pages of video memory. This provided some graphics acceleration: the processor could generate several picture frames at once.

Now it sounds strange, but such primitive video adapters have reigned on the market for years. So, until 1987, on PC-compatible PCs, the EGA rules were unchallenged, and users had no idea what could be better. What happened in 1987? VGA appeared.

The new video adapter was originally intended for the new generation of IBM PS / 2 PCs. This family, in the design of which the developers deviated from the principles of open architecture, actually failed in the market, but many solutions applied in it received a start in Life. MCGA (Multi-Color Graphics Array), the newest video adapter built into the motherboard of PS / 2 computers, was soon released as a card for the ISA bus and was called VGA (Video Graphics Array).

The new product provided graphics output in a resolution of 640x480 pixels with 16 colors or 320x240 pixels in 8-bit color (256 colors). It already looked a little like something photorealistic. Since the adapter was originally developed for incompatible PS / 2, its developers, without a shadow of a doubt, created a new analog video interface for it - 15-pin D-Sub, which became a new standard for many years and is still used in budget systems. Importantly, VGA was software compatible with EGA, CGA, and MDA: applications developed for legacy adapters could run on the new one.

256 kb of video memory allowed storing several frames and a custom font in addition, and when using the entire volume for a single frame, it was possible to display a picture in an unprecedented resolution of 800x600 pixels at that time, although this feature was not documented and was used extremely rarely.

A little bit faster

Repeating the history of MDA and CGA, IBM has developed two video adapters for PS / 2 at once: the built-in MCGA (VGA) and the improved 8514 / A. Supplied as an optional PS / 2 upgrade, the 8514 / A could display a 1024x768 pixel resolution with 8-bit color. But the technological innovations did not end there. For the first time, the developers thought about transferring some of the work on preparing the frame to the shoulders of the video adapter, and provided the 8514 / A with some graphics acceleration functions.

The adapter learned to independently draw lines in its memory, fill a part of the frame with color, apply a bit mask. For graphic applications of those years, this was an invaluable help: even when building diagrams, the speedup was clearly noticeable, and engineering graphics applications even won many times in performance. Of course, all this required software support, and the new adapter soon received it.

I must say that in those days, professional graphics stations were already equipped with additional graphics coprocessors located on separate boards. Such devices were very expensive and at the same time had very wide capabilities. The 8514 / A was much less capable, but it was also relatively cheap, which was and still is the most valuable quality in the PC sector.

And in 1990, the 8514 / A was replaced in the form of an XGA (Extended Graphics Array) adapter, which had slightly expanded functionality. The new adapter has a mode of 800x600 pixels with 16-bit color (the so-called High Color, 65,536 colors), otherwise it was similar to its predecessor. Starting with XGA, a variety of SuperVGA adapters began to dominate the market, and the amount of memory and available resolutions began to grow every year. As a result, it became more and more difficult to surprise the user with the clarity and color of the picture. How then to sell expensive new items? New, hitherto unclaimed and unseen functions were needed. And they appeared.

First step in 3D

The well-known company S3 became a pioneer in accelerating 3D graphics for PCs. The S3 Virge adapter was the successor to the highly successful Trio 64V + while still supporting up to 4MB of DRAM or VRAM.Its graphics core and video memory worked at 80 MHz, which is not impressive at all these days.

The most interesting innovation in Virge is support for 3D graphics acceleration functions. They could not provide a serious increase in the speed of games of that time, moreover, software (by means of the CPU) rendering often worked faster than hardware rendering on Virge. But with these features, game developers could afford to embellish their products with newfangled technologies such as dynamic lighting and bilinear texture filtering.

Inspired by its role as a pioneer, S3 set out to take over the consumer 3D accelerator niche. So, contracts were concluded with the manufacturers of well-known game titles: Tomb Raider, Mechwarrior 2, Descent II received support for the S3D standard. The company's marketers have reasoned that by extending their own standard to 3D acceleration functions, they will get a significant advantage over competitors. In theory, the S3 Virge also supported some of the features of the professional OpenGL 3D graphics standard, but performance using the libraries of this standard was completely unsatisfactory. Direct 3D support was also announced, but there were no such games and were not even planned: at that time, almost all games were released under MS-DOS.

S3's ambitious plans were not destined to come true: already in 1996, 3Dfx released the 3D accelerator Voodoo Graphics, which ensured its dominance in the industry. Virge had no chance; in subsequent years, the adapter underwent several updates, but it never managed to get out of the role of an inexpensive 2D video card.

Age of Monsters

From what depths did the hitherto unknown 3Dfx company crawl out? In 1994, three specialists who worked for a very, very authoritative in the field of professional graphics Silicon Graphics, decided to change jobs. Ross Smith, Harry Tarolly and Scott Sellers drew attention to the growth of the market for game consoles, which already had good 3D graphics. From the simple idea that only 3D performance is missing for the PC gaming boom, 3DfX was born.

After receiving several loans, the founding fathers of the company got down to business. The initial experience and capital 3Dfx earned by the release of graphics chips for game consoles, but a year later the first product for PCs appeared - Voodoo Graphics. The novelty presented at Computex caused a sensation: the smoothness and beauty of the rendering of the 3D scene amazed the imagination. In terms of graphics quality, the accelerator far surpassed the Sony PlayStation and Nintendo 64 consoles, which at that time had not even been released yet.

Just like Virge, Voodoo Graphics had support for OpenGL and DirectX, but the speed was "lame". But when working through its own Glide programming interface, Voodoo Graphics did just fine. Many game manufacturers immediately began to optimize their products for the new accelerator: competitors looked pale against its background, to put it mildly. The maximum graphics mode of Voodoo Graphics did not look very impressive - only 640x480 pixels at 16-bit color, but then it seemed that for 3D graphics this was more than enough.

Structurally, Voodoo Graphics was an adapter installed and a PCI slot, but it lacked the functions of a 2D video card. It was connected to the monitor in series with a conventional video adapter and took over control when switching to 3D mode. At first, this approach looked promising: it was not a trivial task to provide a high-quality 2D image at that time, and the ability to combine a 3D accelerator with an obviously high-quality 2D card was highly appreciated by many users. As a counterexample, we can cite the Rendition Verite V1000 3D accelerator, released in the same 1996, which had the functions of a 2D video card, but noticeably "washed out" the picture in high resolutions. For the same reason, Voodoo Rush, which appeared in 1997, was not successful, which was a full-fledged video card with a 3D core from Voodoo Graphics.

On board Voodoo Graphics was 4 MB EDO DRAM, working, like the processor, at 50 MHz. The drop in prices in late 1996 for this type of memory allowed 3Dfx to sell their chipsets relatively inexpensively, which further boosted their popularity.Note that the company did not produce the adapters themselves, but only supplied the chipsets to its partners. The Diamond Monster 3D adapter is the most widely used, which is why products based on 3Dfx chips are called "monsters" in everyday life.

Second echelon

In addition to the young and early 3Dfx, older companies were also trying to grab their market share. So, ATI, founded in 1985 by the advent of Voodoo Graphics, already had a name and experience, starting with the production of clones of the IBM 8514 / A. By 1995, its asset was listed as ATI 3D Rage - an adapter with basic 3D acceleration capabilities, excellent 2D graphics quality and advanced hardware processing functions for compressed MPEG-1 video stream. By mid-1996, the company released 3D Rage II, which provided twice the 3D performance of its predecessor and was capable of processing MPEG-2 (DVD) video. The adapter supported Direct3D and partly OpenGL, was equipped with 8 MB of SDRAM and ran at 60 MHz (CPU) and 83 MHz (memory). In terms of performance in 3D rendering, the new product was noticeably inferior to many competitors, but the card found its use thanks to a good 2D picture and the beginnings of hardware video acceleration.

NVIDIA was only two years older than 3Dfx, and already in 1995 released its first disastrous product. The NV1 adapter was well conceived and combined a 2D adapter, 3D accelerator, sound adapter, and even a Sega Saturn gamepad port. The product turned out to be not cheap, while the 3D acceleration unit had a very unusual architecture: the 3D scene was built not from polygons, but from curves of the third order. Game developers were wary of such an original approach, fraught with many difficulties in programming a 3D engine, and the advent of Direct3D, which uses polygons, finally put an end to NV1.

It is worth giving NVIDIA its due: having lost a lot of money and a lot of employees, already in 1996 it was able to release a new, completely different product. NVIDIA Riva 128, based on the NV3 chip, had 4 MB (8 MB in the Riva 128ZX version) of SDRAM with a 128-bit bus width and worked at 100 MHz. Having a good 3D part, comparable in performance with Voodoo Graphics, Riva 128 was produced for both PCI and AGP (which the "monsters" could not boast of), and managed to pull NVIDIA out of the financial abyss. However, the parity with the 3Dfx product was very conditional: they were equal only in the little-used Direct3D at that time ...

$config[zx-auto] not found$config[zx-overlay] not found