SydLexia.com Forum Index
"Stay awhile. Stay... FOREVER!"

  [Edit Profile]  [Search]  [Memberlist]  [Usergroups]  [FAQ]  [Register]
[Who's Online]  [Log in to check your private messages]  [Log in]
Knyte's VGM #17 - Computer Gaming Top Tens


Reply to topic
Author Message
Knyte
2010 SLF Tag Champ*
Title: Curator Of The VGM
Joined: Nov 01 2006
Location: Here I am.
PostPosted: Dec 21 2008 02:48 am Reply with quote Back to top

After a long hiatus, I've decided to do another VGM article. Which is offically number 17. (Trust me on this.)

There have been countless lists dedicated to the best game consoles ever made, best (insert genre here) game lists, but what about for us PC gamers? Where are our lists? So, today I give you one of three that I have planned. Dedicated to the technologies that make PC gaming what it is.

Let's get started with:

Top Ten Video Cards:

Computer Games would be still be nothing more than MUDDs and Text Adventures without the video graphics techonologies that have arised over the last 30 years. Here we look at the ones that made the biggest impact on the industry.

10. IBM Color/Graphics Adapter (CGA) - 4 Colors!

Image

The first sighting of color on your home PC. Introduced in 1981, this was IBM's first color graphics card, and the first color computer display standard for the IBM PC.

The standard IBM CGA graphics card was equipped with 16 kilobytes of video memory, and could be connected either to a NTSC-compatible monitor or TV via an RCA jack, or to a dedicated RGBI interface CRT monitor, such as the IBM 5153 color display.

Built around the Motorola MC6845 display controller, the CGA card featured several graphics and text modes. The highest resolution of any mode was 640×200, and the highest color depth supported was 4-bit (16 colors).

CGA offers two commonly-used graphics modes:

320×200 pixels, as with the 40×25 text mode. In the graphics mode, however, each pixel can be addressed independently. The tradeoff is that only 4 colors can be displayed at a time. However, only one of the four colors can be freely chosen from the 16 CGA colors — there are only two official palettes for this mode:

# Palette 1
0 default (Black)
1 cyan
2 magenta
3 white

# Palette 2
0 default (Black)
1 green
2 bright red
3 yellow

It wasn't much, but it was a step in the right direction.

9. Geforce 8800 GTX - Direct X 10 has now been unleashed.

Image

The arrival of the 8800 GTX was also the arrival of the first Graphics Card capable of using Microsoft's Direct X 10. One of the key selling features of Windows Vista.

It was also a complete rethinking of the graphic card.

The GeForce 8 series arrived with NVIDIA's first unified shader Direct3D 10 Shader Model 4.0 / OpenGL 2.1 architecture. The design was a major shift for NVIDIA in GPU functionality and capability, the most obvious change being the move from the separate functional units (pixel shaders, vertex shaders) within previous GPUs to a homogeneous collection of universal floating point processors (called "stream processors") that can perform a more universal set of tasks.

At the time of its release, the G80 was the largest commercial GPU ever constructed. It consists of 681 million transistors covering a 480 mm² die surface area built on a 90 nm process. (In fact the G80's total transistor count is ~686 million, but since the chip was made on a 90nm process and due to process limitations and yield feasibility, NVIDIA had to break the main design into two chips: Main shader core at 681 million transistors and NV I/O core of about ~5 million transistors making the entire G80 design standing at ~686 million transistors).

8. Radeon R700 (4850/4870) - A new way of thinking.

Image

In the same way the 8800GTX was a complete redesign of Nvidia's graphic cards, the R700 is a complete rethinking of AMD/ATI's graphic cards and they way they market them.

The Radeon R700 is the engineering codename for a Graphics Processing Unit series released by AMD Graphics Product Group. The foundation chip, codenamed RV770, was released on June 25, 2008.

AMD is shifting focus on releasing products, starting with the R700 cards. With the performance market in mind, the foundation GPU for the entire family of graphics products will be targeted towards the performance segment, enthusiast-market products are basically two foundation GPUs on a single PCB, while mainstream and entry-level products are derivatives and cut-down versions of the said GPU. This makes a significant contrast towards the previous market strategy with developing the enthusiast-segment GPU in mind, while all other segments are basically cut-down versions of the GPU.

AMD/ATI is using a new strategy for the graphics market. Future GPU architectures will undergo small updates (presumably a die shrink at fabrication half-nodes, minor architectural changes, improvements to performance and power consumption, probably as well as implementation of newer API support if available) 6 months after first release, meaning the first GPU cores having only a 6-month product cycle. For mainstream and value segments, the product cycle will instead be 12 months without architectural alterations.

7. GeForce 2 MX - Affordable 3D for all!

Image

3D Gaming for the masses. A gaming quality 3D Graphic card for $100 was, and still is, a hard deal to beat.

The most successful GeForce2 part was the budget-model GeForce2 MX. This was due to its popularity with OEM system builders, like its predecessor the RIVA TNT2 M64. The combination of low cost and a complete 3D feature-set made it possible to equip a PC with complete 3D acceleration at a much lower price point than if a GeForce2 GTS was used. To reduce the costs involved with the architecture, Nvidia removed two 3D pixel pipelines and half of the GTS card's memory bandwidth by using cheaper SDR SDRAM instead of DDR SDRAM. Nvidia did however add true dual-display support to the MX. In comparison, the GTS and subsequent non-MX models could drive a separate TV-encoder, but this second-display was tied to the primary desktop.

The prime competitors to GeForce2 MX were ATI's Radeon VE and Radeon SDR. Radeon VE had the advantage of somewhat better dual-monitor display software, but it did not offer hardware T&L, an emerging 3D rendering feature of the day that was the major attraction of Direct3D 7. Radeon VE also had only a single rendering pipeline, meaning it had substantially less pixel fill rate than GeForce2 MX. The Radeon SDR, however, was the full original Radeon simply equipped with SDR SDRAM instead of DDR SDRAM. This card was released some time after GeForce2 MX, but was faster at 32-bit "truecolor" 3D rendering. Radeon SDR had slightly more powerful pixel pipelines (3 TMUs each) and more efficient overall operation due to HyperZ. Radeon SDR lacked multi-monitor support, however.

Nvidia eventually created a wider range of GeForce2 MX pricing options with 3 additional configurations; the MX400, MX200, and MX100. The MX400, like the original MX, had a 128-bit SDR memory bus which could also be configured as 64-bit DDR. The MX400 outperformed the original MX because of an increased core clock rate. The MX200 was equipped with a cheaper 64-bit SDR memory bus, resulting in significantly less memory bandwidth than the original MX, a critical requirement for 3D rendering performance. The cheapest model, the MX100, was equipped with a narrow 32-bit SDR memory bus.

The GeForce2 MX was later used by Nvidia as an integrated graphics processor within the nForce line of motherboard chipsets. It also saw an implementation as a mobile graphics chip for use in notebooks, called GeForce2 Go. This chip reduced peak power usage to 2.6 W

6. Geforce 7950GX2 - Who wants a video card sandwich?

Image

Since the advent of SLi (Scalable Link Interface) technology, which allows two graphics cards to function as one, SLI has become the buzz word for a cutting-edge gaming experience. Naturally, two graphics cards are more powerful than one, and Nvidia took that approach to the next level, creating a single card with two graphics cores.

This strange arrangement allowed people that didn't have SLi compatable motherboards to still enjoy the benefits of running two cards. Also, if you did have an SLi capable board, you could run Quad SLi! (Which really didn't improve framerates much, and turned out to be a big let down. Though, with the release of Intel's Core i7 CPUs, we can now see that CPUs have been the bottleneck when it came to multi card configs. As Triple Sli and Crossfire-X performance has jumped up quite a bit running on the new i7s.)

The 7950 GX2 was the retail consumer version of the 7900 GX2, which was a hardware vendor only card, that was a monster in size. (Requiring an ATX-E case to even fit.)

This technology has since grown and matured, and today we have the HD 4870 X2 and the soon to be released GTX 295 as the current versions of the "2 Video Cards on One Board" market.

5. 3DFX Voodoo 2 - It's fun to play together.

Image

In 1998, 3dfx released Voodoo's successor, the popular Voodoo2. The Voodoo2 was architecturally similar, but the basic board configuration added a second texturing unit, allowing two textures to be drawn in a single pass.

A problem with the Voodoo2 was the fact that it required three chips and a separate VGA graphics card, whereas new competing 3D products, such as the ATI Rage Pro, Nvidia RIVA 128, and Rendition Verite 2200, were single-chip products. Despite some shortcomings, such as the card's dithered 16-bit 3D color rendering and 800×600 resolution limitations, no other manufacturers' products could match the smooth framerates that the Voodoo2 produced. It was a landmark (and expensive) achievement in PC 3D-graphics. Its excellent performance, and the mindshare gained from the original Voodoo Graphics, resulted in its success. Many users even preferred Voodoo2's dedicated purpose, because they were free to use the quality 2D card of their choice as a result. Some 2D/3D combined solutions at the time offered quite sub-par 2D quality and speed.

The most important advance Voodoo2 introduced was Scan-Line Interleave (SLI) to the gaming market. In SLI mode, two Voodoo2 boards were connected together, each drawing half the scan lines of the screen. For the price of a second Voodoo2 board, users could easily improve 3D throughput. A welcome result of SLI mode was an increase in the maximum resolution supported, now up to 1024×768. Despite the high cost and inconvenience of using three separate graphics cards (two Voodoo 2 SLI plus the general purpose 2D graphics adapter), the Voodoo2 SLI scheme was the pinnacle of gaming performance at the time.

Having since acquired 3dfx, Nvidia in 2004 reintroduced the SLI brand (with SLI now standing for Scalable Link Interface) in their GeForce 6 Series. ATI Technologies has also since introduced its own multi-chip implementation, dubbed "CrossFire". Although Scalable Link Interface and Crossfire operate on the original SLI principle, the algorithms used are now totally different.

4. IBM Enhanced Graphics Adapter (EGA) - 12 more reasons to game on PC.

Image

Introduced in 1984 by IBM for its new PC-AT, EGA produced a display of 16 colors at a resolution of up to 640×350 pixels. The EGA card included a 16 kilobyte ROM to extend the system BIOS for additional graphics functions and included the Motorola MC6845 video address generator.

Each of the 16 colors could be assigned a unique RGB color code via a palette mechanism in the 640×350 high-resolution mode; EGA let you choose the displayed colors out of a total of 64 palette colors (two bits per pixel for red, green and blue). EGA also included full 16-color versions of the CGA 640×200 and 320×200 graphics modes; only the 16 CGA/RGBI colors are available in these modes. The original CGA modes are also present, though EGA isn't 100% hardware compatible with CGA.

Suddenly games started to look good and more comparable to the games in the Arcades and on the home videogame consoles. Now, we're getting somewhere in the rise of PC gaming.

3. Radeon 9700 Pro - The Wrath of "Khan"*

Image

ATI had held the lead for a while with the Radeon 8500 but NVIDIA retook the performance crown with the launch of the GeForce 4 Ti line. A new high-end refresh part, the 8500XT (R250) was supposedly in the works, ready to compete against NVIDIA’s high-end offerings, particularly the top line Ti 4600. Pre-release information listed a 300 MHz core and RAM clock speed for the "R250" chip. ATI, perhaps mindful of what had happened to 3dfx when they took focus off their "Rampage" processor, abandoned it in favor of finishing off their next-generation R300 card. This proved to be a wise move, as it enabled ATI to take the lead in development for the first time instead of trailing NVIDIA. The R300, with its next-generation architecture giving it unprecedented features and performance, would have been superior to any R250 refresh.

The R3xx chip was designed by ATI's west coast team (formerly ArtX Inc.), and the first product to use it was the Radeon 9700 PRO (internal ATI code name: R300 - internal ArtX codename: Khan), launched in August 2002. The architecture of R300 was quite different from its predecessor, Radeon 8500 ("R200"), in nearly every way. The core of 9700 PRO was manufactured on a 150 nm chip fabrication process, similar to the Radeon 8500. However, refined design and manufacturing techniques enabled a doubling of transistor count and a significant clock speed gain.

One major change with the manufacturing of the core was the use of the flip chip packaging, a technology not used previously on video cards. Flip chip packaging allows far better cooling of the die by flipping it and exposing it directly to the cooling solution. ATI thus could achieve higher clock speeds. Radeon 9700 PRO was launched clocked at 325 MHz, ahead of the originally projected 300 MHz. With a transistor count of 110 million, it was the largest and most complex GPU of the time. Despite that, the Radeon 9700 PRO was clocked significantly higher than the Matrox Parhelia 512, a card released but months before R300 and considered to be the pinnacle of graphics chip manufacturing (with 80 million transistors at 220 MHz), up until R300's arrival.

Radeon 9700's advanced architecture was very efficient and, of course, more powerful compared to its older peers of 2002. Under normal conditions it beat the GeForce4 Ti 4600, the previous top-end card, by 15–20%. However, when anti-aliasing (AA) and/or anisotropic filtering (AF) were enabled it would beat the Ti 4600 by anywhere from 40–100%. At the time, this was quite astonishing, and resulted in the widespread acceptance of AA and AF as critical, truly usable features. id Software technical director John Carmack had the Radeon 9700 run the E3 Doom 3 demonstration.

The performance and quality increases offered by the R300 GPU is considered to be one of the greatest in the history of 3D graphics, alongside the achievements of the GeForce 256 and Voodoo Graphics. R300 would become one of the GPUs with the longest useful lifetime in history, allowing playable performance in new games at least 3 years after its launch.

*"Khan" was the in-house codename for the R300.

2. 3Dfx Voodoo Graphics PCI - Now showing in 3D!

Image

Originally, company 3dfx, made the 3D Voodoo graphic adapters for arcade game cabinets. But, after a fortuitous drop in EDO DRAM prices due to the volatile DRAM market, Voodoo Graphics cards became feasible for the consumer PC market. The Voodoo 1, as the Voodoo Graphics would be later known, was notable for its lack of an onboard VGA controller. As such, a Voodoo-equipped PC still required a separate VGA graphics card, meaning it was very expensive to have both 3D and 2D acceleration. The Voodoo 1 occupied a separate PCI slot and only engaged when the host PC ran a 3D game that had been programmed to use the card. A pass-through VGA cable daisy-chained the VGA card to the Voodoo 1, which was itself connected to the monitor. Although this was a cumbersome arrangement that somewhat hurt the analog signal quality of the separate 2D card, PC gamers were willing to put up with it to gain what was, at the time, the best in 3D graphics.

The Voodoo heralded a new era in 3D graphics. Prior to that, games such as Doom and Quake had compelled video game players to move from their 80386s to 80486s, and then to the Pentium. Intel had developed their MMX instruction set that would increase 3D and multimedia performance, but adding in a Voodoo graphics card would offer a substantial gain in speed, making the graphics card more important than the CPU for 3D games.

1. IBM's PS/2 Video Graphics Array - Wow, that looks real!

Image

Introduced as the internal video component in the PS/2 line of computers by IBM in 1987, VGA was the last graphical standard introduced by IBM that the majority of PC clone manufacturers conformed to, making it today (as of 2008) the lowest common denominator that all PC graphics hardware supports before a device-specific driver is loaded into the computer. For example, the Microsoft Windows splash screen appears while the machine is still operating in VGA mode, which is the reason that this screen always appears in reduced resolution and color depth.

VGA is referred to as an "array" instead of an "adapter" because it was implemented from the start as a single chip, replacing the Motorola 6845 and dozens of discrete logic chips covering a full-length ISA board that the MDA, CGA, and EGA used. This also allowed it to be placed directly on a PC's motherboard with a minimum of difficulty (it only required video memory, timing crystals and an external RAMDAC), and the first IBM PS/2 models were equipped with VGA on the motherboard.

With a color palette of 262,144 values, and able to display 256 of them at once, suddendly the PC blew everything else out of the water. Amiga and Atari STs could only do 32 colors, and the home consoles couldn't match the color or resolution of the PC anymore.

In 1989, a few games supported VGA, but usually only for a better palette of their 16-color graphics. But then it went surprisingly fast. In 1990 VGA became standard, and in 1991 EGA was more or less a thing of the past. A few companies, notably Sierra OnLine, rereleased many of thier classic games in VGA. (King's Quest, Police Quest, Leisure Suit Larry, and such.)

From the moment VGA was introduced, the PC has been the top of gaming food chain, and still is to this day. (Maybe not in popularity, but in hardware ability.)
View user's profileSend private messageVisit poster's website
Greg the White
Joined: Apr 09 2008
Location: Pennsylvania
PostPosted: Dec 21 2008 03:08 am Reply with quote Back to top

Awesome. There really needs to be more PC history stuff, and more stuff like this. Congratulations on really knowing your shit.


So here's to you Mrs. Robinson. People love you more- oh, nevermind.
 
View user's profileSend private message
MonsterOfTheLake
Joined: Dec 02 2008
PostPosted: Dec 21 2008 10:17 am Reply with quote Back to top

Cool


Wahey

Image
 
View user's profileSend private message
UsaSatsui
Title: The White Rabbit
Joined: May 25 2008
Location: Hiding
PostPosted: Dec 21 2008 01:05 pm Reply with quote Back to top

Wow. Alley Cat. Awesome game.
View user's profileSend private message
TheRoboSleuth
Title: Sleuth Mark IV
Joined: Aug 08 2006
Location: The Gritty Future
PostPosted: Dec 21 2008 10:37 pm Reply with quote Back to top

Mang that sucks why not more lists of stupid stuff that happened in teh fideogames that I might of have played? I wants to get my nostalgia on.

No, but seriously, thank you for not doing another list that every other site on the internet has and instead doing something original and informative.


Image
 
View user's profileSend private message
Greg the White
Joined: Apr 09 2008
Location: Pennsylvania
PostPosted: Dec 22 2008 12:34 am Reply with quote Back to top

RobotGumshoe wrote:
Mang that sucks why not more lists of stupid stuff that happened in teh fideogames that I might of have played? I wants to get my nostalgia on.

No, but seriously, thank you for not doing another list that every other site on the internet has and instead doing something original and informative.

Yeah, right? "IT'S TIME FOR THE TOP 10 NES GAMES, BECAUSE WE ONLY EVER PLAYED 10 ON THE EMULATORS"


So here's to you Mrs. Robinson. People love you more- oh, nevermind.
 
View user's profileSend private message
GPFontaine
Joined: Dec 06 2007
Location: Connecticut
PostPosted: Dec 22 2008 09:04 am Reply with quote Back to top

R.I.P. Voodoo Banshee

3DFX Voodoo 2 was overrated.

Geforce 3 should have been on there. Programmable Vertex Shaders = WIN



 
View user's profileSend private messageVisit poster's website
Cattivo
Joined: Apr 14 2006
Location: Lake Michigan
PostPosted: Dec 22 2008 10:53 am Reply with quote Back to top

Good old VGA Very Happy
View user's profileSend private message
IceWarm
Joined: Dec 22 2008
Location: Breckenridge, Colorado
PostPosted: Dec 22 2008 07:15 pm Reply with quote Back to top

I remember my first 3D card...3DFX Voodoo 3 with a whopping 16 megs of video RAM! That was back in 1999. VGA and EGA both rocked also. Played a lot of my first PC games on that kind of hardware.


"Anybody who ever built an empire, or changed the world, sat where you are now. And it’s because they sat there that they were able to do it."

"Fighting in a basement offers a lot of difficulties, number one being, you're fighting in a basement."

"You're Not So Tough Without Your Veggie!"
 
View user's profileSend private messageVisit poster's website
Display posts from previous:      
Reply to topic

 
 Jump to: