GeForce FX 5200
The new GeForce FX 5200 chips, despite all their novelty, had even less memory bandwidth than the budget GeForce4 MX440-8x. However, the GeForce FX 5200 had twice as many texture modules, that is, in applications with multitexturing, the FX 5200 should have worked faster, since it should not have experienced a performance drop due to the transition to 3D mode with multitexturing.
There are quite a lot of innovations in the NV34 (GeForce FX 5200). First of all, this is support for DirectX 9, which nVidia has dreamed of since the release of the Radeon 9700/9500 and which everyone has been waiting for since the release of 3DMark03. nVidia was still a little late, but was able to bring support for pixel and vertex shaders version 2.0 to the Mainstream market. Also in the new video chip there was something from the GeForce4 MX. For example, this is a built-in MPEG-2 decoder that allows you to play DVD movies in hardware. A TDMS transmitter was built into the NV34 core to output images to a DVI panel. Now the production of video cards based on GeForce FX 5200 has become even cheaper. The decoder for displaying images on a TV was also built into the NV34 core, that is, the chip was an almost completely ready-made solution.
But, once again, nVidia made a little confusion. Users are accustomed to calling video chips by code names. Until their release and the official announcement of the name of the video chip all, the talk is only about the labeling - NV30, NV28, etc. And the larger the number is in the chip marking, the larger the number will be in the official name. One would have expected that the NV31 would be called GeForce FX 5200, and the NV34 - GeForce FX 5600, but in reality it turned out the other way around and the younger NV31 received a more “senior” name - GeForce FX 5600, and NV34 - GeForce FX 5200.
Making the chip and card cheaper required sacrifices. List of main differences between the older NV30 and NV34. So, what the NV30 doesn't have:
0.13 Micron Process Technology - made it possible to place more semiconductor elements on the chip and increase the frequency of the 256-bit core. The FX5200 series has a 0.15 micron process.
Intellisample Technology is a new anti-aliasing technology that eliminates jagged edges and ridges in images 50% better than before. It also made it possible to adjust the color gamut to take into account the difference in the perception of light and color directly by the eye and how it is reproduced on the monitor. In addition, this technology used new and improved anisotropic filtering, which reduced texture distortion by making dynamic adjustments to its image. The FX 5200 did not have the Z-compression and "hard" color support inherent in this technology. And he couldn’t have it - the chip’s power was simply not enough to implement such technologies.
8 Pixel Pipelines - output up to 8 pixels per clock. In our (5200) case - only 4.
400 MHz RAMDAC - for the 5200 series, the digital-to-analog video memory data converter operated at a frequency of 350 megahertz.
DDR II memory - instead of progressive DDR II memory, the FX 5200 had regular DDR.
Characteristics of NVIDIA GeForce FX 5200
Name | GeForce FX 5200 |
Core | NV34 |
Process technology (µm) | 0,15 |
Transistors (millions) | 47 |
Core frequency | 250 |
Memory operating frequency (DDR) | 200 (400) |
Bus and memory type | DDR-128 bit |
Bandwidth (Gb/s) | 6,4 |
Pixel pipelines | 4 (2) |
TMU on conveyor | 1 (2) |
Textures per beat | 4 |
Textures per pass | 16 |
Vertex conveyors | 1 |
Pixel Shaders | 2.0 |
Vertex Shaders | 2.0 |
Fill Rate (Mpix/s) | 1000 |
Fill Rate (Mtex/s) | 1000 |
DirectX | 9.0 |
Anti-Aliasing (Max) | SS & MS - 4x |
Anisotropic Filtering (Max) | 8x |
Memory Capacity | 128 / 256 MB |
Interface | AGP 8x/PCI |
RAMDAC | 2x350 MHz |
This chip could be called GeForce4 MX440-8x with DirectX 9 support. Indeed, it was a good update of the nVidia model range. But at that time it was useless: games at that time with DirectX 8 support can be counted on the fingers, and games with DirectX 9 support were still around and they came out much later. When they started appearing, the GeForce FX 5200 series became irrelevant due to its low performance and these video cards dropped significantly in price.