General about video cards
Video card (also known as a graphic card, graphic card, video adapter) (English videocard) - a device that converts an image stored in the computer's memory into a video signal for the monitor. Usually, a video card is an expansion card and is inserted into an expansion slot, universal (PCI-Express, PCI, ISA, VLB) or specialized (AGP), but it can also be built-in (integrated) into the system board. Modern video cards are not limited to simply displaying an image, they have a built-in graphics microprocessor that can perform additional processing, unloading the computer's central processor from these tasks. For example, all modern NVIDIA and AMD (ATi) video cards support OpenGL applications at the hardware level.
History
One of the first graphics adapters for the IBM PC was the MDA (Monochrome Display Adapter) in 1981. It worked only in text mode with a resolution of 80x25 characters (physically 720x350 pixels) and supported five text attributes: normal, bright, inverse, underlined and flashing. It could not transmit any color or graphic information, and what color the letters would be was determined by the model of the monitor used. They were usually black and white, amber or emerald. In 1982, Hercules released a further development of the MDA adapter, the HGC video adapter (Hercules Graphics Controller - Hercules graphics adapter), which had a graphic resolution of 720x348 pixels and supported two graphic pages. But he still didn’t allow me to work with color.
The first color video card was the CGA (Color Graphics Adapter), released by IBM and which became the basis for subsequent video card standards. It could work either in text mode with resolutions of 40x25 and 80x25 (the character matrix is 8x8), or in graphic mode with resolutions of 320x200 or 640x200. In text modes, 256 character attributes were available - 16 character colors and 16 background colors (or 8 background colors and a blink attribute), in the 320x200 graphics mode there were four palettes of four colors each, and the 640x200 high-resolution mode was monochrome. In development of this card, EGA (Enhanced Graphics Adapter) appeared - an improved graphics adapter, with a palette expanded to 64 colors, and an intermediate buffer. The resolution was improved to 640x350, resulting in an 80x43 text mode with an 8x8 character matrix. For the 80x25 mode, a large matrix was used - 8x14, 16 colors could be used simultaneously, the color palette was expanded to 64 colors. The graphics mode also allowed the use of 640 colors from a palette of 350 colors at a resolution of 16x64. Was compatible with CGA and MDA.
It is worth noting that the interfaces with the monitor of all these types of video adapters were digital, MDA and HGC transmitted only whether the dot was lit or not lit and an additional brightness signal for the “bright” text attribute, similarly CGA transmitted the main video signal through three channels (red, green, blue) , and could additionally transmit a brightness signal (all this resulted in 16 colors), the EGA had two transmission lines for each of the primary colors, that is, each primary color could be displayed at full brightness, 2/3, or 1/3 of full brightness, which and gave a total of a maximum of 64 colors.
In early models of computers from IBM PS/2, a new graphics adapter appeared, MCGA (Multicolor Graphics Adapter). The text resolution was raised to 640x400, which made it possible to use the 80x50 mode with an 8x8 matrix, and for the 80x25 mode to use an 8x16 matrix. The number of colors was increased to 262144 (64 brightness levels for each color); for compatibility with EGA in text modes, a color table was introduced, through which the 64-color EGA space was converted to the MCGA color space. A 320x200x256 mode appeared, where each pixel on the screen was encoded by the corresponding byte in video memory, there were no bit planes, accordingly, compatibility with EGA remained only in text modes, compatibility with CGA was complete. Due to the huge number of brightnesses of primary colors, it became necessary to use an analog color signal; the horizontal scanning frequency was already 31,5 KHz.
Then IBM went even further and made VGA (Video Graphics Array), an extension of MCGA compatible with EGA and introduced in mid-range PS/2 models. This has been the de facto video adapter standard since the late 80s. Added 720x400 text resolution for MDA emulation and 640x480 graphics mode, accessible via bitplanes. The 640x480 mode is remarkable because it uses a square pixel, that is, the ratio of the number of horizontal and vertical pixels is the same as the standard screen aspect ratio of 4:3. Then came the IBM 8514/a with resolutions of 640x480x256 and 1024x768x256, and the IBM XGA with 132x25 text mode (1056x400) and increased color depth (640x480x65K).
Since 1991, the concept of SVGA (Super VGA - “super” VGA) appeared - an expansion of VGA with the addition of higher modes and additional services, for example the ability to set an arbitrary frame rate. The number of simultaneously displayed colors increases to 65'536 (High Color, 16 bits) and 16'777'216 (True Color, 24 bits), and additional text modes appear. Among the service functions, support for VBE (VESA BIOS Extention - an extension of the VESA standard BIOS) appears. SVGA has been perceived as the de facto video adapter standard since about mid-1992, after the adoption of the VBE standard version 1.0 by the Video Electronics Standard Association (VESA). Until that moment, almost all SVGA video adapters were incompatible with each other.
The graphical user interface, which appeared in many operating systems, stimulated a new stage in the development of video adapters. The concept of “graphics accelerator” appears. These are video adapters that perform some graphics functions at the hardware level. These functions include moving large blocks of image from one area of the screen to another (for example, when moving a window), filling areas of the image, drawing lines, arcs, fonts, hardware cursor support, etc. The direct impetus for the development of such a specialized device was the fact that the graphical user interface is undoubtedly convenient, but its use requires considerable computing resources from the central processor, and a modern graphics accelerator is precisely designed to remove the lion's share of the calculations for the final display of the image on the screen.
Устройство
A modern video card consists of the following parts:
Graphics Processor (Graphics processing unit - graphic processor device) - deals with the calculations of the displayed image, relieving the central processor of this responsibility, makes calculations for processing three-dimensional graphics commands. It is the basis of the graphics board, it is on it that the speed and capabilities of the entire device depend. Modern graphic processors are not much inferior in complexity to the central processor of a computer, and often surpass it both in the number of transistors and in computing power, due to the large number of universal computing units. However, the architecture GPU The previous generation usually assumes the presence of several information processing units, namely: a 2D graphics processing unit, a 3D graphics processing unit, which in turn is usually divided into a geometric core (plus vertex cache) and a rasterization unit (plus texture cache), etc.
Video controller — is responsible for forming an image in video memory, gives RAMDAC commands to generate scanning signals for the monitor and processes requests from the central processor. In addition, there is usually an external data bus controller (for example, PCI or AGP), an internal data bus controller, and a video memory controller. The width of the internal bus and video memory bus is usually larger than the external one (64, 128 or 256 bits versus 16 or 32); many video controllers also have RAMDAC built in. Modern graphics adapters (ATI, nVidia) usually have at least two video controllers that operate independently of each other and simultaneously control one or more displays each.
Video memory — acts as a frame buffer in which the image is stored, generated and constantly changed by the graphics processor and displayed on the monitor screen (or several monitors). Video memory also stores intermediate image elements invisible on the screen and other data. Video memory comes in several types, differing in access speed and operating frequency. Modern video cards are equipped with DDR, DDR2, GDDR3, GDDR4 and GDDR5 memory. It should also be borne in mind that in addition to the video memory located on the video card, modern graphics processors usually use in their work part of the general system memory of the computer, direct access to which is organized by the video adapter driver via the AGP or PCIE bus.
Digital-to-analog converter (DAC, RAMDAC — Random Access Memory Digital-to-Analog Converter) — is used to convert the image generated by the video controller into color intensity levels fed to the analog monitor. The possible range of image color is determined only by the RAMDAC parameters. Most often, RAMDAC has four main blocks — three digital-to-analog converters, one for each color channel (red, green, blue, RGB), and SRAM for storing gamma correction data. Most DACs have a bit depth of 8 bits per channel — this results in 256 brightness levels for each primary color, which in total gives 16,7 million colors (and due to gamma correction, it is possible to display the original 16,7 million colors in a much larger color space). Some RAMDACs have a bit depth of 10 bits per channel (1024 brightness levels), which allows displaying more than 1 billion colors at once, but this feature is practically never used. A second DAC is often installed to support a second monitor. It is worth noting that monitors and video projectors connected to the digital DVI output of the video card use their own digital-to-analog converters to convert the digital data stream and do not depend on the characteristics of the DAC of the video card.
Video ROM (Video ROM) is a permanent storage device in which video BIOS, screen fonts, service tables, etc. are written. ROM is not used directly by the video controller - only the central processor accesses it. The video BIOS stored in ROM ensures the initialization and operation of the video card before loading the main operating system, and also contains system data that can be read and interpreted by the video driver during operation (depending on the method used to separate responsibilities between the driver and the BIOS). Many modern cards are equipped with electrically reprogrammable ROMs (EEPROM, Flash ROM), which allow the video BIOS to be rewritten by the user using a special program.
Cooling system — designed to maintain the temperature of the video processor and video memory within acceptable limits.
Correct and fully functional operation of a modern graphics adapter is ensured using video drivers — special software supplied by the video card manufacturer and loaded during the operating system startup process. The video driver functions as an interface between the system with the applications running in it and the video adapter. Just like the video BIOS, the video driver organizes and programmatically controls the operation of all parts of the video adapter through special control registers, access to which occurs through the corresponding bus.
br /br /