tg Facebook insta link twiter jute  patreon
 
train

Age of Empires IV - PC performance graphics benchmarks of Graphics Cards and Processors

BASIC INFORMATION ON THE GAME

Relic Cardinal 2021 10 28 16 06 24 942

Release date: 2021
Genre: Strategy, RTS
developer: Relic Entertainment
Publisher: Microsoft Game Studios

Embark on a medieval world by installing Age of Empires IV and starting to play, lead one of the many nations, manage resources, train warriors and survive by rewriting the history of the world ...

The developers decided to continue the theme of the second numbered part of the series and again send the players to the Middle Ages. Lead one of the eight proposed nations and start developing your civilization, guide your people through four eras, fight for the right to rule your territories, survive and build cities, and enjoy advanced gameplay and an updated list of features.

THE GRAFICAL PART

In this subsection of our review, the main graphical aspects of this game are revealed. Particular attention is paid to the version of the graphics engine used, the version of the API used, graphics settings and the quality of the development of the main visual aspects. 

SUPPORTED OC AND GRAPHICS API

The Age of Empires IV supported  Windows 10/11.

Relic

The priority and primary graphics API for Age of Empires IV is DX12.

GAME ENGINE

The basis for Age of Empires IV was the game engine Essence Engine 5.0. 

Relic Cardinal 2021 10 28 16 03 09 989

Essence Engine is a game engine originally developed by Relic Entertainment for the computer game Company of Heroes.

SETTINGS

Age of Empires IV has a sufficient range of graphic settings.

Relic Cardinal 2021 10 28 16 06 34 233

Below we have provided screenshots of the game at various graphics settings, where our readers can see the difference between the minimum and maximum graphics quality settings.

QUALITY MODES

TECHNOLOGIES

The graphics in Age of Empires IV are very, very nice:

Relic Cardinal 2021 10 28 16 04 08 071 Relic Cardinal 2021 10 28 16 04 19 940

Relic Cardinal 2021 10 28 16 04 28 096 Relic Cardinal 2021 10 28 16 08 22 668

Well, then we will go directly to the gaming tests and determine what impact this game has on modern computer hardware.

TEST PART
Test configuration
test stands

GIGABYTE G1 Sniper Z97

GIGABYTE GA-X99-Gaming 7 WIFI

GIGABYTE GA-Z270-Gaming K3

GIGABYTE Z370 AORUS Gaming 7

GIGABYTE GA-AX370-Gaming 5

GIGABYTE X470 AORUS GAMING 7 WIFI

AORUS RGB M.2 NVMe SSD 256GB

MSI MPG Z490 GAMING CARBON WIFI

multimedia equipment

Monitor Philips 326M6VJRMB/00

Power Supply Seasonic PRIME TX-1000

RAM: Transcend JM2666HLE-16Gx2pcs

SSD: Transcend NVMe SSD MTE220S 512 GB

Software configuration
Operating system Windows 20H2
Graphics driver

Nvidia GeForce/ION Driver Release 496.49

AMD Radeon Adrenaline Edition 21.10.3

Monitoring programs

MSI Afterburner

Action!

FRAPS

 
All video cards were tested at maximum graphics quality by MSI Afterburner. The purpose of the test is to determine how video cards from different manufacturers behave under the same conditions. Below is a video of the test segment:

 Our video cards were tested at different screen sizes 1920x1080, 2560x1600 and 3840x2160 at maximum graphics quality settings.

NVIDIA GeForce RTX and GTX graphics cards provided by LLC Business Development Center.

GPU TEST

Relic Cardinal 2021 10 28 16 06 34 233

In the test of video cards, the default resolution is 1920x1080, other resolutions are added and removed manually. You can also remove and add any positions of video cards. You can also choose any of our test processors from the list in the drop-down menu, comparing its performance with the given video card tests (by default, the most productive solution is selected). The test is performed on the most productive CPU and scaled to other processors, taking into account their testing on NVIDIA and AMD video cards.

Age of Empires IV GPU Benchmark
  • Max
  • Screen resolutions 1 of 3

  • Hide DLSS/FSR video cards

  • Loading ...


With permission 1920x1080 an average FPS of 25 frames was shown by video cards like the Radeon RX 590 or GeForce GTX 1070. A minimum FPS of at least 25 frames can be provided by video cards of the level adeon RX 590 or GeForce GTX 1070 A comfortable average FPS of 60 frames will be able to provide solutions Radeon RX 590 or GeForce GTX 1070.

With permission 2560x1440 an average FPS of 25 frames was shown by video cards like the Radeon RX 590 or GeForce GTX 1070. A minimum FPS of at least 25 frames can be provided by video cards of the level Radeon RX 590 or GeForce GTX 1070 A comfortable average FPS of 60 frames will be able to provide solutions Radeon RX Vega 64 or GeForce RTX 2060s. .

With permission 3840x2160  an average FPS of 25 frames was shown by video cards like the Radeon RX 590 or GeForce GTX 1070. A minimum FPS of at least 25 frames can be provided by video cards of the level Radeon RX 6600 XT or GeForce RTX 2070s A comfortable average FPS of 60 frames will be able to provide solutions Radeon RX 6800 or GeForce RTX 3060 Ti.

VIDEO RAM CONSUMPTION

msi rx 2080

Testing of the video memory consumed by the game was carried out by the program MSI Afterburner. For the indicator, the results on video cards from AMD and NVIDIA were taken at different screen settings 1920x1080 and 2560x1440 with various anti-aliasing settings. By default, the graph displays the most up-to-date solutions. Other graphics cards are added and removed from the chart at the reader's request.

Age of Empires IV VRAM test
  • Max
  • All screen resolutions

  • Hide DLSS/FSR video cards

GEFORCE RTX 2070 Super 8GB
6175
5316
4994
GEFORCE RTX 2080 Super 8GB
6225
5359
5035
6749
5806
5503
6225
5311
4871

GameGPU

 - 3840x2160 Mbyte
 - 2560x1440 Mbyte
 - 1920x1080 Mbyte

With permission 1920x1080 video memory consumption for video cards with 8 gigabytes 5000 megabytes, with 10 gigabytes 5400 megabytes, with 12 gigabytes 570 megabytes, with 16 gigabytes 5900 megabytes.

With permission 2560x1440 video memory consumption for video cards with 8 gigabytes 5400 megabytes, with 10 gigabytes 6000 megabytes, with 12 gigabytes 6200 megabytes, with 16 gigabytes 6400 megabytes.

With permission 3840x2160 video memory consumption for video cards with 8 gigabytes 6200 megabytes, with 10 gigabytes 7000 megabytes, with 12 gigabytes 7100 megabytes, with 16 gigabytes 7300 megabytes.  

CPU TEST

Relic Cardinal 2021 10 28 16 06 34 233

Testing was carried out at a resolution of 1920x1080. In the processor test, you can remove and add any positions of the processors. You can also select any tested video card from the list in the drop-down menu, comparing its performance to the given processor tests(the most productive solution from NVIDIA is selected by default). Testing takes place on the most productive NVIDIA and AMD video cards and scales up to lower models.

Age of Empires IV CPU test
  • Max
  • All screen resolutions

  • All processors

  • Loading ...

When using video cards NVIDIA an acceptable indicator of at least 25 frames was shown by processors Ryzen 5 1400 or Core i 7 4770, and an FPS rate of at least 60 frames will be able to provide solutions of the level  Ryzen 5 1600 or Core i 7 4770

When using video cards AMD an acceptable indicator of at least 25 frames was shown by processors Ryzen 5 1400 or Core i 7 4770, and an FPS rate of at least 60 frames will be able to provide solutions of the level Ryzen 5 1600 or Core i 7 4770.  

Age of Empires IV Quiz Cores
  • Max
  • All screen resolutions

  • All processors

1 core
2 core
3 core
4 core
5 core
6 core
7 core
8 core
1 HT
10 HT
2 HT
3 HT
4 HT
5 HT
6 HT
7 HT
8 HT
9 core
9 HT
10 core
2
29
3
3
8
38
28
6
3
3
14
11
6
3
26
58
8
8
5
2
1 core
2 core
3 core
4 core
5 core
6 core
7 core
8 core
1 HT
2 HT
3 HT
4 HT
5 HT
6 HT
7 HT
8 HT
25
59
16
25
8
5
6
3
3
5
5
3
94
3
3
21
1 core
2 core
3 core
4 core
5 core
6 core
7 core
8 core
6
22
9
16
11
8
92
90
1 core
2 core
3 core
4 core
5 core
6 core
1 HT
2 HT
3 HT
4 HT
5 HT
6 HT
20
67
22
17
9
23
14
16
8
9
80
6
1 core
2 core
3 core
4 core
5 core
6 core
20
94
27
19
28
27
1 core
2 core
3 core
4 core
1 HT
2 HT
3 HT
4 HT
23
84
23
16
16
30
16
20
1 core
10 core
11 core
12 core
2 core
3 core
4 core
5 core
6 core
7 core
8 core
9 core
1 SMT
10 SMT
11 SMT
12 SMT
2 SMT
3 SMT
4 SMT
5 SMT
6 SMT
7 SMT
8 SMT
9 SMT
35
10
6
3
8
20
8
40
20
6
14
8
60
3
5
2
0
3
0
1
0
5
0
2
1 core
2 core
3 core
4 core
5 core
6 core
7 core
8 core
1 SMT
2 SMT
3 SMT
4 SMT
5 SMT
6 SMT
7 SMT
8 SMT
46
8
8
30
10
42
27
12
28
9
95
5
10
6
3
12
1 core
2 core
3 core
4 core
5 core
6 core
1 SMT
2 SMT
3 SMT
4 SMT
5 SMT
6 SMT
51
3
11
13
35
21
20
5
94
8
6
1
1 core
2 core
3 core
4 core
1 SMT
2 SMT
3 SMT
4 SMT
47
24
83
21
11
66
25
11

GameGPU

 - 1920x1080%

The game can download up to 8 threads. The game uses 6 cores as efficiently as possible.

RAM TEST

dz

The test was conducted on the base configuration of the Core i 9 11900K with 16 GB DDR4 3733 MGz pre-installed memory. All used RAM was taken as an indicator. The RAM test on the entire system was carried out on various video cards without running extraneous applications (browsers, etc.). In the graphics, you can add and remove any resolutions and video cards as desired.

Age of Empires IV test RAM
  • Max
  • All screen resolutions

  • Hide DLSS/FSR video cards

11094
11001
10957
10689
10590
10546
10894
10728
10710
10829
10801
10750
10785
10620
10602
10807
10779
10729
GEFORCE RTX 2070 Super 8GB
10688
10661
10610
GEFORCE RTX 2080 Super 8GB
10775
10747
10696
10797
10769
10718
10799
10676
10957
11131
11017
11173
11358
11242
11401
10963
10933
10758
12227
11273
11565
12203
11250
11542
12105
11160
11449
9812
9544
9424

GameGPU

 - 3840x2160 Mbyte
 - 2560x1440 Mbyte
 - 1920x1080 Mbyte

With permission 1920x1080 RAM consumption of the system by a video card with with 8 gigabytes 10700 megabytes, 10th gigabyte 10600 megabytes, with 12 gigabytes 10400 megabytes, with 16 gigabytes 11200 megabytes.

With permission 2560x1440 RAM consumption of the system by a video card with with 8 gigabytes 10700 megabytes, 10th gigabyte 10800 megabytes, with 12 gigabytes 10500 megabytesс, with 16 gigabytes 11300 megabytes.

With permission 3840x2160 RAM consumption of the video card system с 8th gigabyte 10700 megabytes, 10th gigabyte 10800 megabytes, 12 gigabytes 10700 megabytesс, with 16 gigabytes 11400 megabytes.

IRON SPONSORS
AMD Seasonic 40 years logo image 2019 07 30T13 48 51 198Z
nvidia philips transcendent logo
  

 

Тeating  1 1 1 1 1 1 1 1 1 1 Rating 66% [22 Votes] Graphics  1 1 1 1 1 1 1 1 1 1 Rating 60% [20 Votes] Optimization  1 1 1 1 1 1 1 1 1 1 Rating 55% [24 Votes]

 

android download

People in this conversation

Comments (51)

This comment was made by the moderator on the site

RadeonForce
from 6800xt 10% margin, normal. Expected, however, 15%. But do not forget that 6900xt has 12% more cores, higher limits and factory frequencies, and on average 8% faster than 6800xt (not linearly because both have 128 rops, the memory is the same, only the CUs are cut off), then there is a clear difference.

Requires evidence, so far there has not been a single 6900xt in the tests. I can also tell you that I will put water + bios with extended TDP limits, and it will be at a constant 2100-2200.
Gamegpu compares throttling refs 6800, 6800xt and 6900xt. Even I bypass the latter on 6800xt by 10% on average, and every single ampere is non-ref, 3080ti with 3x8 pins, aorus. The Germans, most likely, either all cards are refs, or all are unrefs. In our country, the AMD representative only harms refs (except for 6600 and 6700), and Nvidia has exclusively giga for 3 generations in a row (which is also why I don’t trust the AMD + gigabyte bundle)

I have no faith in anyone, without monitoring and a test segment. So that everything can be checked, and everything was transparent. I haven't seen anyone like this yet.
PS I would like to buy 6900xt for the sake of justice, and check for lice. And then there are too many songs without evidence.
Moreover, I checked, we have top 6900xt for 1500-1600 euros. (Red Devil for 1550 euros)

zmey
This comment was made by the moderator on the site

in the profile topic on overs, the guys at 6900XT ~ 2700-2750 MHz stagger me by 8-10%, which is generally natural. The standard difference between x and non-x radeon, which in fact are 6900xt and 6800xt. If we talk about the 6800 - then it is castrated by a quarter not only in terms of CU, but also in terms of ROPs - for the first time, AMD took such a step in non-X models, that is, it was quite seriously cut down, and is also clamped by a voltage limit of 1.05v (6800xt has 1.15v, for 6900xt - 1.175v, for the new XTXH it seems that you can raise it to 1.2v), that is, it will not even catch up with the stock 6800xt. I would rather call 6800xt 6900, tapa, "non-X" flagship model. And 6800 is already a lower class card.

By the way, 6800xt@2650 = 6900xt@2380, if by flops. But in this case, 6800xt will have 11% higher pixel fillrate (rops), because the core frequency is higher, and this is especially important in high screen sizes

I have no faith in anyone, without monitoring and a test segment. So that everything can be checked, and everything was transparent. I haven't seen anyone like this yet.

right now, wherever you spit - everywhere charged with invidia editions. In the Russian-speaking segment, apart from gamegpu and i2hard, there is no one left
PS I would like to buy 6900xt for the sake of justice, and check for lice. And then there are too many songs without evidence.

Well, why, I regularly post vidos. 6900xt in terms of layout is ahead of 6800xt exactly as much as 290x was ahead of 290, 480 - 470, vega64 - vega56, 5700xt - 5700. And the difference at equal frequencies was 6-7% on average everywhere. Considering that the 6900xt has higher frequencies, ~10% of the difference emerges here.
Moreover, I checked, we have top 6900xt for 1500-1600 euros. (Red Devil for 1550 euros)

they are noticeably cheaper than analogs from nvidia, because they mine worse, but the prices are simply inadequate sky-high. If you will take, then take the version with the XTXH chip, the limits are unlocked on it, and the memory can be driven above 2150 MHz (on the usual debility limit - the slider is not allowed to be raised higher). I know for sure that such a chip is on the 6900 XT Asrock Formula

RadeoForce
This comment was made by the moderator on the site

RadeonForce
Interesting results were obtained, exactly the opposite of what was expected.
1080r
https://ibb.co/rGqTJVm
1440r
https://ibb.co/8868BvX
4k
https://ibb.co/vq99ZRh
1440rUltra+RT
https://ibb.co/FgP1gD6
PS How can I post a picture here?

zmey
This comment was made by the moderator on the site

Interesting results were obtained, exactly the opposite of what was expected.

but it’s not, just exactly what I was talking about was confirmed. Both of us have an emphasis on the CPU (except for 4K), your percent is faster, therefore the overall FPS is higher. But here's the thing: look at the "GP game" metric:

3090 vs 6800XT

1080p: 236 fps vs 273 fps (+15.6%)
1440p: 173 fps vs 193 fps (+11.5%)
2160p: 106 fps vs 107 fps (+1%)

In total, what we have: in 4K we got parity, in 1080p - an advantage of 15% in favor of 6800xt.

Well, no matter how much I hoped that this game would show a red tint in the rasterization, because the huang logos, but we are not identifying the winner right now, I just wanted to demonstrate this gradation, depending on the resolution. Where parity is in 4K, there is + 1080% in 15p and + 1440% in 10p. And plus or minus will also be in most games.


In Horizon, we also drove with you on the forum:

1440r - 154 against 151 in your favor (+2%)
2160r - 96 against 85 in your favor (+13%)

That is, most likely, in 1080p I have something like an advantage of 5% expected.

And note, between 1440p and 4K again the same ~10%.

Why I remembered the GCN syndrome: on the vega in Detroit Become a Human in 4K there was parity with 1080ti, in 1080p - a slight lag behind 1080, in 720p - the level of 1070. In all three cases, the emphasis was on the GPU. Pascals against the background of GCN showed the same thing that RDNA2 now shows against the backdrop of Nvidia architectures - better scalability to low resolutions. That is, at low loads, the efficiency of the architecture is higher. A little spoiler: in 8K the difference in your favor will be about 20-30%. That is, de facto amps are stronger than RDNA2, as at one time any GCN card was stronger than its direct competitor, but this could only be seen at maximum loads, for example, in 4K, but from the Radeons it was more or less pulled by 4K only Vega, and before that it was 290X, but we had to compare ourselves, because the refs went into deep throttling in these different buildings on the screen - power consumption grew along with the "dance" as the resolution grew.

This is the very reason why earlier it was invidia that was more relevant for multi-hertz players. Now everything has turned upside down - in high-rise buildings, the screen shows itself better invidia, in low ones - AMD.

That's why I say that 3080 is a competitor for 6800ht except in 4K, in lower screen sizes it will be weaker. Of course, if you do not compare ref against non-ref, as in gamegpu

1440rUltra+RT
https://ibb.co/FgP1gD6

buddy, your card seems to be trotting. The rays there throw their 50 watts, it could not fit into the limit. They put at least 115-116 fps on them, Gan77 like 120 fps struck.
Perhaps for the same reason, 4K did not bypass me in Lark
PS How can I post a picture here?

on this photo hosting there are "embed codes" below, select "BB-code for full size image", copy the link, paste it here

RadeoForce
This comment was made by the moderator on the site

buddy, your card seems to be trotting.

Not the card does not throttling There is a problem, the card works on x8 gen3, I took the mother on ebay, then the trouble was revealed. Maybe speed is lost through the riser, now I will not do anything. Rain 12900k, I will sell this setup.
Gan77 seems to hit 120 fps.

He has the same setup as me. Maybe he will advise something with gen3 x8.
https://i.ibb.co/c3LmvGt/rrr.jpg

zmey
This comment was made by the moderator on the site

Not the card does not throttling There is a problem, the card works on x8 gen3, I took the mother on ebay, then the trouble was revealed. Maybe speed is lost through the riser, now I will not do anything. Rain 12900k, I will sell this setup.

ahh, well, for the beams, the maximum bandwidth of psi-e is just important, at least in theory. That's probably why it wobbles.
He has the same setup as me. Maybe he will advise something with gen3 x8.

I think he will advise when he reads it.

Well, in general, did you catch the essence of what I wanted to convey to you? The above does not mean at all that the 6800xt will be the fastest in every game. The bottom line is that where the lag in 4K is 15%, there will most likely be parity in 1080p, in 1440p about 5% of the lag. Well, of course, in some games this gradation from 1080p to 4K can be 10% or 20%, but it will certainly be, due to the features of both architectures

That's why I say that the competition in 4K and the competition in 1080p and 1440p are two different competitions. Sobsno, the strips of third-party publications confirm this, however, there is also horseradish to understand which cards are refs or non-refs, but a similar trend can be traced (you can’t look at 8-gig ones, they don’t have enough memory for ultra-presets in 4K, because they can’t get ahead of a simple 6800 begin, but only fall further behind)

https://i.ibb.co/tC54tJb/1080.png
https://i.ibb.co/qFtFhN4/1440.png
https://i.ibb.co/419Tmyb/4k.png

Actually, when playing with you in 4K, I consciously choose the worst scenario for the 6800xt, and even there it does not make a mistake in general

RadeoForce
This comment was made by the moderator on the site

Well, in general, did you catch the essence of what I wanted to convey to you?

The meaning is clear, but I think it will not affect me at 3440x1440. All this is certainly informative, but I believe in harsh specific numbers (of a specific game) ..
Well, of course, in some games this gradation from 1080p to 4K can be 10% or 20%, but it will certainly be, due to the features of both architectures

Therefore, I would not make any patterns, there will always be exceptions to the rules.
Sobsno, strips of third-party publications confirm this.

I don’t take these stripes seriously, in 4k the gap between greens and reds is not 1%. 10-15 I would still believe. Even on the local strips, the gap is larger.

PS I think the psp is in 4k and in the rays of the buxanula

zmey
This comment was made by the moderator on the site

by and large, we didn’t need rays for this experiment, but the solution to your problem, I think, is somewhere on the surface. Naga at 3090 has normal results with rays shooting, which means he knows something for sure This is a topic for discussion on the forum.
In any case, thanks for participating. I have voiced this topic many times, but I believe that most did not understand what I mean. Or they thought that it was about processor dependence. And now he literally showed it on his fingers, using the example of two cards. I think most of those who read that sheet understood everything

To be fair, it's not interesting to bet under favorable conditions, and it will be unprofitable to load such power at 1080p even with zen3, even a lower processor dependence in dx12 will not help here (in 1440p, I think, it will "reveal"). But "in enemy territory", in 4K, where you can get people, that's it

I don’t take these stripes seriously, in 4k the gap between greens and reds is not 1%. 10-15 I would still believe. Even on the local strips, the gap is larger.

from 6800xt 10% margin, normal. Expected, however, 15%. But do not forget that 6900xt has 12% more cores, higher limits and factory frequencies, and on average 8% faster than 6800xt (not linearly because both have 128 rops, the memory is the same, only the CUs are cut off), then there is a clear difference.
Even on the local strips, the gap is larger.

Gamegpu compares throttling refs 6800, 6800xt and 6900xt. Even I bypass the latter on 6800xt by 10% on average, and every single ampere is non-ref, 3080ti with 3x8 pins, aorus. The Germans, most likely, either all cards are refs, or all are unrefs. In our country, the AMD representative only harms refs (except for 6600 and 6700), and Nvidia - exclusively gig for 3 generations in a row (which is also why I don’t trust the AMD + gigabyte bundle )

I myself don’t really trust the tests of these Germans, or rather, I don’t trust them more often, but they have the same gradation when the resolution is increased, that is, at least they tested something. But when some TPU-type publications are rolled out, which in dx12 do not have a difference in processor dependence in 1080p, there immediately appear doubts about the reliability of the test

RadeoForce
This comment was made by the moderator on the site

Zmey
As expected, the processor did not load the card in these different settings on the screen, but "GP game" 194 and 273 fps respectively. Preset "max", TAA

Let's start:

1440r

https://i.ibb.co/Bt76RXT/Shadow-of-the-Tomb-Raider-2021-10-30-17-10.png

1080r

https://i.ibb.co/gMpCP70/Shadow-of-the-Tomb-Raider-2021-10-30-17-22.png

I'll add another 4K - 107 fps, then you're shaking me, most likely

https://i.ibb.co/R0bTKh6/Shadow-of-the-Tomb-Raider-2021-10-30-17-29.png

It's funny, a couple of years ago Lara was scolded for shitty performance, but for the current flagships she turned out to be light even in 4K. Somehow unusually, I have always associated it with bending games. At 1080p, the CPU test was generally released - "GPU dependence" 0%, that is, during the benchmark, performance rested on the video card 0% of the time. At 1440p - 41% of the time the emphasis is on the GPU, the rest - on the CPU

RadeoForce
This comment was made by the moderator on the site

Why can't I see graphics settings?

zmey
This comment was made by the moderator on the site

at the top, "max", under the word "graphics". If you change something in the preset, instead of "max" there appears "by choice"

RadeoForce
This comment was made by the moderator on the site

Why not with full twists, without RT?

zmey
This comment was made by the moderator on the site

We talked about rasterization, didn't we? Rays are more or less clear in this game. Well, here's 1440p fsenamax + DXR, the same anti-aliasing, the same version of the game (there is even a vidos for this result, if so). You will get around 120 fps here, or a little more. In 4K with beams, there will be about 20% overweight.
https://i.ibb.co/XWgTJQf/Shadow-of-the-Tomb-Raider-2021-10-20-20-27-1.png

But right now, specifically about rasterization, in order to dot the "E". I just want to demonstrate to you the scalability of both architectures so that you understand what I have been talking about for a long time.

RadeoForce
This comment was made by the moderator on the site

ok, I understand you, now I'll put zhipegs on the maximum preset

zmey
This comment was made by the moderator on the site

It sucks that in 4K Radeons sag like that ..

mfuZ
This comment was made by the moderator on the site

They would have taken out the prots, they would have been behind the screen in low-rise buildings ...

Undertaker
This comment was made by the moderator on the site

Zmey

ok Let's calmly compare several in popular games, and draw conclusions whether it is critical in the same 1080 / 1440p or you can score.

so it is uncritical in 4K, but usually when someone wins by 10% - "this is a gap in the trash, the best performance on the market", and when it loses - it is "uncritical". I try to adhere to certain criteria in such matters.
Initially, after all, he wrote that the distance between tops and pretops is insignificant. In fact, if the 3080 had 12GB, I would not have seen much in silence. I think in 1080p 3080 will be 10% slower, in 4K plus or minus the same. we are talking about the fact that in rasterization 3080 the competitor 6800xt is only in 4K (if there is enough memory), in 1080p and 1440p the competitor is more likely to act as a competitor. Rays - it's understandable, in huangoluchs basically either a little weaker than 3080, or a little faster. in cross-platform beams, even wins happen, like in Myst, Farik, Dirt, or parities like in Deadloop. Let me remind you, Mist and Deadloop are not amdish games

RadeoForce
There are no comments left here yet.
Load more