AMD Radeon RX 5700 XT Stuttering at 1080p

Quick note before I go into any details: I did not find a solution for this problem, unfortunately. I’ll be explaining what happened and show frame time graphs as proof.

So, with that out of the way, let’s get into it. I’m certainly not the only one with this issue. If you employ the search engine of your liking you will find many threads covering that topic (like here and here and here and here and so on). Some managed to get it working, some did not. I’m obviously in the latter category.

What happens? From what I found in my research it seems like the RX 5700 XT GPU aggressively tries to save energy if it is not fully utilized. If you run MSI’s Afterburner or any other monitoring software, then you’ll see the GPU load and frequency being all over the place. In general, this is a good thing – if it does not affect perceived performance. And this is where it fell apart for me.

My system specs:

  • Ryzen 5 2600
  • Sapphire Pulse RX 570 8GB to be replaced by PowerColor Red Dragon RX 5700 XT
  • 16GB 3000MHz CL15 RAM

Clearly, the CPU is the bottleneck, I know that. The GPU was chosen to be able to gain additional performance in a CPU upgrade later in 2020. In an ideal world, the 5700 XT is around twice as fast as the RX 570 as shown in this benchmark graph from the reputable German technology site Computerbase. This ideal world is a 9900K.

I only have four games installed now and one of them did manage that jump. It’s the Division 2 in DX12 mode. In this game I saw a boost from 48 fps at 30% CPU utilization to 102 fps at 73% CPU utilization at Ultra settings. However, I’m not playing this game right now. It’s Assassin’s Creed Odyssey and Far Cry 5 (and I just finished Ghost Recon: Wildlands which I will also use to verify improvements in key areas where performance was atrocious with the RX 570).

To make it short: although FPS were higher in all those games, but not as spectacular as in Division 2, perceived performance was terrible. I’ll be using ACO and the built-in benchmark to make my point because it provides a nice frame time graph and it conveniently exports the results to a file.

Here’s how it looks at Very High to Ultra settings (the exact settings don’t really matter) at 1080p.

As can be seen, the frame times are all over the place. There are more than 50 shades of gray in there.

Now the same settings with a resolution scale of 140%.

Granted, it’s a subtle difference for sure and it is way more obvious in the in-game graph (which, oddly enough, is not exported), but the number of shades has dropped below 50. Putting a higher load on the GPU resulted in better frame times without actually reducing the average FPS number.

Lastly, the same settings with the RX 570.

Now that are stable frame times! Yes, the performance is lower, not by much mind you, but the overall experience is waaaaaay better. As a result, I’ve returned the 5700 XT and I will go with NVIDIA the next time.

Wolfenstein: Youngblood – Coop Review

Wolfenstein Youngblood follows in the same footsteps as its three predecessors that sucessfully revived the series in 2009. Having liked Wolfenstein, The New Order and The New Colossus I thought that sharing that kind of game with a friend in Coop would be even better. This is the first installement in this series that allows you to do that and I’m a big fan of Coop gameplay. And by Coop I mean playing the regular campaign with a fellow gamer, not some unrelated multiplayer map or basic PvP action. I want to experience the story with somebody, have a ton of fun and discuss the game while playing it.

Read More »

Ryzen Master not Resetting to “Auto” Control Mode

Recently I set out to figure out how much clock speed I can squeeze out of my Zen+ based Ryzen 5 2600. To make life easier I figured I use Ryzen Master so I can change the settings while I’m in Windows so I don’t have to reboot every time I increase the clock speed. This has worked nicely until the point where I figured the viable maximum was. The next step was to dial those numbers "into hardware", meaning setting the options in the BIOS so that Ryzen Master is not required any more. And this is where my issues started to appear.

The Error

First, here’s a screenshot of the message Ryzen Master was giving me. After that I’ll explain what had happened.

In order to set the CPU multiplier you have to change from automatic to manual mode in Ryzen Master. I wanted to reset all options to their defaults after setting the overclock in the BIOS, but I always kept getting the message that Ryzen Master wants to restart Windows because the setting was changed to "Manual" – which it wasn’t, but more on that later. So I did as it asked multiple times with the same outcome every time. Effectively, I was doing a boot loop manually.

So, how did I get there?

The Journey

In brief:

  • Find a stable overclock in Windows using Ryzen Master.
  • Reboot to BIOS and set the overclock closer to the hardware.
  • Reboot to Windows and reset everything in Ryzen Master.
  • Manual "Boot Loop" a few times.
  • Notice CPU always at 4GHz, no more Cool’n’Quiet operation mode.
  • Undo overclock in BIOS.
  • Still see overclock in Windows.
  • Uninstalling Ryzen Master.
  • Still see overclock in Windows.
  • Ryzen Master still not resetting.
  • Manual "Boot Loop" a few more times.
  • Getting pissed and searching the Internet – apparantly I was not alone.
  • More reboots and tests with BIOS settings.

The Fix

It was the frickin’ BIOS! Ryzen Master was not to blame.

I have an ASRock B450 Gaming mITX mainboard with the latest non-Matisse (Ryzen 3000) BIOS. It is not recommended to upgrade unless a Ryzen 3000 is installed. There’s a weird bug in the BIOS that still applies the overclock even if the setting is set to "Auto by AMD CBS" (or something like that). There were two things that helped:

  1. Load BIOS defaults.
  2. Enable manual control and set the correct CPU base frequency at 3400MHz.

When applying the overclock with 4000MHz it effectly ran at 4GHz every time, even in idle. When setting 3400MHz it properly clocked down and also boosted as a R5 2600 should. The same setting only with a different clock value produced a different behavior. And unless the BIOS defaults are loaded the "Auto" mode doesn’t do what you expect – if you’ve set an overclock previously.

Curiously enough, booting Fedora Linux from an USB stick did properly scale the CPU frequency based on the load, even with the overclock applied. Apparently only Windows or AMD’s drivers didn’t manage to do that. Booting a Linux helped me to rule out Ryzen Master as the root of the always applied overclock although the BIOS setting was set to the default Auto mode.

The takeaways:

  • Don’t overclock on this mainboard.
    • The OC options for the CPU are laughable at best. No way to set the multiplier per core.
  • Next time buy a higher-end mainboard for overclocking (ITX is expensive though…).

Kotlin Object Expression – What more can object do?

In a previous post I explained what Kotlin Object Declarations are. This time around it’s about the declaration’s sibling, the Object Expression.

An object is not just a glorified static replacement or a singleton. object can be used where Java usually utilizes anonymous inner classes. Let’s look at a more realistic scenario: a JButton and an ActionListener or a MouseListener.

Read More »

Kotlin Object Declarations – The fake-static

Instead of implementing my own backup application as I had planned a long time ago, I’m wandering off (re)learning Kotlin after a long absence from that language. In my defense though, I’m doing it in the context of the backup app which will not be Java as originally intended (or maybe later for comparison, who knows, I obviously can’t be trusted with my plans). Putting that aside, the most confusing concept of Kotlin for a Java developer is the object. What is that thing doing that a class can’t do and how do we declare static fields and methods? I know it’s nothing new, but that part seems to have changed a bit since I used Kotlin about two (?) years ago. So, for me this is a refresh of old information and also something new and by writing about it I will engrave it in my brain once and for all. And your confusion will hopefully turn into some productive… fusion… of some sort… or so.

Read More »

Red Dead Redemption 2

I’ve been a gamer for a very long time – it’s easily been twenty years or more (yes, I’m old). But, in the past year or so, my excitement has been waning. I have mentioned in another blog post that I was planning to replace my big tower PC with a notebook for (mobile) coding and writing – which I have done – and, in the short- to mid-term, get a gaming console to replace the video gaming part of the PC with something more casual and affordable. This day has finally come and the first game I have played has been Red Dead Redemption 2. Now, this game was many firsts for me:

  • First non-digital game since Steam has launched. I bought it in a retail store on a BluRay disc.
  • First full-price video game at 60€. Before that, I have always been shopping for special offers and discounts.
  • First console game.

I think Red Dead Redemption is something very special and I will try to explain why I think that is. One thing is for sure and that is the fact that it has rekindled the fire within me to play a video game on-end without pause. Unlike the other game reviews/experience reports I have written so far, this one is a bit different. I started writing when I was about 40% through the game and added to it at different stages of progress. In short: it’s like a diary.

Read More »

Unwanted JUnit 4 Dependency with Kotlin and JUnit 5

I ran across this issue only by accident because I was investigating a completely different problem. I wrote a quick test to debug my issue and was wondering why custom serializers and deserializers are not registered with the Jackson ObjectMapper. I had a nice init() function that was annotated with @Before. So, what the hell?

Let’s back up a bit for some context.

  • Kotlin Project
  • Runs on Java 12
  • JUnit 5 as test engine
  • AssertK for assertions (just for the sake of completeness)

I’m used to JUnit 4, so in my test I used @Before to annotate a setup method. It was one of the many options IntelliJ presented to me.

@Before
fun init() {
    val module = SimpleModule()
    module.addDeserializer(Instant::class.java, InstantDeserializer())
    module.addSerializer(Instant::class.java, InstantSerializer())
    mapper.registerModule(module)
}

The method wasn’t called, however. But it’s annotated! Well, it’s the wrong annotation if you’re using JUnit 5. The correct one is @BeforeEach. This one and @BeforeClass (new name @BeforeAll) have been changed from version 4 to 5 to make their meaning more obvious.

But that’s besides the point. The question is: where does this @Before come from then?

A look at the dependency tree quickly reveals the culprit.

It’s the official JetBrains Kotlin JUnit test artifact. Although it doesn’t hurt me to have it in my project, it certainly caused some confusion and I’d like to avoid that in the future. Hence, I excluded the old version of JUnit in my POM file for this dependency.

<dependency>
    <groupId>org.jetbrains.kotlin</groupId>
    <artifactId>kotlin-test-junit</artifactId>
    <version>${kotlin.version}</version>
    <scope>test</scope>
    <exclusions>
        <exclusion>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
        </exclusion>
    </exclusions>
</dependency>

Problem solved.