Alan Wake Remastered Review (PC)

The gamer thought he knew what to expect. After all, he had watched a playthrough of the Xbox 360 version so many years ago. In truth, he had no idea.

Alan Wake had been a successful game. It sold over three million copies. Critics loved it. Players loved it. There was a huge fanbase around it. Still, the remastered version was slightly less well-received. It was a technological overhaul, more suitable for modern systems, while the gameplay was still the same old same old.

As the gamer fought his first battles, a realization set in. The controls were clunky, even odd at times. He had to retrain his brain to make things work. Dodging and sprinting were activated by the same key. „Why would the game do such a thing?“, the gamer wondered. The way he controlled Alan was unlike anything he had ever played. Alan was a writer, not a superhero. He was less athletic than a boulder chasing a famous fictional adventurer through narrow caves. Running was a futile endeavor. Any such attempt was quickly responded with heavy panting by the protagonist, Alan Wake.

The game did not want to be fast-paced, did not want to be a shooter. It was a supernatural mystery thriller with action elements. The beam of Alan’s flashlight was the game’s version of a reticle. The gamer thought this was a clever idea. He also didn’t like it as it made him feel vulnerable, not always in control. That’s what the game wanted him to feel.

Not in control.

Helpless.

The game’s story evoked similar feelings on an intellectual level. It was deliberately convoluted. It contained a meta narrative that foretold the story while the gamer experienced it on screen. He was dreading the moments of instense combat the game foretold. But how much of it was real? Was any of it real? This was unlike anything the gamer has seen before. He was wondering how the pieces fit together, how it all made sense. Would there be a happy ending?

Not in the know.

Clueless.

Read More »

Base64 PowerShell Cmdlet Via Advanced Functions

Among the many valuable command line utilities on a Linux system is base64, which encodes to and decodes from the Base64 encoding scheme. As much as I like PowerShell…

(Yes, you read that correctly)

…it sorely lacks a base64-equivalent utility, or cmdlet as they are called in PowerShell land. The only solution was to create one myself. Cmdlets are usually written in C#, but you can also employ the concept of advanced functions, which is what I have done.

Here’s the code for converting strings to Base64. The function supports receiving data from a pipeline, or you can call it directly and pass the value as a parameter. More on the usage later.

Function ConvertFrom-Base64
{
    [CmdletBinding()]
    param (
        [Parameter(ValueFromPipeline = $true)]
        [string] $Base64
    )
    
    Process 
    {
        if ($null -ne $Base64) 
        {
            $Bytes = [Convert]::FromBase64String($Base64)
            Write-Output [System.Text.Encoding]::UTF8.GetString($Bytes)
        }
        else 
        {
            Write-Error "No base64-encoded data provided."
        }
    }
}
Read More »

Horizon Forbidden West PC Technology Discussion (Linux + Windows Benchmarks)

Horizon Forbidden West was one of the most visually impressive titles on the PlayStation 4 and PlayStation 5. Two years later, the same sentiment repeats on the PC. Despite no fancy raytracing features, Guerrila’s Decima Engine still produces a stunning game world and character rendering. I gushed enough about the visuals when I wrote my PS4 Pro and Burning Shores expansion reviews. Therefore, I will not elaborate further on this topic here. Instead, I will focus on the performance aspects. If you are interested in seeing a lot of screenshots, please read my reviews. The PC version is essentially the PS5 version, with a slightly higher level of detail in the distance and some PC-specific improvements, as Digital Foundry had discovered

The most significant benefit of this PC version is the “unlimited” performance that is unshackled from console hardware and the free choice of input peripherals. I played with a keyboard and a mouse because of RSI issues in my thumbs that operating a controller’s thumbsticks worsened. A mouse was also more accurate when aiming with the bow, but I would still have preferred a controller during traversal and combat. The proximity of all buttons to the fingers would have made coordinating the various skills and movement patterns much more effortless. Apart from that, PC’s stalwart input methods worked very well and did not hold me back much. I made up for what I lost in convenience with comfort and precision.

Unlike other modern releases that ate hardware for more or less good reasons, Horizon Forbidden West performed admirably. The YouTube channel eTeknix did enormous work testing 40 GPUs in three different resolutions. Nixxes did an excellent job making the game scalable, and Guerrilla’s initial work optimizing for the weaker consoles also paid off immensely. Even my former 3060 would have been enough to enjoy the Forbidden West with some help from DLSS upscaling.

Read More »

Uncharted – The Legacy of Thieves Collection Review (PS5 + PC)

In my review of The Nathan Drake Collection, I hinted at wanting to play the Legacy of Thieves Collection at some point. That day has come, or rather has passed, and I have some thoughts about Uncharted 4: A Thief’s End and The Lost Legacy.

Both titles were unmistakably Uncharted games and improved on many aspects of the previous trilogy. The biggest one pertained to the controls, which were more accurate in the fourth entry of the series and its spinoff. This eliminated all the unintentionally hilarious deaths I frequently endured in the first three titles. Combat also benefitted greatly from these improvements and was more precise this time around. Thanks to the PlayStation 5 overhaul, stable high framerates and the inclusion of Dual Sense controller features made for a more fluid and immersive gameplay. The PC version reached even higher framerates elevating the feel of gunplay further.

Despite all the mechanical grievances being eliminated, Uncharted 4 still contained moments of downtime. Naughty Dog’s trademark storytelling and character moments were stretched to their limits on a couple of occasions. These moments no doubt accurately captured the mood of the situation. This is undeniable. I only wished the writers would have opted for a more concise presentation to make it more delightful. Long-winded traversal and climbing sections slowed down progress and put a damper on the enjoyment more than necessary. This sentiment does not apply to the Lost Legacy, however. Its pacing was without complaint while still managing to deliver strong character moments that portrayed the human side of Chloe and Nadine and their relationship.

On the other side of that ancient coin were these high-quality character interactions emanating from this design and gameplay decision, something we have come to expect from contemporary Naughty Dog games. The tenuous relationship between Nathan and Elena, the love and admiration between the two brothers, and the friendship of the group and Sully; all were exceptionally well written. That even extended to Nadine, one of the two antagonists. In typical Uncharted fashion, the primary adversary was just a bad dude. Arrogant and spoiled. A rich douchebag. Someone to dislike.

Read More »

Are Game Discounts Bad Or Should You Pay Full Price?

I regularly watch a YouTube channel named SkillUp and listen to several IGN podcasts for weekly video game and gaming-industry-related news and updates. Almost every week since 2024 started, the weekly shows have contained reports about layoffs throughout the gaming industry. Many were due to bad management or lackluster sales of mediocre and unsuccessful titles. Around 7000 people have already lost their jobs in the first two months of this year alone, following up the more than ten thousand last year. This is compounded by the fact that game development is becoming an ever-growing financial undertaking. From recent leaks and the legal battle between Sony and Microsoft, we know that PlayStation’s last AAA first-party masterpieces have broken through the 200 and 300-million-dollar production cost barrier.

The more I think about it, the more I ask myself whether I should be looking for deals rather than paying full price. Relying solely on discounts is incredibly hypocritical of me as I am predominantly a PC gamer. A PC gamer with a very costly gaming computer, I am sad to admit. And despite having the cash for all that fancy hardware, I constantly look for discounts rather than accepting the full price for a video game. How does that make sense? Isn’t that salt in the wounds of all those talented developers and artists who lost their livelihoods? Am I part of the problem and equally despicable like the handsomely paid CEOs who gamble with the lives of their employees in their relentless chase for a slice of that shiny live-service money? I shudder to think what will happen to Rocksteady after the unfortunate release of Suicide Squad.

I will rethink my approach to game purchases, but I alone will not even make a dent. It’s a conscience thing more than anything. It requires more people to do the same and also changes in the industry. Are development cycles that now span the typical lifetime of a console generation sustainable in the long run? Who, other than full-time game reviewers or streamers, maybe, is supposed to have the time to repeatedly spend 100 hours on every single game because the new norm for AAA releases is to be a gigantic time sink? Why do we need countless hours of filler content? Aren’t shorter, more focused games of 40 hours not more affordable and more uncomplicated to produce? I often feel bad for skipping so much side content because the main story is so engaging or because the side content is not entertaining enough. Sometimes, I want to finish a game because I have already spent so much time in its world that I am ready to move on. That means I skip on content somebody put their sweat and tears into to produce. It feels wrong, yet I want to experience more of the many great titles released yearly, not more of a single one. Maybe it is an attention span issue of mine. I do not know. All I see is budgets skyrocketing and people losing their jobs.

As a passionate gamer and a software developer, I can only hope the gaming industry reaches an equilibrium soon. These massive waves of workforce reductions and studio closures must stop. The industry needs a sustainable approach to high-quality, high-profile cinematic experiences so we can continue playing these incredible stories and even more of them. Studios also need a way to try something new, to take a risk without fear of the loss of everything. I immensely enjoyed Immortals of Aveum, but Ascendant Studios did not ascend far with their first game. If this industry only knows the two extremes of blinding success or heartbreaking disaster, it will become harder and harder for creative minds to take the plunge and dare raise funds to create something new.

Now, these thoughts mainly apply to high-profile releases. However, the concept of paying full price is even more applicable to double-A or indie productions. Smaller studios need all the money they can get without a giant publisher backing the development.

This is a complex topic with many more points I could discuss if I invested more time. Despite that, I said what I thought was necessary, and I will end this post on that note.

I apologize for the downer. These thoughts did not let me go.

Thank you for reading this sad gamer’s tale.

PS

The header image was created by Microsoft’s Copilot with a minor edit job from me using the following input:

“create an image of a video game store with a bargain bin full of discounted video games next to a stand of triple-a full price titles”

Azure Key Vault Error: The Specified PEM X.509 Certificate Content Is In An Unexpected Format

Microsoft’s Azure Key Vault supports uploading certificates in the PEM format. However, it is a bit picky, and the format must be exact. The documentation contains all the information, but the PEM format has a few nuances that the documentation does not address.

The following is a valid certificate as generated by a PKI.

Subject: CN=The Codeslinger,O=The Codeslinger,C=DE
Issuer: CN=The Codeslinger Intermediate,O=The Codeslinger,C=DE
-----BEGIN CERTIFICATE-----
MIIC...Ivw=
-----END CERTIFICATE-----
Subject: CN=The Codeslinger Intermediate,O=The Codeslinger,C=DE
Issuer: CN=The Codeslinger Root,O=The Codeslinger,C=DE
-----BEGIN CERTIFICATE-----
MIIB...Rps=
-----END CERTIFICATE-----
Subject: CN=The Codeslinger Root,O=The Codeslinger,C=DE
Issuer: CN=The Codeslinger Root,O=The Codeslinger,C=DE
-----BEGIN CERTIFICATE-----
MIIB...aA==
-----END CERTIFICATE-----
-----BEGIN RSA PRIVATE KEY-----
MIIE...12Us
-----END RSA PRIVATE KEY-----

However, Key Vault will not accept it. Instead, it throws the dreaded error: “The specified PEM X.509 certificate content is in an unexpected format. Please check if certificate is in valid PEM format.”

As you can see in the documentation, the PEM file must not have metadata about the certificate and issuing authorities. You can remove this information, and the PEM will look like the following.

-----BEGIN CERTIFICATE-----
MIIC...Ivw=
-----END CERTIFICATE-----
-----BEGIN CERTIFICATE-----
MIIB...Rps=
-----END CERTIFICATE-----
-----BEGIN CERTIFICATE-----
MIIB...aA==
-----END CERTIFICATE-----
-----BEGIN RSA PRIVATE KEY-----
MIIE...12Us
-----END RSA PRIVATE KEY-----

You are not done yet, though, as the key must be in the PCKS#8 format. The following OpenSSL command will do the trick if you store your key in a file.

openssl pkcs8 -topk8 -nocrypt -in private-key.pem

This works for RSA keys, as shown above, and Elliptic Curve keys.

-----BEGIN EC PRIVATE KEY-----
MHcC...8g==
-----END EC PRIVATE KEY-----

The output will be the following.

-----BEGIN PRIVATE KEY-----
MIGH...vnry
-----END PRIVATE KEY-----

Putting it all together, Key Vault will now accept the certificate.

-----BEGIN CERTIFICATE-----
MIIC...Ivw=
-----END CERTIFICATE-----
-----BEGIN CERTIFICATE-----
MIIB...Rps=
-----END CERTIFICATE-----
-----BEGIN CERTIFICATE-----
MIIB...aA==
-----END CERTIFICATE-----
-----BEGIN PRIVATE KEY-----
MIIE...8kjt
-----END PRIVATE KEY-----

I hope this helps.

Thank you for reading.

Starfield Review – Jack Of All Trades, Master Of Some (PC)

Microsoft desperately needed a win after all the negative press surrounding its Xbox gaming brand in 2023 and the year before. 2022, in particular, needed major first-party releases as many titles were pushed further into the future. When one of those much-anticipated games finally arrived, it was a stake in the heart of a vampire. Redfall’s release in May was a disaster, and all eyes were now on Starfield, Bethesda Game Studio’s space opera and first new IP in ages. “Skyrim in space “was the most succinct description that was even given by the game’s director.

I like Skyrim, and I enjoy science fiction in space. The more Bethesda revealed about the game, especially in its Starfield Direct presentation, the more my interest was piqued. It went so far that I took advantage of AMD’s sponsorship deal that bundled a Radeon graphics card with the game’s Premium edition. It was actually the only way for any of the current-gen GPUs to be a sensible purchase. Based on early benchmarks, it also was a necessary upgrade from the RTX 3060 to get an enjoyable performance in this game. Although the performance at launch was still imperfect, it ran well most of the time outside major cities.

The game itself was, in many aspects, a typical Bethesda title, offering more of the gameplay loop that we all have come to enjoy. But Bethesda’s ambitions have been grand, vast as the depths of space, so they have added 1000 planets for players to explore. Could we be talking about illusions of grandeur instead?

Regarding scope, Bethesda’s games have always offered the player an enormous amount of content worth several hundreds of hours if you wanted to. Starfield’s 1000 planets certainly have that potential, although I doubt it would be exhilarating. Worlds only contained a low, curated (maybe randomized) number of locations of interest, and traveling between them was… a perfect opportunity to listen to space podcasts. And even those locations existed primarily for looting purposes. If you love the gameplay loop, all the power to you. If I remember correctly, I have only visited and explored planets during the events of a quest. Therefore, take my words with a grain of salt.

The big story questlines were where Starfield shined, not the number of planets or auxiliary game systems. Combined with the addictive Bethesda storytelling, looting, fantastic art design, and entertaining combat, I had a delightful time. However, it is essential to know what to expect from Starfield. It could have been a better space exploration game. But it was a terrific story-based first-person shooter with RPG elements Bethesda-style.

Performance was a mixed bag, and it still depends on the hardware. Starfield prefers AMD graphics cards and Intel processors. Big cities like New Atlantis or Akila will murder low-end CPUs, and performance dips must be expected. Indoor areas ran well, whereas outdoor regions varied based on the location (dense forest vs. barren planet surface). However, not all graphics options must be cranked to eleven to enjoy the artwork.

Read More »

Starfield PC Technology Discussion (Linux + Windows Benchmarks)

In my game reviews, I usually include a section I call “The Nerdy Bits” to examine a game’s technology. I have decided to separate this content from the game review to keep the size manageable. My Marvel’s Midnight Suns review showed me how an expansive technology section can inflate the blog post and maybe even distract from discussing the gameplay, the content, and the story, or potentially deter and intimidate readers because of the total length.

(This blog post is dangerously close to 3000 words 😉.)

I firmly believe that technology is a crucial aspect of a video game. Still, sometimes, I can get carried away and focus too much on it. Other people may not be as interested in that or as curious as I am, and they prefer an overview of the gameplay and a brief summary of the visual fidelity.

For me, a lousy running game can break the immersion. Take Elden Ring on the PlayStation 5, for example. My sister bought the game and thinks it runs fine, like many others who believe it to be the greatest thing since sliced bread. I took a 10-second look, turned the camera around one time, and concluded it ran like crap, and I did not want to play this way. Playing for ten to fifteen more minutes solidified this initial perception. This technology discussion is for gamers like me who are also interested in the technical aspects of a video game and base their purchasing decisions on that.

With this explanation out of the way, let me discuss what I think of Starfield’s technology. I will touch on the art style, the visual fidelity and technology, audio, and performance on Windows and Linux.

Please note that this is not a Digital Foundry-level inspection. For that, click here and here.

Read More »

How To Execute PowerShell And Bash Scripts In Terraform

The first thing to know is what Terraform expects of the scripts it executes. It does not work with regular command line parameters and return codes. Instead, it passes a JSON structure via the script’s standard input (stdin) and expects a JSON structure on the standard output (stdout) stream.

The Terraform documentation already contains a working example with explanations for Bash scripts.

#!/bin/bash
set -e

eval "$(jq -r '@sh "FOO=\(.foo) BAZ=\(.baz)"')"

FOOBAZ="$FOO $BAZ"
jq -n --arg foobaz "$FOOBAZ" '{"foobaz":$foobaz}'

I will replicate this functionality for PowerShell on Windows and combine it with the OS detection from my other blog post.

The trick is handling the input. There is a specific way, since Terraform calls your script through PowerShell, something like this echo '{"key": "value"}' | powershell.exe script.ps1.

$json = [Console]::In.ReadLine() | ConvertFrom-Json

$foobaz = @{foobaz = "$($json.foo) $($json.baz)"}
Write-Output $foobaz | ConvertTo-Json

You access the C# Console class’ In property representing the standard input and read a line to get the data Terraform passes through PowerShell to the script. From there, it is all just regular PowerShell. The caveat is that you can no longer call your script as usual. If you want to test it on the command line, you must type the cumbersome command I have shown earlier.

echo '{"json": "object"}' | powershell.exe script.ps1

Depending on how often you work with PowerShell scripts, you may bump into its execution policy restrictions when Terraform attempts to run the script.

│ Error: External Program Execution Failed
│
│   with data.external.script,
│   on main.tf line 8, in data "external" "script":
│    8:   program = [
│    9:     local.shell_name, "${path.module}/${local.script_name}"
│   10:   ]
│
│ The data source received an unexpected error while attempting to execute the program.
│
│ Program: C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
│ Error Message: ./ps-script.ps1 : File
│ C:\Apps\Terraform-Run-PowerShell-And-Bash-Scripts\ps-script.ps1
│ cannot be loaded because running scripts is disabled on this system. For more information, see
│ about_Execution_Policies at https:/go.microsoft.com/fwlink/?LinkID=135170.
│ At line:1 char:1
│ + ./ps-script.ps1
│ + ~~~~~~~~~~~~~~~
│     + CategoryInfo          : SecurityError: (:) [], PSSecurityException
│     + FullyQualifiedErrorId : UnauthorizedAccess
│
│ State: exit status 1

You can solve this problem by adjusting the execution policy accordingly. The quick and dirty way is to allow all scripts as is the default on non-Windows PowerShell installations. Run the following as Administrator.

Set-ExecutionPolicy -ExecutionPolicy Unrestricted -Scope LocalMachine

This is good enough for testing and your own use. If you regularly execute scripts that are not your own, you should choose a narrower permission level or consider signing your scripts.

Another potential pitfall is the version of PowerShell in which you set the execution policy. I use PowerShell 7 by default but still encountered the error after applying the unrestricted policy. That is because the version executed by Terraform is 5. That is what Windows starts when you type powershell.exe in a terminal.

PowerShell 7.4.1
PS C:\Users\lober> Set-ExecutionPolicy -ExecutionPolicy Unrestricted -Scope LocalMachine
PS C:\Users\lober> Get-ExecutionPolicy
Unrestricted
PS C:\Users\lober> powershell
Windows PowerShell
Copyright (C) Microsoft Corporation. All rights reserved.

Install the latest PowerShell for new features and improvements! https://aka.ms/PSWindows

PS C:\Users\lober> Get-ExecutionPolicy
Restricted
PS C:\Users\lober> $PsVersionTable

Name                           Value
----                           -----
PSVersion                      5.1.22621.2506
PSEdition                      Desktop
PSCompatibleVersions           {1.0, 2.0, 3.0, 4.0...}
BuildVersion                   10.0.22621.2506
CLRVersion                     4.0.30319.42000
WSManStackVersion              3.0
PSRemotingProtocolVersion      2.3
SerializationVersion           1.1.0.1

Once you set the execution policy in the default PowerShell version, Terraform has no more issues.

A screenshot that shows the Windows Terminal output of the Terraform plan command.

And for completeness sake, here is the Linux output.

A screenshot that shows the Linux terminal output of the Terraform plan command.

You can find the source code on GitHub.

I hope this was useful.

Thank you for reading

How To Detect Windows Or Linux Operating System In Terraform

I have found that Terraform does not have constants or functions to determine the operating system it is running on. You can work around this limitation with some knowledge of the target platforms you are running on. The most common use case is discerning between Windows and Unix-based systems to execute shell scripts, for example.

Ideally, you do not have to do this, but sometimes, you, your colleagues, and your CI/CD pipeline do not utilize a homogeneous environment.

One almost 100% certain fact is that Windows addresses storage devices with drive letters. You can leverage this to detect a Windows host by checking the project’s root path and storing the result in a variable.

locals {
  is_windows = length(regexall("^[a-z]:", lower(abspath(path.root)))) > 0
}

output "absolute_path" {
    value = abspath(path.root)
}

output "operating_system" {
    value = local.is_windows ? "Windows" : "Linux"
}

The output values are for demonstration purposes only. All you need is the regex for potential drive letters and the absolute path of the directory. Any path would do, actually.

The regexall function returns a list of all matches, and if the path starts with a drive letter, the resulting list contains more than zero elements, which you can check with the length function.

You could also check for “/home” to detect a Linux-based system or “/Users” for a macOS computer. In those instances, the source code must always be located somewhere in a user’s directory during execution. That may not be the case in a CI/CD pipeline, so keep that in mind. Here is the result on Windows.

A screenshot that shows the Windows Terminal output of the Terraform plan command.

And here on Linux.

A screenshot that shows the Linux terminal output of the Terraform plan command.

You can find the source code on GitHub.

I hope this was useful.

Thank you for reading

Fedora 39/40 Switch Desktop Environment KDE Plasma To Gnome Shell

If you want to do the reverse operation and switch from Gnome Shell to KDE Plasma, I also have a blog post on that.

Replacing KDE Plasma on Fedora 39 requires only a couple of dnf and systemctl commands to convert the Fedora KDE spin into the Default Fedora Workstation with Gnome Shell. It might also work on earlier and later versions.

I have verified these steps on a fresh installation. Be sure to check the console output to avoid accidentally uninstalling any required software if you perform the desktop swap on a productive system.

Start with upgrading all packages. It is generally a good idea when performing such a massive system change.

sudo dnf upgrade

Next, you change the type of the Fedora installation. This is required because Fedora uses package groups and protected packages. You allow removing the KDE package groups by swapping them with the Gnome package groups.

$ sudo dnf swap fedora-release-identity-kde fedora-release-identity-workstation

Last metadata expiration check: 0:19:04 ago on Tue 02 Jan 2024 08:37:17 AM CET.
Dependencies resolved.
==============================================================================
 Package                              Architecture  Version  Repository  Size
==============================================================================
Installing:
 fedora-release-identity-workstation  noarch        39-30    fedora      11 k
Installing dependencies:
 fedora-release-workstation           noarch        39-30    fedora      8.2 k
Removing:
 fedora-release-identity-kde          noarch        39-34    @updates    1.9 k
Downgrading:
 fedora-release-common                noarch        39-30    fedora      18 k
 fedora-release-kde                   noarch        39-30    fedora      8.2 k

Transaction Summary
==============================================================================
Install    2 Packages
Remove     1 Package
Downgrade  2 Packages

Total download size: 45 k
Is this ok [y/N]:

And the second command.

$ sudo dnf swap fedora-release-kde fedora-release-workstation

Last metadata expiration check: 0:20:04 ago on Tue 02 Jan 2024 08:37:17 AM CET.
Package fedora-release-workstation-39-30.noarch is already installed.
Dependencies resolved.
==============================================================================
 Package                              Architecture  Version  Repository  Size
==============================================================================
Removing:
 fedora-release-kde                   noarch        39-30    @fedora     0  

Transaction Summary
==============================================================================
Remove  1 Package

Freed space: 0  
Is this ok [y/N]:

Next, fetch the Fedora Workstation packages and dump them on your storage drive (omitting output for brevity).

sudo dnf group install "Fedora Workstation"

Now that Gnome Shell packages are installed disable SDDM and enable the GDM login manager on boot.

sudo systemctl disable sddm
sudo systemctl enable gdm

At this point, I would log out or reboot and log into the Gnome Shell.

As the final step, you remove the KDE spin packages and the remaining stragglers.

sudo dnf group remove "KDE Plasma Workspaces"
sudo dnf remove *plasma*
sudo dnf remove kde-*
sudo dnf autoremove

Be careful not to mistype sudo dnf remove kde-*! If instruct dnf to remove kde*, it will catch more packages than you would like.

That is all there is to turn the Fedora KDE spin installation into the default Fedora Workstation with the Gnome Shell.

Read More »

Fedora 39/40 Switch Desktop Environment Gnome Shell To KDE Plasma

If you want to do the reverse operation and switch from KDE Plasma to the Gnome Shell, I also have a blog post on that.

Replacing the Gnome Shell on Fedora 39 requires only a couple of dnf and systemctl commands to convert the default Fedora Workstation into the KDE spin. It might also work on earlier and later versions.

I have verified these steps on a fresh installation. Be sure to check the console output to avoid accidentally uninstalling any required software if you perform the desktop swap on a productive system.

Start with upgrading all packages. It is generally a good idea when performing such a massive system change.

sudo dnf upgrade

Next, you change the type of the Fedora installation. This is required because Fedora uses package groups and protected packages. You allow removing the Gnome package groups by swapping them with the KDE package groups.

sudo dnf swap fedora-release-identity-workstation fedora-release-identity-kde

And the second command.

sudo dnf swap fedora-release-workstation fedora-release-kde

Next, fetch the KDE spin packages and dump them on your storage drive (omitting output for brevity).

sudo dnf group install "KDE Plasma Workspaces"

Now that KDE packages are installed disable GDM and enable the SDDM login manager on boot.

sudo systemctl disable gdm
sudo systemctl enable sddm

At this point, I would log out or reboot and log into the KDE session.

As the final step, you remove the Fedora Gnome packages and the remaining stragglers.

sudo dnf group remove "Fedora Workstation"
sudo dnf remove *gnome*
sudo dnf autoremove

That is all there is to turn the default Fedora Gnome installation into the Fedora KDE spin.

Read More »

My Year In Video Gaming 2023 – Game Of The Year And More

It is my third time doing a write-up of my gaming year. The third time’s the charm, right? Before I get into the games I played, I want to reflect on the year and mention and comment on a few subjects that happened throughout it.

First of all, 2023 has been an unbelievable year when looked at just in terms of high-profile releases. There have been so many great titles I cannot possibly remember them all, and as you will see later, I barely even played any of them. Despite that exciting time for game consumers, countless layoffs have shaken the gaming industry. So many people lost their livelihoods because of what often was mismanagement or just greed. This Polygon article summarizes the situation. It is an excellent and somber read.

Another sad topic, although irrelevant compared to layoffs, is the quality of PC ports. Cynical voices may call it business as usual and not any different from other years. Even if that were correct, it does not make it acceptable. This year’s worst offender is most likely Star Wars Jedi Survivor, a highly praised game overall. Although benchmarking generally shows high framerates, the moment-to-moment experience is probably not always flawless. Please note that I cannot speak from experience. I have avoided this title because of its technical issues. From what I have gathered so far, the 30fps mode on consoles might be the most consistent and fluid experience of them all. Sounds wrong now, does it? Just before I published this blog post, Digital Foundry posted their worst PC ports 2023 video summarizing what started as a bad release and was fixed and games that are still bad.

(Guess which game is still in the latter category.)

Adding to the 2023 pile of sad topics, there is no way to get around the current GPU market, and the subpar price-to-performance ratio NVIDIA and AMD have graced us with. NVIDIA is greedy, and AMD does not know how to or does not want to take advantage of the situation. Looked at in isolation, the performance of available GPUs is good to crazy fast. But products do not exist in isolation, and last year’s models in the mid-range are barely slower. Vendors have plenty of stock now, but shopping for graphics cards is still not fun. New system builders are probably better off than upgraders – depending on the hardware age, of course.

Lastly, I need help understanding the buzz around Call of Duty. How can it be that this franchise is a top seller every year? It is a short, bombastic, and action-oriented campaign of less than 10 hours, so people must be interested in the multiplayer component. But how does that warrant 60 to 70 bucks purchases every year when offshoots like CoD Warzone exist that are ongoing service games? In any case, I hope that Microsoft’s acquisition of Activision Blizzard King positively affects the company’s work culture.

Read More »

How To Add Microsoft Store Games To Steam

As a gamer, you likely prefer Steam as a game launcher over everything else, notably the Microsoft Store. Steam supports adding non-Steam games, but Microsoft makes it stupidly complicated to run their store content from anywhere else – at least if you do not like Windows shortcuts.

I wanted to add Gears of War 4 to Steam, a game only available in the Microsoft Store. Here is what I did and what should also work for other titles or applications.

First, there is no way around a shortcut. However, it is only temporary and serves as the starting point. If you are lucky, it is all you need. You can delete the shortcut after all is said and done.

Read More »

Upgrade Intel Core i5 12400F DDR4 to AMD Ryzen R5 7600 DDR5 – Worth It?

Intel’s Core i5 12400F was and still is a capable budget gaming CPU. When I bought this chip at the end of summer 2022, DDR5 memory was still costly, and the benefit in gaming was not worth the price by a long shot. In 2023, Intel’s 13th-gen CPUs benefit significantly from faster memory, and DDR5 prices have reached their equilibrium where DDR4 was last year. I could have taken advantage of slotting in a Core-13000 model, maybe even a 14000 variant, but I am too much of a tech enthusiast to ignore the performance I could be leaving on the table with DDR4.

As you will see, I probably would not have noticed the difference and potentially benefitted the one game that triggered the upgrade thoughts. I recently took advantage of AMD’s Starfield bundles and received a new GPU with my game purchase. I knew of all the discussion around this game’s performance profile. Intel owns this game despite it being an AMD-sponsored title. Nevertheless, the 12400 had issues in the CPU-heavy areas, like New Atlantis.

(It appears that AMD or Bethesda forgot that AMD also makes CPUs, which is baffling since AMD makes the Xbox chips and Xbox owns Bethesda…)

Anyway.

The Ryzen 7600 should walk all over the 12400 with DDR4. The Intel chip is roughly equivalent to a Ryzen R5 5600X, and compared to that processor, the R5 7600 is 30% faster in games on average, according to Hardware Unboxed’s testing published on Techspot.

I performed several gaming benchmarks that compare the i5 12400F to the R5 7600 when paired with a Radeon RX 7900 XT.

Read More »