Take (Game) Screenshots On Linux Every X Minutes In Python

Almost four years ago, I explained how I automated taking screenshots of video games on Windows. I integrated this code into a GUI two years later to simplify starting and stopping the capture. It was not a pretty application, but it did the job. Since I resurrected my Linux experiment a while ago, gaming on Linux and even the general day-to-day use of Linux have caught up to the point where it is a serious contender for daily driving. Therefore, I was looking for a way to automate taking a screenshot while I am gaming on Linux. My previous approach would not work since it relied on Windows’ infamous WinAPI, and Wine wasn’t something I wanted to dabble with for something so small.

My initial idea was to use a Wayland API to do something low-level. It seems like this is impossible by design in the name of security. I want to substantiate this with links to official statements or documentation. However, I could only find user messages in various forums and SO-like services saying precisely what I just did, but without providing a source.

The most viable solution I found was using DBus and calling into the XDG Desktop Portal to capture the screen. From my understanding, the desktop environment’s compositor implements this specification and serves the request, e.g., Gnome’s Mutter.

The solution I present here is based on this StackOverflow response. All of the credit goes to that user. I added a bit of context and explanation in this blog post. Note that this is not a DBus tutorial, although I implicitly tackle some core concepts when explaining the code. I would direct you to the Freedesktop tutorial on DBus for a high-level overview. I am not a DBus specialist, and some aspects still elude me.

The complete example code is in my GitHub repository. I only show the bare minimum here for the explanations.

Read More »

Black Myth: Wukong PC Technical Discussion (Linux + Windows Benchmarks)

The hype around this game may have subsided, but I thought it would still be fascinating to test Black Myth: Wukong on Linux and Windows and see how both operating systems fare. Games Science’s handy benchmark utility was a perfect tool for the task. I was not interested in realistic gameplay benchmarks, so a built-in canned benchmark was perfect. What started as a simple run of the benchmark with different settings turned into a discovery of some unexpected behaviors that piqued my curiosity.

Read More »

Horizon Forbidden West PC Technology Discussion (Linux + Windows Benchmarks)

Horizon Forbidden West was one of the most visually impressive titles on the PlayStation 4 and PlayStation 5. Two years later, the same sentiment repeats on the PC. Despite no fancy raytracing features, Guerrila’s Decima Engine still produces a stunning game world and character rendering. I gushed enough about the visuals when I wrote my PS4 Pro and Burning Shores expansion reviews. Therefore, I will not elaborate further on this topic here. Instead, I will focus on the performance aspects. If you are interested in seeing a lot of screenshots, please read my reviews. The PC version is essentially the PS5 version, with a slightly higher level of detail in the distance and some PC-specific improvements, as Digital Foundry had discovered

The most significant benefit of this PC version is the “unlimited” performance that is unshackled from console hardware and the free choice of input peripherals. I played with a keyboard and a mouse because of RSI issues in my thumbs that operating a controller’s thumbsticks worsened. A mouse was also more accurate when aiming with the bow, but I would still have preferred a controller during traversal and combat. The proximity of all buttons to the fingers would have made coordinating the various skills and movement patterns much more effortless. Apart from that, PC’s stalwart input methods worked very well and did not hold me back much. I made up for what I lost in convenience with comfort and precision.

Unlike other modern releases that ate hardware for more or less good reasons, Horizon Forbidden West performed admirably. The YouTube channel eTeknix did enormous work testing 40 GPUs in three different resolutions. Nixxes did an excellent job making the game scalable, and Guerrilla’s initial work optimizing for the weaker consoles also paid off immensely. Even my former 3060 would have been enough to enjoy the Forbidden West with some help from DLSS upscaling.

Read More »

Starfield PC Technology Discussion (Linux + Windows Benchmarks)

In my game reviews, I usually include a section I call “The Nerdy Bits” to examine a game’s technology. I have decided to separate this content from the game review to keep the size manageable. My Marvel’s Midnight Suns review showed me how an expansive technology section can inflate the blog post and maybe even distract from discussing the gameplay, the content, and the story, or potentially deter and intimidate readers because of the total length.

(This blog post is dangerously close to 3000 words 😉.)

I firmly believe that technology is a crucial aspect of a video game. Still, sometimes, I can get carried away and focus too much on it. Other people may not be as interested in that or as curious as I am, and they prefer an overview of the gameplay and a brief summary of the visual fidelity.

For me, a lousy running game can break the immersion. Take Elden Ring on the PlayStation 5, for example. My sister bought the game and thinks it runs fine, like many others who believe it to be the greatest thing since sliced bread. I took a 10-second look, turned the camera around one time, and concluded it ran like crap, and I did not want to play this way. Playing for ten to fifteen more minutes solidified this initial perception. This technology discussion is for gamers like me who are also interested in the technical aspects of a video game and base their purchasing decisions on that.

With this explanation out of the way, let me discuss what I think of Starfield’s technology. I will touch on the art style, the visual fidelity and technology, audio, and performance on Windows and Linux.

Please note that this is not a Digital Foundry-level inspection. For that, click here and here.

Read More »

How To Execute PowerShell And Bash Scripts In Terraform

The first thing to know is what Terraform expects of the scripts it executes. It does not work with regular command line parameters and return codes. Instead, it passes a JSON structure via the script’s standard input (stdin) and expects a JSON structure on the standard output (stdout) stream.

The Terraform documentation already contains a working example with explanations for Bash scripts.

#!/bin/bash
set -e

eval "$(jq -r '@sh "FOO=\(.foo) BAZ=\(.baz)"')"

FOOBAZ="$FOO $BAZ"
jq -n --arg foobaz "$FOOBAZ" '{"foobaz":$foobaz}'

I will replicate this functionality for PowerShell on Windows and combine it with the OS detection from my other blog post.

The trick is handling the input. There is a specific way, since Terraform calls your script through PowerShell, something like this echo '{"key": "value"}' | powershell.exe script.ps1.

$json = [Console]::In.ReadLine() | ConvertFrom-Json

$foobaz = @{foobaz = "$($json.foo) $($json.baz)"}
Write-Output $foobaz | ConvertTo-Json

You access the C# Console class’ In property representing the standard input and read a line to get the data Terraform passes through PowerShell to the script. From there, it is all just regular PowerShell. The caveat is that you can no longer call your script as usual. If you want to test it on the command line, you must type the cumbersome command I have shown earlier.

echo '{"json": "object"}' | powershell.exe script.ps1

Depending on how often you work with PowerShell scripts, you may bump into its execution policy restrictions when Terraform attempts to run the script.

│ Error: External Program Execution Failed
│
│   with data.external.script,
│   on main.tf line 8, in data "external" "script":
│    8:   program = [
│    9:     local.shell_name, "${path.module}/${local.script_name}"
│   10:   ]
│
│ The data source received an unexpected error while attempting to execute the program.
│
│ Program: C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
│ Error Message: ./ps-script.ps1 : File
│ C:\Apps\Terraform-Run-PowerShell-And-Bash-Scripts\ps-script.ps1
│ cannot be loaded because running scripts is disabled on this system. For more information, see
│ about_Execution_Policies at https:/go.microsoft.com/fwlink/?LinkID=135170.
│ At line:1 char:1
│ + ./ps-script.ps1
│ + ~~~~~~~~~~~~~~~
│     + CategoryInfo          : SecurityError: (:) [], PSSecurityException
│     + FullyQualifiedErrorId : UnauthorizedAccess
│
│ State: exit status 1

You can solve this problem by adjusting the execution policy accordingly. The quick and dirty way is to allow all scripts as is the default on non-Windows PowerShell installations. Run the following as Administrator.

Set-ExecutionPolicy -ExecutionPolicy Unrestricted -Scope LocalMachine

This is good enough for testing and your own use. If you regularly execute scripts that are not your own, you should choose a narrower permission level or consider signing your scripts.

Another potential pitfall is the version of PowerShell in which you set the execution policy. I use PowerShell 7 by default but still encountered the error after applying the unrestricted policy. That is because the version executed by Terraform is 5. That is what Windows starts when you type powershell.exe in a terminal.

PowerShell 7.4.1
PS C:\Users\lober> Set-ExecutionPolicy -ExecutionPolicy Unrestricted -Scope LocalMachine
PS C:\Users\lober> Get-ExecutionPolicy
Unrestricted
PS C:\Users\lober> powershell
Windows PowerShell
Copyright (C) Microsoft Corporation. All rights reserved.

Install the latest PowerShell for new features and improvements! https://aka.ms/PSWindows

PS C:\Users\lober> Get-ExecutionPolicy
Restricted
PS C:\Users\lober> $PsVersionTable

Name                           Value
----                           -----
PSVersion                      5.1.22621.2506
PSEdition                      Desktop
PSCompatibleVersions           {1.0, 2.0, 3.0, 4.0...}
BuildVersion                   10.0.22621.2506
CLRVersion                     4.0.30319.42000
WSManStackVersion              3.0
PSRemotingProtocolVersion      2.3
SerializationVersion           1.1.0.1

Once you set the execution policy in the default PowerShell version, Terraform has no more issues.

A screenshot that shows the Windows Terminal output of the Terraform plan command.

And for completeness sake, here is the Linux output.

A screenshot that shows the Linux terminal output of the Terraform plan command.

You can find the source code on GitHub.

I hope this was useful.

Thank you for reading

How To Detect Windows Or Linux Operating System In Terraform

I have found that Terraform does not have constants or functions to determine the operating system it is running on. You can work around this limitation with some knowledge of the target platforms you are running on. The most common use case is discerning between Windows and Unix-based systems to execute shell scripts, for example.

Ideally, you do not have to do this, but sometimes, you, your colleagues, and your CI/CD pipeline do not utilize a homogeneous environment.

One almost 100% certain fact is that Windows addresses storage devices with drive letters. You can leverage this to detect a Windows host by checking the project’s root path and storing the result in a variable.

locals {
  is_windows = length(regexall("^[a-z]:", lower(abspath(path.root)))) > 0
}

output "absolute_path" {
    value = abspath(path.root)
}

output "operating_system" {
    value = local.is_windows ? "Windows" : "Linux"
}

The output values are for demonstration purposes only. All you need is the regex for potential drive letters and the absolute path of the directory. Any path would do, actually.

The regexall function returns a list of all matches, and if the path starts with a drive letter, the resulting list contains more than zero elements, which you can check with the length function.

You could also check for “/home” to detect a Linux-based system or “/Users” for a macOS computer. In those instances, the source code must always be located somewhere in a user’s directory during execution. That may not be the case in a CI/CD pipeline, so keep that in mind. Here is the result on Windows.

A screenshot that shows the Windows Terminal output of the Terraform plan command.

And here on Linux.

A screenshot that shows the Linux terminal output of the Terraform plan command.

You can find the source code on GitHub.

I hope this was useful.

Thank you for reading

Fedora 39/40 Switch Desktop Environment KDE Plasma To Gnome Shell

If you want to do the reverse operation and switch from Gnome Shell to KDE Plasma, I also have a blog post on that.

Replacing KDE Plasma on Fedora 39 requires only a couple of dnf and systemctl commands to convert the Fedora KDE spin into the Default Fedora Workstation with Gnome Shell. It might also work on earlier and later versions.

I have verified these steps on a fresh installation. Be sure to check the console output to avoid accidentally uninstalling any required software if you perform the desktop swap on a productive system.

Start with upgrading all packages. It is generally a good idea when performing such a massive system change.

sudo dnf upgrade

Next, you change the type of the Fedora installation. This is required because Fedora uses package groups and protected packages. You allow removing the KDE package groups by swapping them with the Gnome package groups.

$ sudo dnf swap fedora-release-identity-kde fedora-release-identity-workstation

Last metadata expiration check: 0:19:04 ago on Tue 02 Jan 2024 08:37:17 AM CET.
Dependencies resolved.
==============================================================================
 Package                              Architecture  Version  Repository  Size
==============================================================================
Installing:
 fedora-release-identity-workstation  noarch        39-30    fedora      11 k
Installing dependencies:
 fedora-release-workstation           noarch        39-30    fedora      8.2 k
Removing:
 fedora-release-identity-kde          noarch        39-34    @updates    1.9 k
Downgrading:
 fedora-release-common                noarch        39-30    fedora      18 k
 fedora-release-kde                   noarch        39-30    fedora      8.2 k

Transaction Summary
==============================================================================
Install    2 Packages
Remove     1 Package
Downgrade  2 Packages

Total download size: 45 k
Is this ok [y/N]:

And the second command.

$ sudo dnf swap fedora-release-kde fedora-release-workstation

Last metadata expiration check: 0:20:04 ago on Tue 02 Jan 2024 08:37:17 AM CET.
Package fedora-release-workstation-39-30.noarch is already installed.
Dependencies resolved.
==============================================================================
 Package                              Architecture  Version  Repository  Size
==============================================================================
Removing:
 fedora-release-kde                   noarch        39-30    @fedora     0  

Transaction Summary
==============================================================================
Remove  1 Package

Freed space: 0  
Is this ok [y/N]:

Next, fetch the Fedora Workstation packages and dump them on your storage drive (omitting output for brevity).

sudo dnf group install "Fedora Workstation"

Now that Gnome Shell packages are installed disable SDDM and enable the GDM login manager on boot.

sudo systemctl disable sddm
sudo systemctl enable gdm

At this point, I would log out or reboot and log into the Gnome Shell.

As the final step, you remove the KDE spin packages and the remaining stragglers.

sudo dnf group remove "KDE Plasma Workspaces"
sudo dnf remove *plasma*
sudo dnf remove kde-*
sudo dnf autoremove

Be careful not to mistype sudo dnf remove kde-*! If instruct dnf to remove kde*, it will catch more packages than you would like.

That is all there is to turn the Fedora KDE spin installation into the default Fedora Workstation with the Gnome Shell.

Read More »

Fedora 39/40 Switch Desktop Environment Gnome Shell To KDE Plasma

If you want to do the reverse operation and switch from KDE Plasma to the Gnome Shell, I also have a blog post on that.

Replacing the Gnome Shell on Fedora 39 requires only a couple of dnf and systemctl commands to convert the default Fedora Workstation into the KDE spin. It might also work on earlier and later versions.

I have verified these steps on a fresh installation. Be sure to check the console output to avoid accidentally uninstalling any required software if you perform the desktop swap on a productive system.

Start with upgrading all packages. It is generally a good idea when performing such a massive system change.

sudo dnf upgrade

Next, you change the type of the Fedora installation. This is required because Fedora uses package groups and protected packages. You allow removing the Gnome package groups by swapping them with the KDE package groups.

sudo dnf swap fedora-release-identity-workstation fedora-release-identity-kde

And the second command.

sudo dnf swap fedora-release-workstation fedora-release-kde

Next, fetch the KDE spin packages and dump them on your storage drive (omitting output for brevity).

sudo dnf group install "KDE Plasma Workspaces"

Now that KDE packages are installed disable GDM and enable the SDDM login manager on boot.

sudo systemctl disable gdm
sudo systemctl enable sddm

At this point, I would log out or reboot and log into the KDE session.

As the final step, you remove the Fedora Gnome packages and the remaining stragglers.

sudo dnf group remove "Fedora Workstation"
sudo dnf remove *gnome*
sudo dnf autoremove

That is all there is to turn the default Fedora Gnome installation into the Fedora KDE spin.

Read More »

Fedora Linux 35 Beta Install NVIDIA Driver (Works on Fedora 36 Too)

This is a quick one because the installation works in the same way as it did in Fedora 34.

First, I added the RPM Fusion repositories as described here.

sudo dnf install \
  https://download1.rpmfusion.org/free/fedora/rpmfusion-free-release-$(rpm -E %fedora).noarch.rpm
sudo dnf install \
  https://download1.rpmfusion.org/nonfree/fedora/rpmfusion-nonfree-release-$(rpm -E %fedora).noarch.rpm

Next, I installed the akmod-nvidia package like it is explained on this page.

sudo dnf update
sudo dnf install akmod-nvidia

One reboot later, the NVIDIA module was up and running.

$ lsmod | grep nvidia
nvidia_drm             69632  4
nvidia_modeset       1200128  8 nvidia_drm
nvidia              35332096  408 nvidia_modeset
drm_kms_helper        303104  1 nvidia_drm
drm                   630784  8 drm_kms_helper,nvidia,nvidia_drm

For completeness: my computer has an NVIDIA GT1030.

I hope this helped you, and thank you for reading.

KDE Plasma Remap Meta/Windows Key From App Launcher to KRunner

Sometimes I want to run applications that I do not have pinned to the quick-launch bar of my choice’s operating system/desktop environment. To do that, I am used to pressing the Windows/Meta Key, begin typing a few characters, and hit Enter. This is muscle memory and hard to get rid of. Although it does not matter which UI opens, I do not need the full-blown KDE Application Launcher, Gnome Shell, or Windows Start Menu. The amount of UI that pops up and changes while searching for the app is distracting.

Therefore, I wondered whether I could remap the Meta/Windows key from opening the Application Launcher to opening KRunner. And you can, but only on the command line.

Remove the key mapping from the Application Launcher.

kwriteconfig5 --file kwinrc --group ModifierOnlyShortcuts --key Meta ""

Open KRunner instead.

kwriteconfig5 --file kwinrc --group ModifierOnlyShortcuts --key Meta "org.kde.krunner,/App,,toggleDisplay"

Apply the changes to the current session.

qdbus org.kde.KWin /KWin reconfigure

I hope this helps you. Thank you for reading.

Qt6 QtCreator Crash After Install on Ubuntu 21.04

Hopping Linux distributions, I came to Ubuntu 21.04, and one of the first things I do is install Qt manually. I have described the process in a previous blog post on Linux Mint, and it is the same for Ubuntu. Except for a tiny detail. On Ubuntu, the bundled QtCreator immediately crashes and triggers a "Send Diagnostic" dialog.

$ /opt/Qt/Tools/QtCreator/bin/qtcreator
qt.qpa.plugin: Could not load the Qt platform plugin "xcb" in "" even though it was 
found. This application failed to start because no Qt platform plugin could be 
initialized. Reinstalling the application may fix this problem.

Available platform plugins are: eglfs, linuxfb, minimal, minimalegl, offscreen, vnc, 
xcb.

The fix is simple.

sudo apt install libxcb-xinerama0

I hope this helps. Thank you for reading.

Customize Nautilus Default Bookmarks

Nautilus is the default file manager in basically all Gnome-based distributions. I wonder why I cannot configure the default Bookmarks in the left panel through a context menu with that wide adoption. Is there no demand?

I managed to achieve my goal by editing two files. One is for the user, and the other one is a system file. I have not tried multiple user accounts, but I assume it affects everyone that uses the computer.

I wanted to remove "Desktop", "Public", "Templates", and "Video" because I never need that. What I ended up doing was to also change the location of "Documents", "Music", and "Pictures" to point to their respective OneDrive equivalents. That saves me from creating symbolic links, as I have explained in one of my OneDrive posts.

First, the user file.

vim ~/.config/user-dirs.dirs

#XDG_DESKTOP_DIR="$HOME/Desktop"
XDG_DOWNLOAD_DIR="$HOME/Downloads"
#XDG_TEMPLATES_DIR="$HOME/Templates"
#XDG_PUBLICSHARE_DIR="$HOME/Public"
XDG_DOCUMENTS_DIR="$HOME/OneDrive/Files"
XDG_MUSIC_DIR="$HOME/OneDrive/Music"
XDG_PICTURES_DIR="$HOME/OneDrive/Pictures"
#XDG_VIDEOS_DIR="$HOME/Videos"

Next, the system file. If you only remove entries from the user file, they will be added again the next time you log in. My tests showed that it is enough to customize the location in the user file. The other way around does not work, however.

sudo vim /etc/xdg/user-dirs.defaults

#DESKTOP=Desktop
DOWNLOAD=Downloads
#TEMPLATES=Templates
#PUBLICSHARE=Public
DOCUMENTS=Files
MUSIC=Music
PICTURES=Pictures
#VIDEOS=Videos
# Another alternative is:
#MUSIC=Documents/Music
#PICTURES=Documents/Pictures
#VIDEOS=Documents/Videos

Finally, you need to log out and log in again for this change to take effect.

I hope this helps. Thank you for reading.

OneDrive Sync On Linux Part 3, With abraunegg/onedrive As Daemon

In a previous blog post, I showed another way of syncing OneDrive folders on Linux as an alternative to using RCLONE. It was the Open-Source project “onedrive” by Github user “abraunegg” (a fork of an abandoned project by user “skilion”). One thing I was having trouble with was the installation as a daemon. I used an @reboot crontab workaround to achieve my goal instead. However, I was not satisfied, so I went back to the documentation to see if I missed something. And miss I did. To my defense, other steps I had tried are omitting a necessary detail required to make it work.

I have mentioned the installation in the other post, but I also left out a thing or two that I came across. That is why I will include the setup process again, this time in more detail, and refer you to the other blog post for configuration tips. That is the part I will skip here.

My test system is the same Fedora 34 distribution, and I have also tested the steps on Pop!_OS, which means it should work on the other Ubuntu derivates.

Read More »

OneDrive Sync On Linux Part 2, With abraunegg/onedrive

Edit: There is a part 3 that solves the daemon problem.

It has been about a year since my first blog post about syncing Microsoft’s OneDrive cloud storage on Linux. The last time around, I used RCLONE, which required a more hands-on approach. I have found a new tool that I think is better because it can sync automatically in the background without scripting or manually hacking. It is aptly called onedrive that you can find on Github.

Its name might suggest that Microsoft finally ported their Windows and Mac clients to Linux, but, unfortunately, that is not the case. I would still like to see this happen, and if there is ever a time for Microsoft to do it, it is probably now.

Let me briefly explain how I have installed and configured the onedrive tool to suit my needs. Thanks to good default values, it is straightforward, and you might not need any configuration at all.

(I wonder how I managed to not find this tool a year ago)

Read More »

OpenRGB – An RGB Software I Want to Use (It Runs on Linux!)

If you are in the market for anything gaming PC or gaming laptop related, chances are, you have come across the industry-wide trend of RGB illuminated hardware and peripherals. Everything is RGB, from the graphics card to the RAM, to your headset (because you can see the lights when you wear it 🙄), and many, many more. I am not against RGB lighting per se, but if you follow the industry as a PC hardware enthusiast, it is evident that in some aspects, this has gone too far.

Quick side note: after a rant about RGB software, I will show examples of using OpenRGB on Windows and Linux. If you are interested in only that, skip the rant and scroll to the bottom.

Read More »