Sunday, November 28, 2021

Controller Button Layout

The Playstation controller, especially the Dual Shock controller from the PS4, is my preferred controller.  It has a typical 4 action buttons layout which uses symbols to identify each button.  




I've developed muscle memory for I expect each button typically do which makes switching between Playstation and Nintendo games a bit of a pain since they have the confirm/decline button switched from each other.  Fortunately, each offers to let you remap the behavior for it.  While this work for system settings, each game might have a different behavior assigned to each button so I normally try to map them all into a consistent behavior.  My typical button configuration is as follows:

△ - heavy hit/special attack/switch characters

O - normal hit

X - jump/OK/confirm

▢ - cancel/decline/action

R1 - dash/evade

Controllers from Nintendo, Microsoft Xbox and Google Stadia use alphanumerical values (A, B, X, Y) for their buttons.  While XBox and Stadia have the same button names, Nintendo has a different layout which means that I never remember which is the A & B and which is the X & Y.



X   B




Y   A


Monday, November 22, 2021

Cleaning and Removing Mold with Borax

 This is a reminder post for myself on the mixture solution for using Borax to clean and remove mold.

  • 1 cup of borax
  • 1 gallon of hot water
Mix the borax with the water and spray on affected area, scrub, wipe and dry area.  Do another light spray on the area and let it dry on its own.

Sunday, November 21, 2021

Thoughts on Shang-Chi and the Legend of the Ten Rings


Shang-Chi and the Legend of the Ten Rings got a lot of positive buzz so I watched it when it came out on Disney+.   As expected of a super hero movie in the MCU, it delivers its share of action and comedy.  It is notable that a movie centered around a hero of Asian decent with a story line that stays within an Asian sphere that it had a predominantly Asian cast.  The film aimed to respect the Chinese culture and avoided the use of outdated stereotypes.  Multiple articles have pointed out the cultural significance of this movie.

Shang-Chi will be most identifiable to Asian-Americans (or those of asian descent who grew up in western cultures) because the film is still looking at Asian culture through a Western perspective.  For example, although respectful in its view of the philosphy and people of a hidden village that is meant to represent traditional Chinese culture (even though technically the people are aliens), there are many manerisms that are clearly western (hugging as a greeting), but it is still reflective within the Asian-American experience so it's not something to criticize. 

While I enjoyed the movie for what it is, a superhero film, I also felt the movie was too rushed and wasn't able to imbue Shang-Chi with the inner qualities of a hero before the movie ended.  Instead, Shang-Chi is the superhero of the movie because he got super powers, used those powers and won a fight against the big baddie.

As a person, Shang-Chi never overcame blaming his father nor displayed the empathy and support that those around him showed him.  While angry at his father for making him train in the martial arts, the movie made it very clear that his father was not cruel and did raised him through fear as Thanos did with his own "children".  The movie also made made it clear that his father did made a concerted effort to atone for his past and that while that was recognized by Shang-Chi's mother, Shang-Chi does not attempt to consider it even though he is grown up and has had distance to reflect.  I'm not saying that the father is absolved of his wrong doings and he did fall back to his old ways, but this is a movie aboug Shang-Chi and not about his father.

Instead, Shang-Chi goes from saying how having to kill made him run away from his father to saying that he must kill his father in the same scene.  I found this lack of taking personal accountability and blaming others especially surprising as this difference is what was shown to separate  Captain America and the Patriot as shown in Falcon and the Winter Soldier.

Ultimately, this made me a little unsure about the character of Shang-Chi.  As a movie, I like the actors, the filmography, the respect it showed the Asian culture as well as the action and comedy that the MCU brings.  I do look forward to seeing more and hope that they will take more time in the next movie to develop the real "hero" aspects of Shang-Chi.

Upgrading to Fedora 35

Used Fedora DNF upgrade method to go from Fedora 34 to 35.  Didn't noticed any issues with the upgrade.   I decided to do the upgrade even though I just upgraded a couple of weeks ago to Fedora 34, but since I had some free time this week I decided to upgrade now when I have time to deal with any issues.  

Tuesday, November 2, 2021

Upgrading to Fedora 34

Although Fedora 35 just came out, I decided to first upgrade to Fedora 34 and give Fedora 35 a little bit of time to bake.  I've had good success using the regular Fedora DNF upgrade method, but this is the first time I've upgraded Fedora on my current machine because it was a new system so it only ever had Fedora 33 installed.

Didn't run into any immediate issues that I can tell.

Monday, November 1, 2021

Unable to access the internet when using PiHole?

PiHole added rate-limiting for DNS queries to a very low 1000 queries per minute and enabled it by default even when updating an existing installation.  To change the rate limit (or turn it off), edit /etc/pihole/pihole-FTL.conf and add/edit the line: 


The format is [# of queries]/[seconds] so to set a limit of 1000 queries per hour would be 1000/3600.

Friday, April 9, 2021

Keep Go Module Directory Clean with GOMODCACHE

Go makes downloading projects and their dependencies very easy.  In the beginning there was go get which will download the project source code and its dependencies to $GOPATH/src.  With modules, all the dependencies are downloaded to $GOPATH/pkg/mod.  The ease of downloading and the lack of management control in the go command means that is easy for the two directories to grow in size and to lose track of which project led to the download of a particular package.

I recently started to play around with the Fyne UI toolkit.  I didn't initially know what other packages it would download so I wanted to have Fyne and its dependencies in their own area.  The go command has a flag -pkgdir that is shared by the various commands.

The build flags are shared by the build, clean, get, install, list, run, and test commands:


-pkgdir dir

install and load all packages from dir instead of the usual locations. For example, when building with a non-standard configuration, use -pkgdir to keep generated packages in a separate location.

This didn't work as I expected because it didn't seem like it did anything at all.  Using the command

go build -pkgdir /tmp

resulted in all the downloaded package still going to $GOPATH/pkg/mod.

What did work (thanks to seankhliao) is to set the GOMODCACHE variable which sets more then the cache location but also the package location:

GOMODCACHE=/tmp go build

All the downloaded dependency packages will now be downloaded to /tmp rather then $GOPATH/pkg/mod.  

Honestly, I'm not really sure what -pkgdir is really suppose to do.  Maybe it is only for things that the build command generates?  Why does it do when using with go get?

Wednesday, April 7, 2021

Local Go module file and Go tools magic

I really value that when working with Go there are no "hidden magic" in source code.  Go source code are essentially WYSIWYG.  You don't see decorators or dependency injections that might change the behaviors after it is compiled or run that requires you to not only have to understand the language and syntax but also having to learn additional tools' behavior on the source code.  While this is true of the language it is not true of the go command for Go's module system.

I've personally found Go modules to be more confusing then the original GOPATH.  I understand that it solves some of the complaints about GOPATH and also addresses the diamond dependency problem, but it also adds complexity to the developer workflow and under-the-hood magic. Maybe that's to be expected when it is going beyond source code management and adding a whole package management layer on top, but I'd be much happier to have to deal with this added complexity and burden if the solution was complete (how about package clean up so my mod directory isn't growing non-stop?)!

Modules adds the go.mod file that tracks all a project's dependencies and their versions.  This introduces a problem when one is developing both applications and libraries since it is possible that the developer have both the released production version and in-development version of libraries locally.  To point your application at the library without constantly changing the import path in source code, the replace directive can be used, but when committing the code it is not ideal to submit the go.mod with the replace directives in it as it will likely break the build for someone else checking out the code and can expose some privacy data (the local path that might contain the user name).

Now developers have to add the replace directives locally, remove them right before submission and then put them back (without typos!).  Fortunately, in Go 1.14, the go commands (build, clean, get, install, list, run, and test) got a new flag '-modfile' which allows developer to tell it to use an alternative go.mod file.  The allows a production version of go.mod file to not have to be modified during development/debug and a local dev version of go.mod that can be excluded from getting committed (i.e. .gitignored).  

This can be done on a per-project level by adding -modfile=go.local.mod to go [build | clean | get | install | list | run | test]:

go build -modfile=go.local.mod main.go

Note that whatever the file name is, it still has to end on .mod since the tool assumes to create a local go.sum mod based on a similar name as the local mod file except with the extension renamed from .mod to .sum.

To apply the use of go.local.mod globally, update "go env":

go env -w GOFLAGS=-modfile=go.local.mod

go env -w will write the -modfile value to where Go looks for its settings:

Defaults changed using 'go env -w' are recorded in a Go environment configuration file stored in the per-user configuration directory, as reported by os.UserConfigDir.

So the flow that Jay Conrad pointed out in this bug thread would be as follows:

  1. Copy go.mod to go.local.mod. 
  2. Add go.local.mod to .gitignore.
  3. Run go env -w GOFLAGS=-modfile=go.local.mod. This tells the go command to use that file by default.
  4. Any any replace and exclude directives or other local edits.
  5. Before submitting and in CI, make sure to test without the local file: go env -u GOFLAGS or just -modfile=. 
  6. Probably also go mod tidy.

Tuesday, April 6, 2021

Listing installed packages on Fedora with DNF

To list the packages that are user installed:

dnf history userinstalled

To list all installed packages:

dnf list installed

Saturday, March 27, 2021

ASUS PN50 4300U as a portable system

I needed a computer for a remote location.  It would be used only for short periods of time a few times a year so it is not worth it to invest in a high end system, but it still needs to be powerful enough to do my work (including basic gaming for the kids).  In a more normal time, I might simply build a basic system for about $500, but the shortage of electronic components means many items are simply not available or have sky rocketed in price so that even low end systems now cost too much to build to make them worth it.

I considered getting a laptop but decided that a mini-pc fits this need better.  I am able to use it at home but can easily transport it to the remote location when needed.  My experience with the Asus PN50 has been very good so I decided to go with it again.  This time I opted for the lowest end model that uses the Ryzen 4300U with 4 cores/4 threads running at 2.7 GHz (base)/3.3.7 GHz (boost).  I went with just a single stick of 8 GB 3200MHz Crucial SODIMM and 500GB M.2 NVMe SSD.

For the monitor, I got a Lepow Z1-Gamut (2021) portable USB-C monitor.  It is easy to transport but also able to use at home.

  • $330 (PN50) 
  • $75 (storage)  
  • $46 (memory)
  • $160 (monitor)

Total cost came to $611 with no OS.  The price is reasonable given the current state of the world and portability it provides.  Note that if you don't have a keyboard or mouse, you'll need to provide one yourself.  I'm using it with a Logitech K400 Plus wireless keyboard that also has a track pad built in.  The keyboard is light and portable as well.

I had no problem setting up my previous PN50 with Windows and Linux, but did run into an issue with this particular unit when installing Windows 10.  During the installation, the lower part of the monitor was distorted.  I tried this on both the Lepow and on a existing monitor that I know worked.  The problem appeared on both.  I was still able to see the menu and install choices and the problem went away once the installer rebooted into configuration screen.  

Initially, Windows complained that it couldn't install to the drive and I worried that either the drive was bad or the memory was bad.  The ASUS BIOS doesn't normally show memory and it doesn't make it clear what drives it sees so it wasn't helpful in determining where the problem was.  I just took everything out and re-inserted everything.  This time, the installation worked.

After Windows finish installing, I couldn't access the internet through the ethernet connection to download anything and Windows itself couldn't get any updates and drives.  Ping/tracert from the command line worked but apps that tries to access the internet did not work.  I searched on Google, but none of the solution I found worked for me.

I had to use WIFI and that worked to get all the updates, drivers and patches.  Once those were all installed, using the Ethernet connection worked.  With Linux, I didn't experience this problem.

The Lepow monitor worked both through the USB-C and HDMI.  Using the USB-C, the video and power can be handled with a single cable between it and the PN50.   A problem occurred when it started to display something.  There was a whining sound coming it but it goes away if the brightness is turned up to 100% (the default setting is 30% - 50%).  Besides this issue, everything else worked as expected.

I'm not really sure why it was more troublesome installing this PN50.  Besides the CPU, the other difference with the PN50s I already own is the version of the BIOS so it is possible that the drivers on the Windows installation tool does match with this PN50 or its BIOS version.  I wish the BIOS settings were more descriptive, but fortunately everything ended up working and I didn't have to exchange any of the parts.

Boot time has been surprisingly slow from when the power button is pushed to when the Windows spinner shows up.  I can play Genshin Impact at medium graphics settings and I can run Tomb Raider at normal settings.  Although I don't think another 8 GB might help for running games on this CPU, it might help with running multiple apps in Windows so I'm thinking of getting another 8 GB to fill out the memory slot and then hopefully I'll never had to open up the case again!  Update:  I ended up getting another 8GB so it's now running 16GB of memory.

The Ryzen CPU supports up to 3200 MHz memory and that's what I got, but based on some benchmarks that others have published it doesn't seem like the 4300U benefits much from it vs 26xx MHz memory.  In this case, the 3200 was actually a few dollars cheaper then the 26xx ones so I went with 3200 anyway.

Sunday, January 17, 2021

My Systems (2021)

Updated 5/21/2023 with Beelink system

2021 brings upgrades to the computers in the house that has been fairly static over the past 7-8 years.   I got a couple of new systems and repurposed some parts from the old systems so this post is mainly to inventory the new configurations for my own reference.

System 1 (Asus PN50 4800U) [replaces system 3]

  • Ryzen 7 4800U [Zen2] (8 cores / 16 threads, base clock 1.8GHz, max 4.2GHz - 8 GPU cores - RX Vega 8, 15W)
  • 32 GB Crucial DDR4 3200Mhz  RAM (2x16GB)
  • 1TB Samsung 970 EVO Plus (M.2 NVMe interface) SSD
  • 500GB Crucial MX500 SATA SSD (2.5")
  • Intel WIFI 6, BT 5.0
  • 2 Dell U2311H (existing) monitors
  • 1 Dell U2421HE (24" 1080p) monitor
  • Dell AC511M soundbar
  • Unicomp Ultra Classic keyboard (2009)
  • Logitech Wheel Mouse

The PN50 replaced my trusty Shuttle DS87 as my daily driver.  All the components are new except for the two Dell U2311H monitors, keyboard and mouse.  I added a third monitor, Dell U2421HE, because it has ethernet and a USB-C interface that can connect with the PN50 for DisplayPort, USB hub and ethernet.  This lowered the number of cables between the PN50 and peripherals and reduced clutter on my desk.

Despite its small size, the PN50 is still very high performing, but you do pay a price premium for having something small AND fast.  I did lose some connectivity (fewer USB ports and only 1 Ethernet port).  The USB ports can be addressed with a hub and I use the monitor's built-in ethernet port for the one I loss since the PN50 only has one.

The Dell AC511M is a USB soundbar and can be attached to the monitor stand (not to the monitor itself).  It draws its power from the USB connection but I found three flaws with it: 
  1. It has no power button so it turns on when the PC turns on.  To use the audio-in jack and the speaker means the PC must be turned on. 
  2. The speaker has a hiss to it like many speakers but with no power button the hiss is always there.  I had to plug something into the headphone jack so I don't hear it.
  3. When something is plugged into the audio-in jack no audio goes through the USB.  If there are two audio sources (e.g. PC and music player) they need to share a connection.  I have two PCs connected to the monitor (one on display port and one on hdmi) and I can't have one play through USB and one through the audio in without plugging-and-unplugging the audio-in cable.  Instead, I have a cable from the monitor's audio-out to the soundbar's audio-in and each machine plays through the DP/HDMI outputs.
My ideal system would still be to have a something like the DS87 housing with a Ryzen 4700G (or 5700G), but those CPUs aren't very readily available and there is not DS87-like small form factor cases for them.  Update:  The day after I posted this, Shuttle announced this exact PC.  ^_^;  With my new monitor, though, I would like the new Shuttle to have an USB Type C connection, but at least if I wanted to get more power then I know the option is now out there!

System 2 (Asus PN50 4500U) (replaces system 4)

  • Ryzen 5 4500U [Zen2] (6 cores / 6 threads, base clock 2.3GHz, max 4.0GHz - 6 GPU cores - RX Vega 6, 15W)
  • 2x 8GB 3200 DDR4 so-dimm by SK hynix
  • Intel 660p Series m.2 500GB SSD
  • Intel WI-FI 6 (GIG+) + BT 5.0
  • *Crucial 128BG m4 2.5" SSD

This system replaces my wife's Shuttle XH61 system and is an upgrade across the board over its predecessor.

System 3 (Shuttle DS87)

  • Shuttle PC DS87
  • Intel Core i7-4790S Processor (4 cores / 8 threads, 8M Cache, base clock 3.2 GHz, max 4.0GHz, 65W)
  • Samsung 850 EVO 500GB 2.5-Inch SATA III Internal SSD (MZ-75E500B/AM)
  • 2 x Crucial 16GB Kit (8GBx2) DDR3 1600 MT/s (PC3-12800)
  • *Intel Network 7260.HMWG WiFi Wireless-AC 7260 H/T Dual Band 2x2 AC+Bluetooth HMC
  • *Samsung 840 EVO Series 120GB mSATA3 SSD

This Shuttle had been my reliable daily driver for over 6 years running Linux.  I repurposed an Samsung SSD and Intel wireless card from my Asus VivoMini to install Windows and add WIFI and bluetooth to the system.   The antennas that was in the VivoMini was hard to extract so I took the antennas from an old ASUS Chromebook laptop that wasn't being used anymore.  

The VivoMini was being used for kid's remote/distance learning but was a bit under-powered for handling some of the video conferencing features so this system will now take its place.

System 4 (Shuttle XH61)

  • Intel Core i7-2600S Processor (4 cores / 8 threads, 8M Cache, base clock 2.8 GHz, max 3.8GHz, 65W)
  • *Seagate 300GB 7200RPM HDD Cosair MX500 CT500MX500SSD1 500GB 2.5in SATA 6Gbps SSD
  • TP-Link USB WiFi Adapter for Desktop PC, AC1300Mbps USB 3.0 WiFi Dual Band Network Adapter with 2.4GHz/5GHz High Gain Antenna, MU-MIMO
  • 8GB RAM

This system was originally put together in 2012 (with an SSD) and even in 2020 was a perfectly good system for most tasks.  When running Windows 10 or some basic games (Minecraft, Don't Starve) it still felt pretty snappy.  I wouldn't try running any graphics intensive games on it.  

The SSD from this system was moved to the PN50-4500U (system 2) and replaced with a 2.5" Seagate 300GB 7200RPM hard disk drive that I pulled out of the Chromebook laptop that I pulled the antenna from.  After switching to the mechanical disk drive, the system felt noticeably sluggish.  A solid state drive makes a big difference!  

I'm keeping this system around for schooling.

System 5 (ASUS PN50 4300U)

  • Ryzen 3 4300U [Zen2] (4 cores / 4 threads, base clock 2.7GHz, max 3.7GHz - 5 GPU cores - RX Vega 5, 15W)
  • 16 GB Crucial (CT8G4SFRA32A) DDR4 3200Mhz  RAM (2x8 GB)
  • 500GB Samsung 970 EVO Plus (M.2 NVMe interface) SSD

This system is meant to be a more portable system for when I'm working at another location.  I paired this up with a portable monitor rather then getting a laptop since I don't need this to be a mobile system but one that I can easily transport.

System 6 (Beelink SER5)

This latest addition was added in 2023 as a secondary gaming PC.  The specs are decent given the price of under $300 including Windows.

  • Ryzen 5 5500U (6 cores / 12 threads, base clock 2.1GHz, max 4.0 GHz, 7 core GPU,  @ 1800 MHz, 15W TDP)
  • 16 GB DDR4
  • 500GB NVME M.2 SSD
  • WiFi 6
  • BT 5.2
The system came with Windows 11 Pro.


The ASUS VivoMini UN62 is a wonderfully small and quiet bare bones system with very good build quality.  It was this system that gave me confidence in getting the ASUS PN50.  I actually own 3 of these system and use them for different purposes which have changed over time (e.g. media station, always-on server for minecraft, etc).  More recently, however, the Raspberry Pi 4  have replaced the VivoMinis for some of the tasks.

The specs for my UN62s are:
  • Intel i3-4030U (2 cores / 4 threads, 1.9 GHz, 3 MB cache, 15W)
  • 16GB Crucial (2x8 GB DDR3-1600) 204-pin sodimm
  • Samsung 840 EVO 128GB msata3 SDD
  • Intel Network 7260.HMWG WiFi Wireless-AC 7260 H/T Dual Band 2x2 AC+Bluetooth HMC
Two served as the kids' computers until I upgraded their setup.  One was repurposed as the machine for schooling when remote/distance learning was put in place due to covid-19.  This system was replaced by System 3 and its drive and wireless card got moved to that system.

Raspberry Pi 4

  • Broadcom BCM2711, Quad core Cortex-A72 (ARM v8) 64-bit SoC @ 1.5GHz
  • 4G BLPDDR4-3200 SDRAM
  • 2.4 GHz and 5.0 GHz IEEE 802.11ac wireless, Bluetooth 5.0, BLE
  • Gigabit Ethernet
  • 2 USB 3.0 ports; 2 USB 2.0 ports.
  • Raspberry Pi standard 40 pin GPIO header (fully backwards compatible with previous boards)
  • 2 × micro-HDMI ports (up to 4kp60 supported)
  • 2-lane MIPI DSI display port
  • 2-lane MIPI CSI camera port
  • 4-pole stereo audio and composite video port
  • H.265 (4kp60 decode), H264 (1080p60 decode, 1080p30 encode)
  • OpenGL ES 3.0 graphics
  • Micro-SD card slot for loading operating system and data storage
  • 5V DC via USB-C connector (minimum 3A*)
  • 5V DC via GPIO header (minimum 3A*)
  • Power over Ethernet (PoE) enabled (requires separate PoE HAT)
  • Operating temperature: 0 – 50 degrees C ambient
  • Raspberry Pi ICE Tower Cooler, RGB Cooling Fan (excessive but looks cool on the desk).

The Raspberry Pi 4 is a small wonder of a machine that replaces what I originally used the ASUS VivoMini for and is significantly cheaper.

Friday, January 1, 2021

Installing Windows after Linux On Separate Disks

*** Update Feb. 28, 2021 ***

After having gotten both Linux and Windows dual-booting, I had to wipe and reinstall Windows 10.  This time, however, Windows 10 refused to install and gave a message saying "We couldn't create a new partition or locate an existing one."

This is caused by Windows being confused because there is another drive with a primary partition.  Even though you can tell Windows which drive you want it to be installed on, it still can't figure out how to install itself.  After trying a few different things, the solution was to unplug the other drive, install Windows and the plug the other drive back in.

If you are using GRUB as the boot manager the Windows entry will need to be updated or you'll get an error message when trying to boot into Windows.  To update the GRUB configuration with the proper Windows values you'll need to run grub2-mkconfig and write the results to the configuration file.

For MBR setups:

sudo grub2-mkconfig -o /boot/grub2/grub.cfg

For EFI setups:

sudo grub2-mkconfig -o /boot/efi/EFI/fedora/grub.cfg 

Leaving out the -o option will print the config to screen so you can see it first before overwriting the config.

*** Original Post ***

For my new system, I also added a second SSD that is intended for installing Windows 10.  My primary daily driver is Fedora Linux so I installed it first and it's on the primary drive.  

I don't use Windows very often but I might use it to play some games with the family so I'm okay with dual booting for this purpose.  I've not played with  dual-booting Linux and Windows in a very long time.  It's not something I recommend to someone who switch between the two operating systems frequently nor to someone who very rarely use one of them.  Rarely using one operating systems means it isn't updated so when it is needed there might be a lot of lost time getting the operating system updated and running again.

Setting up dual booting can also be a pain as one operating system might mess with the booting of the others (Windows tends to be the more frequent offender here as it doesn't really like to recognize non-Windows systems).  Most documentation I found suggests installing Windows first and then Linux so that the Linux boot manager (GRUB) can find Windows and add it to the boot options.   However, there's still a possibility that a Windows update can mess up GRUB and then it needs to be restored.  Another slight disadvantage is that it adds additional time for booting since there needs to be some pause to let users pick the right OS they want to run.

What I did was to install Linux first on one drive and then install Windows on complete separate drives so the boot manager of each drive is not effected by the other operating system.  I rely on the BIOS (yes, I know, force of habit to call it the BIOS) to select which drive to boot.  

After Linux was installed on disk "1", I installed Windows on disk "2" (Windows called it disk 0).  Whenever it rebooted, I made sure to tell the BIOS to boot of the Windows drives and not the default Linux drive.

Fortunately, the Asus BIOS's boot options include an Boot Override option to boot a specific drive without having to permanently change the boot order.  The other nice thing is that GRUB has the option to go back to the BIOS boot if I missed hitting the DEL key to get into the BIOS.  When I run Linux it is a no-op and when I need to run Windows it's a an extra 1-2 key strokes during boot.

The only issue that I had was that after Windows was installed there was an error message because of Secure Boot.  The simplest work-around is to disable Secure Boot in the BIOS so it can continue the booth process into either Linux or Windows.  I'm still learning about Secure Boot to see how to make it work with this configuration.

2021 PC - Asus PN50 4800U

Although I was very tempted to build a new desktop PC and get access to all the power goodness of the latest AMD Ryzen, I was hesitant giving up the small form factor that I had with my Shuttle PC DS87.  When the Asus PN50 with the AMD Ryzen 4800U became available I took the plunge.

The specs comparison between the previous and new PCs:

New PC:

  • Ryzen 7 4800U [Zen2] (8 cores / 16 threads, base clock 1.8GHz, max 4.2GHz - 8 GPU cores - RX Vega 8, 15W)
  • 32 GB Crucial DDR4 3200Mhz  RAM (2x16GB)
  • 1TB Samsung 970 EVO Plus (M.2 NVMe interface) SSD
  • 500GB Crucial MX500 SATA SSD (2.5")
  • Intel WIFI 6, BT 5.0

Previous PC:

  • Shuttle PC DS87
  • Intel Core i7-4790S Processor (8M Cache, base clock 3.2 GHz, max 4.0GHz, 65W)
  • Samsung 850 EVO 500GB 2.5-Inch SATA III Internal SSD (MZ-75E500B/AM)
  • 2 x Crucial 16GB Kit (8GBx2) DDR3 1600 MT/s (PC3-12800)

There are enough sites giving benchmarks so I'm not going to try to repeat what they've done, but I wanted to have something to show myself a tangible performance improvement.  It is generally during compilation when I wish things would go faster so why not compare compilation between the two systems?  The multi-core (8 vs 4) and multi-thread (16 vs 8) should benefit compilation even if the base clock of the 4800U is 1.8GHz while the i7 is 3.2GHz.  I'm expecting modern CPU is also more efficient per clock cycle then an 6 year old CPU.

I decided to time the compilation of OpenCV using the following

wget -O
mkdir -p build && cd build
cmake ../opencv-master/
time cmake --build .

i7 Results

real   28m57.219s
user   26m48.466s
sys     2m01.402s

4800U Results

real     36m48.166s
user     34m54.722s
sys       1m52.574s

How did this happen?  Was it that the 3.2-4.0 GHz too much for the 1.8-4.2GHz to overcome?  It did seem like during compilation all of the i7 cores was running at around 3.6 GHZ, but I suspected that the compiler was not actually taking advantage of all the cores of the 4800U.

I tried again using Ninja which automatically configures the build to use the multi-core CPUs.

make clean
cmake -GNinja ../opencv-master/
time ninja

i7 Results

real	11m28.741s
user	85m39.188s
sys	 3m23.310s

4800U Results

real      6m39.268s
user     99m03.178s
sys       4m8.597s

This result looks more like what I expected.  More of the system cycles were used on both the i7 and 4800U as more cores and threads were utilized but the real time was much shorter.  This just shows that for a lot of consumers fewer cores but faster clock speeds might be better for desktops (laptops and battery life adds another dimension) as they rely on the applications to be programmed to take advantage of the multiple cores.  That's why gamer systems usually will give up more cores for faster clock speeds since games aren't known for utilizing multiple cores.