Compare commits

...

11 Commits

Author SHA1 Message Date
33e316b8b4 Its cute.. (but its wrong). 2025-01-21 19:19:11 +01:00
49f79b698e Update of blog post 2025-01-21 19:06:24 +01:00
0fe5cf0883 Draft: Sudoku blog post 2025-01-21 01:56:54 +01:00
1173d1c5c4 pre-commit 2025-01-20 23:29:30 +01:00
bd7c00b9eb Added Taskfile and pre-commit-config files 2025-01-20 23:07:11 +01:00
58afa1d4f3 Gaming Edition! 2023-01-09 20:37:07 +01:00
f231c4b3e5 Update the updatescript 2022-10-30 21:43:46 +00:00
9582d8c896 New Blogpost 2022-10-30 22:39:17 +01:00
19ba3c3b12 Added shellscript, because I am lazy. 2022-10-20 18:30:40 +00:00
7cc34dfe83 Correct baseURL 2022-10-20 20:23:05 +02:00
61e93f0ec3 We need themes. 2022-10-20 20:14:40 +02:00
16 changed files with 324 additions and 12 deletions

6
.gitignore vendored
View File

@ -12,8 +12,8 @@ crash.log
crash.*.log
# Exclude all .tfvars files, which are likely to contain sensitive data, such as
# password, private keys, and other secrets. These should not be part of version
# control as they are data points which are potentially sensitive and subject
# password, private keys, and other secrets. These should not be part of version
# control as they are data points which are potentially sensitive and subject
# to change depending on the environment.
*.tfvars
*.tfvars.json
@ -47,4 +47,4 @@ hugo.darwin
hugo.linux
# Temporary lock file while building
/.hugo_build.lock
/.hugo_build.lock

10
.pre-commit-config.yaml Normal file
View File

@ -0,0 +1,10 @@
# See https://pre-commit.com for more information
# See https://pre-commit.com/hooks.html for more hooks
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-yaml
- id: check-added-large-files

32
Taskfile.yaml Normal file
View File

@ -0,0 +1,32 @@
version: '3'
tasks:
init:
cmds:
- terraform init
validate:
cmds:
- terraform validate
plan:
cmds:
- terraform plan
apply:
cmds:
- terraform apply
fmt:
cmds:
- terraform fmt
lint:
cmds:
- tflint
precommit:
cmds:
- pre-commit autoupdate
- pre-commit run --all-files

6
site/.gitignore vendored
View File

@ -24,8 +24,8 @@ crash.log
crash.*.log
# Exclude all .tfvars files, which are likely to contain sensitive data, such as
# password, private keys, and other secrets. These should not be part of version
# control as they are data points which are potentially sensitive and subject
# password, private keys, and other secrets. These should not be part of version
# control as they are data points which are potentially sensitive and subject
# to change depending on the environment.
*.tfvars
*.tfvars.json
@ -45,4 +45,4 @@ override.tf.json
# Ignore CLI configuration files
.terraformrc
terraform.rc
terraform.rc

View File

@ -3,4 +3,3 @@ title: "{{ replace .Name "-" " " | title }}"
date: {{ .Date }}
draft: true
---

View File

@ -1,4 +1,4 @@
baseURL = "blog.ligthert.net"
baseURL = "https://blog.ligthert.net"
languageCode = "en-us"
title = "Sacha's Blog"
theme = "m10c"
@ -29,4 +29,4 @@ theme = "m10c"
# darkColor = "#ffffff"
# lightColor = "#000000"
# lightestColor = "#000000"
# primaryColor = "#000000"
# primaryColor = "#000000"

View File

@ -0,0 +1,175 @@
---
title: "Exploration, fun, and process cycles of Sudoku"
date: 2025-01-20T23:33:06+01:00
draft: false
---
# The idea
I like to play games, and I to solve puzzles. But, if I can create a puzzle sover for puzzles like Sudoku, I can write an algorithm to solve them once and for all, and I will never have to play them ever again.
[Sudoku puzzles](https://en.wikipedia.org/wiki/Sudoku) have been around for a while and with them [algorithms to solve them](https://en.wikipedia.org/wiki/Sudoku_solving_algorithms). Most of them revolve around throwing random numbers against and see what sticks, intelligent guesswork and finally coming to a solution. I wanted to take a different approach: brute-force all possible solutions, store them in a database. And when I want to have a sudoku puzzle solved I just query the database and it returns all possible solutions.
This idea has been in the back of my mind for close to a decade and late last year during a week off I decided to take a shot at it. I dusted off my trusty Go language skills as I a) wanted to learn the language a bit better and b) wanted to use Go routines to easily (ab)use all my CPU cores in this new quest of mine, and finally c) I am terribad at math, so I am working with the tools I have got.
(...and this looked like something fun to do. 😏)
# Lay of the land
Classic Sudoku puzzles have 9 blocks in a 3x3 grid with each block containing all the digits from 1 to 9, each block consistent out of 3x3 digits. The puzzle setter provides a partially complete grid and its up to you to solve them, which usually have only a single solution.
Example puzzle:
![Example Sudoku Puzzle](/static/Sudoku_Puzzle_by_L2G-20050714_standardized_layout.svg.png)
( _Image honestly stolen from Wikipedia._ )
The first step is to come up with all these unique blocks. As these are the puzzle pieces I need to work with. What I did was the following:
1. Iterate from the lowest possible number (`123456789`) to the highest possible number (`987654321`).
2. Check if all the digits were present once
3. If the block was valid, store it somewhere
This resulted into [this file](https://gitea.ligthert.net/golang/sudoku-funpark/src/branch/trunk/blocks.csv). [Load the file](https://gitea.ligthert.net/golang/sudoku-funpark/src/branch/trunk/solver/blocks.go#L11-L34) into Go as a slice of ints and we can work with that from here on out. (This is faster than adding 363 thousand lines to your source code and keep adding them to a slice one by one, because after 20 minutes of compiling it still wasn't finished and I stopped it. So loading in a CSV was faster.)
Inspecting the file it resulted into `362880` possible blocks. It was only later that I noticed that this was the same as `9!` (9 factorial aka `9 * 8 * 7 * 6 * 5 * 4 * 3 * 2 * 1`). It wasn't entirely a surprise that the number 9 returned in a mathy game like Sudoku about 9 digits in a grid of 9 blocks. As far as I can tell this was the last time I encountered something 9 in the maths later on down the line.
# Making it fit.
The next step was writing code that would make a block like this:
```
123
456
789
```
next to a block like:
```
912
345
678
```
This become a mess as I had to ensure that:
1. The block was uniq (done!)
2. Every horizontal line was unique
3. Every vertical line was unique
The code for this became a lengthy headache, as I need to work with multi-dimensional array and make sure elements 3, 4, and 5 do not conflict with other elements in other blocks. And I ended up with some kind of mapping structure ensuring that there was no overlap. It was tedious to design, create, test, and very heavy on the processor to properly analyse everything.
It was at this point I had an epiphany and realized that:
1. All blocks have to be unique
2. All columns shouldn't have repeating digits
3. All rows shouldn't have repeating digits.
So, instead of comparing 3x3 grids, why not use rows to populate the puzzle?
```
123456789
912345678
...etc...
```
There are benefits to this approach:
1. It would not impact the end result as 3 unique lines that do not violate the constraints of the puzzle will result in 3 unique and valid blocks in a 1x3 row.
2. It is easier to compare the columns of all rows than the 3x3 digits and its adjecent 3x3 digits.
3. It saves on precious process cycles, which ultimately would speed up the entire process.
# Generate a (costly) solution
With this piece of the puzzle, it was time to put this into practice and see what I would get out of it.
I found some random and easy Sudoku puzzle and put this into my code:
```
row1 := "769104802"
row2 := "154800060"
row3 := "002700150"
row4 := "600900308"
row5 := "045328670"
row6 := "328670945"
row7 := "597410280"
row8 := "006283090"
row9 := "200590006"
```
I substituted empty entries with a `0`, and used this to find possible substitutions with the remaining numbers.
I will take `row1` as an example for the next bit:
```
row1 := "769104802"
```
Replacing the null values with the remaining number I wrote an algorithm that would find all the possible solutions that could work for row one. Missing only two digits this would leave me with two compatible entries:
```
769134852
769154832
```
I put them into a slice and moved on to the next row. And repeated this until all the 9 rows had a slice with possible compatible blocks (row1s, row2s, row3s, etc, etc)
The next step was comparing all the 9 slices, compare every entry, and validate every possible solution. This resulted into [a nesting 9 levels deep](https://gitea.ligthert.net/golang/sudoku-funpark/src/commit/16de7dda97747812eb99ef14088656e5f413b090/solver/processing.go#L45-L71):
* Iterate through row1 and take an element
* Iterate through row2 and take an element
* repeat 7 more times
* Validate all the 9 different elements.
If it validates, print the solution. It if doesn't, discard and move on.
This worked great. It took a poor single core on my computer only ~2.5 hours to solve a simple Sudoku puzzle.
# Go Routines and Speedbumps
This was probably the time to utilize Go Routines, a handy way to give a function a task, run it somewhere in the background, and then spawn some more. Ensuring that I use all my CPU cores grinding my poor computer to a halt.
This was my first foray in serious Go routine usage, and I've learnt something. I was fortunate I could exploit an interace for sharing interprocess memory, so I didn't need to resort to using channels (which would add to complexity and speed). I ran the validate step at the end of the 9th level of nesting as a go routine and had roughly thousands of Go routines running at the same time. This reduced the computation time from ~2.5 hours to ~1.5 hours. All things considered, it wasn't bad.
Wanting to increase the brute force performance I tried to run a Go routine at the 8th nesting level, spawning a Go routine for every 8th row slice, that would in turn spawn Go routines for every element in the slice of row9. With the number of simultanious running Go routines in the 100s of thousands, all my cores were at 100%, my desktop was rendered useless, processing time was increased, and overall this was a detrimental approach.
My top wasn't too happy with this:
![Top output](/static/sudokufunparktop.png)
The lesson of this exercise was that I needed to put a brake on the Go routines, and manage this.
# Further (possible) optimisation
At this stage I've been at this for roughly a week and was happy with the intermediate results, but this could be optimized further. I haven't taken the time but intend to work on this in the future. So I would like to explain my thinking and possible solutions.
Comparing rows like the one below are costly:
```
123456789
912345678
...etc...
```
You compare every digit with the digit in the lower rows and do this for every possible solution. This is CPU intensive and with that a costly way to validate all possible solutions. To combat this I would like to compare something more abstract, since I already have a slice with blocks I can use the indexes.
I want to compare the slice of blocks with itself, and validate two entries, and store incompatible entries.
1. Take blocks[1]
2. Validate it with blocks[2]
3. If it is invalid, store the pair
4. Repeat step two, but with blocks[3]
Once step 2 has been exhausted, replace step 1 with blocks[2], validate with blocks[3], and work it way through the slice. Keep in mind that if blocks[2] and blocks[3] are incompatible, this also mean that blocks[3] and blocks[2] are incompatible. This would hopefully reduce the time required to process all possible combinations. (Otherwise I would need to make 362880 * 362880 = 131681894400 comparisons)
Once this the set of invalid combinations have established I can iterate through the 9x9! possible combinations (109110688415571316480344899355894085582848000000000).
Why pairs of incompatible indices?
Because, if I render and abstract notation of possible solutions using blocks[] index numbers:
> 123:345:910:789:684:24:738:182:102
If I know that indices `123` and `910` are not compatible with eachother I can discard this potential solution and move on. It doesn't matter where in the possible solution these indices are place, we know it will never validate.
What I am not sure about, and I doubt if it is more efficient compared to bruteforcing the 9x9! solutions is that comparing possible solutions with invalid pairs may be just as costly, if not more.
Although, once the set of incompatible pairs have been generated, it may be easier to run this on separate machines by giving each machine an index for the first row, let them generate the rest and compare.
🤔 Thinking about this a bit more, the ordering of possible solutions shouldn't really matter, which may speed the process up a bit.
# The numbers
It was somewhere at this stage I started looking into other solutions and numbers:
* As you have seen earlier, comparing 9 rows of 9! possible solutions will result in 109110688415571316480344899355894085582848000000000 comparisons.
* [Looking at Wikipedia again](https://en.wikipedia.org/wiki/Mathematics_of_Sudoku) there are 6670903752021072936960 possible solutions for Sudoku
* This is going to consume a lot of a) time b) energy c) storage
And the latter starts adding up when it comes to storage requiring me to have at least 540 zettabytes to store the solutions as efficiently as possible (a string 81 bytes). Let alone the upfront costs and infra required to hosts such a database.
# I give up...
I tried, I bit of more than I could chew, I learned a lot, it was fun. I can sleep well knowning I did my best, challanged myself, and can cross something of my todo list that has been living rent free in the back of my head for the better part of a decade.

View File

@ -0,0 +1,66 @@
---
title: "From Redhat to Ubuntu to Arch"
date: 2022-10-28T22:41:20+02:00
draft: false
---
# From Debian to Arch
## Summary/TL;DR
I've worked extensively with [Ubuntu](https://ubuntu.com/) both personally and professionally. As time passed it became increasingly bloated, stale and outdated. I switched to [Arch Linux](https://archlinux.org/) and [Manjaro](https://manjaro.org/) due to the frequency of updates.
## In the beginning (1998 - ~2006)
As a teenager in the late '90s I started to to run Linux on my old Pentium 75. This was [RedHat v5.2](https://www.redhat.com/en/about/press-releases/press-redhatlinux52) I bought with a box and a manual on a CD, installed it on my computer with multi-boot. The latter was requirement because I still wanted to play video games on my Windows 98 install. This until recently set a trend that didn't change for many decades.
After RedHat, I turned to [Slackware](http://www.slackware.com/), and from Slackware I got interested in the BSDs even running [OpenBSD](https://www.openbsd.org/) and [FreeBSD](https://www.freebsd.org/) on my desktop.
Around this time I bought a cheap PC, installed OpenBSD on it, chucked it into a data center at an age of 18 (Thanks BillSF! 🙂) and ever since I've had a unix box online somewhere from which I organized my digital life, ran IRC and services for friends.
I briefly ran a small hosting company on several FreeBSD servers. The OS is great, the experience was valuable, but not worth repeating.
## My carreer and Windows ( ~2007 - 2018)
During this period I didn't touch any of the BSDs (except for OS X). It was around this time [Ubuntu](https://ubuntu.com/) started to become my goto Linux flavour for both personal and professional projects. Anything related to work ended up becoming an Ubuntu server, any remote server or laptop for personal use ended up running Ubuntu.
And running Ubuntu made sense, you installed it on something and you had a functioning desktop server instantly available. Updates of packages? Just run `apt`, easy! Upgrade to a new version of Ubuntu, run `do-release-upgrade` and hope it doesn't break! Packages, by the boatload and available at your fingertips. Problems? Chances are you aren't alone and a nifty search on your favorite search-engine would have an answer for you. Even software not in the repos catered to you with their own repos. And it was great!
However, my main desktop became a dedicated windows box because I liked to play video games for hours on end. And with the limited availability of video games on Linux it wasn't really an appealing platform to waste my time on and escape real life, or work.
During this time I wasn't sitting still:
* I got certified with Linux ([LPIC1](https://www.lpi.org/our-certifications/lpic-1-overview)), [BSD Specialist](https://www.lpi.org/our-certifications/bsd-overview), [Solaris 10](https://education.oracle.com/oracle-solaris/solaris-10-administration/product_296), VMware 4, VMware 5 and also some AWS. I liked it, it may have been more of a hobby, because I doubt some of it was useful for my then employer.
* At work I briefly ran a [Fedora](https://getfedora.org/) install as my main desktop, while it worked, I wasn't really happy with it. I felt weird using it, and it felt out of place. Also the software seemed outdated. Luckily this was a short-lived experiment.
* My remote unix box moved from data center to data center to data center, enjoying different hardware and newer Ubuntu installs during this time.
* For several years I was running a VMware ESXi 4 server (colocated in [Eweka](https://www.eweka.nl/) when they still did colocation) with on it VMs running Ubuntu, [Solaris](https://www.oracle.com/solaris/solaris11/), briefly OpenBSD, and as one would expect my remote unix box.
While Windows was my main OS for my gaming, [ArmA 3](https://arma3.com/) and [Flightsim](https://www.falcon-bms.com/) [habbits](https://www.digitalcombatsimulator.com/en/). I still ran different flavours of Unix. Ubuntu being my mainstay.
## Shuffling things around (2019-2022)
A lot has happened in the years prior. So in random order:
* There has been a tremendous amount of improvements to Linux and the desktop environment.
* Gaming through emulation via [Steam's Proton](https://github.com/ValveSoftware/Proton) rendered [a large portion of my Steam Library playable in Linux](https://www.protondb.com/)
* [There were distros specifically setup focussed on gaming](https://linuxstans.com/best-linux-distro-gaming/)
* [Ubuntu has gotten to the point it was the basis for other distros](https://en.wikipedia.org/wiki/List_of_Linux_distributions#Ubuntu-based)
* New universal package managers ([snap](https://snapcraft.io/), [flatpack](https://flatpak.org/), but I prefer just AppImages)
And the last two is where things with Ubuntu started to become stale. And for me this hinged on two things:
1. Lack of up-to-date software
2. Snap
### Lack of up-to-date software
I noticed as time progressed that Ubuntu didn't offer the state of the art desktop. It lacked up-to-date software, configuration, codecs, also the userfriendly aspect took a nosedive. With [Canonical](https://canonical.com/) doing its own thing, it became obvious that it kept moving further and further away from the average linux desktop user. The popularity of [Mint Linux](https://linuxmint.com/) and a plethora of other Linux distros based on Ubuntu was IMHO symptomatic of this. And as a user with a desktop or someone managing servers the packaged software you received from the repos were except for security updates fairly out of date. As a server admin you had to resort to creating your own packages of recently updated software and distribute those if you needed a recent feature. I recall Mint having the same issue, but at least the desktop looked nicer. 🤷
### Snap
I think it is bloated. I think it is slow. It doesn't make sense to me to have multiple package managers in an install. Apt worked fine, why exclusively distribute certain software via snap, which requires an entirely different upgrade process. With the traditional distribution model and dynamically linked libraries one would be able to update a library, without having to run updates on depending software. WIth the bundled model it won't receive security updates until the bundled library is also updated, and it is to the package maintainer to keep track of this.
### Arch Linux and Manjaro
At this stage in my career I am working rather intensively with command-line and tools that need to remain up-to-date. And I like my applications to be up-to-date with all the features, fixes and security updates. I was done with Windows, and needed something different. With Steam Proton allowing me to play video games on Linux, and having always used Open-Source Software for my day-to-day applications, this shift shouldn't be hard. So I decided to migrate to Windows.
At some point I bought a small SSD M2 drive and installed [SparkyLinux GameOver Edition](https://sparkylinux.org/sparkylinux-4-0-gameover/) (based on the Debian Testing) to use as my main Linux desktop and multi-boot between Linux and Windows when needed. With SparkyLinux being unstable and virtually unusable, I briefly tried Linux Mint, but I experienced this to be outdated most of the time. I still ended up booting Windows to play video games, and staying there for months on end.
And this is were a Arch Linux and [Manjaro](https://manjaro.org/) (Desktop distro based on Arch Linux) come in. It is based on a [rolling-release model](https://en.wikipedia.org/wiki/Rolling_release), provides the latest stable versions of most software, has an excellent wiki, lots of support online and just seems to be what I wanted: Bleeding edge, without the sharp pointy bits.
So I got some 1TB SSD M2 on sale, replaced the old one, installed Manjaro, and haven't looked back since. And I think I did this a year or 2 years ago. And this trend continued, my laptop got an install of Manjaro, my VMs slowly migrated from Ubuntu to Arch Linux. And except for a massive file storage running Debian at home, everything is running something based on Arch.
## In conclusion
As my career progressed, technology progressed, my desire to work with unix and to play video games remained throughout the years. And as everything changed, doors opened and I stepped through. What the future brings, I do now know. But where I sit I am sitting very comfortably.
PS: I forgot to mention that I've been working with a Mac Mini and MacBook Pros in a professional capacity for the past 17 years. Technically its based on BSD, so I think I am good. 😅

View File

@ -0,0 +1,21 @@
---
title: "Open-Source Games"
date: 2023-01-09T20:18:46+01:00
draft: false
---
# My addiction
As some of you might know I have played a lot of video games, and as you can see by [looking at my Steam Profile](https://steamcommunity.com/id/Ligthert/home/) I still do. And while I love Steam and what it did for the Linux gaming community with integrating [Proton](https://en.wikipedia.org/wiki/Proton_(software)) into Steam ([My profile here](https://www.protondb.com/users/1960692688)), there is also a non-emulated side thanks to native support and Open-Source. I use this post to dump a bunch of links stuck in my browser-tabs.
# Existing lists and sites
* https://github.com/Trilarion/opensourcegames
* https://libregamewiki.org/Main_Page
* https://osgameclones.com/
* https://github.com/ligurio/awesome-ttygames
* https://trilarion.github.io/opensourcegames/
* https://github.com/leereilly/games
# Personal favorites
* [OpenMW](https://openmw.org/), it is technically an engine reimplementation of _The Elder Scroll: Morrowind_ but I cannot get enough of it. With v0.48 on the horizon I am curious what the future will bring. 2090 will see its v1.0 release. ;-)
* [Veloren](https://veloren.net/), Cube world botched it, but Veloren (just released v0.14) picked up that banner and seem to be doing it right. It is still early days, but is fun to play already.
* [Beyond All Reason](https://www.beyondallreason.info/), if you like Supreme Commander or games with a streaming economy then this game is for you.
* [OpenRA](https://www.openra.net/), again an engine reimplementation that turned into its own thing. Like the old '90s C&C games? Then this is for you.

View File

@ -48,7 +48,7 @@ This part is based on an [blog post on the AWS blog about exactly what I need](h
Easy peasy.
# Terraform
# Terraform
Terraform allows me to do essentially the above in AWS and create all the things automagically from the commandline. I've written [HCL](https://gitea.ligthert.net/Sacha/blog.ligthert.net/src/branch/trunk/terraform.tf) that allows me todo just that. I executed this from the commandline and it created what I needed in one go.
# Wrapping up
@ -61,4 +61,4 @@ The only thing that needs to happen is actually uploading these files.
aws s3 sync public/ s3://blog.ligthert.net/ --delete
```
Easy!
Easy!

Binary file not shown.

After

Width:  |  Height:  |  Size: 19 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 36 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 48 KiB

1
site/themes/m10c Submodule

Submodule site/themes/m10c added at bd6e4228fd

View File

@ -15,7 +15,7 @@ provider "aws" {
region = "eu-west-1"
}
# Global for Certificates
# Global for Certificates
provider "aws" {
region = "us-east-1"
alias = "global"

8
update_blog.sh Executable file
View File

@ -0,0 +1,8 @@
#!/bin/sh
git pull
cd site/
hugo -D
aws s3 sync public/ s3://blog.ligthert.net/ --delete
sleep 5
aws cloudfront create-invalidation --distribution-id E13C4ILGFG6IB2 --path "/*"