Skip to main content

Blog Question Challenge 2025

Why did you start blogging in the first place?

Part of me wants to just record my thought-process and solidify my thinking.

Another part like sharing things I learn in case they help others.

And sometimes I just want to rant.

What platform are you using to manage your blog, and why do you use it

In the late 90s, I wrote raw HTML files and just posted them on the RedHat web-server that my college CS department offered. Good ol' table-based layout and all that.

Once I got my own domain, I looked around at Static Site Generators (SSGs) and Nikola, topped my list. I wanted an SSG that

  • could ingest pure HTML fragments (which I prefer) rather than forcing me to use something like Markdown or AsciiDoc. I don't mind using them for casual prose (like this) but for technical work, I prefer the markup-control that I get from raw HTML

  • was written in Python (which I use as part of $DAYJOB, so I felt more comfortable poking under the hood)

  • ran on all the platforms I used (Linux at the time, now FreeBSD, and OpenBSD

  • had metadata features that fit my requirements

  • had a built-in web-server for viewing the site locally

  • was easy to create an rsync deploy hook to send my files up to my web-server

However, Nikola has a lot of churn, meaning I frequently have to revisit my configuration files or regularly suffer the wrath of warnings and errors at every upgrade. It also processes my HTML source through lxml which in turn mangles certain cromulent markup, meaning my input doesn't show up in output. And system upgrades of Python seem to throw my virtual-env into a bit of a frenzy.

So I've started a Makefile-based SSG that largely sticks to BSD make and POSIX awk. Stay tuned when that comes to fruition.

How do you write your posts?

A plain ol' text editor.

Sometimes vi/vim, sometimes ed(1).

Do you normally publish immediately after writing, or do you let it simmer a bit?

Most posts take me a while to compose, so those get set to Draft status, and the publishing process uploads the static files but doesn't include them in indexing or the RSS feed. This allows me to read them on my site (optionally sharing the pre-release URL with others) and revise them accordingly. However, once I feel it's done, I go ahead and remove the Draft status to publish it.

Have you blogged on other platforms before?

If you count "microblogging", then I've blogged on Twitter/X and have largely moved my microblogging participation to Mastodon.

However, for the most part, it's just been my own site/blog.

When do you feel most inspired to write?

A couple triggers:

  • I'm working through something complex and want to document it for myself

  • I've found a fun little shell trick and want to share it

  • I've experienced some frustration and just need to rant

What’s your favorite post on your blog?

Probably my post on using remind. However, it's starting to show its age, so I should revisit the article and update it.

Any future plans for the blog?

Keep on emitting a steady trickle of whatever interests me.

The fall of the User Agent

In the beginning

Similar to some Mail User Agents (MUAs) and NNTP clients, the specification for HTTP since at least version 0.9 have included User-Agent headers that still exist in modern HTTP standards.

This header lets the server know what software made the request. But it also provides a reminder that the software existed to act on behalf of the user.

Abuse

Sadly, server-side software started to abuse the User-Agent header. Based on the value, a web-server would respond with different output depending on what it thought the other side could handle. By accommodating a broken client and making presumptions about how it would behave, this User-Agent sniffing led to a fractured web.

Lowe's rant

Sigh. Lowe's doesn't seem to be able to competently ship a part that still shows as in-stock on their site. Rant documenting the drama follows.

tl;dr order online, they cancel. order online again, they cancel. Call support, get told to place the order with the local store. Call local store, get told they can't. Call support, get told to physically go into the store to order. Order gets placed. Order arrives, installer notes it's missing parts. Call support, get told they'll rush out a replacement. Replacement arrives. Check box, still missing same parts. Call support, rush delivery of another replacement with notes for the shipper to check the contents. Third order arrives with the same contents as the previous two. To be continued…

Read more…

Planning the day on the CLI with tsort

I had a bunch of items on the todo list for the day and wanted to arrange them in order, but I needed to certain items before others:

  • Drop donations off at the resale store, but can't do that before 10:00 when they open
  • Get the daughter ready for soccer camp before dropping her off
  • Drop the daughter off at camp before taking paperwork up to her school
  • cut the son's hair before he takes a shower and cleans his bathroom
  • pick up daughter at camp after she's started camp (duh) and before 10:45
  • take the kids to the library after picking up daughter at camp
  • for obvious reasons, 8:30 → 8:45 → 9:00 → 10:00 → 10:30 → 10:45
So I expressed this in a file where each row contains "X needs to come before Y":
cat todo.txt
10:00 resale
resale 10:30
sunblock take_to_camp
pack_snack take_to_camp
pack_water take_to_camp
pack_towel take_to_camp
take_to_camp 8:30
8:30 soccer_camp
soccer_camp 8:45 
soccer_camp paperwork
8:45 home
home 9:00
paperwork home
home cut_hair
home clean_bathroom
home shower
cut_hair shower
cut_hair pick_up
shower pick_up
clean_bathroom pick_up
cut_hair clean_bathroom
clean_bathroom 10:00
10:45 pick_up
8:30 8:45 
8:45 9:00 
9:00 10:00 
10:00 10:30 
10:30 10:45 
pick_up library

Now it just became a matter of passing these requirements to tsort

tsort todo.txt
pack_towel
pack_water
pack_snack
sunblock
take_to_camp
8:30
soccer_camp
paperwork
8:45
home
cut_hair
9:00
shower
clean_bathroom
10:00
resale
10:30
10:45
pick_up
library
to sift them into the required order of my todo list for the day.

Old Computer Challenge

History

This is the third year in which Solene has run the Old Computer Challenge and I've tried to participate in each of them.

Participating in these requires a little flexibility since I work remotely for $DAYJOB. The low-end computer challenges posed less issue since neither the VPN nor rdesktop required much in the way of resources. Using the smaller screen-dimensions on the older laptops made it a bit more challenging, but I made it work.

None of these challenges shifted any usage to my phone. I prefer not to use my phone for anything beyond the barest of essentials: phone-calls, texting, podcasts, timers, lists, and weather.

First year (2021)

The first year and third year both focused on limited hardware,

I chose a Gateway Solo 1200 as my primary machine for the first year. Boasting an 800MHz Celeron processor, a 120GB spinning-rust HDD, a 10mbit wired LAN connection (it also had an internal wi0 wireless card and a PCMCIA Intel wifi option, but both only supported WEP rather than WPA2 so I stuck with the wired option), and upgraded to its maximum of 320MB of RAM. This machine infamously arrived on 9/11 with the UPS driver delivering it as I watched the twin towers fall on TV.

The machine ran the latest release of OpenBSD without issue. The limited CPU & RAM limited my choice of software notably. Fortunately, other than web-browsing, much of what I do happens at the command-line.

Email
I had used Claws Mail (a GUI mail program) for many years but it started using more and more system resources. So I had aspired to switch to mutt or neomutt. The Old Computer Challenge gave me the kick I needed. Dealing with multiple accounts and catch-all mailboxes posed the worst pain-points. Otherwise, it ran fine within the limited system resources. And they provide a lot of power to mow through piles of email.
Music
I've long used cmus for playing my music collection, and pianobar for streaming. Both ran fine even on this ancient hardware.
Calendar
I've moved my calendaring to remind and it runs fine at the CLI. It did have a noticeable lag on startup but I suspect that my 3000+ reminders/events cause that. When I winnow it down to a much more sensible volume of reminders, it runs in a blink.
Coding
All my coding happens at the CLI using a mix of vi, vim, and ed for editing, and doing version-control with git or rcs so not much changed here. I did find notable startup lag both in starting vim and executing Python code. It made me appreciate the fast startup times for utilities that compiled down to native code. I also found myself using awk in a lot of places since it had a faster startup time than Python.
RSS
I've long used rss2email to gather my RSS feeds and deliver them to my inbox, reducing the RSS-reader issue to a mail issue. I experienced no disruption here, since mutt let me keep reading my feeds just as I had done in Claws.
Social media
I accessed Twitter with Rainbowstream, Mastodon with Tootstream, and rtv (now obsolete) for Reddit. For text posts and commenting, I loved them all. But for image/video posts, they fell short. I wish I had a quality CLI interface for Facebook to keep in touch with friends & family who only share things there.
Office stuff
Thankfully, I don't have to deal with Office documents often. And almost never outside of $DAYJOB so I could use Word or Excel remotely. I did install Abiword for the occasional MS-Word document and Gnumeric for the occasional spreadsheet. Both provided reasonable fidelity and speed while running within the confines of the limited hardware.
Gaming
I don't game much, so this didn't impact me much. I think I played a couple rounds of cribbage(6) and atc(6) as a proof-of-concept, but certainly no high-end FPS games here.

Web browsing hurt the most. Firefox & Chromium? Completely unusable without gobs of RAM. For some basic browsing, lynx and dillo provided lightweight options, while Epiphany clocked in at barely-usable (but still better than Firefox & Chromium) for sites requiring JavaScript.

Second year (2022)

The second year focused more on limiting network usage (both total-time and bandwidth).

I had to segregate life here since $DAYJOB requires remoting into my work machine so I didn't count that time against my allotted 1hr.

I didn't know how to count my cron job that downloads my podcasts nightly since I don't have much control over how long they run or how much data they download. I decided that, since the challenge only ran for a week, and I batch podcasts roughly every three weeks, I could load a fresh batch to my player before the challenge, disable the cron job for the week, and then re-enable the cron job after completing the challenge. Not quite the spirit of the challenge, but also a lot like how I would download things in high-school, where I would walk to the local campus library to download large files and bring them home.

Email didn't pose a great concern, since OfflineIMAP let me batch download my emails from the server, and my local MTA would batch up outbound emails until I reconnected, sending them all to my smart-host mail-server in one go.

However, the second year really cut into social-media usage. Its model simply doesn't accommodate offline use well.

Third year (2023)

Similar to the first year the tools remained largely the same. However this year I did the challenge while on vacation. Cheating? Maybe. But also enforcing since I didn't take any other laptop. This time I took a Dell Mini10 netbook with me. This hand-me-down came to me with 2GB of RAM, but I'd made a few upgrades:

  • replaced the 120MB HDD with a 60GB SSD giving a bit of extra pep
  • replaced the rubbish Broadcom wireless half-height PCI card with an Atheros chipset
  • installed OpenBSD 7.3 in place of Windows Vista

The netbook has no fan, relying on passive cooling instead. This meant that using apm -L kept the system running cool. I could manually apm -H to get the full 1.x GHz but it came with a warm price, discouraging me from doing so.

The tiny 1024×600 screen resolution gave even greater constraints when remoting into $DAYJOB but, that helped me stay in vacation-mode rather than try to sneak in hours. Additionally, X seemed to think the display offered 1024×768 resolution, so everything rendered with a squishing/scaling that ruined friends' pictures. And equally bad, the Poulsbo chipset lacked support in X, so it rendered very slowly using VESA. But I had times where I could watch text render character-by-character, and could type full paragraphs of text before the first couple words appeared on the screen. With better graphics-support, I suspect it would have felt notably snappier.

Future challenges

After returning from that vacation, I purchased a new laptop for travel, and got rid of four of my old junker laptops (my beloved rejoices at fewer laptops on my desk). I still have the Mini10 and a PPC iBook G4 running OpenBSD, so I can participate in future challenges.

Closing out the books in ledger(1)

I put this here mostly because I forget how to do it and have to make multiple starts to get it right.

ledger -f ledger.txt  print \
  -e 2022-1-1 > 2021.txt

ledger -f ledger.txt equity
  -e 2022-1-1 > tempfile.txt

 echo >> tempfile.txt

ledger -f ledger.txt print \
  -b 2022-1-1 >> tempfile.txt

mv tempfile.txt ledger.txt
Closing out the books for 2021

Read more…

Using mail(1)

Intro

CONGRATULATIONS! Your OpenBSD install has been successfully completed!

When you login to your new system the first time, please read your mail
using the 'mail' command.
OpenBSD's post-install message

Oh, no! You have a fresh install and the only way to read your welcome message from Theo requires using mail.

Read more…

chrooted SFTP

Creating chroot SFTP accounts

For $DAYJOB I had to create user accounts for customers and give them access to SFTP files to/from secured areas of our server. We wanted to use chroot functionality to ensure that no customer could see other customers' data, and prevent them from poking around potentially sensitive areas of the server. After a bit of trial-and-error, I've listed the lessons-learned here in a cook-book fashion so that in case I ever have to do it again, I have the steps documented.

This post was spurred to exist thanks to this Reddit post asking about creating an encrypted FTP server on OpenBSD so my reply there became the basis for this post.

Read more…

CLI Tricks: Spongebob Sarcasm in awk

Sometimes you want to turn some text into "sarcastic Spongebob" text so this little fragment of awk will make that transformation for you:

#!/usr/bin/awk -f
BEGIN { srand() }
{
    n = split($0, a, //)
    for (i=1; i<=n; i++) {
        c=a[i]
        printf("%c", rand() < 0.5 ? toupper(c) : tolower(c))
    }
    print ""
}
sarcasm.awk

Read more…