Monday, September 8, 2025

Konsulatstermine für Reisepässe

Hand holding four German Reispässe Upon acceptance of a Staatsangehörigkeit § 5 declaration, making one a German citizen, the next step is generally to order a German passport called a Reisepass. This requires filling out the form and bringing the Urkunde über den Erwerb der deutschen Staatsangehörigkeit durch Erklärung and passport photos to the responsible Consulate in a passport appointment.

It can be difficult to get a passport appointment. You keep checking the site and there are never any appointment slots available.

German Consulates around the world add new appointments every weekday at midnight in Germany. For example, that is 3pm in California. If you start polling the appointment site at 2:59pm on Sunday, you have the best chance of seeing new appointments appear and grabbing one before they are all gone. Note that Daylight Savings Time differs by several weeks between Europe and the US, they aren't the same number of hours apart all year.

There are Honorary Consuls in a number of cities who can make copies of your documentation and forward it to the Consulate, which might be easier to get an appointment with if your Consulate is swamped.

Tuesday, September 2, 2025

LLMs to Blaze a Trail

As an engineering executive there are a few ideas and practices which I reinforce via repetition to the team, either explicitly at the start of a recurring meeting or implicitly by bringing it up whenever relevant. The first of these ideas is:

The product is not the code, not the features, not the designs.
    The product is that people can use the service for things that are important to them.
The business is not the code, not the features, not the designs.
    The business is that people can use the service for things that are valuable to them.


But the topic of this post is not that. The topic is the second thing I frequently reinforce via repetition:

Get something, anything, working end to end as quickly as you can.
Not even a minimum viable thing. Any thing.


It has been my experience that, as developers, we tend to focus in on one area of a system to explore its requirements and build it out sufficiently until we feel confident that we understand what else will need to be done before moving on to the next piece. This results in a system where the understanding and the plan for development is grown by accretion, each piece layered atop the previous which is left undisturbed by later developments. We might go back and harmonize all of them later... maybe.

It has also been my experience that everything starts progressing more quickly once the system does something, anything end-to-end.

  1. We gain perspective on how the whole system will work and apply it to everything we do subsequently.
  2. One can make a change and see it function all the way through. Enthusiasm improves productivity.
  3. It is far more effective when multiple people work on a system in parallel if they can all see the impacts of each other's work.

Thus:

Get something, anything, working end to end as quickly as you can.
Not even a minimum viable thing. Any thing.


LLMs to Blaze a Trail

With the maturing capabilities of LLM code generation, I tried an experiment with Claude Code. At Google one of the classes in orientation was to construct a web scraper. I asked Claude Code to build a scraper, but an even simpler one: scrape a metric.

In a new scraper directory, create a go program which will scrape a web page formatted
in prometheus metrics format, and extract a floating point value labeled "example"

Create an SQL schema for a timeseries, with columns for a timestamp and a floating point value.

Have scraper connect to a Postgres database and write each sample it collects to the database.

In a new webui/frontend directory, create a web page using React and typescript which will
poll a backend server for changes in a loop and display rows of timeseries data with timestamp,
sample name, and value.

In a new webui/backend directory, create a go program which will handle queries from
webui/frontend and fetch timeseries data from the postgres database.

A classic hacker stock photo in a darkened room sitting in front of a laptop wearing a hoodie and mask, except the person typing is a robot

It produced a small, functional implementation.

scraper123 linesGo
backend169 linesGo
frontend127 linesTypescript
 100 linesCSS
 43 linesHTML

A few interesting tidbits:

  • It produced no unit tests for the Go code. I didn't tell it to.
  • It did produce unit tests for the TypeScript code, even though I did not tell it to. I think this speaks well for the TypeScript community, the training data is infused wth testing as an expected practice.
  • I wish wish wish that Claude Code would automatically populate a .gitignore for node_modules. Not for the first time, I checked 437 Megabytes of code into git and had to rewrite the history to remove it.

 

Unit Tests

Having no tests at all sets a bad example. I don't actually want to encourage the construction of large system test suites at this stage of a project, as the effort to keep updating a large test as the system evolves is likely to outweigh the value of the test at this stage. Yet I do want to set the example by ensuring there is something.

In the scraper directory, keep the main() function in main.go but move the rest of the code
to a scrape.go file. Write tests for scrape.go with a local prometheus server and in-memory
database. Check that metrics are correctly stored in the database.

Claude Code generated 377 lines of test cases, including scraping one value and several values. Most of the code was to set up an in-memory database using sqlite and to run a local Prometheus server.

The cost of the first prompt to generate the system and the second prompt to add unit tests: 93 cents.


 

Non-trivial example

That example was pretty contrived. How about an example of a more realistic system which:

  1. Implements a protocol connecting to a legacy communications system.
  2. Implements a set of modern protocols connecting to current Internet communications infrastructure, to forward messages to and from the legacy protocol.
  3. Has a management layer watching all of the connections and can stop or restart them as needed.
  4. Has a dashboard and console showing the status and configuration of the system.

Can it produce this? Well... not exactly. I kindof cheated: this is the first thing I attempted, I made up the contrived system later.

The problem is that first step. Claude Code was not much help in producing the first piece, connecting to the legacy system. The tasks there were more like engineering archaeology:

  • Trying variations on the digest hash function until the remote system suddenly returned 200 OK.
  • Figuring out what portions of the the poorly documented header fields were actually implemented.
  • Diagnosing failures when the only indication we get is "Invalid" with no further information about what was invalid.

There just isn't any training data for this, and so trying to rapidly get to a functioning end-to-end system entirely via code generation didn't work. I was able to work on the management layer and the dashboard and so on while still debugging the first piece, but it only started working when that first piece was done.

Could I have set that first piece aside with a mockup, and worked on the rest? Probably, but it was just me not a team and the first piece was the biggest risk. I focussed on eliminating the risk.

In an engineering team, I thnk I would approach this with a small team whose job is to sketch out the overall system. It might be entirely senior engineers or at least led by a quite senior engineer, and tasked to identify and quantify risks and to plan out a system. That team could multiply its efforts using LLMs to help generate the more well understood portions of the system.

Monday, September 1, 2025

On the Persistence of Human Memory

Tell me this looks wrong to you, too.

A screenshot of green Save and red Cancel buttons where the Save button is quite obviously lower on the screen than Cancel

Claude Code doesn't see it. I mean, of course Claude Code doesn't see it, it has no eyes or other senses. Nonetheless I tried to get Claude Code to fix it by leading it to a solution.

In frontend/ in the Delivery page, align the Save and Cancel buttons vertically.
In frontend/ in the Delivery page, remove the height property from the Save and Cancel
buttons. Put both buttons inside a div, and set the height of the div to 40px.

Neither of these fixed it, because these were not the problem. The actual problem was:

233 .save-button,
234 .add-button {
235   background-color: #48bb78;
236   margin-top: 1rem;
237 }

This was leftover from when the button was elsewhere on the page, and not removed when it moved to be next to the Cancel button. Poking around with Chrome's Developer Tools and looking at the Elements on the page identified it.


On the Persistence of Human Memory

One thing I am finding is that memory of code generated with the help of an LLM fades much more quickly. Some portions of this system were not amenable to getting help from Claude Code — things which involve low level interoperability with existing and legacy systems. There is no relevant material in the training set, Claude Code could not help in iterative debugging in staring at the errors from the legacy system to figure out what to do next.

Those portions of the codebase, those developed with blood and sweat and tears, remain clear in my memory. Even months later I can predict how they will be impacted by other changes and what will need to be done.

That is not true of the portions which the LLM generated. Continuing with the analogy of treating it as an early career developer, I only reviewed the code I didn't write it. As with any code review, the memory of how it works fades much more quickly compared with actually digging in to the work.

(This is better than Claude Code, though, which retains no memory at all of how code has evolved and instead discovers it all afresh at the start of each session).

Treating an LLM like an early career programming partner can provide large increases in productivity, but it also means that one has less personal recollection of the windy path the code took to get to its current state. One must be able to go spelunking. This isn't that much different from a codebase which one has worked on over a long period: little detailed memory of specific portions of the code remain, but an overall sense of the codebase is retained much longer.

Monday, August 25, 2025

Claude Code's 19 cent Parser

A brief prompt:

In authheader.go write a function to parse a SIP WWW-Authenticate header for Digest
authentication. It should return a map[string]string of key:value pairs which are
present. It should handle the case of valueless parameter with no "=" by populating
an empty string in the map.

Write unit tests, including these WWW-Authenticate headers:
1. WWW-Authenticate: Digest algorithm=MD5,realm="example.com",nonce="abcd="
2. WWW-Authenticate: Digest realm="example.com", nonce="efgh=", opaque="1234__", algorithm=MD5, qop="auth"

A classic hacker stock photo in a darkened room sitting in front of a laptop wearing a hoodie and mask, except the person typing is a robot

From this, Claude Code generated quite reasonable parsing code for a SIP WWW-Authenticate header. It did this in approximately one minute of wall-clock time at a cost of 19 cents. This is considerably more quickly and cheaply than I could have produced a similar function.

I made one manual fix: the string comparison for "Digest" and for parameter field names are supposed to be be case-insensitive, and I added unit tests for it. I hadn't specified this in the prompt, and Claude Code didn't figure that out from the mention of SIP.

I remain of the opinion that vibe coding can be a force multiplier for expertise, not a complete replacement for expertise.


 

Wisdom

Returning to an earlier topic: does the code which Claude Code generated exhibit wisdom? Did it have shortcomings which would be harmful? Claude Code came up with the following test cases, and wrote a Go table-driven test case for them.

  1. The two I explicitly gave it.
  2. Header with valueless parameter
  3. Header with unquoted values
  4. Empty header
  5. Header with comma in quoted value
  6. Header with extra spaces

I looked into the handling of unquoted values. The SIP standard says that fields like algorithm or qop which are enumerated in specifications can be left unquoted. What Claude Code generated would allow any field to be unquoted, including arbitrary text strings like realm.

The spec says these values must be quoted. Yet there is also the Robustness Principle, to be liberal in what you accept and strict in what you send.


 

Postel's Law Considered Harmful

Nowadays I think this principle has ultimately been more harmful than good. Over time we end up with a protocol which is only partially specified, where real implementations require a neverending series of quirks handling to work around the behaviors of widely deployed yet incorrect implementations which other implementations have liberally accepted. For new protocols I'm a fan of be strict in what you send and strict in what you accept, to not allow quirks to accumulate. Like barnacles, quirks slow the forward progress over time and tend to cause standrds to bog down and eventually stop even trying to evolve.

But SIP is ancient. In Internet Years it is a centennarian. What should one do about SIP? Being strict in what one accepts would lead to a series of relaxations being added during deployment when engineering philosophy meets harsh reality that there are a lot of barely-compliant production services run by vendors far too large to care what some Internet Rando thinks of their implementation.


 

Epilogue

I did consider whether to just leave it this way, and allow unquoted strings for all fields. Life is too short to fight the weight of Internet Protocol Inertia... but I couldn't do it. That would make my little corner of the SIP world be part of the problem. I made it only accept unquoted strings for algorithm and qop, the two enumerated fields which my system deals with.

In authheader.go:parseWWWAuthenticate() fields named “algorithm” or “qop” may be
quoted or unquoted. Any other field name must have its value quoted to be accepted.

In authheader_test.go add test cases:
1. fields named “algorithm” or “qop” may be quoted or unquoted.
2. Any other field name must have its value quoted to be accepted.

Monday, August 18, 2025

Training Gemma3-270m for German Q-and-A

Google recently introduced Gemma3-270M, a smaller Gemma3 model with "only" 270 million parameters instead of billions.

The most interesting aspect of this model to me is that it is explicitly intended to be able to run locally, without requiring highly specialized infrastructure — well within what is achievable outside of specialized datacenters. The potential to run the model with an air gap, isolating it from outside, would be interesting for some future stuff I'm working on.

The eventual uses would involve communication in the German language, so I decided to see about adding training to answer questions in German specifically. I referenced an existing colab notebook, which uses Gemma3-270M to predict chess moves. Chess as an application for LLMs isn't as interesting for me personally, we have better ways to use neural networks to play chess, but the training flow is the same.

We start by loading dependencies and instantiating the gemma-3-270m-it model.

%%capture
import os
if "COLAB_" not in "".join(os.environ.keys()):
    !pip install unsloth
else:
    # Do this only in Colab notebooks! Otherwise use pip install unsloth
    !pip install --no-deps bitsandbytes accelerate xformers==0.0.29.post3 peft
    !pip install --no-deps trl triton cut_cross_entropy unsloth_zoo
    !pip install sentencepiece protobuf "datasets>=3.4.1,<4.0.0" "huggingface_hub>=0.34.0" hf_transfer
    !pip install --no-deps unsloth


from unsloth import FastModel
import torch
max_seq_length = 2048
model, tokenizer = FastModel.from_pretrained(
    model_name = "unsloth/gemma-3-270m-it",
    max_seq_length = max_seq_length, # Choose any for long context!
    load_in_4bit = False,  # 4 bit quantization to reduce memory
    load_in_8bit = False, # [NEW!] A bit more accurate, uses 2x memory
    full_finetuning = False, # [NEW!] We have full finetuning now!
    # token = "hf_...", # use one if using gated models
)

We set it up to accept training data in a chat format using the Huggingface deepset/germanquad dataset, a curated set of training data from the Deutsch Wikipedia and various academic sources.

model = FastModel.get_peft_model(
    model, r = 128,
    target_modules = ["q_proj", "k_proj", "v_proj", "o_proj",
                      "gate_proj", "up_proj", "down_proj",],
    lora_alpha = 128, lora_dropout = 0, bias = "none",
    use_gradient_checkpointing = "unsloth",
    random_state = 3407, # Seems pretty random
    use_rslora = False, loftq_config = None,
)

from unsloth.chat_templates import get_chat_template
tokenizer = get_chat_template(tokenizer, chat_template = "gemma3")

from datasets import load_dataset
dataset = load_dataset("deepset/germanquad", split = "train[:10000]")

def convert_to_chatml(example):
    return {
        "conversations": [
            {"role": "system", "content": example["context"]},
            {"role": "user", "content": example["question"]},
            {"role": "assistant", "content": example["answers"]["text"][0]}
        ]
    }
dataset = dataset.map(convert_to_chatml)

def formatting_prompts_func(examples):
   convos = examples["conversations"]
   texts = [tokenizer.apply_chat_template(convo,tokenize = False,
       add_generation_prompt = False).removeprefix('<bos>') for convo in convos]
   return { "text" : texts, }
dataset = dataset.map(formatting_prompts_func, batched = True)

from trl import SFTTrainer, SFTConfig
trainer = SFTTrainer(
    model = model, tokenizer = tokenizer,
    train_dataset = dataset, eval_dataset = None,
    args = SFTConfig(
        dataset_text_field = "text",
        per_device_train_batch_size = 8,
        gradient_accumulation_steps = 1,
        warmup_steps = 5, num_train_epochs = 1,
        max_steps = 100, learning_rate = 5e-5,
        logging_steps = 1, optim = "adamw_8bit",
        weight_decay = 0.01, lr_scheduler_type = "linear",
        seed = 3407, output_dir="outputs",
        report_to = "none",
    ),
)

from unsloth.chat_templates import train_on_responses_only
trainer = train_on_responses_only(
    trainer,
    instruction_part = "<start_of_turn>user\n",
    response_part = "<start_of_turn>model\n",
)

We then train the model. This took about three minutes on Google Colab using a Tensor T4 system.

trainer_stats = trainer.train()

Now, the real test: can it give good answers to questions not in its training data?

messages = [
    {'role': 'system','content': 'Bielefeld'},
    {"role" : 'user', 'content' : 'Gibt es Bielefeld?'}
]
text = tokenizer.apply_chat_template(
    messages,
    tokenize = False,
    add_generation_prompt = True, # Must add for generation
).removeprefix('<bos>')

from transformers import TextStreamer
_ = model.generate(
    **tokenizer(text, return_tensors = "pt").to("cuda"),
    max_new_tokens = 125,
    temperature = 1, top_p = 0.95, top_k = 64,
    streamer = TextStreamer(tokenizer, skip_prompt = True),
)

<bos><start_of_turn>user
Gibt es Bielefeld?
<end_of_turn>

<start_of_turn>model
Ja
<end_of_turn>

Indeed yes, it can!

If that interaction doesn't make much sense: it is a German joke, alleging that the city of Bielefeld doesn't actually exist. Wikipedia has an explanation in English.

The trained model says that Bielefeld does exist. Clearly it has no sense of humor.

Sunday, August 17, 2025

Iceland Carbfix Tour

In July 2025 we took a tour of the Geothermal Exhibition at Hellisheiðarvirkjun in Iceland, all about Geothermal power. My spouse is a Professional Geologist, for whom this was an especially interesting tour.

We took a slightly more extensive version of the tour which included the CarbFix plant, a carbon capture and sequestration project where carbon dioxide is injected deep underground to mineralize.

Near the Carbfix injection site is the Climeworks Mammoth plant, a direct air carbon capture facility. We didn't get to go inside, we could only see it from a distance.

large steam pipes running over a low hill and across the field
Steam pipes from the geothermal vents back to the power plant.
steam pipes within the power plant
Steam pipes within the power plant.
turbine within the power plant
Turbine within the power plant.
Building with a very large number of fans to pull air through
Climeworks Direct Air Capture facility.
Carbfix piping driving H2S and CO2 deep underground
Carbfix H2S+CO2 pumping facility.
Carbfix H2S and CO2 meters
Carbfix H2S+CO2 meters.

Saturday, August 16, 2025

Survey of Germany-related blog posts

A gothic building with a huge animatronic clockMy spouse's German mother emigrated to the United States in 1958. Until 1975, German mothers did not pass on citizenship to children born in wedlock. My spouse was not born a German citizen for this reason. The modern state of Germany has decided that this gender discriminatory policy was unconstitutional, and defined a declaration process called Staatsangehörigkeit § 5 (StAG5) by which descendants of such persons can declare their German citizenship.

Hand holding four German Reispässe

Our journey in this area began in 2020 with genealogical research, then filing a declaration of citizenship for spouse and our children, and finally taking trips to Germany as new German citizens. I've written a number of blog posts on this topic, roughly categorized below.


 

German Genealogy




German Citizenship




Other Topics of Interest to Americans, Concerning Germany




Our European Experiences

Monday, August 11, 2025

High School German with UCScout On Demand

I've written about our journey to German citizenship for my wife and our children. Yet merely having a German passport, Passdeutsche, isn't our goal: we want our children to be able to function comfortably in Europe if they choose to do so at any point in their lives. That means learning to speak German conversationally, if not fluently.

UCScout logo

Two of our kids are in High School. My school in Missouri lo these many years ago offered French, Spanish, and German, but times and school funding levels were different back then. Our High School now offers Spanish — the most widely used language in California after English, but we'd prefer they use this time to learn German instead.

Last year we started taking an online German course from UCScout, which is run by the University of California. The UCScout On Demand courses are self-paced but have an instructor available to assist, grade assignments, and conduct sessions in German. The On Demand courses cost $399 per semester, are accredited high school courses, and meet California's A-G requirements.

We have opted out of the Spanish class offered at school and instead enrolled in the UCScout German course, for 10th and 11th grade so far. At the end of each two semester course UCScout sends a report which our school incorporates into their regular transcript. There won't be a separate report card for the German classes when they apply for admission to college, it will all be part of their High School transcript.

A gothic building with a huge animatronic clockThis has worked out quite well for us. During the school year they use the hour which would have otherwise been the Spanish class to work on their German. They've also taken a class over each of the last two summers while we were in Germany. Being able to work on their own schedule lets them do the classwork in the evenings after we're done for the day.

If you choose to do something like this, start early. It took the entire first year of high school to get agreement that the kids would be allowed to drop Spanish and take German instead. It helped that the school had used UCScout during the pandemic to offer their Spanish course, they already had a way to incorporate the grades into their system.

Monday, August 4, 2025

Germany trip 7.2025

Reprising last year's trip, we spent another July in Europe this year.

One somewhat less pleasant aspect of last year's trip was the flights, particularly the return from Frankfurt to San Francisco where we spent 13 hours in the air. This year we broke up the time in the air:

  1. San Francisco -> Pittsburgh, to visit family
  2. Pittburgh -> Iceland
  3. Iceland -> Munich, Germany
  4. Munich -> Potsdam, near Berlin
  5. Potsdam -> Hannover
  6. Hannover -> Hamburg
  7. Hamburg -> Reykjavik, Iceland
  8. Iceland -> New York City
  9. New York -> San Francisco

 

Pittsburgh

We mainly visited family in Pittsburgh, but saw a few sights like the Duquesne Incline.

Duquesne Incline in Pittsburgh, a furnicular railway with a rail car climbing a steep slope
Duquesne Incline

 

Iceland Geothermal Exhibit

We rented a car in Iceland and went to the Geothermal Exhibit, all about Geothermal power. My spouse is a Professional Geologist, for whom this was an especially interesting tour.

Turbines and steam pipes
Geothermal power plant at Hellisheiðarvirkjun
Carbon capture system

 

Munich

We spent four days in Munich, a highlight was watching a performance of the Glockenspiel.

A gothic building with a huge animatronic clock
Munich Rathaus Glockenspiel

 

Potsdam

Last year we stayed in Berlin, and didn't find time to make it down to Potsdam but wanted to. So this year, we spent four days in Potsdam. We toured the Sanssouci Palace.

Sanssouci Palace

 

Hannover

My wife's family is from Hannover, we visit each time we are there. This year we went to Lake Maschee and the Herrenhäuser Garten.

Panoramic shot of a lake
Lake Maschee
Statue of a man laying in the lap of a woman
Herrenhäuser Garten

 

Hamburg

We loved Hamburg. Hamburg and Potsdam were our favorite cities on this trip, mainly because of the water. It reminded us of the San Francisco Bay.

Miniature replica of a city
Miniatur Wunderland

 

Reykjavik

We went back to Iceland on the return trip, staying in Reykjavik. We visited the Hallgrímskirkja church.

Hallgrímskirkja

 

New York City

I've been to New York a number of times but the rest of the family had not been, so this was a special treat. We took tours of the United Nations and of the Empire State Building.

View of a large room with two concentric seating areas, the UN Security Council chamber
United Nations
View of Manhattan from high above
View from the Empire State Building

Wednesday, July 30, 2025

Personal View of NYC Congestion Pricing

In the 2010s I managed an engineering organization with teams in California and New York. I travelled to NYC a number of times, typically staying near Chelsea Market.

The Maritime on 16th street was my usual lodging, next to Google's NYC office and with the 14th Ave subway station nearby. I recall the blaring of car horns being ever-present, continuing late into the night.

We brought the whole family to New York City in July, the first time I have been there in almost 10 years. We stayed in Manhattan in the Financial District, and went for pizza very near the Google building. The streets were very clear, nowhere near the level of traffic I remember.

view from high above the streets of Manhattan, with almost no cars visible driving on the roads
(view from the Empire State Building)
ground level view of an empty intersection in New York City
(on the way to pizza)

In January 2025, New York City implemented a congestion pricing mechanism to increase tolls for cars entering the city. It had an almost immediate impact in reducing traffic:

The Federal government, always eager to increase fossil fuel consumption, has revoked the needed authorizations and demanded that NYC end the congestion pricing mechanism. The two parties will present their arguments in court in October 2025.

I hope congestion pricing stays. The city is better for having it in place.

Friday, July 18, 2025

Tello Android settings for VoWifi

Tello is a T/Mobile MVNO in the US which offers good support for Voice-over-Wifi, whereby voice and SMS can be sent using an Internet connection while overseas and not require expensive roaming minutes. We were successfully able to use VoWifi on our recent trip to Europe:

  • SMS text messages arrived
  • SMS messages sent were delivered
  • incoming calls ring the phone
  • outgoing calls work, and carried by usual US number as the callerID

The most important setting we needed to set was "Automatic data switching." My Tello plan includes no roaming minutes at all, I had installed a travel eSIM from Roamless.

Friday, July 11, 2025

Vibe Coding and Wisdom

I started experimenting with Claude Code a while ago. I am not the first person to make this observation, but thinking of Claude Code as an early career developer whom one is mentoring and needs to guide to a solution is a good mental model for it. It is pretty impressive in what it can do.

Asking it to produce something the size of what one would want to see in a code review from an early career developer produces good results, far faster than I could write myself, at a cost of a few cents.

For example:

Add a command line utility written in Go in cmd/adduser. It takes command line arguments for email, phone, imsi, realm, remsim, gpp_hostname, ns, slack_app_token, slack_channel_id, and smtp_list. It encrypts the slack_app_token, slack_channel_id, and smtp_list using the code in internal/db/encryption.go. It opens a connection to the database using the code in internal/db/server.go, and adds a new row to the accounts table using the command line arguments it was given.

This resulted in a quite functional command line program which did what I asked.

func main() {
        var (
                email          = flag.String("email", "", "Email address (required)")
                phone          = flag.String("phone", "", "Phone number (required)")
                imsi           = flag.String("imsi", "", "IMSI (required)")
                realm          = flag.String("realm", "", "Realm (required)")
                remsim         = flag.String("remsim", "", "Remsim (required)")
                gppHostname    = flag.String("gpp_hostname", "", "GPP hostname (required)")
                ns             = flag.String("ns", "", "NS (required)")
                slackAppToken  = flag.String("slack_app_token", "", "Slack app token")
                slackChannelID = flag.String("slack_channel_id", "", "Slack channel ID")
                smtpList       = flag.String("smtp_list", "", "SMTP list")
        )
        //...omit the rest but it was straightforward code...

 

Unit Testing

Claude Code doesn't seem to produce unit tests as a regular part of its development... like some engineers I've worked with I suppose. However it can be prompted to do so and produces a reasonable result.

In cmd/forwarding create a unit test which starts a server on localhost and uses an http client to connect and POST JSON with input arguments.
 
In cmd/user-management create a unit test which starts a server on localhost and uses an http client to fetch /api/account/<uuid>/db

Claude Code produced a quite reasonable table-driven Go test, including creating an in-memory sqlite3 to mock the Postgres database.

+       testCases := []struct {
+               name           string
+               uuid           string
+               requestBody    ForwardRequest
+               expectedStatus int
+               setupDB        func(*sql.DB, string)
+       }{
+               {
+                       name: "successful forward",
+                       uuid: testUUID,
+                       requestBody: ForwardRequest{
+                               Sender:   "test@example.com",
+                               Received: "2023-12-01T10:00:00Z",
+                               Content:  "test message",
+                       },
+                       expectedStatus: http.StatusOK,
+                       setupDB: func(database *sql.DB, uuid string) {
+                               slackToken, _ := db.Encrypt("test-slack-token")
+                               slackChannel, _ := db.Encrypt("test-channel")
+                               smtpList, _ := db.Encrypt("test@smtp.com")
+
+                               query := `INSERT INTO accounts (id, account_number, email, phone_number,
+                                         imsi, realm, remsim, gpp_hostname, ns, slack_app_token,
+                                         slack_channel_id, smtp_list)
+                                                 VALUES (?, 1001, 'test@example.com', '+1234567890',
+                                                         '123456789012345', 'test.realm', 'test-remsim',
+                                                         'test-gpp', 'test-ns', ?, ?, ?)`
+                               database.Exec(query, uuid, slackToken, slackChannel, smtpList)
+                       },
+               },

Unwise Approaches

Attempting to get Claude Code to generate larger systems with less guidance didn't go as well. What it produced was functional and quite impressive, but unwise.

I wanted to try out an admin panel with a login link sent to an email address. Not suitable for all environments, but sufficient for many services which rely on the user's email.

In the admin sub-directory, generate a web user interface for an admin console using typescript and react, with a backend server written in Go.
 
The login screen has a text box to enter an email address. When the Submit button is pressed, the backend server should generate a 128 bit random string and use os.exec to run an email.sh process. The backend server should redirect the user to an interstitial page which says "Please click the login link sent to <email address>."
 
Once logged in, the main page has ...


Claude Code generated a quite functional admin console. One could submit an email address and it would fork the script to send email. It maintained a map of pending login tokens in the Go backend. When one clicked the link in the email the backend would respond with ok it it found that token in its active table, otherwise failure. Quite exhilerating to see all of that work within a couple minutes of starting on it.

However this means the client code, itself, was deciding the success or failure of the login link. If it got an ok from the backend, it would proceed to the URL for the admin panel. The backend code would serve up whatever it was asked for, there was no enforcement in the backend.

Anyone capable of understanding the client JavaScript could figure out the URL of the admin panel for any user. The login link only provided the illusion of protection. It was trivial to bypass.

One can observe that Claude Code generated exactly what I told it to, which is a fair observation. One might also observe that Claude Code just regurgitates its training set, meaning that human developers have done similar things in large numbers. This is also a fair observation.

Nonetheless it reinforces that vibe coding is best used as a multiplier, not a substitute, for actual expertise.

Friday, July 4, 2025

Your Parent Did Not Give Up German Citizenship at 18

map of Germany

There have been a large number of US troops stationed in Germany for decades, since the end of World War II. As happens in these circumstances, a fair number of US servicepeople have started families with their spouse who moved with them from the United States, with children born while stationed in Germany.

Some things which are commonly believed amongst US military families who have been stationed in Germany:

  • Children born to US servicepeople on German soil will be dual citizens of the US and Germany.
  • At the age of 18 or 21 or 23, those dual citizens will have to choose which citizenship they will keep and forfeit the other.

Unfortunately neither of these is true. German citizenship is not like the US: being born on German soil does not make one a German citizen. One is German if one's parent is German, or if one naturalizes. So a child born to two US citizens stationed in Germany is not German. If the US serviceperson marries a German, then any children could be dual citizens.


 

Certificate of Citizenship

This story is reinforced because children born in Germany will have either a German birth certificate called a Geburtsurkunde or, less often, they will have paperwork from the US military hospital where they were born. Neither of these are acceptable as proof of US citizenship, which the child needs when they return to the US.

It is quite common for parents to order a Certificate of Citizenship for their children, documenting that the child is a US citizen. This often happens at age 18 when the child registers to vote or finds a job which requires that they prove their right to work. The Certificate of Citizenship contains language forswearing other allegiences, and reinforces the belief that the child had to choose one citizenship or the other at age 18.

In reality the presence of that language on the Certificate of Citizenship has no impact, other countries do not recognize the US document as being binding upon their practices of citizenship. If one actually was born a dual German and US citizen, the issuance of a US Certificate of Citizenship has no impact on their German citizenship. They remain a German citizen.


 

Impact

The impact of these misconceptions works in both directions:

  1. People who mistakenly believe they are German citizens, or were German citizens, and try to get that citizenship back.
  2. Perhaps more tragically, people with a German parent who believe they forfeited their German citizenship at 18 or 21 or 23 and never pursued it further, when in reality they remained citizens throughout their lives. They could have made different choices had they known.

If you wonder whether you are in this situation, Reddit's /r/GermanCitizenship can help you figure it out. I spend time on that subreddit as well, helping people understand the declaration processes which we navigated.

Tuesday, June 24, 2025

Paragon mechanical timer 4004-71M vs 4004-71

Our pool pump uses an Intermatic timer which stopped working a few weeks ago. The mechanical timer mechanism is labelled as a Paragon Electric 4004-71M. After scouring eBay for a few weeks with no 4004-71M timers appearing but several 4004-71 models... I bought one, hoping it would fit.

It looks almost identical and mechanically does fit into the housing, with mounting tabs in the right places. However the original 4004-71M has a through hole at the bottom where a mounting screw secures it to the metal box. The 4004-71 has a smaller diameter hole which doesn't go all the way through, intended only to hold a cover over the wiring.

Paragon timer 4004-71M has a through hole for a mounting screw

As my father would surely have said in this situation: "You can have a through hole anywhere, if you want it badly enough."

The timer chassis is bakelite, which drills cleanly if one takes it slowly. A few minutes drilling resulted in a hole suitable to mount the 4004-71 into the housing which originally held the 4004-71M.

Adding to the amassed knowledge of the Internet: the 4004-71 is not a direct replacement for the 4004-71M, but can be modified to work.

Tuesday, April 29, 2025

HomeAssistant Voice Preview Edition poweron

I powered on two HomeAssistant Voice Preview Edition devices, trying to replace our use of Google Home. It is set up self-hosted in a HomeAssistant VM, running on a quite old Dell T320 server running Proxmox. It is an E5-2450 v2 with 8 cores and 20MB cache at 2.5 GHz. The HA-OS VM gets two of those cores.

Pros:

  • has an announce function, one of the most common things we use Google Home for. Yes, we use Google Home primarily as an overly complicated intercom.
  • entirely self-hosted, voice doesn't leave the home

Cons: haven't yet figured out the other common things we use Google Home for.

  • set timer for N minutes
  • play music from YouTube
  • recurring alarms every weekday/Thursdays/etc

Monday, April 28, 2025

National Climate Assessment team disbanded

A colored band with blue on the left and gradually shifting to red across to the right, with a sudden vertical bar of very dark red on the extreme right.
By Ed Hawkins, climate scientist.
CarlinMack created this version.
Three weeks ago contracts for the National Climate Assessment were defunded and work stopped.

Today the 400 people working on it were disbanded.

Production of the report is funded and mandated by law. Presumably in 2028, AI will write something.

Thursday, April 24, 2025

Finding a Role in Climate

Climate Week is drawing to an end, not yet done but one can see the close approaching.

I have spent a bit over a year now on my own, doing some consulting work while looking for longer-term opportunities but also taking downtime away from the industry. I’m very motivated to work on climate, building on earlier efforts:

  • two years as a Senior Fellow at Project Drawdown
  • several years coaching climate community members starting their careers
  • Cohort 5 of the ClimateBase Fellowship
  • all of that coming after several decades in the Tech industry, at three startups (Dominet Systems, ConSentry Networks, Tailscale Inc) and two large companies (Sun Microsystems, Google). I held roles from ASIC designer to software manager to VP of Engineering.

I’m starting to focus again on finding the right long term opportunity, not just consulting. What I’d request of those whom I’ve worked with or had the pleasure to meet along the way is introductions at the right stage, for roles with:

  • a focus on climate as the primary mission. Energy is the best match for my skillset, but I believe that land use and agricultural tech need more effort and I have relevant experience with satellite imagery.
  • a position which is substantially leadership, from Director at a large company to Founder / Founding Engineer or VP at an earlier stage. I can help hire, evolve organizations, and build a product.
  • a technical component which is not zero. Managers should manage, but I believe managers who completely lose touch with the reality of the engineering work become less effective as leaders. I would seek an opportunity where there would be suitable opportunities to contribute technically, and believe it is important that the team see those contributions.
  • an organization with a European connection. We enjoy Europe, have travelled in Germany several times, and have substantial family connections there.

These sorts of opportunities are mostly not posted publicly. I have responded to a few public postings over the past year, that is not an effective way to proceed. I’d ask for warm introductions you may be aware of, early in the process, perhaps when founders are talking about a new venture or considering a new project which needs leadership.

Thank you so much for any connections you can provide.

Tuesday, April 22, 2025

SF Climate Week Opening Keynote

As I did last year, I took the train to get to SF Climate Week. In this area that means taking Caltrain up the Peninsula before switching to the Bay Area Rapid Transit (BART) to the Embarcadero, then walking to Climate Week at the Exploratorium.

Both of those train systems have been substantially improved since last year:

  • Caltrain completed a years-long electrification project, replacing all of the diesel trains.
  • BART finished deployment of a new generation of cars, retiring all of the 25 year old rolling stock.

From this one might infer a renaissance of mass transit deployment in urban areas... but one would be wrong. Indeed, in nearly every area of climate action where the Inflation Reduction Act had spurred progress, the new administration of the last three months has attempted to roll it all back.




Former Vice President Al Gore presented the opening keynote speech, fiery and powerful.

Monday, April 21, 2025

SF Climate Week 2025

This is SF Climate Week! The opening keynote with former Vice President Al Gore, long-time Speaker of the House Nancy Pelosi, & SF Mayor Daniel Lurie is this afternoon at The Exploratorium in San Francisco.

San Francisco Climate Week in green on a black background

I'll be in SF this week as a volunteer helping keep things running, hope to see you there.

Wednesday, April 9, 2025

German Mothers and the Year 2031

Until 1975 German mothers did not pass on citizenship to children born in wedlock, only German fathers did. To address historic gender discrimination in citizenship practices Germany has defined a declaration process called Staatsangehörigkeit § 5. I wrote about our experience with this process, which we completed in 2023.

In the 20th century several million Germans emigrated to the United States. Staatsangehörigkeit § 5 is applicable to a very large number of their descendants today. From a post on r/GermanCitizenship about an April 2025 visit to the German Consulate:

The caseload has increased exponentially in the past 4 months. He said that aside from all the appointments each day, they get between 80 and 90 inquiries a day in the Chicago office alone.

Hand holding four German Reispässe The Staatsangehörigkeit § 5 process will be open for ten years. Having started in August 2021, declarations will be accepted until August 2031. The current wait time in the queue to be processed is about 2.5 years, and is likely to grow with the number of Americans now applying.

If you were born to a German mother prior to 1975 and a declaration of German citizenship is something you'd consider doing, I'd advise starting on it soon. Applications received by 8/2031 should all be processed, but the queue is likely to be years long.

Tuesday, April 8, 2025

Coal Mining Policies

New coal policies are invariably announced in front of a group of workers wearing hard hats with lights affixed, and often in Pennsylvania for good measure. One might assume the mining profession is a huge economic force and under constant threat which must be fended off to preserve families and livelihoods.

As a profession, coal mining employs about 40,000 people in the US.

Graph from the Federal Reserve Bank of St. Louis showing employment in coal mining over time, which started at 177,800 in 1985 and declined to about 40,000 by the year 2020. Employment has been relatively flat at 40,000 since the start of the COVID-19 pandemic in March 2020.

Source: FRED (Federal Reserve Economic Data).




Construction Management requires similar levels of education and experience and according to employment statistics enjoys a similar pay scale. There are 10x to 20x more Contruction Managers in the US.

Graph from the Federal Reserve Bank of St. Louis showing employment in construction management over time, which started at 335,000 in 2000 and had grown to 785,000 by 20204.

Source: FRED (Federal Reserve Economic Data).




Coal policies are not driven by concern for workers. Coal policies are driven by concern for fossil fuel profits, which have only been made possible by externalizing the cost of the damage to human health and acceleration of global warming.

Sunday, April 6, 2025

RSS Feed Likely to Break

The FeedBurner logo, a stylized flame with a yellow upward facing crescent moon center surrounded by dull red flames, perched on a circular blue floor.

Over a decade ago I configured this Google Blogger site to use FeedBurner. This blog never generated ad revenue and I turned ad insertion off, but left the feed still going through FeedBurner.

I'm making progress in moving the blog off of Google Blogger. I am actively trying to reduce my use of big tech companies, limiting them to easily-replaced commodified services wherever possible. I have a Jekyll site working locally, with all existing posts and images imported. I expect to serve the generated static site from somewhere like GitHub Pages or Cloudflare Pages so as to not operate a public-facing site myself, but retain the content and publishing infrastructure locally. The static hosting can be moved easily.

However: I expect the RSS feed will break, with a discontiguous update making it look like more than 400 posts have suddenly published. The Jekyll site will not generate an identical feed to Google Blogger. I also don't intend to use FeedBurner with the new site, as Google began shuttering the service several years ago.

Looking at the feed today, it is three posts behind. I don't know why, but I guess I'm heartened that it is not more. I'm posting this now in hopes that it will be published to any remaining subscribers of the RSS feed before the changeover happens.

Friday, April 4, 2025

Farewell, Google Charts API

Nearly 14 years ago I wrote a joke post about the Holtzmann Shields from Frank Herbert's Dune, complete with impressive-looking but nonsense equations like this one:

LaTeX T = \frac{(0.09\frac{m}{sec})^2(0.0289644\frac{kg}{mol})}{(3)(8.3145\frac{m^2\cdot kg}{sec^2\cdot mol\cdot K})}

That equation was created using LaTeX:

T = \frac{(0.09\frac{m}{sec})^2(0.0289644\frac{kg}{mol})}{(3)(8.3145\frac{m^2\cdot kg}{sec^2\cdot mol\cdot K})}

 

At the time the post was written in 2011, Google offered a Charts API which would accept URL-encoded LaTeX and render it on the fly. The original posting from back then just embedded the Charts API URL as the source for the image, confident that Google would supply a suitable PNG:

https://chart.googleapis.com/chart?chs=239x83&cht=tx&chl=%0AT%20%3D%20%5Cfrac%7B(0.09%5Cfrac%7Bm%7D%7Bsec%7D)%5E2(0.0289644%5Cfrac%7Bkg%7D%7Bmol%7D)%7D%7B(3)(8.3145%5Cfrac%7Bm%5E2%5Ccdot%20kg%7D%7Bsec%5E2%5Ccdot%20mol%5Ccdot%20K%7D)%7D%0A

One can see the LaTeX code in the `chl` parameter.


 

The joke post turned into a joke on me: Google announced the deprecation of the Charts API the following year, and turned it off altogether in 2019. My post from 2011 has been broken for almost 6 years, without me knowing.

I am currently endeavoring to reduce my use of Big Tech services, turning to alternatives over which I have more control. Importing that 2011 post into Jekyll repeatedly failed because the image link was broken. I was able to recover the original LaTeX from the URLs to fix the old post, by generating PNGs.

I think this reinforces the desire to not depend upon Big Tech. Google kills services every day, especially ones like the Charts API which didn't have their own monetization path.

Wednesday, April 2, 2025

Preparing for Offsite Backup

Apple Time Capsule, a thin white device with rounded corners and a single power light on the right side.

For many years, too many years, my family computer backup plan was an aging Apple Airport Time Capsule paired with the fervent hope that nothing would ever fail. That worked pretty well in that we haven't lost anything important, but Backup Theater is honestly worse than just admitting there is no real backup.

Last year I decided that Adulting should include ensuring that family data remains safe and the kids don't lose schoolwork, or the custom Doom WADs they've developed, or what have you. The Adulting Plan for Backups consists of:

  • Android and iOS devices should be backed up somewhere outside of the home.
  • Windows and macOS laptops should be backed up somewhere outside of the home.
  • Proxmox VMs and LXCs should be backed up somewhere outside of the home.

Repetative and boring, perhaps, but that is how a backup plan should be: replicated and safe.


 

Android and iOS

The mobile devices were simplest: they already backed themselves up, Android to Google Drive and iOS to iCloud. Downloading all iCloud photos to immich allowed us to drop to a less expensive iCloud+ storage plan while still using it for device backups.

One downside of using the mechanisms which Google and Apple provide is that the backups are not encrypted from outside access. Google and Apple can access the contents of the device backups. I hope to come back to re-examine these backup plans in the future with something we have more control over.


 

Windows and macOS

After some searching, we paid for Arq Backup Premium, which provides one license for each of our five laptops. Each laptop is configured to back itself up twice:

  1. To the cloud storage which Arq Premium provides.
  2. Using SFTP over Tailscale to the fileserver within our home.

The backup files for all of the laptops together come to a bit over 800GB, nicely fitting within the 1TB of Google Cloud storage from Arq Premium. The backups are encrypted using a key which only we have, neither Arq nor Google can read the contents.


 

Proxmox

The Proxmox server within the home has 10 terabytes of ZFS storage. It provides the SFTP backup which the laptops are configured to reach via Tailscale, and it backs up its own VMs and LXCs to ZFS using vzdump. I'm working on offsite replication for this and might post again when that is done.

Monday, March 31, 2025

ZFS Spooky Failure at a Distance

I use Proxmox with a ZFS array to run a number of self-hosted services. I have been working on setting up zrepl for offsite backup, replicating encrypted ZFS datasets which the remote system will be able to store but not decrypt.


 

While working through all of this, the new 28TB disk intended for the remote system appears to have failed.

root@zfsremote:~# zpool status
  pool: pool1
 state: DEGRADED
status: One or more devices has experienced an unrecoverable error.  An
        attempt was made to correct the error.  Applications are unaffected.
action: Determine if the device needs to be replaced, and clear the errors
        using 'zpool clear' or replace the device with 'zpool replace'.
   see: https://openzfs.github.io/openzfs-docs/msg/ZFS-8000-9P
config:

        NAME        STATE     READ WRITE CKSUM
        pool1       DEGRADED     0     0     0
          sdb       DEGRADED     0    35     0  too many errors

 

Indeed, there are kernel messages about disk errors:

Mar 31 07:16:33 zfsremote kernel: I/O error, dev sdb, sector 23368396833 ...
Mar 31 07:16:33 zfsremote kernel: I/O error, dev sdb, sector 23368399137 ...
Mar 31 07:16:33 zfsremote kernel: I/O error, dev sdb, sector 23368397089 ...
Mar 31 07:16:33 zfsremote kernel: I/O error, dev sdb, sector 23368401697 ...
Mar 31 07:16:33 zfsremote kernel: I/O error, dev sdb, sector 23368401441 ...
Mar 31 07:16:33 zfsremote kernel: I/O error, dev sdb, sector 23368399393 ...
Mar 31 07:16:33 zfsremote kernel: I/O error, dev sdb, sector 23368402721 ...
Mar 31 07:16:33 zfsremote kernel: I/O error, dev sdb, sector 23368402465 ...
Mar 31 07:16:33 zfsremote kernel: I/O error, dev sdb, sector 23368402209 ...
Mar 31 07:16:34 zfsremote kernel: I/O error, dev sdb, sector 23368401953 ...

 

It seems odd, though. I had run `badblocks` destructive tests for weeks before moving on to creating the ZFS pool. After all that, it would choose this moment to begin uncorrectable failures?

Quite suspiciously, 07:16:33 is also the very instant when I sent a kill signal to a vzdump process running on the Proxmox host.

116: 2025-03-31 07:14:31 INFO:  29% (7.4 TiB of 25.5 TiB) in 9h 37m 2s
116: 2025-03-31 07:16:33 ERROR: interrupted by signal
116: 2025-03-31 07:16:33 INFO: aborting backup job

As I now know, trying to kill vzdump with a signal is not the right thing to do. `vzdump -stop` is the right way to interrupt it.

The OpenZFS docs say: "the following cases will all produce errors that do not indicate potential device failure: 1) A network attached device lost connectivity but has now recovered"

So far as I can tell, this is the explanation for this failure. Me sending a signal to vzdump interrupted the stream of ZFS operations, which manifested as a failed array on the other end. I have to say that I'm not fond of array failure as the way to report network errors. I've cleared the failure using `zpool clear` and will hope that zrepl will sort out bringing the two ZFS filesystems back into sync.

I plan to give it a day, then restore the remote dataset and check whether the file contents are sensible. The remote system does not, and will never, have the encryption key to be able to check the contents of the datasets it holds. I'll have to transfer them back to be able to access them.

Saturday, March 29, 2025

Stadtarchiv Hannover bis 2026 geschlossen

I received a Sterbeurkunde from Stadtarchiv Hannover on 28 March 2025, with the following note in the email signature:

Von März bis Jahrsende 2025 verlagert das Stadtarchiv seinen Standort in das neue Sammlungszentrum an der Vahrenwalder Straße 321. Der Lesesaal ist geschlossen, die Bearbeitung von Anfragen eingestellt.

Bei der Erreichbarkeit unserer Kolleg*innen und unseres Funktionspostfachs stadtarchiv@hannover-stadt.de kann es zeitweilig zu Verzögerungen kommen. Wir bitten um Verständnis und freuen uns, Ihnen voraussichtlich ab Jahresbeginn 2026 am neuen Standort wieder im vollen Umfang zur Verfügung zu stehen.

Bitte beachten Sie hierzu auch die Informationen auf unserer Homepage unter www.stadtarchiv-hannover.de.


From March until the end of 2025, the city archive will relocate to the new collection center at 231 Vahrenwalder Straße. The reading room is closed and the processing of inquiries is suspended.

There may be temporary delays in reaching our colleagues and our functional mailbox stadtarchiv@hannover-stadt.de. We ask for your understanding and look forward to being fully available to you again at the new location from the beginning of 2026.

Please also refer to the information on our homepage at www.stadtarchiv-hannover.de.

In 7/2023 a request to Stadtarchiv Hannover would usually be answered in a week, but then something happened. Since 2024 response times have been 6-8 weeks. A post on their website mentioned a challenging staffing situation. I'm hopeful that in the long term, moving to a larger facility will help.

Imagery from the indexes was added to Arcinsys last year, those should still be available in the interim.


 

Update 8/2025: I sent a request for a Sammelakte, a marriage file, to the Hannover Stadtarchiv in 8/2025 and received a response that the archive is indeed closed for relocation for the rest of the year.

leider müssen wir Ihnen mitteilen, dass das Stadtarchiv Hannover seine Serviceangebote wegen des ab März 2025 stattfindenden Archivumzugs eingestellt hat. Unsere Bestände werden verpackt und sind nicht benutzbar. Der Lesesaal ist geschlossen und öffnet erst wieder zum Jahresbeginn 2026 im neuen Sammlungszentrum an der Vahrenwalder Straße 321 (Haltestelle Wiesenau).


Unfortunately, we have to inform you that the Hanover City Archives has suspended its services due to the archive relocation taking place in March 2025. Our holdings are being packed up and are not available for use. The reading room is closed and will not reopen until the beginning of 2026 in the new collection center at Vahrenwalder Straße 321 (Wiesenau stop).

Thursday, March 27, 2025

Macbook Air M1 USB-C Port Replacement

My Macbook Air M1 was gradually developing some kind of impairment in its USB-C ports where I'd have to jiggle or put actual pressure on a USB-C cable to get it to be recognized — and since it has no Magsafe port for charging, this meant it would switch to and from battery as its charging cable periodically lost contact.

Searching turned up people reporting similar issues, especially that the rear port started having a problem first until eventually the front port did as well. There wasn't a consensus solution but a replacement USB-C board from iFixit came up several times. For only $20, I ordered one.

Innards of a Macbook Air M1, with the old USB-C board off to the side and the new board installed

The original USB-C board is off to the right side in this picture. One can see some corrosion and dirt, and also a bit of blackening on what I assume is a power pin. I believe that carbon buildup is likely the primary issue. I'll scrub it off with some alcohol on a cloth and put it away for the future, it would probably work again if needed.

Wednesday, March 26, 2025

Venmo Public Transactions

Venmo pushes hard for transaction activity to be Public. It doesn't say whether any past payments were actually public, and puts up an interstitial to confirm a change to Private.

This selection does have a benefit for the user, in making it more straightforward for friends to find each other and to make payment arrangements. However the choice has a larger impact on Venmo's user growth, and does come with downsides for their users like making activities public which they assumed were not.

Venmo Privacy settings page with options for Public, Friends, and Private. The current selection is Private. Below are buttons to change past transactions to Friends or to Private. Venmo confirmation to really change past transactions to Private?

Presumably Venmo has data on how much of a network effect they get from having payment information be Public, drawing in friends and family and acquaintances and randos. Venmo appears to allow this data to impact their UI design to steer users toward the choice most beneficial to the company.