Book notes: We Just Build Hammers - Coraline Ada Ehmke
In “We Just Build Hammers”, Coraline Ada Ehmke explores historic parallels to moral issues in technology & software development. Referencing both fiction, historic and scientific sources, the book moves in four parts towards the current day and age.

I start of with a summary of the book, followed by a short reflection.
Summary
In the first part H.G. Wells’ writings set the scene. Wells’ has an almost scientific approach to his writing. Ehmke writes:
Wells’s future histories shined a light on what was possible and gave people permission to imagine alternate futures. But he insisted on the need for rational approaches for bringing these futures about or averting their catastrophe.”
His literature is compared to events that were developing in that same era. Leo Szilard is quoted saying:
In so far as the present discoveries in physics are concerned, the forecast of the writers may prove to be more accurate than the forecast of the scientists.
Szilard was the one who convinced pacifist Einstein to urge Roosevelt to start researching what would become the atomic bomb. But while intended as building knowledge about and/or a deterrent, the scientists were not able to prevent it from being deployed.
The second part plays in the early years of commercial computing, and the fiction this time is The parable of the Locksmith by Neil Macdonald, published in Computers and Automation in 1958. A locksmith is asked by a stranger to open a safe, offering everything desired. While unsure about the contents of the safe, the locksmith reasons: if I don’t take the job, the second-best locksmith will. The locksmith became exceedingly rich, but not much later the stranger shows up as ruler of the world, because he now has power over the best weapon in the world, capable of instantly destroying any living person. A weapon based on drawings from the safe.
Referencing the Nuremberg trials, Macdonald concludes the locksmith didn’t act ethically and was complicit in war crimes.
The magazine Computers and Automation was central to debates around ethics in computing at that time. Edmund Berkely, editor of the magazine and one of the cofounders of the ACM, used it to publish the outcomes of the ACM Committee on the Social Responsibilities of Computer People, highlighting the need for ethical considerations for such powerful technology as computers. But as time moved on, and politics in the US changed rapidly, there was little room for ethics in computing anymore, nor for Berkely, who was haunted for being guilty by association during the anti-communism years.
In the third part Ehmke explores Hacker culture. It is introduced using Neal Stephenson’s Snow Crash (1992), which is known as the book that introduced the term metaverse, and promoted the idea of avatars in these virtual worlds. While discussing the book Ehmke introduces us to the Hacker “ethics” like meritocracy (despite that these merits typically seem to include irrational merits), and GPL guru Richard Stallman’s four freedoms:
the source code of the software must be inspectable and changeable, copyable, and distributable, and modified or extended versions of the software must be allowed to be made available. But the primary condition, Freedom Zero, insisted that the software be free to use for any purpose, without any restriction
Another influential hacker in the open source world has been Eric S. Raymond, author of The Cathedral and the Bazaar (http only), in which he promotes an open, iterative and collaborative style of development over a more big design up front style. It was during these times, that the engine underpinning the popular game Doom was open sourced, as was Netcape (that became Mozilla that became Firefox). During these years, also the open source operating system Linux gained more and more traction, becoming a threat to Microsoft’s dominance.
But this unrestricted freedom centred ethic was neglecting accountability, and Ehmke points out: “This moral indifference toward broader outcomes and social impact doesn’t meet Berkeley’s ethical criteria” Hiding behind fallacies such as the inevitability argument, the hammer argument (‘tech is neutral’), or treating the Stallman’s freedoms “as fundamental human rights”.
And worse, since then open source is captured by corporations. And hacker culture by capitalists ideas about 60-hour work weeks, working through the night and not questioning the nature or impact of their work, taking pride in these Protestant work ethics.
In the fourth and last part Ehmke explores more current times. Central to this chapter is the book Trouble on Triton by Samuel R. Delany (1976). With increased attention to feminism, diversity and social justice, we see the rise of regressive actions. DEI efforts are cancelled, and Ehmke reminds us of alt-right uproars such as Gamergate doxing women and transgender technologists. And Raymond from the Bazaar called Social Justice Warriors a “problem” in open source.
But things did improve slowly. Conferences were adopting codes of conduct to address harassment and other unsafe behaviour, and the contributor convenant for open source projects, which was drafted by Ehmke, was increasingly adopted by open source projects. But not without a cost for the warriors, including Ehmke.
But it didn’t mean the end of the culture wars. We saw #NoTechForICE, walkouts, the Hippocratic License (again originally drafted by Ehmke, but again attacked by Raymond). Many corporations resisted actual change, or implemented DEI measures superficially, but others were embodying it with Ehmke tributing Mitchell Baker’s lead in Mozilla, when writing the “Pledge for a Healthy Internet”, highlighting their commitment to being for everyone, pro civil discourse and the common good.
So where are we now? Instead of only 1.1% of the FLOSS contributors being female in 2002, we now see statistics between 5% to 10%. Not enough, but better.
Reflection
Coraline Ada Ehmke’s book is an interesting exercise in aligning futuristic literature from the past with developments that played out in large a few decades later. As observer and later even participant in the actual events Ehmke was able to selectively choose the stories that lined up with the actual developments. “We didn’t know” is not an argument.
But the main point of the book is that we can’t be hiding behind fallacies like ‘we just build hammers’. If we create yet another hammer, we really should ask questions like Wells did: “What would happen if this thing were true?” Or if we’re thinking about creating what to add to the world of software, perhaps we could start imagining what world we want to shape and try to answer the question that Ehmke derived from the approach used in Trouble on Triton: “What would need to be true for this thing to happen?” This could be code as written for machines, but perhaps our industry needs more codes for humans. Perhaps emphasising the obvious, but at least we can now have rules that people can be helt accountable to.
We should also be reminded that the frontrunners who exercise these forces for good are often dismissed, verbally attacked, or worse doxed and threatened. It happened to Berkely, but women like Ehmke and minorities have elicited even worse responses.
We now live in a world where technology companies dominate the economy, and are increasingly aligned with extreme capitalist if not fascist ideals. DEI efforts are being cancelled because their befriended Führer wants it. It is therefore on to developers, engineers, project leaders, who are enabling software to exist, to make sure it is not used for bad (and if so, reject cooperation, and not being complicit) and to imagine how our powerful tools can be used to create a better, more inclusive, more sustainable, more peaceful world. Ehmke’s book is hence a recommended read. Histories are repeating in slightly different ways, and it is possible to overthink the futures that lie in front of us. We need to exercise that muscle.