Memoria [EN] Nr 83 | Page 20

skyscrapers, global communications networks, and highly automated assembly lines.

Recently, many have lauded large language models for their broad applicability, which promises widespread growth and more efficient task completion. In a world that seems to strive for bigger and faster everything, scale and efficiency have become key measures of performance.

Yet while we celebrate these achievements, we often seem to overlook the capacity of technology to enable large-scale harm also.

Technology did not create Nazi prejudice. But it did allow for atrocities at scales hitherto unfathomable.

On January 20, 1942 in a Berlin suburb, fifteen Nazi party officials discussed how to handle the approximately 11,000,000 Jews in Europe. This cold bit of calculation formed part of what is now referred to as the Wannsee Conference.6 Different officials in attendance raised concerns about the logistical difficulties of “evacuating” (a euphemism for murder) such a large number of people. The Nazis had a problem. Technology promised a “solution” in the form of gas chambers. While the Nazis were already committing mass murder prior to the Wannsee Conference, the subsequent scale of murder was made possible in large part due to new, fiendish technologies.

What, then, is good about scale and efficiency?

Moreover, these questions not only apply to how we might think about technology design but also to how we might reflect on our own individual roles as technologists. Raised on rhetoric about engineers saving the world, many of us set out to create large-scale change through our work. Indeed, we may even find ourselves motivated by one of the National Academy of Engineering’s 14 Grand Challenges (the promise of personal and professional grandeur embedded in even their name). From global pandemics to worsening climate change, we all feel a sense of urgency to create change—and fast!

Yet chasing such an impact often means ceding control of the shape our labor takes. The technologies we develop eventually leave the lab (or, more often now, an open-concept office space) and permeate society, entwining themselves in global problems and existing power structures. The larger the scale, the less we may be able to adjust and the more harm may come. We have our entire careers to work toward positive change. We should consider starting out by focusing on smaller-scale effects or slowing down to create change iteratively—and ideally collaboratively.

Lesson 3: Examine whose voices are left out of the design process and find ways to engage with them.

What we know about our impact as technologists depends on who we care about enough to talk to. For example, people who are not “users” of a product are often left out of user research studies, even if they are affected by the product. What’s more, the diversity of those included in user-research studies can vary greatly based on how much time the study is given, who is contacted, and who can afford to participate. Unintended consequences arise when designers fail to consider the perspectives of people who are not “target users.” These consequences often disproportionately harm minority communities.

While studying the Holocaust, we were struck by the importance of knowing the impact of one’s work. During this period, gas chambers were kept in remote locations, largely hidden from society. In these chambers, a chemically engineered pesticide named Zyklon B enabled Nazis to murder with speed and at scale. At the beginning of WWII, Degesch sold pesticide to concentration camps to prevent the spread of infections and disease. These chemicals eventually became means of mass extermination. Carl Wurster, the chairman of the Degesch Board of Directors, was acquitted of all charges in the Nuremberg Trials. The website of BASF, a company descended from Degesch, states that “the records still preserved and witness accounts give no indication that Carl Wurster knew of