ASU Learning Sparks
The Ethics of Technological Design
Technological design choices come with many implications. Engineers play a large role in shaping technology and it is important that they incorporate social concerns into their technology design process. The "Problem of Many Hands" exists, wherein difficulty of assigning responsibility in complex technological networks is present. Governance, critique, and advocacy are also important in addressing technological design issues. The ultimate goal is to promote the greater good of humanity and minimize negative impacts.
While the Industrial Revolution caused air and water pollution, humanity had no idea that reliance on fossil fuels could lead to irrevocable changes to our planet’s climate, ecological collapse, and mass extinctions. Unpredicted, unintended, and undesirable, this is just one example of our global society’s difficulty predicting the consequences of technological design choices. How technology impacts society is complex, but I think it’s helpful to discuss how issues at each stage of development can result in negative impacts.
Every technology is an idea translated into design, and many are designed by engineers. Despite decades of incorporating ethics into engineering education, a 2013 survey discovered that university students became less concerned with social issues the longer they were enrolled in an undergraduate engineering program. Dr. Erin Cech suggests “things like public welfare considerations get defined out of engineering problems … [and] the broader social impacts of technology are often explicitly excluded from the purview of engineers’ day-to-day design activities.” Engineering might be technical, but the attitudes, values, and biases of engineers shape the systems they design and the extent to which social concerns are prioritized. One example is the choice to design for End-of-Life in satellites. There are extra costs associated in designing a satellite for de-orbit, not simply in the additional systems necessary; at the end of a satellite’s useful life, it’s still functioning, and can still generate revenue with little upkeep cost. But when you have a satellite operating past its End-of-Life, it is more likely to become space debris, which endangers everyone’s ability to use or access space.
Technologies impact so many people that it’s hard for anyone to know how their work might affect myriad others. This would be difficult with even a single designer, but modern technologies are designed, manufactured, and sold by networks of organizations involving a convoluted association of not just engineers, but lawyers, financial professionals, artists, researchers, and more, each with different concerns and agendas.
This creates the “Problem of Many Hands,” where responsibility for a decision becomes near-impossible to pinpoint or control. Consequences often result from seemingly minimal behaviors of unsuspecting individuals, creating a need for governance – laws, regulations, and institutions at a scale capable of pushing for positive outcomes across this network of actors.
Innovation doesn’t stop with the developer. Users of technologies often re-invent how they are used and their importance. Who knew a decade ago that I would conduct my personal banking entirely through satellite communication technologies?
A special complication occurs when technologies really catch on. Something invisible to the designer of a new technology may become noticeable when it reaches significant scale. One fourth of climate emissions are caused by the transportation sector. However, Henry Ford couldn’t have predicted that lightweight engines, reliant on cheap, energy-dense fuel, would at tremendous scale impact the very chemical composition of the world. Now our global society is literally designed around these technologies. This tradeoff between predictability and control is referred to as the Collingridge Dilemma. When a technology’s design could be modified, specific modifications are unpredictable; once the changes are apparent, it’s difficult to course correct. The surrounding infrastructure makes design modifications extremely onerous.
It’s important to center human values in technological design, but that’s only one piece of the puzzle. The other pieces are critique, advocacy, and governance. Critique highlights where technology has issues. These might be categorized by the societal perspective they represent, such as racial critique for how technologies (such as a racist AI algorithm) impact people of a particular race, or feminist critique for how a technology might impact women, or a colonial critique highlighting the way a technology has maintained and reinforced geopolitical power imbalances. Once harm is identified, change must be forced at scale – something we would normally refer to as advocacy. Because, as I mentioned earlier, the best way to push for change given the organizational complexities of an entrenched technology, is through a similarly large and complex institution – i.e. governance.
In the end, the goal of any technology should be to promote the greater good of all of humanity and reduce harm, whether in space or on our own planet.