If you write code for a living, there’s a chance that during your career, someone will ask you to write something a little dodgy – if not outright unethical. The question is not ‘What will you do?” The question is “Will you recognise an unethical request before it’s too late?”
As software takes over more of our lives, the ramifications of decisions made by programmers only become greater. Code is in our cars, our security systems, our children’s toys, and our back pockets. Lack of foresight, lazy programming, or dubious decision-making has the potential to haunt humanity for years to come.
Why do we trust the people who write this code? Many professions are governed by a code of ethics, and when the moral imperative isn’t enough to keep professionals in check, there is legislation in place to protect customers. What is keeping professional programmers from putting tens of thousands of people at risk through sloppy coding, poor testing, or malicious intent?
The debate about ethics in software development has raged for as long as the profession has been around, but the impact of poor programming decisions hits a little closer to home these days.
Technology has evolved with such momentum that the matter of whether something can be built; has become second place to whether it should be built. Realistically speaking it can be nearly impossible to assess all the potential future applications of a technology, so does that mean developers should be exempt from blame when their creations are used to do harm?
Not all unfortunate incidences stem from the morally ambiguous decisions of one person. What makes the ethics debate more challenging is that software is seldom built all at once or by one person, but rather over time and by several developers. Let’s not forget the influence that business analysts, project managers, and stakeholders have over the design and functional requirements of these systems. All these components allow questionable code to be written incrementally and for a variety of reasons.
Furthermore, it can be incredibly challenging to navigate what is right and wrong when you are under pressure to meet deadlines, compete in a swelling market, and pick up the kids at five.
If a developer is pressurised into delivering or losing their job, which option do you think they’d choose most often? How are programmers expected to navigate these murky waters?
Would it be easier for professional programmers if there was a clear code of ethics available that provided context and a framework for them to fall back on? Possibly.
How has code negatively impacted the world?
By now you’re probably trying to think of examples of how code has negatively impacted your world...
Sometimes it's deliberate..
Over the years there have been several highly publicised incidents where developers or their companies have been embroiled in scandals.
Volkswagen’s diesel dupe scandal – Volkswagen used a crafty algorithm to detect when one of their cars was undergoing emissions testing. It ran the engine cleanly during tests and switched off emissions control during normal driving conditions. The algorithm allowed these cars to emit 40 times the U.S. Environmental Protection Agency’s maximum pollution levels. When Volkswagen faced a Congressional hearing over the diesel emissions Michael Horn, the CEO of VW's American division, claimed the defeat devices were put in place by a few rogue software engineers. Horn stated, "This was a couple of software engineers who put this in for whatever reasons."
The Aleksandr Kogan, Facebook and Cambridge Analytica debacle – In 2014, a researcher named Aleksandr Kogan created a personality quiz app that was installed by Facebook users. When users installed the app, they granted access to not only their personal data but that of their friends. Kogan reportedly sold this data to Cambridge Analytica without the consent of these users. Cambridge Analytica is a political analysis firm that claims to build psychological profiles of voters to help its clients win elections. The firm now stands accused of breaking US election law and has filed for insolvency.
Were these incidents motivated by concerns over losing a job, greed, or a desire to outsmart regulators?
Sometimes it's accidental...
Radiation overdoses – There have been several incidences of radiotherapy going horribly wrong because of errors in the systems used to handle the treatment. Both the Therac-25 radiation therapy machine and the treatment planning system in the Instituto Oncológico Nacional, Panama failed to work as require and resulted in patients receiving massive overdoses of radiation. These errors caused severe radiation poisoning in patients and several deaths.
Averting nuclear wars – You may not have heard of Soviet Air Defence officer Stanislav Petrov, but his gut instinct is credited with saving many people’s lives. In September 1983 the Soviet early warning system malfunctioned and erroneously reported the launch of multiple USAF Minuteman intercontinental ballistic missiles from bases in the United States. Petrov identified these warnings as erroneous. Had the data from the warning system been acted upon, the Cold War stalemate could have escalated into a full-scale nuclear war.
Could these errors have been averted if more forethought or diligent testing had gone into the development of these systems?
Programming errors and unethical developers have been causing chaos since the beginning. The biggest difference between the 19th century and now, is that now there are more than 7.2 billion smart devices used around the world to run our lives. Like it or not digital transformation is now.
From avalanche transceivers to banking apps, to rail transport and private vehicles, to IoT toys we are all reliant on software in one shape or form.
Queue the obligatory quote, used most recently by Uncle Ben, “With great power comes great responsibility.” Trite as this statement has become in recent years, it should serve as a very real reminder to everyone who writes code to consider the impact of their creations before unleashing them on the world. Because the reality is, that if programmers don't start to govern themselves there are going to be legislators that will do it for them. Dictating everything about their jobs, down to the programming languages that they can use.
Ethics isn’t simply about obeying the law and not killing users, it’s about being mindful of how your actions impact others. Will your code be used to do harm? If you write shoddy code, how many late nights will your colleagues need to put in to fix your errors? How will your faulty code affect your customer’s business and reputation? These are the questions developers should be asking themselves.
Ethical issues have no Boolean (yes/no) answer. Like all professionals, programmers need to exercise due diligence and then use their own judgment to do what is right.
Article take away?
Think past your delivery deadline and consider how your code will be used in the future.