Pitfalls in Ethical Decision-Making
Last updated
Last updated
Implementing ethical decision-making can be a complex process. In addition to each of the actions and decisions that need to be made throughout the process, there are some common pitfalls that can severely hamper the success of your ethical decision-making efforts.
Biases and prejudices can undermine the ethical decision-making of both individuals and groups. Seeing things as they really are isn’t the same as seeing things as we want them to be, or as we have always supposed them to be. For this reason, it is important to remember that it is generally easier to believe things that already fit within our worldview, and that this doesn’t necessarily mean that they are true.
Biases, which are prejudice in favor of or against a person, group, or thing as compared with another, usually in an unfair way, come in different forms. What they all have in common is that they skew our view of how important something is. Being biased towards or against something essentially means thinking that something is more or less important or relevant than it really is. For example, gender bias often means that a person’s gender is treated as relevant in some way that typically disadvantages them.
Biases are a huge problem for moral reasoning because they skew our view both of what should count as a reason and how we should weigh different kinds of reasons. In this sense, checking our biases essentially means making sure that we don’t mistakenly perceive things as reasons that aren’t really reasons.
One important bias that is specific to the dynamics of reasoning in groups is groupthink. Groupthink occurs when groups members fail to voice opposing views. While agreement and harmony are positive features of groups in many contexts, groupthink is a danger to moral reasoning. Recall that good moral decision-making means going through different reasons, and ending up with the strongest. Good moral decision-making therefore involves challenging the reasons we come up with, to test how strong they really are. Since this cannot happen if everyone agrees, groupthink is a danger to moral reasoning.
Here are some strategies to avoid groupthink:
Assign at least one group member the role of devil's advocate, rotating at each meeting. A devil's advocate expresses a contrary opinion to provoke debate or to test the strength of an opposing argument.
Discourage leaders from expressing an opinion early in the discussion, and even encourage them to absent themselves from group meetings to avoid excessively influencing the outcome.
Set up several independent groups to work on the same problem.
Invite outside experts into meetings.
Ethical reasoning is an essential skill for those developing emerging technologies. But this is often not sufficient. As we have seen, different courses of action are sometimes supported by good reasons. Being in a position to trade off competing reasons is to exercise power. Who should exercise this power?
For example, should the passenger be the one who decides whether a self-driving car should sacrifice a pedestrian to save the passenger’s life? Or should it be the government? Or the company that produced the car?
Sometimes, what it means to do the right thing isn’t just dependent on what different reasons we have. Sometimes, it’s also a question of who decides. Therefore, ensuring that the decision is legitimate is crucially important. To ensure that the decision is legitimate and is also perceived as such by stakeholders, organizations need a credible decision-making process. Markers of a sound process are:
Decisions are grounded in ethical principles that the organization consistently applies to difficult cases.
Affected groups had fair opportunities for input in the decision-making process.
The process is informed by relevant information and expertise.
The reasons for the selected option are clearly articulated.
The decision-making process is transparent.
Decisions can be criticized and appealed.
Future decisions adapt to feedback and changing contexts.
Failing to include the affected parties risks situations which are either paternalistic or technocratic.
Paternalism is often used to characterize situations where decisions about a person’s own interests are made by someone else. This is often appropriate when the subject is a child, an animal, or a severely mentally impaired adult. But paternalism is a violation of respect—a failure to respect someone’s autonomy—when such a person is capable of looking after her own interests.
Technocracy refers to rule by experts. It often carries a pejorative connotation and is used in contrast with democracy, rule by the many. A common complaint about emerging technologies is that they have enormous influence over contemporary life, but that influence is directed inequitably. Decisions are dominated by designers, engineers, and executives, at the expense of users, third parties, and governments. Complaints about the distribution of power also common within technology companies, where opportunities for voice may be—or may be perceived to be—unfairly distributed.
Ethics washing occurs when an organization engages in ethical reasoning without being motivated to do the right thing. Instead of engaging in genuine reflection, making real commitments, and taking decisive action, such organizations merely make superficial statements, gestures, and empty promises. Sometimes companies put significantly more effort into portraying themselves as being highly sensitive to ethical issues; for example by setting up ethics committees without any real power, than into effective regulation, practices, and policies for which they could meaningfully be held accountable.
A good way to avoid the charge of ethics washing is to share or give up decision-making power about ethical issues. Organizations benefiting from developing an emerging technology will rarely be able to avoid the charge of conflict of interest when making important ethical decisions by themselves. Genuinely sharing or giving up power about ethical decision-making is one way of being seen as genuinely interested in reaching good ethical decisions.