Many recent questions about technology ethics have centered on the role of algorithms in various aspects of our lives. As artificial intelligence and machine learning become more complex, it’s reasonable to wonder how the algorithms powered by these technologies will react when human lives are at stake. Even someone who has never heard of a neural network or a social network may have pondered whether a self-driving car should crash into a barricade and kill the driver or run over a pregnant woman to save its owner.

As technology has entered the criminal justice system, less theoretical and more difficult discussions about how algorithms should be used for everything from providing sentencing guidelines to predicting crime and prompting preemptive intervention have taken place. Researchers, ethicists, and citizens have questioned whether algorithms are biased based on race or ethnicity.

The issues of racial and demographic bias in algorithms are serious and necessary. Everything from insufficient or one-sided training data to the skillsets and people designing an algorithm can result in unintended consequences. As leaders, it is our responsibility to understand these potential pitfalls and mitigate them by appropriately structuring our teams, including skillsets beyond the technical aspects of data science, and ensuring appropriate testing and monitoring.

More importantly, we must comprehend and attempt to mitigate the unintended consequences of the algorithms we commission.

Newspapers recently published an intriguing series on Facebook’s behemoth algorithms, highlighting all manner of unintended consequences. The terrifying outcomes reported range from suicidal ideation among some teenage girls who use Instagram to facilitating human trafficking.

Algorithms were created or modified in nearly all cases to drive the benign metric of promoting user engagement, thereby increasing revenue. In one case, changes made to reduce negativity and emphasize content from friends resulted in a method for quickly spreading misinformation and emphasizing angry posts. According to the reporting in the WSJ series and the subsequent backlash, a noteworthy detail about the Facebook case is the amount of painstaking research and forthright conclusions that highlighted these negative effects that appeared to be ignored or downplayed by leadership. Facebook appears to have had the best tools in place to identify unintended consequences, but its leaders did not act.

How does this apply to your business? Something as simple as changing the “Likes” equivalent in your company’s algorithms could have far-reaching unintended consequences. Because of the complexity of modern algorithms, it may not be possible to predict all of the outcomes of these types of tweaks, but our roles as leaders require that we consider the possibilities and put monitoring mechanisms in place to identify any potential and unforeseen negative outcomes.

Perhaps more difficult is mitigating those unintended consequences after they have been discovered. As the WSJ series on Facebook implies, many of Facebook’s algorithm changes were made to meet business objectives. However, history is littered with businesses and leaders who prioritized financial performance over societal impact. There are shades of gray along this spectrum, but consequences such as suicidal thoughts and human trafficking do not require an ethicist or much debate to conclude that they are fundamentally wrong regardless of the benefits to business.

Hopefully, few of us will have to deal with problems of this magnitude. However, relying on technicians or focusing solely on demographic factors as you increasingly rely on algorithms to drive your business can lead to unintended and sometimes negative consequences. It’s all too easy to dismiss the Facebook story as a problem for big companies or tech companies; your job as a leader is to be aware of and address these issues head on, whether you’re a Fortune 50 company or a small business. If your company is unwilling or unable to meet this need, it may be time to reconsider some of these complex technologies, regardless of the business outcomes they drive.