Skip to main content
Tell your legislator to say NO to the Governor’s permanent Corporate Transit Fee. SEND A MESSAGE

Cloud technology and machine learning are at the center of the 4th Industrial Revolution and the challenge for today’s business leaders who care about diversity, equity and inclusion is to keep racial, gender and ethnic biases out of artificial intelligence.

Toni Tomlin, of the Federal Aviation Administration’s Office of Human Resources, told NJBIA’s Diversity, Equity & Inclusion Council on Friday it’s wrong to assume computers are inherently unbiased because the people who program them are. There are already products and software in use today that discriminate against people of certain races and ethnicities.

For example, a pulse oximeter is a device placed on a person’s fingertip that uses a beam of light to measure oxygen levels in the bloodstream. It is widely used today in hospitals and sold to consumers for at-home use. However, researchers have found that it is more accurate on a person with white skin than a person with dark skin.

“This becomes more relevant because as we are dealing with COVID one of the ways that we try to understand if we’re ill, or if there’s a challenge with our respiratory system, is by using a pulse-ox machine,” Tomlin said. “Clearly this isn’t the greatest news for people of color.”

Another example of bias in artificial intelligence is the software used in hundreds of courts nationwide as an assistive tool to predict the likelihood that a criminal defendant will commit another crime. The software, known as COMPAS, weighs various factors to predict the defendant’s “risk factor” and is used by judges in bail and sentencing decisions. The problem is that the software routinely assigns Black defendants higher risk scores than whites, she said.

To avoid these problems, high-tech businesses need to make sure that they have a diverse workforce involved in AI development.

“The goal is to get to a place where all or most of our actions are automated, but if we’re not careful, what happens is the biases of the programmers find their way into the automation,” Tomlin said. “We need to take a strong look at that and try to fix that problem.”

To watch Tomlin’s entire presentation, go here.

The next virtual meeting of the NJBIA Diversity, Equity and Inclusion Council is scheduled for 11 a.m., Oct. 21.  For more information, go to https://njbia.org/events.