Since news of the Apple Card’s gender discrimination broke last week, Goldman Sachs has been in damage control mode.
The financial institute, which partnered with Apple (AAPL) to issue the newly-launched credit card, insists its approval process doesn’t favor male applicants, as users have reported. It’s safe to say the public isn’t buying Goldman’s claim of innocence; even Apple co-founder Steve Wozniak noted his wife’s considerably lower Apple Card spending limit.
SEE ALSO: Amazon Is Now Offering a Credit Card to Customers With Bad Credit
“What Goldman Sachs (GS) failed to take into account is that machine learning algorithms excel at finding latent features in data,” explained Lux Research analyst Cole McCollum. “These are the features that aren’t directly used in training a machine learning model, but are inferred from other features that are.”
McCollum told Observer that this type of bias that’s found in historical datasets is one of the biggest challenges in implementing machine learning. It’s also why companies need to explore bias detection and artificial intelligence (AI) explainability tools that can help alleviate potential discrimination. “In the case of the Apple Card’s spending limits, even if gender was not specifically considered (as Goldman Sachs claims), other related features in the dataset can still embed those biases and lead to unfair decisions,” he said.
Apple’s security-focused titanium card is meant to be an innovative consumer finance product. While Goldman Sachs’ application process still needs technical tweaking, the Apple credit card could come out on the other end thanks to its appealing perks, like a daily cash back feature.
“This incident should serve as a warning for companies to invest more in algorithm interpretability and testing in addition to executive education around the subtle ways that bias can creep into AI and machine learning projects,” McCollum concluded.