The artificial intelligence systems used by Apple to manage their credit card would be less inclined to facilitate credit to women. An investigation was opened in New York, confirmed Saturday the regulator of the financial services of the American State to AFP.

Claims of discrimination against tech giant Apple and Goldman Sachs, partner of its Apple Card payment card, led to the opening of an investigation in New York, confirmed Saturday the regulator of financial services of the US state to AFP.

"The Apple Card is really a fucking sexist program," David Heinemeier Hansson, an American entrepreneur, tweeted on Thursday, "My wife and I are jointly filing our taxes and we have been married for a very long time. Black box "Apple thinks I have the right to a credit limit 20 times higher than it", he continued.

The @AppleCard is such a fucking sexist program. My wife and I have been married for a long time. Yet Apple's black box algorithm thinks I deserve 20x the credit limit she does. No appeals work.

- DHH (@dhh) November 7, 2019

The entrepreneur has since tweeted a series of messages, recounting his fruitless discussions with Apple's customer service and cursing the impossibility of explaining why the algorithm decided that his wife was less eligible for credits than him. The qualification of "black box" designates artificial intelligence systems whose reasons for the decisions they take can not be explained.

An open investigation

We will "investigate whether the New York law has been broken and ensure that all consumers are treated equally regardless of gender," said a spokesperson for Linda Lacewell, director of financial services of New York.

Apple has been offering the Apple Card since March with two partners, Mastercard and Goldman Sachs. "Our credit decisions are based on the creditworthiness of clients and not on gender, race, age, sexual orientation or any other factor that is prohibited by law," said Andrew Williams. spokesperson for Goldman Sachs.