Last night, Goldman Sachs issued a second statement dimly alluding to the algorithmic voodoo behind the Apple Card’s allegedly discriminatory determinations of creditworthiness, claiming that the banking giant “[has] not and never will make decisions based on factors like gender.”
The company again hedged with the theory that women (presumed to be married women) are getting lower credit limits than their partners because they’ve been supplemental account holders under their spouses’ names and, therefore, have not accrued the same personal credit history.
“Some of our customers have told us they received lower credit lines than they expected,” the statement reads. “In many cases, this is because their existing credit cards are supplemental cards under their spouse’s primary account — which may result in the applicant having limited personal credit history. Apple Card’s credit decision process is not aware of your marital status at the time of the application.”
Goldman Sachs says more broadly that it makes no determinations based on gender, though previously Goldman Sachs said that it looks forward to remedying this problem by allowing family members to hold joint accounts (with their, say, husbands). They add in the latest statement that customers can ask that their credit lines be reevaluated, if you so choose to jump through another flaming hoop for an already inconvenient card with middling perks.
Goldman Sachs has been scrambling to justify its determinations of creditworthiness in its Apple Card approval process after software engineer David Heinemeier Hansson alleged in a viral Twitter thread that Apple awarded him a credit limit 20 times higher than his wife’s. Among numerous commenters with similar stories, Apple co-founder Steve Wozniak chipped in to say that he and his wife, with whom he shares all accounts and assets—“financially we are ONE in every regard,” he tweeted—received wildly different credit limits. Nearly overnight, the New York State Department of Financial Services announced an investigation.
That it took two wealthy and high-profile men to get the U.S. Department of Financial Services’s attention did not escape the main subject of the thread herself, Jamie Heinemeier Hansson. Hansson, who identifies herself as a millionaire, wrote in a blog post that while she is an extremely private person and “slightly mortified” to see her name in the news, she’s glad that the episode at least made regulators sit up and pay attention to discriminatory banking. “... I hear the frustration of women and minorities who have already been beating this drum loudly and publicly for years without this level of attention. I didn’t wish to be the subject matter that sparked these fires, but I’m glad they’re blazing.” (Again, we’re not surprised: Gizmodo’s Victoria Song has heard from numerous readers reporting mysterious discrepancies in their credit limits and interest rates.)
Hansson adds that her credit in the U.S. is “far longer” than her husband’s, that she has never made a late payment, that her credit score is higher, she’s had a successful career prior to their relationship, and pays off credit in full monthly. She says that she and David share all financial accounts—what differentiates her is her status as “homemaker.”
A Goldman Sachs spokesperson told Gizmodo on Tuesday that the bank never at any point, before or after issuing an Apple Card, collects data on employment status, marital status, gender, or whether the customer has children. They assured Gizmodo that they work with a third-party consulting firm Charles River Associates to ensure that the algorithm does not discriminate against protected classes. Charles River Associates was not immediately available for comment.
That could be absolutely true, but algorithms act on garbage data and implicit bias of the institutions that build them (and Goldman Sachs does not have a strong track record of taking gender discrimination claims seriously). An extensive report from the Brookings Institute offers numerous examples of bad data tainting the algorithms: Amazon’s algorithm filtered out resumes containing the word “women’s,” based on a dataset of resumes primarily from white men; Harvard researcher Latanya Sweeney found that an algorithm targeted ads for an arrest records services to people with African-American names; ProPublica uncovered an algorithm that judges used to assign a “risk score” for potential repeat offenders, which typically gave African-Americans higher risk scores than white counterparts who were equally “risky” offenders.
A Goldman Sachs representative claimed to be unaware of what data, if any, Apple shares with Goldman Sachs but pointed us to an FAQ sheet that merely states that the bank does not share information such as Social Security numbers, account balances, transactions, purchase history, transaction history, and payment history with nonaffiliates (including Apple)—but this only applies after the card has been approved.
Apple remains conspicuously quiet. We will update the post if we hear back from the company.