- Discriminatory Lending practices have made it more difficult for certain individuals to qualify for mortgages and small-business loans
- Lending discrimination occurs when lenders base credit decisions on factors other than an applicant’s creditworthiness
- Biased Algorithms, programmed by banks, financial institutions and other corporate entities, may contribute to unfair lending practices
- Consumer Protection Laws have been established to forbid unfair loan practices
Compiling consumer data and using sophisticated AI-powered lending programs creates an opportunity to transform how companies allocate credit and risk, and ultimately create a more transparent and fair credit system. However, when financial institutions program a bias into their loan algorithm it can worsen an already existing bias, leading to more lending discrimination and unfair lending practices.
The Lyon Firm is investigating consumer protection violations that may involve biased lending algorithms and other discriminatory lending practices. Any corporate act of lending discrimination based on race, color, religion, sex, national origin, handicap, familial status, age, or amount of public assistance income may be grounds for legal action, and a potential consumer protection lawsuit.
The following federal laws are meant to offer protection against lending discrimination:
- The Fair Housing Act (FHA)
- The Equal Credit Opportunity Act (ECOA)
- The Community Reinvestment Act (CRA)
But even with existing laws that forbid such practices, lending discrimination is a lot more common than many think. In fact, with lenders able to program risk analysis software with potentially biased algorithms, unfair lending may be as pervasive as ever before.
Joe Lyon is an experienced Consumer Protection Attorney, reviewing Digital Discrimination, Biased Algorithms, and Unfair Lending cases for plaintiffs nationwide.
Types of Discriminatory Lending
Consumer advocates have suggested some banks and financial institutions have programmed their lending algorithms to be biased, and may violate several state and federal states. By law, lenders may not discriminate or base credit decisions based on the following:
- Race or Color: Studies show that both online and face-to-face mortgage lenders charge higher interest rates to borrowers of color. Lenders are prohibited from asking about a person’s race or ethnicity on a loan application, and may only be disclosed voluntarily. Race cannot be used in any way by a lender to make a credit decision.
- Religion: It is illegal for lenders to discourage applicants from applying for credit or reject a loan application because of a religious affiliation.
- National origin: Lenders cannot inquire about a person’s ancestry, the origin of their surnames or discriminate against non-English speakers. Creditors do have the right to ask about immigration status and whether an applicant has the right to remain in the country long enough to repay any outstanding debts.
- Sex: There have been many instances of gender discrimination where loan officers rate a female’s credit lower than a male’s, with all other things equal, including their occupation. Many women have also had credit denied due to income that includes part-time jobs, alimony or child support, all historically female-centric. Many women have reported loan officers asking about their husband’s job, but not vice versa. Pregnant women have also seen a fair amount of discrimination due to the possibility of maternity leave.
- Marital status: Lenders may ask specifically if an applicant is married, single, or separated, but not if they are divorced or widowed. Lenders, however, can require spousal information if they will be permitted to use, or be liable for, an account, or if the applicant lives in a community property state.
- Age: Lenders cannot discriminate against borrowers solely based on age, but can use age to determine the possibility of impending retirement.
- Source of income: Lenders cannot discriminate against applicants because of income derived from public assistance, child support, alimony, Social Security, part-time employment, pensions or annuities.
- Sexual orientation: There are currently no federal laws prohibiting credit discrimination based on sexual orientation, though some state laws exist.
Mortgage Lending Discrimination
With many protections in place to prohibit such bias, Black and Hispanic Americans are still denied mortgages, or offered loans at a higher rate than their white counterparts. Data collected from the Home Mortgage Disclosure Act (HMDA) and compiled by the Consumer Financial Protection Bureau (CFPB) shows not only a high rate of discrimination on whether to accept an application, but many borrowers of color wind up with a higher-priced loan compared to white applicants.
This modern-day redlining has been documented in almost every major metro area in America. No matter a loan applicant’s location, many describe disproportionate denials and unfair loan practices.
Denying loans based on race, color, sex, religion or national origin, unfair mortgage lending discrimination is one of the most common forms of lending bias. When a bank or other lender receives a mortgage application and bases their decision on factors other than creditworthiness, the victims may file a claim and contact an attorney to review.
Small Business Loan Bias
Digital discrimination goes beyond mortgage lending. A report from The Business Journals found White neighborhoods receive about twice as much per person in small-business loans compared with Black neighborhoods.
Some concerning credit reports note that in recent years the number of loans made to Black-owned businesses have decreased dramatically, despite positive trends in overall loans awarded. Some consumer protection firms have investigated, looking for evidence of obvious discrimination in small-business lending.
A report by the New York Times showed that 75 percent of the government’s initial round of Paycheck Protection Program loans went to businesses in White-majority areas. Citi, Bank of America, JPMorgan, and Wells Fargo—America’s four largest banks—made 91 percent fewer small-business loans to Black-owned businesses in 2019 than in 2007. Whether this is due to relatively new algorithmic bias or some other factors is not entirely clear, but is certainly cause for suspecting unfair lending practices.
Clear Signs of Lending Discrimination
Some obvious warning signs of discriminatory lending loan practices may include:
- You are denied credit that you qualify for
- You are approved for a loan but at a higher rate than what you applied for (when you should qualify for a lower rate)
- A lender advises you not to apply for credit without a good reason
- You are treated differently over the phone than in person
- You hear discriminatory language
- An online lending program asks questions about race, sex, or national origin
What is Digital Discrimination?
Digital discrimination is a relatively new form of bias that describes a software that treats applications differently based on the personal data that is processed by an algorithm. Digital discrimination often mirrors an existing thread of discrimination by inheriting the biases of prior decision-makers, or by those who programmed the algorithm.
Consumer protection attorneys are demanding more transparency and more liability for companies who negligently serve consumers with biased algorithmic risk analysis software. The Lyon Firm feels very strongly that companies have a duty to ensure their systems are free of bias and discrimination.
This is a complex area of law, and while it may be difficult to prove companies willfully create biased algorithms and embrace digital discrimination, there may be enough evidence to prove their negligence in overseeing the shortfalls of technology like automated decision-making.
How are Lending Algorithms Biased?
Acknowledging the existence and causes of bias in lending software is a logical first step in solving the problem. Bias in algorithms may be born of incomplete training data, the reliance on flawed information, or historical inequalities.
Biased algorithms may not always be an intentional act of discrimination, or some malicious method of holding certain minority groups down, but rather a lack of data, or simply a poorly programmed lending tool. Computer-generated pricing systems may discriminate against minority borrowers because they tend to shop less than white borrowers, according to a recent study from the University of California, Berkeley.
Either way, lenders have a responsibility to provide all clients with the same opportunity for loans.
Why Should I Hire The Lyon Firm?
The experienced attorneys at The Lyon Firm have the knowledge and resources to tackle novel legal claims such as those involving biased algorithms and digital discrimination.
Discriminatory lending is a complex area of law that requires the attention of an experienced lawyer. When compounded with additional issues such as biased algorithms and inherently biased systems, it can become even more complex. Allow The Lyon Firm to investigate your unique case, and fight on your behalf following an overt act of discrimination.