{"id":6608,"date":"2022-05-27T15:07:05","date_gmt":"2022-05-27T20:07:05","guid":{"rendered":"https:\/\/nwfl4sale.com\/lenders-cant-deny-loans-and-blame-technology\/"},"modified":"2022-05-27T15:07:05","modified_gmt":"2022-05-27T20:07:05","slug":"lenders-cant-deny-loans-and-blame-technology","status":"publish","type":"post","link":"https:\/\/nwfl4sale.com\/lenders-cant-deny-loans-and-blame-technology\/","title":{"rendered":"Lenders Can\u2019t Deny Loans and Blame Technology"},"content":{"rendered":"
<\/p>\n
CFPB: Lenders are obligated to tell buyers why their loan application was turned down. They can\u2019t plead ignorance and say an algorithm made the determination.<\/span><\/span><\/p>\n<\/div>\n WASHINGTON \u2013 According to an announcement by the Consumer Financial Protection Bureau (CFPB), federal anti-discrimination law requires lenders to give applicants specific reasons for denying an application for credit. CFPB says it\u2019s required \u201ceven if the creditor is relying on credit models using complex algorithms.\u201d<\/span><\/span><\/p>\n CFPB published a Consumer Financial Protection Circular<\/a> to remind borrowers of creditors\u2019 adverse action notice requirements under the Equal Credit Opportunity Act (ECOA).<\/span><\/span><\/p>\n \u201cCompanies are not absolved of their legal responsibilities when they let a black-box model make lending decisions,\u201d says CFPB Director Rohit Chopra. \u201cThe law gives every applicant the right to a specific explanation if their application for credit was denied, and that right is not diminished simply because a company uses a complex algorithm that it doesn\u2019t understand.\u201d<\/span><\/span><\/p>\n Thanks to data harvesting \u2013 a broad collection of diverse data culled from browsers, websites, stores and more \u2013 lenders often have highly detailed customer information before they ever interact with someone. Many firms today rely on these detailed datasets to power algorithmic decision-making, which is sometimes marketed as artificial intelligence or AI. Data harvesting has a broad range of commercial uses, such as targeted advertising and in making credit decisions.<\/span><\/span><\/p>\n Some financial companies have long used advanced computations to help make credit decisions, but they still gave borrowers an explanation if their loan was denied.<\/span><\/span><\/p>\n However, some creditors today may make credit decisions based on complex algorithm outputs, sometimes called \u201cblack-box\u201d models. The reasoning behind some of these models\u2019 outputs \u2013\u00a0 the basis for accepting or denying a loan \u2013 may be unknown to the model\u2019s users and sometimes even the model\u2019s developers.<\/span><\/span><\/p>\n \u201cWith such models, adverse action notices that meet ECOA\u2019s requirements may not be possible,\u201d CFPB said in a statement. \u2026\u00a0To help ensure a creditor does not discriminate, ECOA requires that a creditor provide a notice when it takes an adverse action against an applicant, which must contain the specific and accurate reasons for that adverse action. Creditors cannot lawfully use technologies in their decision-making processes if using them means that they are unable to provide these required explanations.\u201d<\/span><\/span><\/p>\n CFPB credits whistleblowers for complaining about companies that use tech in a way that violates ECOA and other federal consumer financial protection laws. It encourages anyone who believes their rights have been denied to contact the bureau through the CFPB Whistleblower Program webpage<\/a>.<\/span><\/span><\/p>\n