In this episode of Payments Pros, our hosts Keith Barnett, Carlin McCrory, and Josh McBeain join their colleague Chris Willis to discuss the Consumer Financial Protection Bureau’s (CFPB) larger participant rule for consumer payments mentioned in its 2023 semiannual rulemaking agenda. During this podcast, they examine a myriad of topics concerning this rule, including the following:

Continue Reading CFPB’s Larger Participant Rule for Consumer Payments

As shown by a new report, the Consumer Financial Protection Bureau (CFPB or Bureau) is focusing its fair lending work on mortgage origination and pricing, small business lending, redlining, and the use of artificial intelligence (AI) and machine learning models.

On June 29, the CFPB released its annual Fair Lending Report (Report) to Congress describing its fair lending enforcement and supervisory activities, guidance, and rulemaking for calendar year 2022. The Report satisfies the CFPB’s statutory responsibility to report annually to Congress on public enforcement actions taken pursuant to the Equal Credit Opportunity Act (ECOA).

Continue Reading CFPB’s 2022 Fair Lending Report Focuses on Bias in Mortgage Lending, Redlining, Home Appraisals, Small Business Lending, and Shows Continued Skepticism of AI

Please join Troutman Pepper Partner Chris Willis and colleagues Keith Barnett, Carlin McCrory, and Josh McBeain as they discuss the Consumer Financial Protection Bureau’s (CFPB) larger participant rule for consumer payments mentioned in its 2023 semi-annual rulemaking agenda. During this podcast, they examine a myriad of topics concerning this rule, including the following:

Continue Reading CFPB’s Larger Participant Rule for Consumer Payments

Do companies that use workplace surveillance tools to make hiring and firing decisions risk violating the Fair Credit Reporting Act (FCRA)? According to the Consumer Financial Protection Bureau (CFPB or Bureau) in a recent comment, the answer to that question is yes. The Bureau’s official comment comes in response to a request for information issued by the White House’s Office of Science and Technology Policy on the impact of automated tools used by employers to monitor and evaluate workers. The CFPB’s position that the FCRA applies to automated worker surveillance tools is consistent with the Bureau’s March 2023 request for information on data brokers, discussed here, to determine whether the FCRA applies to modern data surveillance practices.

As background, the FCRA provides protections related to consumer reports. The FCRA defines “consumer report” to include “any written, oral, or other communication of any information by a consumer reporting agency bearing on a consumer’s credit worthiness, credit standing, credit capacity, character, general reputation, personal characteristics, or mode of living which is used or expected to be used or collected in whole or in part for the purpose of serving as a factor in establishing the consumer’s eligibility for … employment purposes.”

The gathering and use of information from worker surveillance technologies to support retention and promotion decisions has greatly increased in recent years, in large part because of COVID-19 work-from-home arrangements. But in a now-familiar refrain, the CFPB cautions that such “automated technologies may produce incomplete or inaccurate information or exacerbate biases.”

The CFPB makes clear that the FCRA may apply to worker surveillance when making decisions including hiring, firing, promotion, reassignment, retention, and compensation. The CFPB also expressed interest in exploring how the information employers obtain about their employees through such technologies can find its way into the data broker market as well as when such information is used for employment background screening and other decisions that could impact consumers.

Furthermore, the CFPB expressed significant concerns about whether entities offering evolving worker surveillance technologies to employers are complying with applicable law. The CFPB stated that a company’s choice to use new technologies does not absolve it from its legal obligations.

Our Take:

The CFPB presumably is not going to transform itself into an employment regulator, but its public announcement does signal an interest in protecting employee rights from the types of monitoring discussed by the Bureau in its comment. We expect the Bureau to consider enforcement action on this subject if a company engaged in such practices comes to its attention, which means that employers should consider their use of such technologies to ensure that they are in a defensible position. The subject matter of the Bureau’s statements lies at the intersection of employment, privacy and consumer protection laws, and we believe that all three need to be taken into account in assessing this issue. That’s why the three of us — representatives of Troutman Pepper’s Labor & Employment, Privacy, and Consumer Financial Services groups — wanted to write on this jointly and will be watching this issue together for further developments.

After analyzing public feedback on pandemic-related forbearance programs and ways to automate and streamline long-term loss mitigation assistance, the Director of the Consumer Financial Protection Bureau (CFPB or Bureau), Rohit Chopra, issued a blogpost indicating the CFPB will be proposing ways to “simplify and streamline” mortgage servicing rules.

“Many commenters noted that borrowers seeking help on their mortgages can face a paperwork treadmill that hurts both homeowners and mortgage servicers. According to commenters, the temporary pandemic-related changes we made to the mortgage servicing rules helped alleviate this problem and get borrowers accommodations more quickly.” According to the CFPB, commenters also expressed concern over incurring servicing fees and negative credit reporting while waiting for mortgage servicers to review loss mitigation options.

Against the backdrop of these concerns, Director Chopra stated the CFPB will propose streamlining servicing rules “only if it would promote greater agility on the part of mortgage servicers in responding to future economic shocks while also continuing to ensure they meet their obligations for assisting borrowers promptly and fairly.” Director Chopra also stated that the Bureau continues to welcome petitions on potential amendments to the servicing rules. Stay tuned for additional developments on the servicing rules.

On June 20, the Consumer Financial Protection Bureau’s (CFPB or Bureau) Office of Servicemember Affairs published its Annual Report analyzing complaints submitted by servicemembers, veterans, and their families in 2022. The report found that in 2022, servicemembers submitted over 66,400 complaints, representing a 55% increase from 2021, and a 62% increase from 2020. As in prior years, credit reporting remained the top issue for servicemembers, followed by debt collection and credit cards. Nonetheless, much of the report focused on the rising number of complaints from servicemembers related to payment app fraud and recommended steps the industry can take to address this issue.

Continue Reading CFPB’s Office of Servicemember Affairs Issues Annual Report Highlighting Complaints Related to Payment App Fraud

The Consumer Financial Protection Bureau (CFPB or Bureau) has signaled that it intends to propose a rule that would allow it to exercise supervisory authority over a greater number of nonbank financial companies that participate in the consumer payments market.

Under the Dodd-Frank Wall Street Reform and Consumer Protection Act (Dodd-Frank), the CFPB has authority to examine “larger participants” in various consumer financial products and services markets that the Bureau designates by rule — past examples include consumer reporting, debt collection, student loan servicing, international remittances, and auto finance. The CFPB’s proposed rule would “further the scope” of the CFPB’s supervision over the consumer payments market to include nonbank larger participants.

The agenda states that a notice of proposed rulemaking is expected in July 2023.

This proposal represents the culmination of the CFPB’s attempts to regulate the payments industry that started around the time that Rohit Chopra took over as the Director of the CFPB. In October 2021, the CFPB issued orders to collect information on the business practices of large technology companies operating payments systems in the United States. At the time, the CFPB said that it needed the information to “better understand how these firms use personal payments data and manage data access to users so the Bureau can ensure adequate consumer protection.” More recently, Director Chopra and members of Congress discussed possible amendments to Regulation E to remove barriers that prevent consumers from recovering money from banks and payments companies when bad actors fraudulently induce consumers into sending money to the bad actors through peer-to-peer apps. Although payments companies are already required to comply with state and federal laws, CFPB supervisory examinations will be a new experience if the CFPB follows through with the proposed rule.

We also note that this forthcoming larger participant rules comes at a time where the CFPB seems to be using its supervisory authority in a more expansive way than in the past. Numerous nonbanks have received information requests from the CFPB recently, inquiring about whether they are larger participants in markets already subject to CFPB jurisdiction, which suggests that the Bureau intends to begin performing examinations of more companies in those markets. Further, the Bureau has notified a number of nonbanks that they are being considered for supervision under the “risks to consumers” provision in Dodd-Frank, which the Bureau announced that it planned to use more extensively in 2022. Supervision has become an even more active venue for the Bureau to advance its policy objectives, and now it appears that the CFPB intends to bring this authority to bear on the payments industry.

On June 8, the Consumer Financial Protection Bureau (CFPB) announced that it had entered a consent order against medical debt collector Phoenix Financial Services for alleged violations of the Fair Credit Reporting Act (FCRA) and Fair Debt Collection Practices Act (FDCPA).

According to the CFPB, Phoenix sent collection letters to consumers who had disputed the validity or accuracy of the debts without first conducting a reasonable investigation of the dispute or receiving verification of the debt. Specifically, the CFPB alleged that when investigating disputes, Phoenix typically did not have any documentation supporting the purported debt and did not obtain additional information after the dispute was submitted — whether in writing or verbally — to supplement the limited documentation received when the debt was initially placed for collection. According to the CFPB, Phoenix’s own policies instructed employees to only perform a cursory review of the limited data points already in its system to resolve a dispute. Specifically, Phoenix’s procedures directed employees and contractors to compare only the consumer’s name, social security number, and date of birth provided in the dispute to the data in Phoenix’s system of record when resolving disputes, which the CFPB determined would be insufficient in a number of factual scenarios.

The CFPB further alleged that Phoenix did not employ enough employees or contractors to handle the volume of disputes it received. In reaching that conclusion, the CFPB utilized the same mathematical approach it used when developing the meaningful attorney involvement standard: “For example, on one day in 2018, one [Phoenix] employee or contractor responded to 722 disputes, spending less than 30 seconds per dispute on average, and potentially as little as 10 seconds per dispute on average.”

Under the terms of the consent order, the collector agreed to:

  • Not make any representation that a consumer owes a debt or as to the amount owed, including by sending collection letters, unless it can substantiate the representation.
  • Establish and implement written policies and procedures to ensure that it conducts reasonable investigations of disputes about information furnished to consumer reporting agencies.
  • Refund any amounts consumers paid on an unverified debt.
  • Pay a penalty of $1,675,000 to the CFPB, which will be deposited in the victim’s relief fund.

Please join Troutman Pepper Partners Chris Willis and Jason Cover as they discuss the Consumer Financial Protection Bureau’s (CFPB) recent special edition Supervisory Highlights focused on “junk fees.” Chris and Jason dive into the report and talk about how this fits into the CFPB’s broader initiative on junk fees, what exactly constitutes a junk fee, the types of fees the CFPB identifies as problematic, if this means that creditors can’t charge any of these fees, and steps to take to mitigate risk when imposing fees.

Continue Reading CFPB’s War on Junk Fees

Today, the Consumer Financial Protection Bureau (CFPB) issued a report analyzing the use of chatbots in consumer finance and the impact on customer service. The report notes that financial institutions are increasingly using chatbots to reduce the costs of human customer service agents, and moving away from simple, rules-based chatbots toward more sophisticated technologies, such as large language models and generative chatbots and others marketed as artificial intelligence (AI).

The report found that while chatbots may be useful for answering basic questions, their effectiveness lessens as the questions become more complex. According to the CFPB, “[r]eview of consumer complaints and of the current market show that some people experience significant negative outcomes due to the technical limitations of chatbots functionality.” Additionally, the CFPB warns that financial institutions may risk violating federal consumer protection law when deploying chatbot technology. In addition to potential privacy and security risks, the CFPB states that “[w]hen chatbots are poorly designed, or when customers are unable to get support, there can be widespread harm and customer trust can be significantly undermined.”

Chatbots are computer programs that mimic human interaction by processing a user’s input to produce an appropriate output. “Rule-based chatbots use either decision tree logic or a database of keywords to trigger preset, limited responses. These chatbots may present the user with a set menu of options to select from or navigate the user between options based on a set of keywords and generate replies using predetermined rules … More complex chatbots use additional technologies to generate responses. Specifically, these chatbots may be designed to use machine learning or technology often marketed as ‘artificial intelligence’ to simulate natural dialogue.” Chatbots have been widely adopted by banks, mortgage servicers, and debt collectors. In 2022, over 98 million consumers interacted with a bank’s chatbot and that number is projected to grow to 110.9 million users by 2026.

According to the CFPB, while the use of chatbots has increased, so have the consumer complaints, including complaints concerning:

  • Difficulties in recognizing and obtaining dispute resolution.
    • Chatbots process only specific words or syntax to recognize a dispute and begin the process of dispute resolution. So the ability of chatbots to discern disputes may be limited.
    • Even when chatbots do recognize a dispute, their ability to reach a resolution can be limited. “In some cases, customers are disputing transactions or information that is incorrect. Chatbots that are limited to simply regurgitating the same system information that the customer is attempting to dispute back to them are insufficient.”
    • Importantly for fair lending concerns, a chatbot’s limited syntax may be problematic for consumers with limited English proficiency.
  • Difficulties obtaining accurate or sufficient information.
    • Studies have shown that chatbots sometimes generate inaccurate data that go undetected by some consumers. Specifically, researchers have found that chatbots are ill-suited for tasks that require logic, specialized knowledge, or current data.
    • According to the CFPB, “[w]hen a chatbot is backed by unreliable technology, inaccurate data, or is little more than a gateway into the company’s public policies or FAQs, customers may be left without recourse. Providing reliable and accurate responses to people with regard to their financial lives is a critical function for financial institutions.”
  • Difficulties obtaining meaningful customer service.
    • A chatbot’s scripted responses may fail to answer a customer’s questions and instead lead to “doom loops” or “continuous loops of repetitive, unhelpful jargon or legalese without an offramp to a human customer service representative.”
  • Difficulties obtaining intervention from human customer service representatives.
    • Consumers complain about the lack of access to or unreasonably long wait times for human customer service representatives. The limitations of chatbots and lack of access to a human customer service representative may not be apparent to customers when they are initially signing up with a specific financial institution.
    • Moreover, chatbots may be less likely to waive fees or negotiate on prices.
  • Difficulties keeping personal information safe.
    • Fake chatbots can be used by fraudsters to conduct phishing attacks.
    • Customers must validate themselves as the owner of a specific account by providing personally identifiable information. Therefore, these chat logs provide another venue for privacy attacks.

The report concludes by warning financial institutions that the difficulties listed above can not only lead to dissatisfied customers, but also to violations of federal consumer protection laws. “Deficient chatbots that prevent access to live, human support can lead to law violations, diminished service, and other harms. The shift away from relationship banking and toward algorithmic banking will have a number of long-term implications that the CFPB will continue to monitor closely.”

Our Take:

We believe this is the first regulatory statement about the use of chatbots by financial services companies, and it follows the now-familiar theme of hostility to AI and algorithms, underscoring that relationship banking is at risk of being replaced by algorithmic banking. Although the statement certainly appears aimed at causing financial institutions to pay close attention to the problems asserted by the CFPB, it remains to be seen whether the CFPB will truly elevate what seems like customer service issues to alleged violations of law. We’ll be watching this issue closely.