Artificial Intelligence Consumer protection laws News

CFPB Issue Spotlight Analyzes “Artificial Intelligence” Chatbots in Banking

CFPB Issue Spotlight Analyzes "Artificial Intelligence" Chatbots in Banking

Poorly deployed chatbots can impede customers from resolving problems

The Consumer Financial Protection Bureau (CFPB) released a new issue spotlight on the expansive adoption and use of chatbots by financial institutions. Chatbots are intended to simulate human-like responses using computer programming and help institutions reduce the costs of customer service agents. These chatbots sometimes have human names and use popup features to encourage engagement. Some chatbots use more complex technologies marketed as “artificial intelligence,” to generate responses to customers.

The CFPB has received numerous complaints from frustrated customers trying to receive timely, straightforward answers from their financial institutions or raise a concern or dispute. Working with customers to resolve a problem or answer a question is an essential function for financial institutions – and is the basis of relationship banking.

“To reduce costs, many financial institutions are integrating artificial intelligence technologies to steer people toward chatbots,” said CFPB Director Rohit Chopra. “A poorly deployed chatbot can lead to customer frustration, reduced trust, and even violations of the law.”

Latest Fintech News: CSC Acquires Eddystone Financial Services, Expanding Loan Agency Offering in Australasia

Approximately 37% of the United States population is estimated to have interacted with a bank’s chatbot in 2022, a figure that is projected to grow. Among the top ten commercial banks in the country, all use chatbots of varying complexity to engage with customers. Financial institutions advertise that their chatbots offer a variety of features to consumers like retrieving account balances, looking up recent transactions, and paying bills. Much of the industry uses simple rule-based chatbots with either decision tree logic or databases of keywords or emojis that trigger preset, limited responses or route customers to Frequently Asked Questions (FAQs). Other institutions have built their own chatbots by training algorithms with real customer conversations and chat logs, like Capital One’s Eno and Bank of America’s Erica. More recently, the banking industry has begun adopting advanced technologies, such as generative chatbots, to support customer service needs.

Financial products and services can be complex, and the information being sought by people shopping for or using those products and services may not be easily retrievable or effectively reduced to an FAQ response. Financial institutions should avoid using chatbots as their primary customer service delivery channel when it is reasonably clear that the chatbot is unable to meet customer needs.

Latest Fintech News: Cross River Surpasses Bank Record with 1 Million RTP Transactions, Totaling Over $500mm in May

The spotlight found the use of chatbots raised several risks, including:

  • Noncompliance with federal consumer financial protection laws. Financial institutions run the risk that when chatbots ingest customer communications and provide responses, the information chatbots provide may not be accurate, the technology may fail to recognize that a consumer is invoking their federal rights, or it may fail to protect their privacy and data.
  • Diminished customer service and trust. When consumers require assistance from their financial institution, the circumstances could be dire and urgent. Instead of finding help, consumers can face repetitive loops of unhelpful jargon. Consumers also can struggle to get the response they need, including an inability to access a human customer service representative. Overall, their chatbot interactions can diminish their confidence and trust in their financial institutions.
  • Harm to consumers. When chatbots provide inaccurate information regarding a consumer financial product or service, there is potential to cause considerable harm. It could lead the consumer to select the wrong product or service that they need. There could also be an assessment of fees or other penalties should consumers receive inaccurate information on making payments.

Federal consumer financial protection laws place a variety of relevant legal responsibilities on financial institutions, such as obligations to respond to consumer disputes or questions or otherwise competently interact with customers about financial products or services. When market participants can deploy new technologies, they should do so in ways that comply with existing law and, ideally to increase the quality of customer care.

The CFPB is actively monitoring the market, and expects institutions using chatbots to do so in a manner consistent with their customer and legal obligations. The CFPB also encourages people who are experiencing issues getting answers to their questions due to a lack of human interaction, to submit a consumer complaint with the CFPB.

Latest Fintech News: Step Introduces the Step Black Card – A Premium Rewards Card Designed for the Next Generation

[To share your insights with us, please write to sghosh@martechseries.com]

Related posts

HomeTrust Bancshares, Inc. Announces Merger with Quantum Capital Corp.

Fintech News Desk

Paystand Announces Scott Bennion as Company’s First-Ever Chief Financial Officer

Fintech News Desk

Decta Highlights The Importance Of Contactless Payment Technologies For The Growth Of Other Niches Alongside The Popularity Of Digital Wallets

Fintech News Desk
1