Banks have embedded Gen-AI in their operations, and it’s now a co-pilot in decision-making across the board, but it’s still far from replacing your friendly banker.
Has your loan application been rejected? Do you feel upset about it? What can be more upsetting is that it wasn’t a human but a machine that aided that decision.
Today, generative-AI provides rich data sets that enable the ‘human in the loop’ – the bank officer – to approve or reject your loan. Welcome to the era of AI-driven banking.
Banks in India today are steadily embedding generative AI (Gen-AI) into their operations from chatbots and fraud detection to customer onboarding and even loan origination. Gen-AI is also behind those personalised messages you now receive via email or WhatsApp, tailored using your past behaviour and spending history.
“We’re just beginning to scratch the surface,” said Deepak Sharma, CXO advisor and entrepreneur. “AI is fast becoming the co-pilot in decision-making across many parts of banking. It isn’t replacing humans. It’s amplifying their ability to serve.”
Sharma, who now sits on the boards of Experian India, Suryoday Small Finance Bank and was the former chief digital officer at Kotak Mahindra Bank, told The Federal that Gen-AI is driving change in both front-end engagement and back-end efficiency.
“The most valuable use cases right now are in customer acquisition, retention, and cross-selling of financial products like mutual funds or insurance. Over time, I see it extending to product design, market prediction, and even real-time policy adaptation,” he said.
However, there’s a caveat. While banks are eager to embrace Gen-AI, fintech partners and tech advisors caution that in India a heavily regulated market Gen-AI is yet to find a place in “core banking functions” like payments and reconciliations. Most deployments today focus on enhancing the periphery, not the engine.
Banking on bots
According to Kishan Sundar, CTO at Maveric Systems which works with banks globally on AI integration — the most immediate impact has been on “risk management, customer engagement, and automation”.
Take fraud detection, for instance. AI tools developed by Maveric scan enormous volumes of transactional data to identify unusual patterns a sudden flurry of logins from different locations, or microtransactions meant to test stolen cards.
“The scale and speed at which AI can surface red flags is far beyond traditional rule-based systems,” Sundar explained. “It’s not just about spotting fraud after the fact it’s about preventing it in real time.”
Diagnostic assistant
Another area gaining traction is incident management a vital but often invisible part of customer experience. Maveric’s AI models help banks trace the root cause of system outages or failed transactions, enabling faster resolution and less customer frustration.
“Think of it as a diagnostic assistant that sees across systems and suggests fixes before an issue escalates,” Sundar pointed out. But one of the most powerful use cases lies in customer segmentation and targetted marketing something generative AI is now making dynamic and real-time.
By analysing a blend of personal financial behaviour, spending patterns, and digital interaction data, AI can cluster customers far more precisely than conventional segmentation. “Instead of treating 10,000 customers as one group, you can have 10,000 micro-strategies,” said Sundar.
Banks can now tailor offers — from insurance to savings products — based on predicted customer behaviour, not just demographics.
Indian languages
For fin-tech startups like Devnagri, gen-AI is solving a completely different problem: language.
Founded in 2019, Devnagri uses a proprietary NLP model to translate and contextualise content across 22 Indian languages with banking being one of its biggest sectors. Their clients include several India’s top banks, insurance companies, and mutual funds.
“Language is a big unlock for BFSI,” Nakul Kundra, CEO and co-founder, told The Federal.
“Banks want to reach the next 500 million users. But those customers speak in their native tongue not English. And when it comes to banking, context is everything.”
”Devnagri’s Gen-AI engine doesn’t just translate. It interprets. For example, it understands that the word ‘home’ (ghar in Hindi) could mean a literal house — but the Union Home Minister is not ghar ka minister right? Similarly, ‘watch’ could mean a wristwatch or surveillance.
“We built domain-specific models so the AI knows whether it’s handling BFSI content or a political speech,” Kundra explained. The company also enables real-time chat and voice bots for banks many of which are integrated into websites and call centers. These bots are trained to handle up to 40 per cent of routine queries in local languages before escalating complex issues to human agents.
“And because Gen-AI can detect tone and emotion, it can also express empathy,” Kundra said. “If a customer is angry or grieving, the bot won’t reply with a generic message. It can respond with sensitivity.”
Red lines and firewalls
Yet, for all its promise, there are boundaries to where AI is allowed to operate especially in India.
“Core banking is sacrosanct,” Kundra said. “RBI regulations prohibit third-party Gen-AI systems from touching real-time banking data whether it’s credit scoring, payment reconciliation, or KYC approval.
”This means that while AI can translate a loan document, it cannot currently approve one. It can remind you about an EMI, but it cannot evaluate whether your credit profile justifies one,” he added.
That said, Sharma believed this will evolve. “In countries like Singapore, regulators are far more flexible. India will get there, but we’ll take a cautious path — rightly so, given the sensitivity of financial data,” he said.
Policing the prompt
Both Sharma and Kundra emphasise the importance of guardrails when deploying Gen-AI in banking. Left unchecked, AI can go rogue offering unsolicited advice or interpreting sarcasm as fact.
Kundra shared an anecdote. “During testing, someone jokingly asked the bot who would win the India-Pakistan match. The bot politely refused to answer, reminding the user that its purpose was to assist with banking queries,” he said. In another case, a bot offered a link to a job portal when a user said they had lost their job.
“That could’ve backfired. So we introduced boundaries predefined responses when the AI detects sarcasm, jokes, or out-of-scope questions.”
In customer service, these nuances matter, he said. “If a person calls the bank and says, ‘my husband died, and I don’t know what to do about the home loan’, the AI needs to escalate that immediately to a human,” he explained. “Empathy can’t be optional.”
Internal intel
Interestingly, Gen-AI is also finding a role in “internal knowledge management”.
Sharma shared how, during his time at Kotak, they used AI to help employees stay updated with RBI circulars, compliance checklists, and internal product policies. “It became like a ready reckoner,” he admitted.
“With RBI releasing updates almost every day, no one could possibly keep up. So AI made that accessible. Devnagri is also exploring this space. Several of its clients now use AI to train new staff, ensure compliance, and even assist in employee certification exams. With global partnerships like those Maveric has with Databricks, and startups like Devnagri gaining traction in multi-lingual automation, the future of Gen-AI in Indian banking looks expansive albeit tightly regulated.”
Assistive to predictive
Sharma is optimistic. “AI will increasingly move from assistive to predictive. It’ll tell you what the customer wants before they know it. It’ll help banks become proactive, not reactive.”
Yet, he cautions that trust will be key. “No AI should ever deny a loan or freeze an account without a clear explanation and a human override. Transparency, explainability, and ethics those are non-negotiable,” Sharma added.
He believed the day isn’t far when AI audits will become mandatory much like the financial and IT audits banks undergo today. For now, Gen-AI may not be running the bank. But it’s certainly becoming its smartest assistant.
Ariticle Originally published in The Federal