Displaying search results for ""

Graceful Degradation in banking

Graceful Degradation in banking

Beyond revenue losses and trust erosion, the recent Facebook outage brings urgent lessons. For banks aspiring to grow CX capabilities at scale, it is a clarion call to revisit “Graceful Degradation”, according to Muraleedhar Ramapai, executive director at Maveric Systems

An attention-deficit planet is unlikely to forget the memorable Monday moan in a hurry. For six hours, Facebook and its app family of Instagram, Messenger, and WhatsApp went down. Beyond the flood of FOMO memes, the impact to Facebook was tangible – shares fell 4.9 percent, its US ad revenues bled $545,000 each hour, not to mention the inconvenience it brought on to millions of businesses that rely on Facebook pay to access e-commerce sites. Using specific indicators from World Bank and other agencies, the cost of shutdown tool (COST) was estimated at $160m to the global economy.

An internal Facebook blog pointed to a cascade of mistakes, but the effect grows ominous as one considers how outages at brands like  Target, Amazon Web Services and Microsoft influences market leaders across BFSI and Fintechs. After all, with massive investments in digital banking, banks want to, and laudably so, replicate their scale and customer agility.

To be fair, not all of us are FOMO sapiens. But the fear-of-missing-out does influence the billions of dollars’ worth of resiliency agendas pursued across global enterprises. That fear is of course rooted due to rising global uncertainty, fluctuating geopolitical risks, an increased frequency of natural disasters, recurring large-scale outages and security breaches.

Resilience is an easy word to proffer, tricky to pin, harder to promise. But the singular difference, between Big Techs and banking, is the essential nature of services. Not being able to stream a favourite movie, upload a picture, or refresh a feed, are all lower on the “anxiety-spectrum” as compared to when banking systems go down. The sweet spot? Combine the agility and innovation of Big Techs with the resilience and reliability of banking systems.

To a certain extent while regulatory oversight shields customers’ data and deposits, the number of outages at banks aren’t negligible, in fact just the opposite. Consider, the ten outages a month at Barclays, or even the recent disruptions across Bank of America, and Visa.

How serious is the resiliency imperative?

In an age where every company aspires to turn pure cloud, the resiliency imperative is alive for not only the FAANG’s (Facebook, Amazon, Apple, Netflix, and Google). According to McKinsey research, companies report that one month or more of disruptions occur every 3.7 years, resulting in losses worth almost 45 percent of one year’s EBITDA over the course of a decade.

So, what does resiliency mean for our hyper connected world? Simply said, whereas quarantines and social distancing works for humans; for systems a different fail-safe is needed. It is called Graceful Degradation.

The business need for Graceful Degradation

The theory and practice of Graceful Degradation (GD) is captured by the question “If everything was to fail, what’s the most important thing that needs to work?” For network engineers, product managers, UI designers, and CX professionals, as we would shortly discover, the answer to that question is neither straightforward nor facile.

Let’s say you are in the middle of an online moment – booking an airline seat, at the ATM withdrawing cash, locking a stock market transaction, browsing Netflix recommendations – and the network breaks, or latency hits a threshold, or maybe a power outage or the system inexplicably behaves in an unexpected way. Complex systems, after all, are often fragile systems where macro level issues of saturation, latency, and excessive workloads are failures with more than one root cause.

What happens next? Does the seat get locked out, the card retained by the ATM, instead of your personalisation’s does Netflix offer general selections? How much of failure status does the system communicate to the user and at what stages? At the point of failure, how much information counts as good customer service and yet, doesn’t create silent anxiety? Does the “broken” system offer alternatives to delight the customer or not? If yes, how? Does it shed workloads, or time-shift it or reduce the service quality or add more capacity? Does the system prioritise between functions – may be onboarding new users smoothly over allowing latency for users already on the platform?

These are all answers to the primary question: how should the system gracefully degrade?

The ability to maintain limited functionality even when a large portion of it is rendered inoperative to prevent catastrophic failure, is how large-scale enterprise applications generate the power of resiliency.

Designing a Graceful Degradation system

In the age of CX dominance, the first aspect of designing a gracefully degradable system is probably obvious. These are the twin design features of fail fast (setting aggressive timeouts such that failing components won’t make the entire system crawl to a halt) and fall backs (designs that allows for fall back to lower quality). Once the failure has been “injected” comes the critical part –  test it.

Graceful degradation may begin with causing the failure to see what happens, but it is equally thinking about what to expect before it happens, and what was designed to happen when it does take place?

How to introduce GD in banking user journeys

  • Rather than stating the entire app being down, build portions (especially information vending) that are serviceable from multiple sources of the same truth.
  • For transactions, even as the promise and intent are to provide straight through processing, the architecture should be message driven. It should switch between request/response at full availability and publish/subscribe on lower service levels.
  • Create secure embedded data stores within customer apps that isn’t network-dependent or has to seek enterprise app(s) for every function.

Seizing the future – learning the Netflix way

No matter the cause, the fact that technology will disappoint, can be countered by the question: How can we handle the failure gracefully? What stops us from planning ahead to keep our customers happy? Ahead. Not after the feature is built, the product tested, and the version released. The profits of heeding GD lessons come to us from another iconic brand – Netflix. After its 2o12 Christmas eve outage when across parts of US, Canada, and Latin America programming went off-stream, the company put in a slew of GD measures. Netflix today regularly uses external services to simulate service failure, automates zone failover and recovery, and de-risks critical dependencies.

Eventually like Netflix, if banks are to increase GD into their operational systems, they must begin by rethinking their resiliency philosophy.
Disclaimer: Originally published on Bobsguide

View

DataOps : The new DevOps for Analytics

DataOps : The new DevOps for Analytics

Understanding the similarity and differences between DataOps and DevOps; and the relevance of DataOps in present day Banking

DataOps, while often spoken as the ‘new DevOps for Analytics’ is a collaborative data management practice focused on communication, integration and automation of data flows across an organization.

DevOps and DataOps leverage different organizational people and their expectations. DevOps serves software developers who embrace complex details of code creation, integration and deployment. DataOps users on the other hand, are usually data scientists and analysts focussed on building and deploying models and visualizations. The DataOps mindset focuses on domain expertise, and is interested in getting models to be more predictive or deciding how best to visually render data.

Where by using DevOps (Continuous Integration and Continuous Delivery) leading companies (Amazon, Google, Netflix, Facebook, Apple) have accelerated their software build lifecycle (earlier called ‘release engineering’) to reduce deployment time, decrease time to market, minimize defects, and shorten time required to resolve issues; DataOps seeks to reduce the end to end cycle time of Data Analytics – from idea origination to value creation through charts, graphs and models.

The Similarities between DataOps and DevOps

DevOps and DataOps both rely on similar architectural principles (cloud delivery) for Continuous Integration and Delivery, as also they harvest cross-team (development, operations, analysts, architects, scientists, quality monitoring and customers) collaborative energies that drive value creation and innovation.

In fact as modern organizational cultures promote ‘Data Literacy’, newer approaches (like self-service data preparation tools) come equipped with their own in-built data operations. As a result, today data practitioners not only collaborate and co-develop insights in a zero-code environment, but also streamline work delivery across the organization.

Another significant end purpose that unites the two is large scale global consumption and provisioning. As DevOps and DataOps function in high-speeds, multi-geography scenarios whilst accommodating lots of users, both need a unified management environment, where monitoring, and cataloguing can happen concurrently.

The ‘Factory-model’ in DataOps

Whereas Agile and DevOps relate to analytics development and deployment, data analytics additionally manages and orchestrates a data pipeline. Data continuously enters on one side of the pipeline, progresses through a series of steps and exits in the form of reports, models and views. The data pipeline is the ‘operations’ side of data analytics and is called ‘data factory’, just like in a manufacturing line where quality, efficiency, constraints and uptime need to be managed.

The Intellectual heritage of Data Ops

DataOps combines agile philosophies, DevOps principles, and Statistical process control (a lean engineering tool) in balancing four critical Data aspects, namely, engineering, integration, security and quality – to uniquely manage an enterprise-critical data operations pipeline.

Andy Palmer

Banking and DataOps

Broadly speaking, DataOps targets improving data and analytics quality, reducing cycle times for creating new analytics, and increasing the productivity of the data organization exponentially.

More so, Banks have an additional criticality in adopting DataOps powered infrastructure: that of preventing (and bouncing back) from system breaches, platform outages, and process malfunctions that inevitably erode consumer experience and trust.

As DataOps powered platforms start to play a more pivotal role, banking organizations will realise that data needed to fuel innovation cannot be left siloed in legacy applications. Accurate data for precise decisions at the exact speeds will not only combat bottlenecks and outages but DataOps platforms that combine on-demand data delivery with data compliance will boost Customer experience transformation and innovation – especially as  more customers shift to online and mobile banking channels in the current pandemic crisis.

View

Banks Are Embracing Cloud & Open-Source

Banks Are Embracing Cloud & Open-Source

Founded in 2000, Maveric Systems is a software engineering services company that works across financial platforms, banking solutions, data technologies and regulatory systems. The firm has offices around the globe to serve their banking partners spanning across 15 countries, along with a dedicated offshore delivery and research centres in Bengaluru, Chennai, and Singapore.

The company initially offered testing services to banks and financial institutions but, by 2012, Maveric Systems expanded into other areas like software development, analytics and digital platform architecture for banking domain specialisation.

We connected with Muraleedhar Ramapai, Executive Director of Data at Maveric Systems to know more about innovations that banks worldwide are exploring, including data analytics and open-source machine learning frameworks. Here are the excerpts:

Analytics India Magazine: How has the company been impacted due to the ongoing pandemic?

Muraleedhar Ramapai: We have been growing despite the pandemic and that’s one of the key things. We are finding good takers from newer businesses as well as the older ones for our services, especially in software and data analytics.

Initially, we did functional validation of computerisation which used to happen in the banks, then, once the packages came in, we did their installation, testing, quality, and engineering for banks. Five years ago, we added three more businesses and are now implementation partners for a lot of banks in building microservices-based architecture and integrating it with most of the other parts of the legacy banking systems.

Data Analytics is the newest of the businesses. And within data analytics, we’re helping banks move from their traditional models to newer ways. So, analytics is comparatively a small footprint, maybe 10-15% of our business, but a larger thing is data engineering. And what we’re realising is that banks will have to redo their approach on how they have been managing data to be able to go multi-channel and move away from batch approach of data to streaming analytics approach.

AIM: What have been the biggest demands from banking customers during the pandemic in terms of software?

There is a lot of demand in terms of microservices and related aspects, and also related to content development — whether it is using any of the newer modern technology, or traditional ones to build web-facing assets. There is also a massive requirement of data integration and quick movement from across business units, sites and applications into data warehouses. Also, the central banks and governments are coming out with various consumer-friendly/ SME-friendly digital products. New systems are being set up with complete new workflows. Finally, there is a demand for data movements to support new regulatory requirements around the world.

AIM: What have been the biggest focus areas for innovation in the last year or so in the banking sector?

There is a lot of focus on how to lend to SMEs. Especially after COVID, the government is focusing on ensuring that SMEs return to their health. Another focus is fintechs. There is a lot of focus in Europe on open banking in collaboration with fintech innovation and open APIs. Finally, there is an emphasis on the use of data analytics as opposed to traditional methods, whether it is customer identification, prospect identification or using AI for KYC.

AIM: Maveric is involved in the whole technology ecosystem from IT to data analytics and software development. What are some of the main focus areas for Maveric?

Our traditional focus has been on banks where we implement large enterprise transformation programs in banking systems, which are getting upgraded and replaced. Enabling digital services is another area for us. Here we are building microservices architecture, which will help them to work with open banking.

When I’m talking about open banking, we are not creating those mobile apps for fintech, but data integration with the bank. We are focusing on data and analytics in two areas — regulatory and anti-money laundering & fraud. There are packages and algorithms on how to integrate and power such analytics systems.

AIM: In terms of exploring open-source and cloud, what kind of technologies are the banks looking for?

There is a lot of cloud adoption, at least in medium-sized companies in some geographies. It is less in America perhaps, but in Asia and Europe, companies are adopting cloud. Companies are moving more towards either cloud data warehouses, and even the non-structured data is coming into them.

The second is moving from the traditional way in which analysis was shared within the banks. There is more democratisation of data and data-enabled documents.

The third one is the adoption of open source, especially in advanced analytics space such as machine learning. Here a lot of ML systems are GPU-based running tasks like Computer Vision using open-source frameworks based on Python.

How NDML Is Driving Digital India Initiatives: Interview With Madhusudhan ML, MD & CEO At NDML

There is immense value in using Python libraries for example, to do a machine learning model, even in regulated sectors such as banking. Banks are connecting with community support such as from academia and open-source ML products to carry innovations.

AIM: Do you also have an internal team of data scientists which help develop these models and deploy them? 

We do have a small team. We are working with one large bank to enable some of their data and analytics routes. But our work is primarily focused on enabling data scientists using engineering.

AIM: There are a lot of client systems running on legacy systems, how far do you think these banks are when you compare them to the state-of-the-art fintech companies? 

Not having legacy gives fintech an advantage for sure, but that does not mean that banks have poor technology. Banks, before releasing a product, need to ensure many things which a peer to peer lender may not. They don’t have to adhere to various norms. But we’re seeing a lot of banks quickly modernising. A lot of them are investing in technology, and a lot of them are also open to collaborating with fintech.

So the comparison of fintechs versus banks is, in my opinion, not right. It is more about what is in it for the customer and if the customer demands a good experience, banks will have to invest heavily to modernise. For example, when you look at North America, and even parts of Europe, there are still transaction systems which might be running in so-called legacy mainframes. There are many beautiful integration technologies which are available these days to upgrade banks’ digital offerings.

AIM: Are you looking to hire more talent at the moment?

Talking about talent, we have labs where our team members work on streaming data coming from stock exchanges to build signal mining from that data. We are also using social media data to advise banks on what they should build on their mobile apps, where they should focus and more. These tasks require high-end talent, and we are establishing sizable data science projects, and we are also expanding our data engineering team.

This interview was originally published on Analytics India Magazine website and is being reproduced here.

View

How Maveric Systems is using AI to find clients

How Maveric Systems is using AI to find clients

While the outset of Covid-19 outbreak has transformed how organizations used to operate, accelerating digital transformation is something that every corporate honcho believes.

In a conversation with ETCIO, Muraleedhar Ramapai, Executive Director- Data at engineering services firm Maveric Systems busts some myths about traditional ways of working, leveraging data analytics and much more.

Ramapai believes that the good-old theory of human work motivation and management, which assumes the typical worker has little ambition, avoids responsibility, and is individual-goal oriented, is simply not true.

The senior management at Maveric had early discussions on possible productivity snags and remedial counter methods. To their surprise, the leadership found when goals are clear and projects that do not require elaborate white-boarding, the productivity actually increased while working from home.

In terms of capacity building approaches, Ramapai feels that organizations would not be in a hurry to build, say a 200-member team under one roof.

“Irrespective of how great an impact Agile methodology has had over the Industry, I feel, it has done a disservice to one of the most admirable achievements of human collaboration – the ‘Open Source movement’. Moreover, across organizations, future workflow examinations will include rethinking the importance given to face to face collaboration. Pre-lock down quite a few companies practiced 4:1 rhythm (work from office: work from home). I will not be surprised if we see that ratio inverted post lockdown,” he maintained.

Fueling growth with data analytics

It is imperative for any enterprise to acquire clients for growth. And to pursue this, Maveric Systems is now leveraging advanced analytics and data engineering to derive the nuances of voice of our clients’ customers.

“We listen keenly for insights, we understand the customer’s pain points, and we spot features and offers which give our clients the competitive edge. Let me describe this in a little detail,” said Ramapai.

Before Maveric approaches its clients, it works out the many parts of this equation, via advanced analytics.

At the outset, the company employs the three-way analytic assurance framework. It starts by using automation to collate banks customers’ historical feedback from all public domain and social media qualified-sources and separate out key themes using NLP and machine learning.

“The key dissatisfiers and delighters are validated through our detailed engineering analysis of the channel technologies. First-hand experience of the channels is scientifically analysed to complete the third leg of analytics,” he added.

According to Ramapai, the Chennai based Maveric Systems’ core belief is that ultimately data analytics should be tangibly usable and should be converted into actionable decisions.

AI for finding suitable candidates

Continually finding appropriate talent and making sense of the changing workforce trends is high on Maveric’s agenda. The company is experimenting with ML and AI to locate appropriate candidates for various roles.

Initial screening is being done by machines and data. Resumes are scanned and relevant technical assessments are then served over the cloud for candidates to participate and submit for evaluation. However, this initiative is currently in its infancy.

“Joining the dots for a robust and fool proof-talent recruitment program will happen as we replace the ‘all-powerful human-selection-mindset’ with objective decision-making models on AI, ML platforms. The expectation is that algorithms will enable us to complete role fitment through situational analysis,” added Ramapai.

Cloud Computing

According to Ramapai, all of the company’s computing and storage needs are federated and distributed. The systems are built to scale up by up to 25% of its capacity anytime as per the requirements. Maveric Systems embraced the cloud as a natural evolution strategy much before COVID took over.

Talking about application modernization plans for data centres, Ramapai said, “Luckily for us, we have been in a rather modernized (low-legacy) position. Most of our corporate applications (like code repositories etc.) are ‘SAAS-ified’ and already hosted on the cloud. Some labs were local, for which remote access was provided. We took an internal decision that future capacity enhancements (be it data-storage or compute) for R&D purposes, will happen only on the cloud.”

This was originally published on Economic Times website and is being reproduced here.

View

Data Labs at Maveric

Data Labs at Maveric

Research to Results by redefining the intersections between people and technology

Data Technology Innovation Lab – is the Maveric way to solve today’s existential problems by reimagining the customer’s tomorrow-value paradigm.

Banks that capture the zeitgeist of ‘20’s will be the ones driving adoption of maturing digital technology that is more commoditized and accessible, and also, harness scientific advancements that leverage the industry’s ecosystem.

Experiments at the Maveric Data Laboratory merges our deep belief in client-reliability with a precocious talent across applied data sciences. As institutions scan the banking ecosystem to either seize the approaching technology shift or negotiate the veiled disruptive threat, Maveric Data Labs prepares you with a potent techno-domain edge to convert disruptions into competitive advantages faster than before.

The involved steps in Maveric led digital transformations are neither incremental nor are they about finding the next ‘silver bullet’. Rather a contextualized plunge between blue sky thinking and applied technologies, the experiments at the Data Innovation Lab build solutions that are unbiased of open source or commercially licensed technologies.

The modus operandi for running experiments here are straightforward.

Guided by a set of concrete business benefits, Banks engage with Maveric Data Labs. The engagement that can spark in any number of ways, needs a problem statement to be articulated at the outset. In specific ways. Either through an intended business impact or transcending a big picture challenge or co-creating a tangible human experience or even by, agreeing on bottom line metrics as a key measure of the experiment’s success.

The problem statement is then parsed into functional and non-functional elements. Thereafter, related experiments are set up, run and packaged as Proof of Concepts. The presented PoC’s come with appropriate choices of solution architecture, and recommendations for underlying technologies (Open source and/or commercially licensed).

At Maveric Data Lab, Innovation missions typically work across a 2 to 8-weeks window on experiments that aims to put technology back into specific business situations.

A 4 to 8 – member agile team of SME, Domain experts, and Technology specialists work on naturally scalable, cloud based solutions across Synthetic Data (Domain model and statistics) and Public data (Indian market, Social Sites, and Open data)

Few of the open projects at DataLabs@Maveric include:

  • Live streaming of market data
  • Automated valuation model for real estate
  • Ecommerce – price comparison website
  • Hotel Search & Price Comparison website
  • Loan processing solution
  • Credit Card Fraud Detection system
  • Targeted marketing based on card transactions
  • Fraud Detection in cross border financial transactions
  • Market based Analysis platform
  • Centralized Data Hub for a Bank

If you are keen to explore the B-I-G question that accelerates the next step in Research & Development across DATA Transformations we at Data@Maveric-systems.com are listening!

About Maveric Data Technology Practice

Backed by domain, driven by technology, validated at each step, we are committed to accelerate your business through precise decisions using on demand data, engineered for accuracy.

 

View