Displaying search results for ""

Data Labs at Maveric

Data Labs at Maveric

Research to Results by redefining the intersections between people and technology

Data Technology Innovation Lab – is the Maveric way to solve today’s existential problems by reimagining the customer’s tomorrow-value paradigm.

Banks that capture the zeitgeist of ‘20’s will be the ones driving adoption of maturing digital technology that is more commoditized and accessible, and also, harness scientific advancements that leverage the industry’s ecosystem.

Experiments at the Maveric Data Laboratory merges our deep belief in client-reliability with a precocious talent across applied data sciences. As institutions scan the banking ecosystem to either seize the approaching technology shift or negotiate the veiled disruptive threat, Maveric Data Labs prepares you with a potent techno-domain edge to convert disruptions into competitive advantages faster than before.

The involved steps in Maveric led digital transformations are neither incremental nor are they about finding the next ‘silver bullet’. Rather a contextualized plunge between blue sky thinking and applied technologies, the experiments at the Data Innovation Lab build solutions that are unbiased of open source or commercially licensed technologies.

The modus operandi for running experiments here are straightforward.

Guided by a set of concrete business benefits, Banks engage with Maveric Data Labs. The engagement that can spark in any number of ways, needs a problem statement to be articulated at the outset. In specific ways. Either through an intended business impact or transcending a big picture challenge or co-creating a tangible human experience or even by, agreeing on bottom line metrics as a key measure of the experiment’s success.

The problem statement is then parsed into functional and non-functional elements. Thereafter, related experiments are set up, run and packaged as Proof of Concepts. The presented PoC’s come with appropriate choices of solution architecture, and recommendations for underlying technologies (Open source and/or commercially licensed).

At Maveric Data Lab, Innovation missions typically work across a 2 to 8-weeks window on experiments that aims to put technology back into specific business situations.

A 4 to 8 – member agile team of SME, Domain experts, and Technology specialists work on naturally scalable, cloud based solutions across Synthetic Data (Domain model and statistics) and Public data (Indian market, Social Sites, and Open data)

Few of the open projects at DataLabs@Maveric include:

  • Live streaming of market data
  • Automated valuation model for real estate
  • Ecommerce – price comparison website
  • Hotel Search & Price Comparison website
  • Loan processing solution
  • Credit Card Fraud Detection system
  • Targeted marketing based on card transactions
  • Fraud Detection in cross border financial transactions
  • Market based Analysis platform
  • Centralized Data Hub for a Bank

If you are keen to explore the B-I-G question that accelerates the next step in Research & Development across DATA Transformations we at Data@Maveric-systems.com are listening!

About Maveric Data Technology Practice

Backed by domain, driven by technology, validated at each step, we are committed to accelerate your business through precise decisions using on demand data, engineered for accuracy.



A Data Readiness Manifesto in the times of COVID

A Data Readiness Manifesto in the times of COVID

In a world distanced socially, it appears our work focus has intensified further. With fewer interruptions and the universality of video-collaboration applications, for millions, the nature of work itself is metamorphosing. And so is doing business.

Across geographies and industries, the mounting importance of going digital was never in doubt. But it is now heightened by our changing behaviors forced by COVID.

The pandemic is on a path to radically alter ways consumers interact with their businesses. For instance, a May 2020 BCG banking industry report points out that one in four customers is planning to use branchless banking or stop visiting branches altogether after the crisis.

Returning to the beginning of this piece; I realized last month that staring at my laptop for long(er) hours is a pain in the neck. Literally. A pain in my neck. After an exhaustive hypothesis-elimination-exercise, the culprit surfaced: My laptop via its various inadequacies was causing the pain (and the watering eyes).

So in earnest began my shopping expedition for a desktop monitor.

In the COVID period, I quickly found out my favorite computer peripherals store(s) are either locked-down or are barely functional. My trusted product advisors were missing. And gone too was a ‘large swathe of decision-making-wealth’: the fact-finding, needs-analysis, product recommendations, and sales discounts.

For the customer in me, all this counted as an experiential loss. After all, a buyer’s trust-bond is created over many happy hours of exchanging insights and trends.

Questions for digital enterprises (including ones that are planning to go online):

  • Is your business recreating a similar customer experience dynamic?
  • Does your business listen and engage with your virtual buyers as well as the brick-mortar world?
  • Sure you are industriously and enthusiastically posting across a host of social media, but are you able to listen to the customer, probe his needs, zero-in on a buyer’s pain points, perceive buying behavior and demographic preferences, layout product nuances, detail the price comparators, and numerous other variables that close sales?

If my virtual purchase adventure described further is anything to go by, I am guessing answers to the above is negative.

Therein lies the first differentiator (and challenge) precipitated by the COVID times: ‘the human-buying-experience’ has to be replicated (and possibly enhanced) digitally.

The neck pain unrelenting, I took my search online, little knowing that a secondary data-avalanche waited ahead.

Tamr, the data mastering company, in its 2020 survey on the state of data and digital transformation, surveyed 300 C-suite executives in the US financial institutions with revenue of $1 Billion or greater. Here are the key findings,

  • Only one percent of those surveyed, are not pursuing digital transformation
  • 47 percent say the main drivers of digital transformation is keeping up to the competition.
  • 63 percent say data management is a drag on their transformation efforts (because data is unreliable, disorganized, or unusable)
  • One in four are dissatisfied with their current methods for managing data velocity, volume, and variety
  • 75 percent can’t keep up with constant changes in data over time
  • 55 percent say non-scalable data management practices are causing choke-by-data.
  • One in two cites ‘speed to insight’ and ‘unifying data across enterprise’ as their primary weakness
  • More than fifty percent say they aren’t utilizing their current data to its full potential because it is siloed throughout the organization.

The derived macro-picture is clear in its summary:

Managing data has notable systemic impediments; Inadequate data readiness brings significant drag to digital transformation efforts and finally, there is a strong need for solutions to data volume, variety and velocity.

I called up a few trusted and knowledgeable friends. Soon after, I started with the organization whose name is now a global ‘verb’: Google.

I feed in the phrases and search. And dozens of websites tuned to the latest SEO algorithms show up on my screen immediately.

I proceed to spend much knowledge filled hours on the sites of global (and national) e-retailers. I try the filters: Brand, display size, display technology, monitor resolution types, horizontal and vertical monitor resolution, energy consumption certifications, and average customer reviews. I compare and contrast, I analyze and assimilate, I deduce and deliberate.

Slowly and surely, I learn more and more about anything remotely connected to a computer monitor.

Next, I turn to various product blogs and their expert recommendations: Nifty pictures, neat summaries of features- advantages- benefits and yet, more information. I wade in deeper – refresh rates, response times, aspect and contrast ratios, panel technology, bezel design, curved or flat screen, built-in webcam and speakers, viewing angle, and factor in variables that are unique to my condition – the monitor stand adjustments (height, tilt, swivel, pivot and their permutations thereof).

Needless to say, all facts and information are calibrated against my budget.

All things considered, am I close to make a decision? Not yet.

I find myself asking:

Saved-by-data or death-by-data?

In the absence of a trusted sales professional who listens, diagnoses my need to offer a slew of recommendations, how does a buyer himself join the dots?

Especially for products that lie outside of his area of expertise.

Questions for digital enterprises (including ones that are planning to go online):

  • Does your data readiness strategy actively engage the virtual buyer (are their feedbacks and observations collated and correlated)?
  • Are your products and services data consistent, well-structured, and in a readable format and aligned to industry aggregators and search engines?
  • Does your data enable real-time integration with other sources for analysis?
  • Can the data be accessed in repeatable, automatic methods?
  • Can the data be tied back to physical sources involved in the production?
  • Does your organization have resources to institutionalize knowledge that comes out of data relationships and real-time models?

As ‘data’ continues to systemically restructure our society, our economy, and our institutions in ways unseen since the industrial revolution two and a half centuries earlier, businesses have to accept that data lies at the structural sweet spot between technology, process, and people.

Before enterprises commit to digital investments, they have to consider the various aspects of data governance, namely, integrity, security, availability, and usability.

And even before that, digital (or soon to be) enterprises have to honestly ‘see’ their data readiness.

Just like a holistically deliberated and a uniquely picked monitor is critical to see your work (and keep your health)

This was originally published on www.emergingpayments.org website and is being reproduced here.


R&D in the Banking Sector: Making the case for Innovation Data Labs

R&D in the Banking Sector: Making the case for Innovation Data Labs

Ranked by industry percentages, what position do you think the banking industry occupies in global research and development spending?

Computing & Electronics, Healthcare, Auto, and Software & Internet are the top four (22%, 21%, 16% and 15% respectively). Banking does not occupy the fifth (Industrials), sixth (Chemicals and Energy) or even seventh (Consumer) spot.

In fact, R&D in the banking industry is so low, it is dumped as “others” at 1.9%.

To cut a long story short: Banks have a long way to grow their R&D initiatives, especially, in today’s shifting landscape of increasing regulatory pressures, rising customer expectations, innovative technologies, and nimbler challengers regularly combining to disrupt the financial services sector.

Let us next examine in some detail, a variable that admittedly has had the largest impact on a BFSI organization’s R&D capabilities, namely, Open Source Technology

The Evolutionary Impetus

Our appetite for far-reaching technology changes is matched (and fuelled) with the incredible leverage open source technologies bring across industries.

(Think of the ‘Unicorns’ formed around Open Source Technologies: Red Hat, MuleSoft, Databricks, Elastic NV, Confluent, Hashicorp)

In fact, Fintech Open Source Foundation (FINOS), in its 2018 white paper, makes the central argument succinctly, “The question is not whether to use open source but how to do it more strategically, efficiently, and extensively than your competitors. With digital disruption handled collectively by technology solutions that become de facto industry standards, financial services companies become defined not by their software, but by execution and differentiation in customer service

Traditionally large banks have protected all technology as Intellectual property and the driver of competitive advantage with large scale engineering teams building all software from scratch. In the last decade this has rapidly changed.

Open source technology (and other turnkey solutions) has made serious inroads in financial services.

Back end technologies: Servers supporting the massive compute landscape, data storage and processing, and trading infrastructure – essentially all back-end software capabilities – largely run on open source Linux platforms

Engineering layer: Financial services software development has been commoditized with the large-scale use of open source for network communications, database storage, workflow management, web application development and much more

“As a Service” offerings: While not necessarily open source, these range from infrastructure and compute power to software and entire platform offerings that are greatly reduce the customization and integration effort required by banks

Regulator-mandated openness & standardization: Regulators have started mandating the industry to open up, standardize and become more transparent, with Europe leading the drive with PSD2 and Open Banking in the UK.

Consequently, plugging in open source solutions means Banks today can free up precious resources to focus on integration and more importantly, concentrate on building their unique business value.

As the BFSI industry turns its attention to Fintechs to meet their digitization challenges; the eventual target areas for their R&D efforts does not waver: Develop and Deploy new technologies to better serve B2B banking customers, Increase profits, improve compliance and security preparedness and reduce infrastructure costs.

If the end-goals are similar, then where does the Banking R&D differentiation come from?

In one word: Reliability

To infuse reliability as a core rubric in its R&D paradigm means Banks have to check a number of boxes.

Firstly, Banks and Technology teams need to bedrock ‘reliability-as-a- yardstick’ in their partnerships; across vendors, across geographies, across platforms.

Secondly, Reliability is built over time by adopting a divergent approach.

The ‘traditional-hire-and-instruct-engineers-on-a-project-mode’, does not produce optimum test results because to harness advanced technologies necessitates an experimental mindset as opposed to the erstwhile engineering approach.

Finally, reliability comes at a cost.

To experiment with production in real time comes with a sizable expense – One, the cost of errors can be high (especially when teams start implementing on a project to realise the underlying thinking is incorrect and it has to begin anew), and Two, the multifarious skill base that runs these R&D experiments is rare to come by (it rather needs to be cherry picked and teethed on multiple R&D propositions); both actions require investments.

Unsurprisingly then barring a few major banks, in-house ‘sandbox environments’ have largely been the domain of a few ivy-league academia and an elitist start-up/incubator ecosystem, either options hardly conducive to support projects of large scale or variable scopes.

Banks today primarily focus on banking processes and not on creating horizontal pieces of technology. Rather than hiring technologists, they are beginning to partner with technology companies that bring the wherewithal to deliver on their high-value R&D outcomes.

Successful Innovation labs (data or digital) tightrope between real-life opportunities today with the possibilities of tomorrow; between applied technologies and blue-sky thinking.

The operative words here are “balance” and “focus”.

Guided by a set of concrete business benefits, banks seek Innovation labs; who given a specific problem statement pare it down to related experiments. Thereafter the labs set up, run and package Proof of Concepts (PoC). Then in quick turnaround, they come back with appropriate choices of solution architecture, and recommendations for underlying technologies that either addresses the specific business challenge or seizes the market place’ quantum opportunities.

This is the ‘playpen-mindset’: Flexible in approach but committed to the outcome; Balanced and Focussed.

This was originally published on Express Computers website and is being reproduced here.


Enabling Indian Economy to recover faster from the Covid-19 impact

Enabling Indian Economy to recover faster from the Covid-19 impact

The case for creating banking data architecture that allows seamless information exchange by re-imagining existing data assets.

Along with other corporate and household sectors faced by lockdowns, the banking sector is witnessing humongous losses – especially in the financial markets. Decreased productivity, income slowdowns, reduced investor sentiments, supply chain disruptions, manufacturing hindrances, are unprecedented strains with significant economic repercussions in this year. This may overflow into the next.

Governments globally along with their central banks and supervisory institutions, are rolling out combative diverse measures – be they stimulus funds, liquidity injections, targeted loans to severely affected industry sectors, policy rate cuts or repayment moratoriums for borrowers of all commercial banks including housing finance companies and micro finance institutions.

Much of the globe’s economic rebuilding success – with the premise that the virus will plateau in the near future – will hinge on how swiftly the governments’ and regulators’ policy responses are converted into financial transactions. Speed of money will determine the speed of recovery.

The need of the hour are minds that look at the situation calmly and clearly – minds that ask precise questions and then, seek out pragmatic answers. Especially by leveraging the various existing data ecosystems and assets to forge relatively inexpensive and ready to deploy solutions without compromising on data security standards.

The challenge areas that governments and banks can rapidly de-risk, and de-clog are quite a few. But the most important one which will determine the success is the ability to seamlessly onboard new customers, exchange information – digital approvals, payments and settlements.

Apart from payment processing, the current work from home scenarios is adversely impacting multiple banking functions that are non- contactless and non-payment related – like Customer on boarding, underwriting and approvals of Mortgages, Loans, Lines of credit, Credit Cards etc.

In these Covid-19 times that mandate specific containment strategies, banks are forced to work on skeletal staffing and their associates cannot have physical contacts with the customer. This fact combined with strict adherence to regulatory procedures means banks have to think of migrating and operationalising their customer on boarding processes to a digital only infrastructure.

Let’s play out this scenario further for customer verification processes. Presently for higher value loans the applicants’ need to physically present themselves at a bank’s premise or for issuing a credit card, banks’ representative meets applicants for multiple verification proofs – residence, income and other KYC information etc.

Extending our case in point, why do we not think of piggybacking this transaction dynamic to our existing data assets? As in institutionalizing a complete e-onboarding process by creating ‘KYC sharing platforms’. What this essentially implies is, rather than going back to customers for laborious verification exercises, the proposed data architecture will re-utilize already verified and enriched data assets like AADHAR, PAN, Voter ID, Ration card, Driving License, Passport, KYC data held by Telecom authority, and GST company information etc. to generate socially distant but well connected and secure transactions. The advanced data can ensure that this is achieved without compromising on the accuracy or security of socially connected process.

Along with pooling customer identification data (from the mentioned sources), the architecture will need newly created information workflows. These workflows will vitally integrate small businesses that may not have bank accounts or be even registered as business entities. In fact, without this last-mile connectivity, governmental benefits – soft loans, benefits, and other relief measures – would not reach the intended recipients.

The clear idea here is to architect a data solution for seamless information exchange that accelerates delivery of financial services to both banked and non-banked population. By integrating data assets and pooling them in a way that does not compromise privacy and security, banking can become truly contactless to ensure government schemes and benefits are reaching the needy faster.

And all this without reinventing the wheel by leveraging the significant investments already made in creating GST infrastructure, the UIDAI architecture and the existing seamless payment provisioning processes.

Understandably the proposed concept will raise regulatory and jurisdictional questions pushing us to diligently work through; but given the unprecedented nature of the global challenge that knocks on our doors; it is time that private-public partners unite across social – economic – political spectrum to find pragmatic solutions that revitalize the economy – one definite step at a time.

In summary, digital-only information flows and workflows are a pre-requisite for benefits to flow into deserving wallets to ensure faster economic recovery.

This was originally published on ET CIO website and is being reproduced here.


How data scientists can bolster the future of fintech industry

How data scientists can bolster the future of fintech industry

Just like the famous Gold Rush of 1849, nowadays businesses are dipping their toe in the data mine, in order to seek some value out of it. This huge chunk of data is forcing the fintech and the banking industry to unleash the power of the hidden gems that data analytics can deliver.

One can’t deny the fact that banks and financial institutions generate astronomical amounts of data in the form of customer transactional and non-transactional data. Reports state that 2.5 quintillion bytes of data are being generated every day. Have we ever thought whether this data is a promise or a peril? It is no surprise that conventional data-processing fails in managing this large volume of data and provides insights that are far from reality.

Realising the value of big data requires an analytical eye and technologies such as big data analytics, AI, and machine learning. These help in churning down data into meaningful information, thus minimising the risk decisions based on intuition.

That’s where the role of data scientist comes into the picture. A data scientist has mastered this treasure hunt as it requires them to know exactly what information to look for that will act as a booster in cross-selling and customer satisfaction. In the banking industry, a data scientist can help develop customer profiles, predict behaviours and track trends, to name a few.

According to a survey, the banking and financial services sector is the biggest market for analytics and data science professionals with 44 per cent of all jobs created in this domain. This percentage will grow in the coming years as this sector is actively using data to derive business insights and improve scalability.

The emerging role of data scientists

Over the past few years, the banking industry has achieved new heights through innovative means for evolving customer expectations of personalisation and convenience.

Earlier banks and other financial institutions used to follow a one-size-fits-all strategy where every customer was treated with the same approach irrespective of their needs and interest.

Gone are those days when customers would visit banks for every single service like depositing, checking account balance, etc. Customers now use their mobile phones to check their account balances, deposit checks, pay bills, and transfer money.

According to a research commissioned by Relay42, the data management platform (DMP), “Digital banking is growing in popularity with 53 per cent of consumers using or willing to move to an online or mobile only bank — 27 per cent have moved already, while 26 per cent are considering the switch”.

There was a time when it would take a few years to build a framework that helps banks in gathering an overall picture of their customers. Since online banking is gaining popularity, adopting big data analytics becomes all the more important. Thus, all this has given room to the new and ever-growing career of the data scientist. A data scientist helps provide meaning to the raw data and uses it to draw insights for better analysis. They help banks in establishing a 360-degree approach for their customers by the analysis of:

  • Customer spending patterns
  • Customer segmentation
  • Implement risk management processes
  • Customised product offerings
  • Customer loyalty

In addition to this, data scientists help banks in designing, building and maintaining the complex data flows, tools and solutions that are needed from the bank’s data systems to analytics environments.

Moreover, data scientists tend to toil away in advanced analysis tools such as R, SPSS, Hadoop, and advanced statistical modeling. Thus, easing the process of generating valuable information from the piles of data and provide inside based on key metrics with suggestive best practices.

It can be rightly said that the fintech domain has benefited from the emergence of analytics.

The path forward

It’s high time that banks adopt big data analytics to remain relevant and profitable in this hyper-competitive business environment. Experts like data scientists will be an edge to these growing trends and will bolster the future of the fintech industry.

This career has never-ending benefits and it poses a promising future for the data science space as data has become the new oil to drive decision-making. One of the biggest challenges faced by the modern banking industry is legacy systems that aren’t equipped to handle the big data revolution. So, banks will need to align their people, processes, and technology platforms to provide highly personalised customer experience by extracting insights from data.


This article has been published in Express Computer