Data makes the difference in modern banking. Banks rely on data to build a picture of their customers to deliver the fast, seamless, and tailored digital experiences customers expect today. The quicker data gets consolidated and made actionable, the more relevant and tailored banks can make the experience.
Confluent is a data streaming platform that banks use to consume, join, and enrich data from multiple sources – like core banking systems, card platforms, customer-facing apps, and marketing platforms – into one central place for analytics.
In this article, we'll explore how we use Confluent at 10x and the benefits of event-driven architectures in banking.
How we use Confluent at 10x
We've built Confluent into SuperCore®, our next-generation core banking platform, to give banks real-time access to their data. Extracting and transforming data is a huge challenge for traditional banks. Typically, by the time data becomes actionable insight, it's already out of date, making it challenging to offer contextual promotions, pricing, and decisions.
It's difficult for banks to action their critical data because it's locked up in mainframe banking systems or proprietary vendor databases. Data is extracted in bulk, usually daily or weekly, through batch processes. While mainframes are highly dependable, they aren't architected to deliver data in real time for analytical purposes, as they pre-date the digital banking era, when the infrastructure requirements were very different.
For more information on our architecture and how we use Confluent, check out this infographic.
By building Confluent into our core banking platform, batch processing is removed – banks can unlock insights and capitalize on the vast quantities of data flowing through their systems much faster.
For example, if a customer spends in an airport, a push notification could be sent to promote travel insurance or offer discounted lounge access. To make this happen, the transaction must be processed in real time, using data across multiple domains – including the device and the core banking domains – must be combined. Since these data points don't typically sit within the same platform, the data solution needs to be open-source and easily integrated.
This wouldn't be possible with a mainframe – the bank wouldn't know their customer was at the airport until 24 hours later, and pulling all the data together would be resource-intensive. By then, the customer will have already boarded their flight and reached their destination.
Why move to an event-driven architecture?
"Confluent is brilliant for bringing things together and adding richness to banking data," explains Stuart Coleman, Director of Data and Analytics at 10x. "Many banks use it to get data out of their legacy system and into a place where they can do more interesting things."
While this offers banks a way to work around their mainframe, it doesn't address the reliance on batch processes or tackle the complexity of their architecture. "Even though my data ends up in a better place, if I want to go in and fix an issue, I still need to go in and fix the mainframe," explains Stuart.
In an event-driven architecture, there are fewer dependencies between systems. Instead, individual systems produce and consume event streams without needing to be hardwired. This enables banks to iterate their tech stack far more quickly.
"If one platform has an outage, it doesn’t matter. Each platform can even consume data at different rates if they want to. The whole architecture becomes decoupled, making the innovation rate much faster. Now, if a bank needs to change one component, they don't need to worry about 20 other connections. The whole thing becomes far less brittle."
Event-driven architecture in action: improving credit card fraud controls
"It's not uncommon for banks to have two or three systems in use that don't speak to each other," explains Stuart. "You can't make unified decisions without a joined-up view. A customer could have a known fraud on their debit card, but a similar attack on their credit card three hours later could get through."
An event-driven architecture would solve this. By taking data from separate systems and joining them together using Confluent, the bank can create a single stream that provides an entire picture of the customer, and the fraudulent activity can be blocked faster across all channels.
Conclusion
Banks can move from retrospective data analysis to real-time events with our cloud-native core banking platform. As a result, it becomes easier to be more customer-centric (through unified data) and deliver the personalized experiences digital consumers expect.
Got any questions regarding data, migrating to a 10x core, or doing more with the insights your bank generates? Talk to us here