This article is an extract from our whitepaper, The Power of Data. Download your copy today.
Banks have always recognized the value of their data. The challenge is that legacy infrastructure makes extracting and transforming data into actionable insights difficult and intensive. When the data is finally processed and ready, it's often stored in undocumented structured within systems that aren't accessible in real-time.
With greater access to and control over their data, banks can enhance their product set, provide contextual offers, and add more value to the lives of customers. In this blog, we look at how to avoid data siloes, traditional microservices versus event-driven ones, and the importance of data governance.
One potential risk with traditional microservices is that data related to a specific business function (e.g. transactions) can end up getting siloed. With event-driven microservices, it is possible to create ‘aggregate’ functions (or domains, as they are often called) which marry data from different business domains to create other microservices.
Take customer recommendations, for instance. This would instantly combine data from the transaction domain and the customer domain to recommend products or services based on previous transactions and customer info such as demographics. Aggregated domains therefore depend on data from other domains as they don’t have their own raw data to draw from.
This is another way banks can be more nimble. In the old microservices architecture, the application would connect directly with the relevant service via API and extract the data that it needs. But this can be problematic because it is not possible for the aggregate domain (in this case, customer recommendations) to know when something has taken place in the source domain (in this case, the transactions domain). That means having to request batches of data from the source domain API at set intervals, making it harder to access the data in real time.
Given that financial institutions have strict rules on data and governance, it's important that data is published in a federated manner so that all the different business domains within the core banking system have the same view of the world. For instance, all domains need to share the same definition of what a transaction is - you can’t have one domain defining it in one way and another domain defining it differently. That means when banks publish data on the mesh, there should be some common rules about how the data is published—for example, it should always have a transaction key or it should always have an amount attached.
Having an event-driven approach is also more helpful for record keeping and compliance because every event produces a schema and that schema is tagged with metadata, for instance whether or not that data contains personally identifiable information. That means for GDPR requests, any data that could identify a customer is clearly tagged, making it easier to retrieve or delete as needed.
This article is an extract from our whitepaper "The Power of Data". To read on, please download the full whitepaper via the button below.