One of the largest financial institutional banks in the U.S.
The client aspired to create a Kafka-based data pipeline to source data from various databases and create a single operational data store.
We created a data pipeline to:
• Get raw data from multiple source systems in a pre-defined format
• Define transformation services to convert raw data into business events
• Send business data as output into an operational datastore
Using these data pipelines, the client successfully:
• Synchronized data in almost close to real time
• De-coupled architecture completely, thus avoiding any point-to-point integrations
• Planned and put forth a scalable framework for the customer
• Designed a distributed architecture for high throughput and optimized performance