A Canadian logistics company owning 20,100 kilometers (12,500 mi) of track in seven provinces of Canada and into the United States.
THE CHALLENGE / REQUIREMENT
As the client was sourcing data from multiple business applications, and working with latent data, the actual movements and loading operations of their fleet were not kept current. This caused under-utilization of their fleet and delays, resulting in loss of revenue.
The client was thus looking for a centralized data hub, and a single store with low latent schedule. The challenge was to meet end to end latency with complex transformation and validation. An additional challenge lay in enabling business events from legacy application and publishing them to the messaging platform in canonical form.
We built a common and canonical data model that met client’s requirements of demand planning and operations. We implemented an enterprise messaging platform (Kafka), and developed and deployed multiple streaming jobs (Apache Spark, Azure DevOps, CI/CD) to consume business events from various SoRs and transform them to a common data model. Finally, we exposed Data Hub data through domain data services (Java Spring Boot), dashboards (Angular), planning tools, etc.
With a single, low-latent and reliable data hub for transport management, the client could achieve
Enhanced visibility of vehicle demand and supply, resulting in improved efficiency and productivity
Higher utilization and optimal loading of their fleet, resulting up to 80% improvement in on-time departure