Journey from Telematics Platform to Connected Platform
In the automotive industry Telematics, especially general purpose Track & Trace (T&T) applications has come a very long way from pure Track & Trace applications to complex Integrated Vehicle Health Management (IVHM) platform fueled by Advanced data analytics. This is in line with industry’s focus on connected, electric vehicles and autonomous vehicles with various degrees of ADAS projects undertaken by the industry. But these developments present a unique challenge for enterprises running legacy or Third party telematics systems.
Questtion#1: Journey from Track & Trace to Connected Platform:
How can organizations seamlessly transform the already running legacy telematics systems architected to serve T&T requirements to connected, electric and autonomous use cases which requires pumping millions of data points every second and requires much faster data processing and decision making capabilities either at cloud or at edge.
Question#2: How do we securely achieve the same
How do we securely achieve this transformation, especially with maturity of hackers and their attacks grown multifold over the years, making the existing T&T platform more secure and robust from cyber-attacks remain one of the top most challenge
An Approach
Like any other situation, there are multiple technical solutions to achieve the above objectives. In this article however, we will be focusing on one achieving the same using the traditional Track & Trace Platform
Track & Trace Platform
Conventionally speaking a Track & Trace Platform is used by organizations for basic Track & Trace operations, trip usage analysis, route analysis. The typical characteristic of a conventional T&T Platform is
- Monolithic Design
- Virtual Machines Based
- RDBMS
Integrated Vehicle Health Management(IVHM)
Integrated vehicle health management (IVHM) is the unified capability of systems to assess the current or future state of the member system health and integrate that picture of system health within a framework of available resources and operational demand. IVHM requires a holistic approach and deep integrations with Original Equipment Manufacturer (OEM)’s ERP & IT Systems.
- Microservices driven
- Containers based deployment topology
- NoSQL /Data-Lake
- Native API Support
- Native Data Streaming Support
Transformation Approach
We propose a two-step process to achieve this transformation. This transformation is powered by an open source and open standards-based framework to support transformation of the legacy Telematics platform to a complete IVHM stack in line with ongoing business demand.
Step#1: First a brownfield step in which a data analytics/digital enablement platform (DEP)/layer can be plugged into an existing Track & Trace platform. This step can help OEMs serve its existing customers and at the same time will help them in processing the real time Telematics data to find actionable insights which is the bedrock of a IVHM.
This DEP platform will have the capability to ingest data into a data lake via traditional Message queues as well as more modern Kafka streams.
- By following an ELT (Extract- Load and Transform) design paradigm rather than more traditional ETL (Extract Transform and Load), data in as-is format can be stored in the Integrated Data Lake.
- Post which Automotive ready algorithms in Kubernetes Container Management platform can help find deep actionable insights in a very efficient manner.
- This information can then be made available by API Management platform/Frameworks such as Django/WSO2/IBM API Connect/AWS API Gateway etc.
Step#2: In the second step the message queues and web services endpoints at ingestion layer will be replaced by a Kafka Broker and Telematics Control Unit (TCU) lifecycle management can be taken care by HSM backed Device management platform.
- TCU devices with HSM Modules can be managed by a Device management platform which has at a bare minimum certificate based support for device onboarding. In this case Key management becomes very important and aspects such as key rotation etc. This will securely enable ingestion of data points directly into the data lake after device and endpoint authentication and authorization.
- Apache Kafka
- Apache Kafka is used for real-time streams of data, to collect big data, or to do real time analysis (or both). Kafka is used with in-memory Microservices to provide durability and it can be used to feed events to CEP (complex event streaming systems) and IoT/IFTTT-style automation systems.
- In parallel other Data sources such as Enterprise Resource planning data (ERP, Customer Relationship Management (CRM) Data, other Historical Data can be pointed to the Integrated Data lake.
- Apache spark
- Apache Spark is an open-source, distributed processing system used for big data workloads. It utilizes in-memory caching, and optimized query execution for fast analytic queries against data of any size. It provides development APIs in Java, Scala, Python and R, and supports code reuse across multiple workloads—batch processing, interactive queries, real-time analytics, machine learning, and graph processing.
- After which more advanced correlation analysis can be run by augmenting (as additional data is now available) the already running models hosted in containers. Once mature the legacy T&T application can be sunset.
- API Management Platform
- The telematics data such as location data, vehicle health data can be made available with various stakeholders such as ecommerce companies, fleet management operators, load aggregators, Government authorities etc. via APIs. We propose a complete API Management platform to manage the API lifecycle for all the APIs
- Object Storage Layer
- Object storage layer can be used for data storage as well as data archival. Object storage can be used as OLAP layer.
- K8S based Track & Trace Platform
- The legacy Track & Trace application can then be migrated to the Kubernetes Container management platform.
- Web Application Firewall
- Web application firewall will help protect both the track & trace platform as well as the APIs
Rest of the Architecture will remain as is.
Benefit
This framework can not only help Automotive companies accelerate their Data Monetization Journey but can also help in securing legacy Telematics platform, historical data (both in transit as well as at rest), an accelerated journey to support Streaming Data, Big Data, Containers, Unstructured Data, Data Lake etc. This will also benefit Automotive enterprises achieve faster go to market for various connected services such as Usage based insurance, Fleet as a service, Warranty Analytics etc.
Author:
Ajay Tiwari
Connected Services Lead
Volvo Eicher Commercial Vehicles
Published in Telematics Wire