Building a data-centric approach in a digital transformation program.
Helvetia Environnement was created in 2005 as the heir to Swiss family businesses to provide a global response to waste management in Switzerland. The company is active in waste collection for communities, businesses and industries, in waste sorting and recovery, and in the field of energy creation from waste (Waste to Energy). Helvetia Environnement is present in Switzerland with more than 500 employees in the cantons of Geneva, Vaud, Fribourg, Basel and Solothurn.
The Swiss waste treatment and management sector is accelerating its transformation, driven by the digitisation of processes, innovation, the exploitation of data and the potential of new technologies (AI, IoT, Big Data, Blockchain, etc.).
To meet the new challenges of the sector, Helvetia Environnement is launching a transformation programme in 2019 to meet the following objectives:
- Refining the management of the global performance of the group, its entities and its activities
- Computerise all business data flows (collection - sorting - processing - trading - service)
- Dematerialise and then digitise certain company functions
- Simplify and deploy business processes
- Set up an information system with an architecture conducive to future developments.
- Putting data governance at the heart of processes.
In this context, we assist Helvetia Environnement to :
- Understand the current and future state of Helvetia Environnement's main data domains in order to define the group's "Data Centric" strategy
- Design the data architecture that will support the programme and its objectives
- Support the teams in the implementation of data management
- Implement the programme, providing leadership and project resources
To carry out this mission, we divided our intervention into several phases.
The first phase is an organisational phase, which enabled us to move from a project vision to a programme vision, by building the streams that would make up the programme. In this vision, the prism of data as a support for digital transformation appeared to be essential and allowed us to lay the first bricks of the data foundations, whether technical or functional.
The second phase is an analysis of Helvetia Environnement's IT landscape. This allowed us to align the building blocks of a data platform with the technical constraints and business challenges, and thus to design and provide our client with a real data strategy.
This analysis forged a conviction, that of proposing, beyond the classic Data Lake or Data warehouse architectures, an architecture inspired by distributed architectures. To do this, we considered data domains as a priority and applied our "Data Thinking" methodology to create a self-service data infrastructure and treat data as a product.
The third phase is the instantiation of the programme. Cross steers the programme with the various stakeholders, carries out the technical design of the datahub, lays down the principles of data management and defines, together with Helvetia Environnement, the implementation of the data centric organisation.
The objective of our approach was to ensure that the four pillars of the data foundation were aligned so that they could effectively and sustainably support Helvetia Environnement's programme, projects and data needs.
We have therefore built an architecture with a long-term vision that will integrate all future challenges to be addressed by Helvetia Environnement. Moreover, the agile implementation of this project allows us to build a solution in accordance with the needs of the moment.
This approach allowed us to switch from a point-to-point integration approach to a centralised data architecture, which we will call DataHub.
A DataHub approach makes it possible to offer a centralised service that connects all systems, whether they are web applications, IoT sensors, SaaS solutions or more operational solutions, such as a CRM or ERP.
Also, this brick of the data architecture manages the connections between the systems and orchestrates the flow of data between them. We therefore have a flexible architecture that can evolve over time.
With a Datahub, connections are established with each system or component that will be integrated in the future. This connection is shared with all other systems that need to interact with this DataHub. Data services can be exposed and published in a consistent manner, allowing for better data integration between systems and reducing the need for data replication.
This DataHub approach also simplifies the implementation of a data governance policy, as data is stored and centralised. Data can be easily transformed and distributed to other points, such as Datawarehouses, DataLake (Cloud or On Premise), and data restitution tools (BI, advanced analysis, or data science tools).
This is what makes the difference with traditional data platform architectures. The DataHub is a brick for mediation, storage and exchange of coherent data. It is a component of the data architecture and not an end point like a data warehouse or a data lake.
The added value of this architecture is that it gives each component a clearly defined role.
- The datawarehouse, to structure the data for analytical purposes ;
- The datalake, to collect Big Data for exploratory analysis;
- The DataHub, to provide semantic mediation between systems.
For CROSS, the Datahub is the most mature architecture for meeting our client's data needs. Moreover, it is fully compatible with emerging architectures such as datamesh, which will allow us to absorb the explosion of the business perimeters of new functional domains that will emerge at our client's.
To meet the challenges of tomorrow, where data remains a critical element, tomorrow's data architecture must be dynamic and flexible, to enable companies to evolve at the pace of their environment and to offer rapid innovation.