Migrating away from Google Analytics – Getting it Right
Google’s Universal Analytics (UA) will sunset in 2023. Whether companies opt for the very different replacement (Google Analytics 4) or invest in a new analytics platform, what is the best approach to minimise disruption, especially to business decision-makers? From data model to reporting tools, Nicolas Hinternesch, senior solutions engineer at Piano, outlines key steps to keep data-driven operations up and running.
Immediate Action
The pressure is on companies to plan for the end of UA. 2023 is only a few months away – and with data-driven businesses often reliant upon year-to-year comparisons, that means ensuring 13 months’ data is collected and ready to go before the end date. The idea of stakeholders logging into the dashboard and discovering no data is the stuff of nightmares for any data analytics team. Since data capture is not the starting point in any data analytics implementation – indeed, it can be some way down the line – the onus is on companies to get moving, fast.
Understanding Data
The right data model is the first concern. A flexible model—with the right balance of both standardised and customizable components—will make the initial migration easier and simplify the necessary adjustments as business needs evolve. Data models are now event-driven, which means that all data streams must be migrated into an event-based schema. Data consistency for all data stakeholders is critical, so look for a product that seamlessly connects data from a single unified data model in all the tools, APIs and reporting interfaces.
The next step is to create a tagging plan to identify all of the elements required to achieve the defined business metrics. A great option to speed up this process is the use of an incremental tagging plan. Standard events can be implemented and feeding reports within a few hours. More detailed tagging can then be added along the way to meet the need for more sophisticated analysis.
Accelerating Implementation
A Tag Management System (TMS) can also speed up the migration process by using the existing data layer as well as existing tags, triggers, and variables wherever possible. This way, the analytics team can retain specific technical elements, enabling a seamless migration without having to redesign every aspect of the implementation. Data quality tools also play an important role for debugging, stream-inspection, and transparent data mapping and validation. Furthermore, without a TMS, additional support from technical teams will be required, which could slow the migration process.
It is also worth thinking about data privacy compliance issues up front, as this will save time further down the line, as and when regulations evolve. The tracking solution should provide immediate technical support of all consent-levels and their ramifications, as well as potentially even a tracking exemption (in certain markets), which allows for certain audience measurement without prior consent. Within a flexible data model, it is easy to add a flag to any data that is considered to be Personally Identifiable Information (PII). This way, user sensitive information can be easily managed alongside user agnostic information.
Choosing a vendor with a privacy-first approach will allow you to build a sustainable solution. The right vendor will continuously adapt to the privacy landscape and provide the right technical tools as well as resources to manage full compliance and therefore diminish any business risk.
Retaining Reporting
The priority within this migration is to minimise the impact on decision makers – the implications of leaving businesses without access to this vital data for days, even weeks, are dire. The new analytics solution therefore has to work with your existing reporting workflow, not vice versa.
If the reporting is based on third-party business intelligence (BI) and dashboarding tools, the new solution has to provide the right export and API functionality to swap out the data source for these and allow for a continuous reporting flow. If the reporting is mainly based on stakeholders accessing the Analytics Tool GUI, however, then the new solution has to come with a strong set of out of the box reporting, dashboarding, and analysis functionality.
Fitting the Existing Big Data Stack
Analytics is likely to be just one part of the overall big data technology stack. During any migration, it is important to consider what other aspects of the business might be affected. How is information distributed to the wider team? If an external dashboard or API tool is already in place, a new analytics solution that has the connectors and API endpoints to integrate seamlessly with the wider tech stack is hugely valuable, minimising the need for additional integration work.
Conclusion
There is no simple or set timeline for any analytics migration or integration. But when organisations are compelled to move fast, it is amazing what can be achieved with the right approach. From an incremental approach to tagging onwards, designing the implementation around the context of the current situation and the need to be ready before UA disappears will help to keep the project focused and on track.