Google’s Universal Analytics (UA) will sunset in 2023. Whether companies opt for its very different replacement (Google Analytics 4) or invest in a new analytics platform, what is the best approach to minimize disruption, especially for business decision-makers? From data models to reporting tools, Nicolas Hinternesch, senior solutions engineer at Piano, outlines key steps to keep data-driven operations up and running.
The pressure is on companies to plan for the end of Universal Analytics (UA). 2023 is only a few months away – and with data-driven businesses often reliant upon year-to-year comparisons, that means ensuring 13 months’ data is collected and ready to go before the end date. The idea of stakeholders logging into the dashboard and discovering no data is the stuff of nightmares for any data analytics team. Since data capture is not the starting point in any data analytics implementation – indeed, it can be some way down the line – the onus is on companies to get moving, fast.
The right data model is the first concern. A flexible model—with the right balance of both standardized and customizable components—will make the initial migration easier and simplify the necessary adjustments as business needs evolve. Data models are now event-driven, which means that all data streams must be migrated into an event-based schema. Data consistency for all data stakeholders is critical, so look for a product that seamlessly connects data from a single unified data model in all the tools, APIs and reporting interfaces.
The next step is to create a tagging plan to identify all of the elements required to achieve the defined business metrics. A great option to speed up this process is the use of an incremental tagging plan. Standard events can be implemented and feeding reports within a few hours. More detailed tagging can then be added along the way to meet the need for more sophisticated analysis.
A Tag Management System (TMS) can also speed up the migration process by using the existing data layer as well as existing tags, triggers, and variables wherever possible. This way, the analytics team can retain specific technical elements, enabling a seamless migration without having to redesign every aspect of the implementation. Data quality tools also play an important role for debugging, stream-inspection, and transparent data mapping and validation. Furthermore, without a TMS, additional support from technical teams will be required, which could slow the migration process.
It is also worth thinking about data privacy compliance issues up front, as this will save time further down the line, as and when regulations evolve. The tracking solution should provide immediate technical support of all consent-levels and their ramifications, as well as potentially even a tracking exemption (in certain markets), which allows for certain audience measurement without prior consent. Within a flexible data model, it is easy to add a flag to any data that is considered to be Personally Identifiable Information (PII). This way, user-sensitive information can be easily managed alongside user agnostic information.
Choosing a vendor with a privacy-first approach will allow you to build a sustainable solution. The right vendor will continuously adapt to the privacy landscape and provide the right technical tools as well as resources to manage full compliance and therefore diminish any business risk.
The priority within this migration is to minimize the impact on decision-makers – the implications of leaving businesses without access to this vital data for days, even weeks, are dire. The new analytics solution therefore has to work with your existing reporting workflow, not vice versa.
If the reporting is based on third-party business intelligence (BI) and dashboarding tools, the new solution has to provide the right export and API functionality to swap out the data source for these and allow for a continuous reporting flow. If the reporting is mainly based on stakeholders accessing the Analytics Tool GUI, however, then the new solution has to come with a strong set of out-of-the-box reporting, dashboarding, and analysis functionality.
Fitting the Existing Big Data Stack
Analytics is likely to be just one part of the overall big data technology stack. During any migration, it is important to consider what other aspects of the business might be affected. How is information distributed to the wider team? If an external dashboard or API tool is already in place, a new analytics solution that has the connectors and API endpoints to integrate seamlessly with the wider tech stack is hugely valuable, minimizing the need for additional integration work.
There is no simple or set timeline for any analytics migration or integration. But when organizations are compelled to move fast, it is amazing what can be achieved with the right approach. From an incremental approach to tagging onwards, designing the implementation around the context of the current situation and the need to be ready before UA disappears will help to keep the project focused and on track.
Senior Solutions Engineer, Piano
Piano’s Digital Experience Cloud empowers organizations to understand and influence customer behavior. By unifying customer data, analyzing behavior metrics and creating personalized customer journeys, Piano helps brands launch campaigns and products faster, strengthen customer engagement and drive personalization at scale from a single platform. Headquartered in Philadelphia with offices across the Americas, Europe and Asia Pacific, Piano serves a global client base, including Air France, the BBC, CBS, IBM, Kirin Holdings, Jaguar Land Rover, Nielsen, The Wall Street Journal and more. For more information, visit piano.io.