Case studies

Business Intelligence (BI) case studies showcasing real-world success stories of how tailored BI solutions help companies achieve impactful results and success with data.

Product Analytics and AB Testing

Business Intelligence Infographic displaying the User Personas Analysis Plan, divided into five sections:

Seasonality: Acquisition volumes are aligned with product seasonality, while churn volumes are not aligned with seasonality.
Subscription Lifecycle: Includes subscription cancellation characteristics, understanding users with multiple cancellations, churn characteristics, renewals, and user lifetime across different plans.
User Behaviour: Examines demographics, device patterns, behavior patterns, and different product usage patterns.
Conclusions: Final ideal user definition based on plan, device, renewals, days of product usage, and number of sessions.
Outcome: Results in targeted AB testing, a customized product experience, and an alert system for internal stakeholders.

Product strategy definition with a series of analyses

The aim of the project was to explore the user journey, identify patterns and find different user segments based on their common characteristics. The findings of the analysis were serve as a foundation to build a new data-backed product strategy.

From Spreadsheets to BI

The client lacked any centralised analytics infrastructure. Reporting was entirely manual, scattered across spreadsheets, and data lived in disconnected sources There were no dashboards, no warehouse, and key business metrics were manually calculated, making decision-making slow and error-prone.

Business Intelligence Infographic displaying the AB testing process framework, broken down into five stages:

Initiation: Identify seasonality and peak periods to create an ideal testing calendar.
Roadmap: Synchronize testing plans across departments to avoid running tests with overlapping metrics, and define scopes for multiple tests.
Planning: Involve data analysts from the early idea phase to ensure comprehensive analysis.
Preparation: Data analysts collaborate with developers on event specifications, QA, and implementation.
Closure: Use automated analysis processes with Python, providing monetization summaries and strategic recommendations to C-level executives based on AB test results.

Restructuring AB testing processes

My client wanted to restructure the AB testing processes, as their previous setup was cumbersome, with very long experiment runtime and development time and limited insights on the full impact. The main goal was to improve test result quality and enhance data-driven decision-making.

Data Visualisation

Business Intelligence dashboard sample

Creating an interactive reporting hub

The company faced issues with report management and data accessibility, slowing down effective decision-making. The dashboards they had were not interactive, slow to load and it was a very long process to build something new. There was no core framework, which caused inconsistencies in how certain KPI-s very calculated.

Business Intelligence Infographic outlining key items for a project focusing on self-service analytics and Looker tool migration:

Need for Self-Service Analytics: Migrate existing Tableau dashboards and create relevant explores to enable self-service analytics.
Develop a Central Project: Create a central project that can be used across departments, calculating company-wide KPIs, and extendable by departments for additional KPIs.
Knowledge Sharing: Train in-house analysts on Looker, providing them with the technical knowledge to support tool migration.
Handle Admin Responsibilities: Personally handle admin tasks such as creating projects, user groups, and datagroups, adding new users, and managing access.
Execution Time: Complete the project in under 8 months to avoid renewing Tableau licenses.
Onboard the Product Team: Train the product team to use Looker’s self-service features for ad-hoc analysis, supporting data-driven decision-making.

Company wide Looker transition

The company was undergoing a full transition from Tableau to Looker. The project was driven by the need for a self-service analytics platform – and comprised of setting up a comprehensive Looker infrastructure, migrating all existing analytics processes, executing department-specific projects, and the essential training of in-house analysts.

 Infographic illustrating a data pipeline from various data sources to dashboards through a central data warehouse:

Data Sources: Include applications like QuickBooks, Culture Amp, Google Analytics, Monday.com, Notion, Airtable, Google Ads, ActiveCollab, Yomly, Sprinklr, Todoist, SharePoint, social utilization templates, sales pipeline P&L, briefs and proposals, personal drives, and banks.
Data Warehouse: The purpose is to create a single source of truth where company data is accessed in a unified way. Data loads are automated weekly, daily, or hourly. The system can automate manual workflows and supports AI model development if desired.
Dashboards:
Finance & Sales: Metrics for financial stability, such as forecast vs. budget vs. actuals, cash flow, and sales pipeline.
HR: Tracks employee life, headcount, leaves, on/off-boarding, satisfaction, and retention rates.
Operations: Monitors project progress, timesheet data, productivity, margin, and profitability.
Social Dashboard: Compares industry trends and benchmarks against client performance.
Delivery: Focuses on resource allocation, AB testing, ROI, team performance, and financial standings.
Paid Media: Evaluates marketing strategies, including campaign performance, customer acquisition, and AB testing.
High-Level Overview: Supports decision-making for business leads, account directors, and department heads.

Data Automation and Dashboard Implementation

The project focused on building a comprehensive business intelligence solution, automating data extraction processes by and developing interactive Tableau dashboards to centralise and standardise reporting across the organisation. Finally, the automation project led to saving over 40 hours per month per department and eliminating the need for more than 50 manual reports.

Data infrastructure and Data Management

Infographic presenting three main options to build a scalable data warehouse (DWH) and related pipelines:

Solution 1 - No-code integration to BigQuery through Fivetran (ELT pipeline):

Description: Cloud-based data integration platform for data extraction, loading, and transformation.
Pros: Automated ELT processes, pre-built connectors (e.g., Monday.com, QuickBooks), custom connectors via APIs and Google Cloud Functions, easy to scale data pipelines, direct integration with Google Analytics, and built-in data-quality checks.
Cons: Requires Google Cloud Functions through Fivetran as not all tools have pre-built connectors.
Solution 2 - Direct integration to BigQuery through APIs:

Description: Direct API connection to data sources and loading into BigQuery using custom Python scripts or programs.
Pros: No need for intermediary tools, direct connection with Google Analytics, automated data transfer via Zapier, and integration with SharePoint/Excel files and Notion.
Cons: Significant coding required, high investment in development and maintenance, managing multiple API integrations can be complex, and no built-in data-quality cross-checks.
Solution 3 - Developing an ETL pipeline:

Description: Custom ETL (Extract, Transform, Load) pipeline for full control over data extraction and transformation processes.
Pros: Complete control over data processing, recommended for architectures with complex dependencies, scalable to growing data needs, and direct integration with Google Analytics.
Cons: Time-consuming to develop, requires constant development expertise, and data-quality checks are not default but can be added manually.

Building a unified datawarehouse

The client used various softwares and tools for their business, so critical business data was dispersed across various platforms, blocking seamless decision-making. The focus was on creating a unified data warehouse that would serve as a centralised repository.

Business Intelligence Infographic illustrating a six-step data management process:

Data Management: Restructured the company’s data management processes to transform data into valuable insights.
Data Migration: Migrated all company data from text files into Azure SQL Data Warehouse, with automated pipelines refreshing data in daily batches.
KPI Definition: Redefined core metrics to ensure that stakeholders have a clear view of the company’s performance, regardless of region or underlying system.
BI Tools: Implemented business intelligence tools, understood business needs, and created insightful analyses to maintain the company’s competitive position and profitability.
Reporting: Took governance over 15 weekly, monthly, and quarterly reports, ensuring that they are scalable and always up-to-date.
Consolidation: Centralized data-driven processes like customer satisfaction metrics and capacity planning, enabling departments to communicate using the same data language.

Building data driven culture

My client faced challenges related to business productivity, decision-making, and identifying new opportunities. The issues they faced were related to: inefficient data management processes, lack of insights from data, data was stored in non-scalable text or Excel files, inconsistent and outdated reports and confusing KPI definitions. I implemented a full scale business intelligence solution.

Infographic showing the four key stages of project planning:

Discovery:
Understand the existing data infrastructure, tools, and sources.
Explore options for data ingestion or ETL processes.
Perform a thorough data quality assessment.
Data Storage:
Evaluate the current data storage solutions and determine if they meet the needs.
Implement new processes defined in earlier stages.
Data Model:
Define a data model that supports analytical requirements.
Map the relationships between data from different tools.
Create a data layer for reporting purposes.
Dashboards:
Conduct workshops with key stakeholders.
Design and develop interactive dashboards.
Ensure the dashboards provide actionable insights.

Data infrastructure optimisation

My client struggled with data scattered across multiple platforms, hindering efficient reporting and decision-making. The project focused on building a unified data warehouse to serve as a centralized repository, enabling streamlined data management and enhanced insights for better business outcomes.

Data tracking

Infographic showing the stages of event tracking implementation:

Participate in Research and Planning: Support game designers with recommendations, ensuring a full understanding of the goal of the new feature and user actions.
Prepare Tracking Specifications: Based on the mockups, prepare detailed event specifications for developers, defining events for each screen and action (e.g., screen_display, button_press, user_action).
Development Support: Assist the development team during implementation by providing clarifications. Monitor changes in the feature scope and adjust tracking specifications accordingly.
Tracking QA: Using Firebase and prototype app versions, test to ensure the tracking was implemented correctly and functions as expected.
Event Implementation: Implement events into the data warehouse and build the corresponding underlying reports.

Mobile game launch data flow setup (android, iOS)

The challenge was to create a streamlined and accurate data capture system for the worldwide launch of the game without having access to actual production events or data. Additionally, the organization needed a solid foundation for data analysis, which they can use from day 1.

Intelligence Infographic illustrating a data-driven marketing strategy in 5 steps:

Pixel Tracking: Implement pixel tracking on the website, focusing on ads placement and entry point banners.
GTM Tracking: Define, implement, and track crucial events, campaigns, and pixels through Google Tag Manager to capture user interactions and conversions.
User Analyses: Identify trends, analyze user behavior, and campaign performance to create user segments for targeted marketing.
AB Testing: Conduct A/B tests to enhance performance and ROI using tools like Google Optimize and Firebase.
Dashboards: Create an interactive custom dashboard using Tableau for stakeholders to quickly assess campaign performance.

Data-driven marketing strategy

The client was running paid media campaigns, however, the existing tracking setup and pixel placements were not optimised, resulting in insufficient data collection and suboptimal campaign performance. This inconsistency in tracking led to missed opportunities .

Illustration of a unified tracking setup process for Android, iOS, and web platforms, presented in four steps:

Review existing tracking: Map all events, identify discrepancies, and catalog where differences lie. Evaluate redundant events.
Framework Creation: Define consistent naming conventions for events and parameters. Standardize key event attributes and eliminate any duplication or inconsistency in event definitions.
Framework Implementation: Update tracking libraries and event triggers, conduct QA in sandbox environments, and roll out the updated tracking structure incrementally.
Implement Monitoring: Set up automated alerts to detect discrepancies, ensure quality assurance, and resolve issues to maintain the integrity of the tracking system.

Unified tracking setup – android, iOS, web

The organization was facing challenges with inconsistent data tracking across Android, iOS, and web platforms. Event names and structures were inconsistent, leading to unreliable data for analysis. The project aimed to review the existing tracking setup, develop a unified event taxonomy, and implement changes to ensure consistent data capture across all platforms from day one.