Why should brands invest 1% of revenue on data initiatives?

Krishna

Krishna

Co-Founder

If you are a brand making $10M in revenue, we would recommend budgeting 1%-2%, i.e. $100K-$200K for your data initiative to get the best results.  That would pay for the costs of the tools needed to move data, host that data in a data warehouse, and visualize it in a BI tool, plus the cost of personnel to make data work for the business.

The costs associated with a data initiative can vary widely depending on several factors such as the size and complexity of the data, the technological infrastructure required to process and store the data, the skills and expertise of the personnel involved, and the overall goals and scope of the initiative. 

Some of the costs that organizations may need to consider when embarking on a data initiative: 

This includes the costs associated with acquiring and collecting data from various sources, such as purchasing data from third-party providers or building data collection systems. 

This includes the costs associated with storing and processing the data. It covers the cost of purchasing and maintaining servers or cloud-based infrastructure, as well as software licensing fees. 

This includes the costs associated with analyzing and visualizing the data, such as hiring data analysts or purchasing analytics software. 

This includes the costs associated with ensuring the accuracy and consistency of the data, such as investing in data cleaning tools or hiring data quality specialists. 

This includes the costs associated with training personnel on data-related skills and technologies, hiring and retaining data professionals. 

Learn the key insights for hiring a head of data.

This includes the costs associated with missed opportunities or delayed decision-making due to a lack of timely and accurate data. 

Overall, the costs associated with a data initiative can be significant, but the potential benefits, such as improved decision-making and increased efficiency, can often outweigh these costs in the long run. 

The question is, what is a good budget for an emerging brand to consider and plan for its data initiatives?

We believe that 1%-2% of annualized revenue is a good place to start for companies making $5M-$50M in revenue. That would pay for the costs of the tools needed to move data, host that data in a data warehouse, and visualize it in a BI tool, plus the cost of personnel to make data work for the business. 

Cost Breakdown

Let’s look at the cost breakdown for building and automating reporting and analytics.

Gathering Business Requirements

All businesses are unique in some way. Therefore, a turnkey reporting solution often does not address all your requirements and might end up costing you a move over time because you are paying for the tool and then you are also paying for personnel to do the work that the tool is not doing. This is the reason you will hardly find any enterprise-level business using a turnkey BI tool that gives you metrics out-of-the-box. 

 

To have an owned analytics solution, it is important for someone in the company, typically an analyst, to work with various stakeholders to document business requirements about the data they need regularly to take effective decisions and analyze the impact of the decisions.  

 

The cost to do this activity is the cost you pay your resources to think through their business requirements and document them. Basically, time from executives, and time from a business analyst(s).

Automating Reporting and Analytics

Once you have the requirements, then you can start on the journey of automating reporting and analytics. Let’s look at the next steps. 

Tools like Daton, Stitch, and Fivetran help you move data from various sources into a cloud data warehouse like Snowflake or BigQuery without the analyst or a data engineer having to write a single line of code. When a source you need is not supported in these tools, an analyst or a data engineer has to write code to automate data extraction from these sources into the data warehouse. Alternatively, if the source data is not used frequently, the analyst can spend some time periodically to manually download the data and move it to a data warehouse.

So, the total cost is the sum of the price paid for the software + the time the analyst spends on manual work.  

With Daton, our eCommerce data pipeline, we are attempting to reduce the manual work of the analyst to zero by building custom connectors for our customers to use. 

A data warehouse acts as the single source of truth for business data. There are many data warehouses available in the market and most have pay-as-you-go pricing.

The cost here is the cost customers pay to load data in the data warehouse and for querying the data. 

The second aspect of the costs is the effort needed to build the necessary data models in the warehouse using tools like dBT or Airflow. These models codify the business rules and convert the data into usable metrics and dimensions that can then be visualized in BI tools like Tableau, Looker, or PowerBI. 

The costs here as the costs you pay for data modeling. Typically, an analyst who is adept at using SQL or Python or a data engineer can undertake this activity.  

The costs associated here are the cost of the analyst or data engineer. The tooling cost is negligible as dBT has a decent free tier and is something that we often recommend. 

This includes the costs associated with ensuring the accuracy and consistency of the data, such as investing in data cleaning tools or hiring data quality specialists. 

Finally, the quarterback in stitching all these activities together is a capable analyst or ideally, a data team. An experienced analyst/ data team can help through the entire lifecycle – finds the right tools for your data initiative, builds the data stack, keeps your data stack up and running, ensure data quality, and helps find insights that can support decision-making.  

Learn how to approach your dtc brand data and data needs.

  • Data Ingestion
  • Data Warehouse
  • Data Modelling or Data Engineering
  • Data Visualization
  • Personnel

Tools like Daton, Stitch, and Fivetran help you move data from various sources into a cloud data warehouse like Snowflake or BigQuery without the analyst or a data engineer having to write a single line of code. When a source you need is not supported in these tools, an analyst or a data engineer has to write code to automate data extraction from these sources into the data warehouse. Alternatively, if the source data is not used frequently, the analyst can spend some time periodically to manually download the data and move it to a data warehouse.

So, the total cost is the sum of the price paid for the software + the time the analyst spends on manual work.  

With Daton, our eCommerce data pipeline, we are attempting to reduce the manual work of the analyst to zero by building custom connectors for our customers to use. 

A data warehouse acts as the single source of truth for business data. There are many data warehouses available in the market and most have pay-as-you-go pricing.

The cost here is the cost customers pay to load data in the data warehouse and for querying the data. 

The second aspect of the costs is the effort needed to build the necessary data models in the warehouse using tools like dBT or Airflow. These models codify the business rules and convert the data into usable metrics and dimensions that can then be visualized in BI tools like Tableau, Looker, or PowerBI. 

The costs here as the costs you pay for data modeling. Typically, an analyst who is adept at using SQL or Python or a data engineer can undertake this activity.  

The costs associated here are the cost of the analyst or data engineer. The tooling cost is negligible as dBT has a decent free tier and is something that we often recommend. 

Once you have your requirements ready, you have ingested data using a tool like Daton, an analyst models the data using a service like dBT on a warehouse-like BigQuery, the next step then is to build automated dashboards and reports.  

The costs here as the costs for tools like Tableau and the time the analyst spends in building dashboards.  

Finally, the quarterback in stitching all these activities together is a capable analyst or ideally, a data team. An experienced analyst/ data team can help through the entire lifecycle – finds the right tools for your data initiative, builds the data stack, keeps your data stack up and running, ensure data quality, and helps find insights that can support decision-making.  

Learn how to approach your dtc brand data and data needs.

Conclusion

In summary, if you are a brand making $10M in revenue, we would recommend budgeting 1%-2%, i.e. $100K-$200K for your data initiative to get the best results. We have found through our experience over the last 7 years working with over 100 brands that the companies that invested in the right data stack and data team (analyst/data engineer/data scientist) grow much faster and are more profitable.

Discover how Investing just 1% of Annual Revenue in Data Initiatives Resulted in 50x Return on Investment!