Financial Reporting for Startups (Part Two)
Don’t Just Create a “Dashboard”
We’ve covered the role that data collection plays in effective management, and why financial reporting only feels like a painful necessity when data collection is not effectively integrated into your organizational philosophy. Now it’s time to address the perils that come while trying to fix this problem.
Many businesses, once they get serious about data collection and begin to see the returns on running a data-driven business, naturally begin to ask how far they can take a good thing. As they invest into better data preparation, these businesses begin to realize just how much information they are collecting, and the potential of this information.
It’s completely understandable that founders would want this information to be as available to them as possible. If closing last month’s books in a few days is good, then wouldn’t having yesterday’s data on tap as of 12:01 AM be better?
The answer, as it often is, is “sometimes,” and if you’re ready for it. Here’s what to understand first before you sign up for an expensive Business Intelligence (“BI”) service or start your accountant on a new financial dashboard project.
Upstream vs. Downstream Data
Your data is mostly just a bunch of numbers and associations, but those numbers and associations have a lifecycle they go through as they make their way from raw information to something more meaningful.
This shouldn’t come as a surprise, but the farther downstream you go with your data, the more time that data takes to process. This goes double for anything being prepared manually, but even automated connections take a few hours to a day to fully sync.
There’s a connection between the level of preparation time for a given datapoint, and the minimum period for refreshing that datapoint. You can’t have a daily report on data that takes a week to prepare unless you want to employ 7 separate teams to work in parallel, each handling a day of the week.
There’s also an obvious ROI calculation to be made here. Your team’s time is valuable, taking that time away from other tasks is only if the value of the data they report is greater than the opportunity cost of that time. To solve this, businesses sacrifice either report frequency (the normal monthly close) or choose to create high-frequency reports on upstream data that’s more immediately available. Many BI tools will largely automate this process, though there can be some drawbacks.
Context is King
If you’ve found an acceptable midpoint between waiting for monthly reports and injecting raw binary straight into your eyeballs, you still need to make sure you can understand the data you receive. The context of your numbers can be just as important as the numbers themselves, and understanding context is of the things humans still do much better than machines.
Your mileage may vary depending on the business you run. Let’s say that you’ve linked up a tool that gives you gross receipts data daily through an online dashboard. If you’re selling a single eCommerce product, that may be highly relevant data. If you run retail partnerships through other businesses, it may leave you with dozens of other questions. What partner made the sales? Were they high-margin or low-margin products?
This isn’t just about upstream vs. downstream data, though that is certainly part of the equation. An often-overlooked fact about the people that collect and prepare our data for us – they show up to the discussion call and answer our questions for us. “What happened in March?” or “Why is this double last year?” BI and dashboarding tools spend a lot of time and money on data visualization, to make up for this lack of context with visual flair.
Just because we’re a data driven business doesn’t mean data is all we need. Make sure you start your Business Intelligence project based on a desire for a specific, actionable, and achievable dataset. Whatever you do, do not solve for an achievable dataset based on your desire for a Business Intelligence project.
Architects and Firefighters
Let’s talk a bit about that word, “actionable.” You just got the sales report for yesterday. Your summer intern just blasted into your office, sweat dripping, with a laminated chart in his hands (to protect against the sweat). Great news! Sales were up 20% yesterday!
What do you do, besides feel slightly pleased? There aren’t a whole lot of executive decisions that can be made on such a granular datapoint. In fact, as a rational data-minded executive you know that you need a sufficient sample size before you can start to draw conclusions. Say, 30 days of sales data. The same data you’d have gotten before you started your BI project.
Let’s describe two different management archetypes, the architect and the firefighter. The architect builds for the future, the firefighter solves problems with urgency. When the house is burning down, you don’t want an architect pointing out all the structural solutions that could have prevented the fire. But when you’re building that house, you don’t want a firefighter locking the site down every time sparks appear.
Architects don’t ask for a weather report at the building site every morning, they want an understanding of all the stresses that their site might experience over time. Firefighters, on the other hand, need pretty granular, if highly specific data. Is the house on fire right now?
Granular data encourages firefighting tendencies. We look for problems we can fix right now and can tend to over-manage if the decision to act is at all vague. Many well-intentioned BI projects wind up like lousy smoke alarms – throwing up flags every time we fry an egg. All this noise can lead us away from the architectural work we should be focused on.
The best stats are ones that are attached to a clear action. Site downtime, support response time, customer cancellations, days of cash remaining. If your site goes down, there’s no need for additional context, it’s time to fight that fire!
Who is Ready for Business Intelligence?
There is absolutely a place for BI within successfully run businesses. But getting there requires that a few data fundamentals be in place.
First – sort out your data pipeline. Upstream data is easier to get quickly and automatically, but that does you no good if your upstream data is useless on its own. Many a reporting process requires an accountant to hack together several chaotic data sources to turn it into something comprehensible.
That’s a downstream solution to an upstream problem. While solving this problem at its source alongside a BI implementation can be daunting, getting your data sources in harmony both provides you with better source data, and reduces the reporting load at the end of the pipeline. It also paves the way for more automation later.
Second, identify your functional datapoints and degrees of precision. Some KPIs just don’t mean as much minute-by-minute as they do monthly, and that’s fine. Once you have these targets in mind, you’ll have a much easier time identifying the critical path to increasing their availability. Targetless projects simply looking for “more data” usually yield limited returns, and rarely get referenced once the novelty wears off.
Finally, determine how you’re going to use (and not misuse) the data. Humans are creatures of habit, and implementing the use of your new insight will be just as crucial as getting the numbers there in the first place. A daily report that you only remember to check twice a month just won’t get you the value you are looking for.
With these steps in place, a comprehensive solution is a phenomenal upgrade for a business. But don’t let the gloss of a financial dashboard convince you that you’ve yet reached data nirvana. There’s work do be done first.
This article was written by Ben Coleman