One of the most important question when running a business is: Which metrics should I track?
Whether it’s about team performance, or pipeline performance, or competitor or product performance, it all takes a few metrics to keep track to ensure business visibility.
How quickly will I achieve my goals?
Do I need to speed up my sales process?
Is their a team which needs more attention?
Is their an account that I’m losing?
Are we improving?
Metrics should be shown on a dashboard, big and visible! It can be shown on a TV or big displays.
So, here are 12 sales metrics that you need to show on your Sales dashboard.
Metrics include: Lead by source, Pipeline, Sales cycle, Closed opportunities, New business vs. upsell, Win/loss rate, Product gaps, Open opportunities, Open activities, Open cases, Opportunities past due and Sales by closed date.
Only 12 metrics and you are done with your sales tracking!
Which tool should I use?
How do I extract the data?
How do I plot these?
Which visuals should I use?
How should I show sales deal stage funnel?
You will be amazed to see how the metrics have been placed with the right choice of visuals and the layout in this Power BI report developed by us using HubSpot CRM data.
If you see the report here it is divided into several views:
Overview report – Shows your key metrics and sales funnel. Here we track open revenue, deal owners, sales cycle time along with deal details
Deal Geo analysis – Here you can clearly identify Asian regions not performing well with won rate less than 30%
Deal Region analysis – Imagine an executive interested in Region level performance? This view gives a clear picture where his team needs to focus
Deal Details report – Last but not the least, a detailed report to call out specific deals and work with the account owners to understand progress
The CRM source could be HubSpot, SalesForce or even Microsoft Dynamics.
The tool we have used is Microsoft Power BI to pull, model and visualize raw data.
Tons of questions on how can we copy data from Facebook Ads or HubSpot or Google Analytics or any other SaaS platform or APIs to Power BI.
There are a couple of reasons why you want to analyze data from multiple sources with Power BI.
Why should I care?
Say you want to analyze how much traffic and number of leads (CRM) are generated by spending on Ads on Google and Facebook. And, how many of them actually purchase (Stripe or PayPal)?
You can go to each of these service providers dashboards and get answers to your questions, one by one. Or, you can copy data from these providers into a single system say Power BI and analyze data together. Which one do you think will give you a clear business picture and with much less overhead?
Clearly the Power BI one!
Power BI provides data connectors to a good number of services already. Google Analytics, Facebook Page, MailChimp, SalesForce, MixPanel etc. But there are a number of sources for which you do not have data connectors yet – HubSpot, Facebook Ads, Instagram Ads, LinkedIn Ads etc.
Update (April 30): We have created a list of popular Power BI custom data connectors. The list includes connectors to HubSpot, Zendesk Support, Facebook Ads, Instagram Ads, LinkedIn and more. Check here.
Of late I have been seeing a lot of requests to copy data from HubSpot to Power BI!
What are my options?
This post talks about two general approaches to copy data to Power BI from SaaS platforms and APIs.
Approach 1: Use Power Query (within Power BI) to get data from Online Services (through REST APIs), model, and visualize data in Power BI – Self Service, Simple, Less cost
Approach 2: Setup data factory pipelines using say Azure Data Factory or setup Azure Functions, copy data to a database/data warehouse and use this data layer to visualize in Power BI – Enterprise, More control on data, Involves several components
Which approach to go with will depend on how frequently you want the data pulls andhow you want to reuse the data from these services.
How should I decide?
Questions to ask:
Question 0: How frequently you want to pull data? Every 15 mins? Every 2 hours? Every day?
Question 1: Are you going to use the data only in Power BI/Excel and only for visualization and analysis?
Question 2: Are you going to use the data in another application or going to reuse that data for Machine Learning or other use cases?
Question 3: How big is your data?
If your data pull requirements are every 2 hours or so and is small* and does not have to be used outside of Power BI/Excel and the purpose is only visualization and analytics – go with Approach 1.
Else go with Approach 2.
Approach 1 is quick to implement and will incur a less overall cost.
Approach 2 could take time to implement and has several other components than just Power BI (read on).
Approach 2 also provides you ability to pull incremental data. While in Approach 1 (Pro users) you have to pull full data every time.
Here we take Azure Data Factory to pull data from Google Analytics API.
Note: In the example below “FlattenJSON” is an Azure Function to simplify complex JSON returned by Google Analytics API.
Step 1: Setup data pipelines using say Azure Data Factory and push data to a database/data warehouse
Step 2: Connect to the database using Direct Query Approach in Power BI
Step 3: Start visualizing!
I have seen customers going with Approach 1 almost always since this saves time and cost. But this approach may require a gateway for data refreshes.
Which approach have you used in similar scenarios? Let me know
Note 2: Power BI has dataflows which may replace Azure Data Factory in scenarios above.
Note 3: Power BI team is also working to expose the data model behind the scene as Analysis Services. You can then query the data through SSMS. That feature should come out in preview in Feb 2019. (This date is from Microsoft Power BI blog/website)