2 ways to copy data to Power BI from online services!

Tons of questions on how can we copy data from Facebook Ads or HubSpot or Google Analytics to Power BI.

There are a couple of reasons why you want to analyze data from multiple online sources with Power BI.

Say you want to analyze how much traffic and number of leads (CRM) are generated by spending on Ads on Google and Facebook. And, how many of them actually purchase (Stripe or PayPal)?

You can go to each of these service providers dashboards and get answers to your questions, one by one. Or, you can copy data from these providers into a single system say Power BI and analyze data together. Which one do you think will give you a clear business picture and with much less overhead?

Clearly the Power BI one!

Power BI provides data connectors to a good number of services already. Google Analytics, Facebook Page, MailChimp, SalesForce, MixPanel etc. But there are a number of sources for which you do not have data connectors yet – HubSpot, Facebook Ads, Instagram Ads, LinkedIn Ads etc.

Of late I have been seeing a lot of requests to copy data from HubSpot to Power BI!

This post talks about two general approaches to copy data to Power BI from online services for which we do not have data connectors.

Approach 1: Use Power Query to get data from Online Services (through REST APIs), model and visualize data in Power BI – Self Service, Simple, Less cost

Approach 2: Setup data factory pipelines using say Azure Data Factory, copy data to a database/data warehouse and use this data layer to visualize in Power BI – Enterprise, More control on data, Involves several components

Which approach to go with will depend on how you want to reuse the data from these services.

Questions to ask:

Question 1: Are you going to use the data only in Power BI/Excel and only for visualization and analysis?

Question 2: Are you going to use the data in another application or going to reuse that data for Machine Learning or other use cases?

Question 3: How big is your data?

If your data is small* and does not have to be used outside of Power BI/Excel and the purpose is only visualization and analytics – go with Approach 1.

Else go with Approach 2.

Approach 1 is quick to implement and will incur a less overall cost.

Approach 2 could take time to implement and has several other components than just Power BI (read on).

Approach 2 also provides you ability to pull incremental data. While in Approach 1 (Pro users) you have to pull full data every time.

*Note 1: I have seen Power BI experience getting degraded if your model size becomes more than 300 MB.

Examples to showcase how to use Approach 1.

You can go with either “Get Data” way or a Custom Connector way. I will show steps for “Get Data” way to connect to HubSpot (CRM).

Step 1. Go to “Home” tab in the Power BI Desktop ribbon, select “Get Data->Web”

Get Data -> Web

Step 2: Select “Advanced” radio button and put API details. The screenshot below shows examples for connecting to HubSpot ‘Companies’ API endpoint.

Connect to HubSpot Companies API endpoint

Step 3: Apply transformations using Power Query

Transform JSON data to tabular format using Power Query transformations.

Step 4. Start visualizing!

Sales Pipeline Report from HubSpot data developed by us.

Play with Sales Pipeline Report live

Examples to showcase how to use Approach 2.

Here we take Azure Data Factory as a mean to pull data from Google Analytics API.

Note: In the example below “FlattenJSON” is an Azure Function to simplify complex JSON returned by Google Analytics API.

Step 1: Setup data pipelines using say Azure Data Factory and push data to a database/data warehouse

Azure Data Factory v2 data pipelines to copy data from Google Analytics to Azure SQL

Step 2: Connect to the database using Direct Query Approach in Power BI

Direct Query to connect to Azure SQL

Step 3: Start visualizing!

Google Analytics data visualized by us using Approach 2

I have seen customers going with Approach 1 almost always since this saves time and cost.

I have also seen customers going with Approach 2 when data sources involved is more than 5 or so.

Which approach have you used in similar scenarios? Let me know

Thanks,

Ranbeer M

Note 2: Power BI has dataflows which may replace Azure Data Factory in scenarios above. The dataflows are still in preview.

Note 3: Power BI team is also working to expose the data model behind the scene as Analysis Services. You can then query the data through SSMS. That feature should come out in preview in Feb 2019. (This date is from Microsoft Power BI blog/website)

Note 4: Power BI team has released AI features in private preview so you can apply ML techniques to your data in Power BI! More on that in this blog post.

Note 5: You may contact us if you need similar data movement and analytics for your online services data!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s