There are three major steps in setting up Power BI Embedded for your Power BI Reports. These steps apply to embed your dashboards or Q&A as well (except for permission settings).
In this blog post we will be setting up Power BI Embedded for ISV or “App owns data” scenarios. The scenario where your users need not have Power BI License to view reports. You embed Power BI Reports in your custom application say, e-commerce site, or health app etc. for your end users.
Here are those three major steps to embed your reports.
Note: Some of the settings differ when you embed for US Gov clouds. Read our brand new post on settings when embedding for US Gov clouds here
1. Register application
2. Set up permissions in Azure Portal
3. Set up sample code to embed Power BI Reports
Step 1: Register application
Step by Step guide.
a. Go to https://dev.powerbi.com/apps and register your application. This step is required since Power BI Rest APIs or .NET SDK requires an “application” to connect to Power BI to get token for embedding.
This application is not your custom application (or portal) where the reports will be embedded.
b. Login with your Power BI Account. This is the account where you have Pro or Premium license assigned. This could be yours or a master account (non-human).
c. Add the following setup. You can put any Application Name, but Application Type should be “Native”
d. Select APIs to access. If you only want to “view” reports, select “Read all reports”, “Read all datasets”, and “Create content”. If you want to embed in edit mode, select “Read and write all datasets”, “Read and write all reports”, and “Create content”. We will/can change these in Azure Portal later.
Note: We have seen if you do not select “Create content” permissions, requests to Power BI APIs fail with 403 error.
e. Click on “Register” button. You will receive your client id (or application id as it is called in Azure Portal). Keep it handy
Step 2: Set up permissions in Azure Portal
Let’s log in to Azure Portal using our Power BI account (the account used to register application above).
b. Navigate to the Azure Active Directory in the left panel and click on App Registrations. If you do not see your app you just created, select “All Apps” instead of “My Apps”
c. Click on “PBIEmbedApp” and then on Settings and Required permissions
d. On the Required permissions, select Power BI Service. Here you can enable/disable any permissions.
Click on “Save”.
e. After saving you would see “3” delegated permissions for Power BI Service. While Power BI Service is selected, click on “Grant permissions”.
This would grant permissions to your app.
You are set to embed your reports now! You would need 3 things to embed your report in your application.
a. Your app’s client id (Remember I had asked you to keep that handy above). If you miss it no worries, go to Azure Portal and grab the “application Id” for your app under Azure Active Directory. This is nothing but your client id.
b. Power BI Report Group ID and Report ID. How do you find out your report’s group id and report id? Navigate to your Power BI report in Power BI portal. Copy the URL
Anything after “groups/” but before “/reports” shown above in red box is group_id, and anything after “reports/” but before the last “/” shown above in blue box is report_id. Copy them.
c. Your Power BI username and password. Usually in ISV or App owns data scenarios this is a master account. This means it can be a person’s account like you or me or any other account which nobody (or I must say “no human”) will use.
Step 3: Set up sample code to embed Power BI Reports
Download sample code from Microsoft’s GitHub repo. You will need to use “PowerBIEmbedded_AppOwnsData” project.
Phew! Finally, we were able to resolve the error “Request is not a valid SAML 2.0” when embedding Power BI Reports with federated authentication. It took us some time but thanks to the wonderful Microsoft support team who worked with us in debugging and isolating the issues.
Our scenario: Enterprise customer with Power BI Premium capacity planning to embed Power BI reports in an internal application using “App Owns Data” approach. There are scenarios why would you embed for enterprises (also called as organizational embedding), and scenarios why would you use “App Owns Data” approach over “User owns Data” approach. More about this in another blog post.
Ok, then why this error? How to solve it?
Why this error:
When you authenticate using master account the request goes to a federated server (in this case customer’s Identity Provider (IdP)), the IdP validates the credentials, sends back SAML assertion and TokenType, the Azure AD .NET libraries check the TokenType and assigns granttype. This granttype and SAML assertion is sent to Azure AD for confirmation.
In our particular case, the PingFederate Identity server was using a TokenType which Azure AD .NET SDK assumed to be of 2.0 and hence tagged granttype as “2.0” (urn:ietf:params:oauth:grant-type:saml2-bearer). But the assertion was not 2.0, it was actually 1.1.
Hence the error – Request is not a valid SAML 2.0 protocol message.
How to solve this error?
There are two ways to solve this error:
Create a cloud account on customer’s tenant which would not be federated (simple solution), example: email@example.com
Create SAML requests manually, fire to your IdP, modify the TokenType in the code and send this request to Azure AD. You will have to bypass using Azure AD libraries and construct your own requests. (complex solution)
We went ahead with solution 1, used this cloud account as our master account and were able to successfully embed the reports in enterprise internal applications.
We all know that data is the new oil (and insights is the new king). With data getting generated from innumerable sources (Facebook, Twitter, YouTube, Snapchat, Uber, Web traffic, Google Searches), the data security should not just get limited to “contractual terms”.
Here are few facts about data:
“We create as much information in two days now as we did from the dawn of man through 2003.” – Eric Schmidt
90% of the world’s data was generated over the last two years.
Every second, 40,000 searches are performed on Google.
Every minute, 4.1 million YouTube videos are watched.
I must say, data never sleeps!
With data being the center point of everything, it’s a must to secure private and confidential information. We are not trying to solve the world’s data security problem. Instead, through this series of blog post, we would show techniques to anonymize and secure our customer’s data (while preserving analytic utility – re-read this line).
But, wait a second. What are we trying to solve? Through data security techniques, we would want to protect end-users’/end customers’ personal information (Name, email, phone, national Id), and protect other confidential and sensitive information like revenue, salary, internal data, patient health information, trip route, personal chat messages etc.
Removing or encrypting such attributes is not a solution as it would remove data’s analytic utility. Giving all this information without any control is also not a solution since it would lead to data privacy issues. What should we do then? Ideally, we would want to be in the middle of this curve. The privacy and risk should be at an acceptable level while preserving analytic utility.
Are there any methods that can help maintain an appropriate balance between privacy protection and data analytic utility? This is what we would learn in this series of blog posts.
Here’s how we would structure next set of posts:
Types of identifiers (Direct identifiers and Quasi Identifiers)
Methods and Techniques to protect these identifiers
Methods to protect:
Other data protection methods
Top and Bottom coding
Data sharing options
VPN – Protected infrastructure
The purpose of this post was to give a premise of how we protect customer’s data and suggest practical approaches to data anonymization and sharing.
How do you protect this information? What tools and techniques you use? We would love to know.
Namaste! It’s been a tiring month – working on customer projects, building a product prototype, getting work done by my team, phew! – I’m donning multiple hats. Recently we wrapped up two projects on showing heat streams with Power BI. The projects were challenging, and you know customers will take out the best from you. And, it happened with us as well…
Heat streams could be very useful in analyzing large amount of data sets and analyzing patterns or “heats” over a period of time.
Some use cases of heat streams could be:
Analyze call center calls by weekday and time of the day. The time of the day as X-axis and weekdays as Y-axis with the number of calls as “heats”
Perform clickstream analysis for website clicks
Analyze Patient re-admissions and re-admission types in a hospital over a period of years
Usually, in a heat stream visual, we put the time of the day or date or year on X-axis, a discrete or a continuous value on Y-axis, and fill the visual with a discrete or a continuous value with gradient colors.
The code that we developed used ggplot and geom_raster layers along with various settings for formatting axes. This R code combined with Power BI gave us BI capabilities. The visuals were seamlessly sliced/diced based on the data we selected in Power BI. I’m attaching here screenshots of visuals that we created using R and Power BI.
Our customers were wowed by the output they saw from the data. Remember, if data is the new oil, then insights in the new king. And we do this using interesting and stunning visuals.
Note: You need large amount of data to have this kind of output. We can further improvise these visuals to be interactive. This can easily be done using plotly and htmlwidgets library combined with Power BI.
The biggest challenge you will face in plotting such visuals is handling large amount of data points on “x-axis”. You may have to use breaks or cuts to limit the points.
Have you plotted heat streams in Power BI/R? What were the most challenging aspects of your project?
We would love to know.
Note: Next week we will be starting a series of blog posts on how we secure customers data with data anonymization and masking techniques. There are some incredible techniques that we use which give our customers 100% confidence in data security.
Do subscribe to our blog posts to not miss our proven data masking techniques and other interesting articles.
Note: There is a custom visual for plotting heat streams in Power BI, but it cannot generate heat streams anything like what we have shown above.