Microsoft Azure is a cloud computing service created by Microsoft for building, testing, deploying, and managing applications and services through Microsoft’s managed data centres.

It provides software as a service (SaaS), platform as a service (PaaS) and infrastructure as a service (IaaS) and supports many different programming languages, tools and frameworks, including both Microsoft-specific and third-party software and systems. Here, I will share some tips & tricks using Azure Data Factory, Azure Databricks, Logic Apps, Power BI and others.

 

Using Logic Apps to send E-Mail from Azure Data Factory

Azure Data Factory doesn’t contain activity for sending an e-mail. However, Azure Data Factory can utilize Azure Logic Apps to send e-mail content and attachments. Azure Data Factory will trigger the Azure Logic App and pass parameters like from, to, attachment content etc.,

    • Add an HTTP activity to trigger the Logic App and pass the parameters from Azure Data Factory
    • Add a Send E-Mail activity to send the mail content and attachment to the recipienttips-tricks-azure

 

  • In Azure Data Factory, use a web activity to trigger the Logic App
  • Add the POST URL and Parameters into web activity.
  • To learn more detail about the triggering logic app from the data factory and pass parameter. Please visit this link.

Starting an Azure Databricks Cluster from Azure Data Factory

Azure Databricks will automatically start the cluster when we hit the notebook. However, it can take several minutes to start the cluster, around 2 to 3 minutes. If you think that would impact your performance, you can start the cluster through an API before starting the primary job.

Let’s see how we can do this using the API.

  • Add a web activity to the Azure Data Factory Pipeline
  • In the URL section, copy the below URL and paste it in your URL section
    • https://eastus.azuredatabricks.net/api/2.0/clusters/start
    • azuredatabricks.net may need to change based on your Azure location
  • Choose Method as ‘POST’
  • In the header section add Authorization and pass access the token generated in Databricks
  • In the body section, Pass Cluster ID to starttips-tricks-azure

Using the Azure Key Vault for mounting ADLS in Databricks

Databricks provides options to mount ADLS into DBFS. At the time of mounting ADLS, we need a Service Principle Key from ADLS. The Service Principle Key is very confidential and should not be shared or hardcoded. We can utilize the Azure Key Vault to store the Service Principle Key securely and provide the key to Databricks.

The below example code will get the key from the Azure Key Vault and mount the ADLS to DBFS.

Code

tips-tricks-azure

tips-tricks-azure

Refreshing Power BI Datasets using ADF

There is a common need to refresh your Power BI Dataset automatically when data is loaded to your target table or view. Here, we going to demonstrate using a Power BI API from Azure Data Factory to trigger the refresh.

After your target table is loaded through a copy activity, add a web activity and set the access token using the native app as below:

URL – https://login.windows.net/common/oauth2/token/

Method – POST

Body grant_type=password&resource=https://analysis.windows.net/powerbi/api&client_id={Your Native APP Client ID}&username={org_mailid} &password={Password}&scope=openidtips-tricks-azure

Add another web activity to refresh the dataset using the access token retrieved from first web activity.

URL –  https://api.powerbi.com/v1.0/myorg/groups/{Group_ID} /datasets/{Dataset_ID_Yougoingto Refresh} /refreshes

Method – POST

Header – Append access token from previous web activity like below.

Body –  Based on your need, you can give any information on the body.tips-tricks-azure

 

Learn more about Visual BI’s Microsoft Azure offerings here.

Subscribe to our Newsletter

5920 Windhaven Pkwy
Plano TX 75093.

+1 888-227-2794

+1 972-232-2233

+1 888-227-7192

solutions@visualbi.com