Process Azure Analysis Services objects from Azure Data Factory v2 using a Logic App

Posted by

In this blog post I will show how you can orchestrate processing of your Azure Analysis Services objects from Azure Data Factory v2. This makes it possible to process an Analysis Services model right after your Azure Data Factory ETL process finishes, a common scenario.

I will guide you through creating a Logic App that communicates with the Azure Analysis Services REST API and show how you can trigger it’s execution using a webhook from Azure Data Factory v2. The whole process will be executed using an Azure Active Directory App registration that has been granted access to process Azure Analysis Services objects via it’s REST API.

The following figure gives an overview of the process:

Process Overview

Now let me guide you through the following steps:

  1. Create App Registration (application service principal) in Azure Active Directory
  2. Grant App Registration permissions to process your Azure Analysis Services model
  3. Create the Logic App that will communicate with the Azure Analysis Services REST API to process a model
  4. Create an Azure Data Factory pipeline with a Web Activity that will trigger the execution of the Logic App

Create App Registration (application service principal) in Azure Active Directory
Navigate to the Azure Active Directory page. On the Overview page, click “App registrations” followed by “+ New application registration”.
New application registration

Enter a name for your App registration, select Web App / API for application type and enter a dummy Sign-on URL as we don’t intend to use Web Authorization.
Create app registration

In the Settings section for your App registration, click Required permissions.
App Registration Required permissions

In Select an API, type “Azure Analysis Services” into the search box. Then select “Azure Analysis Services”.
App Registration Required permissions

In Select permissions, select “Read and Write all Models” and then click the Select button. Then click Done to add the permissions. It may take a few minutes to propagate.
Select permissions Read and Write all Models

Back in the Settings section for your App registration, now click Keys.
App Registration Keys

Enter a key description, choose a duration and click Save. A new key will be created. Make sure to copy and save it as you won’t be able to retrieve it after you leave this blade.
App Registration Key

The Application Registration is ready. Take note of the Application ID in the settings section for your App registration as you need it later when configuring the Logic App.
App Registration Application ID


Azure Active Directory ID (Tenant ID)
When you will configure your Logic App to use the App Registration we just created, you also need to provide a value for the Tenant property. This reflects to the Azure Active Directory ID in which the App Registration is created.

Navigate to the Azure portal, go to the Azure Active Directory page and click Properties. Now take note of the “Directory ID”.
Azure Active Directory Directory ID

Grant App Registration permissions to process your Azure Analysis Services model
Connect to your Azure Analysis Services server using SQL Server Management Studio. To process models using the API, the App Registration needs Server administrator permissions.

Open the Analysis Services Server Properties, click Security and click Add. You can add the App Registration as a manual entry using the Application ID (app guid) and the Azure Active Directory ID (tenant guid) that you saved before. Use the following syntax:

app:<app guid>@<tenant guid>

Azure Analysis Services Server administrators

 

Create the Logic App
Create a new Logic App from the Azure portal Marketplace and click Edit.
Edit Logic App

Pick the “When a HTTP request is received” trigger from the list of most commonly used triggers.

Click “+ New step”, “Add an action” and choose the “HTTP – HTTP” action.
Your Logic App Designer now contains all the necessary components to be triggered from Azure Data Factory and to communicate with the Analysis Services REST API.
Logic App Designer

We now need to configure the HTTP action. You can find background information regarding the Uri (Base URL) and Body in the REST API for Azure Analysis Services documentation on the Analysis Services Team Blog.

  • Method: POST
  • Uri: https://northeurope.asazure.windows.net/servers/myaas/models/MyModel/refreshes
    The base URL for the Azure Analysis Services REST API to process (refresh) models follows the following format:
    https:// <rollout>.asazure.windows.net/servers/<serverName>/models/<resource>/refreshesIn this URL replace the following values with parts of your Azure Analysis Services Server Name property and model name. In this example my server name was:
    asazure://northeurope.asazure.windows.net/myaas

    • <rollout>: northeurope
    • <serverName>: myaas
    • <resource>: MyModel (model name)
  • Body: Read the REST API documentation on the Analysis Services Team Blog under “POST /refreshes” for more information. In the example below I will refresh my entire database (model) by a process Type Full, but processing specific tables or partitions (objects) is also possible.
    {
     "Type": "Full",
     "CommitMode": "transactional",
     "MaxParallelism": 2,
     "RetryCount": 2,
     "Objects": []
    }
    
  • Authentication: Active Directory OAuth
  • Tenant: Use the Azure Active Directory ID that you saved before
  • Audience: https://*.asazure.windows.net
  • Client ID: Use the App Registration Application ID that you saved before
  • Credential Type: Secret
  • Secret: Use the App Registration Key that you saved before

Your Logic App HTTP Action should now look like this:
Logic App HTTP Action

Test if everything works as expected by running your Logic App. You can fold the HTTP Action open to see the REST API response code and message.
Logic App Run

The Logic App works and is ready to get executed from Azure Data Factory. Open the “When a HTTP request is received” trigger and take note of the HTTP POST URL.
HTTP POST URL

Create the Azure Data Factory
Create a new Azure Data Factory v2 from the Azure portal Marketplace. Click on the “Author & Monitor tile in your provisioned v2 data factory blade to open the visual tools for ADF v2.

In the visual tools, create a new pipeline and drag and drop a Web Activity on the pane.
Azure Data Factory v2 Web Activity

Configuring the Web Activity is easy. Open the Settings tab and specifiy the following property values:

  • URL: Use the HTTP POST URL of the Logic App that you saved before.
  • Method: POST
  • Body: A body is required but not needed for our Logic App to function the way we want, therefore use the Sample:
    {"message":"hello world"}
    

Your Web Activity Settings should now look like this:
Web Activity Settings

Execute a Test Run and check the Output tab to see if everything works as expected.
Azure Data Factory v2 Web Activity Output

Now open your Logic App and check if it ran succesfully.
Logic App Run from ADF

Your Analysis Services model has started processing from Azure Data Factory! In a next blog post I plan to show how you can enhance the Logic App with a workflow that will poll the Analysis Services REST API to check when the process operation is completed.

Update: I have created a more advanced version of this Logic App, read about it here: Process Azure Analysis Services objects using a Logic App part 2

31 comments

  1. Hi,
    I ran the steps above and when I run my data factory pipeline I get following error in MyLogicApp

    {
    “code”: “Conflict”,
    “subCode”: 1,
    “message”: “Internal error AsyncOperationConflictException_Message1.AliTest”,
    “timeStamp”: “2018-02-06T16:26:49.4094919Z”,
    “httpStatusCode”: 409,
    “details”: [
    {
    “code”: “RootActivityId”,
    “message”: “3798339c-6cdc-4f75-9b60-7daad4d8fc9b”
    },
    {
    “code”: “Param1”,
    “message”: “AliTest”

    Like

    1. Hi Jorg,
      Firstly thanks for this post and your reply. I have not been able to process the cube thus far as it always gives me this same error. We tried running it locally from Logic apps and calling it from data factory and in both cases the job doesn’t process. Is there some setting on the SSAS side which we need to modify ?

      Like

      1. Sorry I missed that. I kicked the job off again from data factory and this time is showing me the job succeeded but the cube doesn’t process. Instead in the output section I get the below message:
        “startTime”: “2018-02-09T16:15:25.7689228Z”,
        “type”: “full”,
        “status”: “notStarted”,
        “currentRefreshType”: “full”,
        “objects”: [
        {
        “status”: “notStarted”
        }
        I checked the cube to see if it ran but it doesn’t show time changed of last process.

        Like

        1. My guess is that the message from data factory gets to the MyLogicApp. Because the MyLogicApp says it is successful. and it just doesn’t send a message to kick it off on SSAS Azure side? Can you advise which layer the issue might be or how to trouble shoot for why the cube isn’t starting even though MyLogicApp is saying it successful?

          Like

          1. Ok. I think the issue was that the “database”: “AliTest” command wasn’t kicking off the database but if I did individual tables it did process if specified per table in the ‘Object’ section of the code. Thanks.

            Liked by 1 person

  2. Hey, nice blog. I wonder why you use the logic app though? You could pass all the relevant information into your ADF as parameters and call the Azure Analysis services apis using the web activity. That way you could have one ADF pipeline and toggle environments with parameters. Just a thought.

    Like

    1. Hi Matt,

      Thank you.

      That’s not possible. The Azure AS REST API requires OAuth, which isn’t supported in ADF’s Web Activity: https://docs.microsoft.com/en-us/azure/data-factory/control-flow-web-activity#authentication

      Next to that Logic Apps offer more capabilities to build advanced workflows, which is something I plan write a blog post about soon.

      But to be honest a Logic App is way better in communicating with API’s then ADF should ever be. If OAuth would be there I would still prefer to use a Logic App. In my opinion ADF should be a orchestrator that integrates with all other Azure PaaS compute services for activity dispatches and should not try to replace what other services are already very good at.

      Regards,
      Jorg

      Like

  3. If you are already connecting to a Tabular model’s service, the body of the request can be simplified to: {“type”: “full”, “maxParallelism”: 2}. You can also easily modify the trigger to accept parameters so that you can create one Logic App that can refresh lots of models. “https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-http-endpoint”

    Liked by 1 person

    1. Hi Kevin,

      Thanks for your reply!
      Simplifying the body is a good suggestion and modifying the trigger to accept parameters to refresh multiple different models is indeed one of the many possible scenarios. I plan to write more blog posts on this subject and will definitely mention your suggestion. Thanks!

      Like

  4. what if we are not using AAS, but AS on premise? I need to refresh AS on premise after an ADF data load for our warehouse.

    Like

    1. Hi Mike,
      I would look into using Azure Automation Hybrid Runbook Workers for this scenario. From ADF you can trigger a Azure Automation runbook using a webhook just like I did for the Logic App in this blog post. Hybrid Runbook Workers can execute PowerShell scripts that access on-premises resources.

      https://docs.microsoft.com/en-us/azure/automation/automation-hybrid-runbook-worker
      https://docs.microsoft.com/en-us/azure/automation/automation-hrw-run-runbooks

      Let me know if this works for you!

      Like

  5. Hi, I do not find “SQL Server Analysis Services” when I search this in the text box. May I know how do I add this in the available services list?

    Like

  6. Hi,

    We have followed this article but ends up with huge impact in our tenant. After granting permission to the APP, Azure Analysis services is down which created a impact in our tenant. Can you please clarify below queries.

    1. Granting permission to a AAD app will create any impact in other models in Azure Analysis Service?
    2. How Azure AD permission works?
    3. can we have more than one AAD app for Azure Analysis Service models.

    Thanks,
    K Senthilrajan

    Like

    1. Hi, I don’t recognize your issues. In your article I see you not only granted the required Azure Analysis Services API permissions like described in this blog post, but also Windows AD API permissions. I would like to understand your case: why did you do this and could this be the cause of the problems? Did you get any background information from Microsoft support?

      Like

      1. We were simply told that your AAD app blocked all permissions to Azure Analysis Service. Really not understood how it happened? As well as our development team does not have admin rights to accept the consents, and the AAD app is not shared with any other team. Any idea?

        Like

  7. Really great post!

    Would you recommend this scenario also when refreshing a Power BI dataset. Or should instead Azure automation be considered + a Powershell script that refreshes the API?

    Lastly I saw your comment around oAuth and why logicapps is needed. Where would I be able to find whether Oauth is needed to refresh Power BI datasets?

    Like

  8. Excellent Post! Is it possible to process selective table from Azure analysis Service tabular model? can please you share the code

    Like

  9. Is there anyway to keep the web activity to wait until the cube has finished processing before it moves onto the next step in the pipeline?

    Like

  10. Hi Jorg,
    I am not able to user to Analysis Server manually. Can you please let me know the exact steps to add the user manually ?

    app:@

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s