Process Azure Analysis Services objects from Azure Data Factory v2 using a Logic App

Posted by

In this blog post I will show how you can orchestrate processing of your Azure Analysis Services objects from Azure Data Factory v2. This makes it possible to process an Analysis Services model right after your Azure Data Factory ETL process finishes, a common scenario.

I will guide you through creating a Logic App that communicates with the Azure Analysis Services REST API and show how you can trigger it’s execution using a webhook from Azure Data Factory v2. The whole process will be executed using an Azure Active Directory App registration that has been granted access to process Azure Analysis Services objects via it’s REST API.

The following figure gives an overview of the process:

Process Overview

Now let me guide you through the following steps:

  1. Create App Registration (application service principal) in Azure Active Directory
  2. Grant App Registration permissions to process your Azure Analysis Services model
  3. Create the Logic App that will communicate with the Azure Analysis Services REST API to process a model
  4. Create an Azure Data Factory pipeline with a Web Activity that will trigger the execution of the Logic App

Create App Registration (application service principal) in Azure Active Directory
Navigate to the Azure Active Directory page. On the Overview page, click “App registrations” followed by “+ New application registration”.
New application registration

Enter a name for your App registration, select Web App / API for application type and enter a dummy Sign-on URL as we don’t intend to use Web Authorization.
Create app registration

In the Settings section for your App registration, click Required permissions.
App Registration Required permissions

In Select an API, type “SQL Server Analysis Services” into the search box. Then select “Azure Analysis Services (SQL Server Analysis Services Azure)”.
Add Azure Analysis Services API access

In Select permissions, select “Read and Write all Models” and then click the Select button. Then click Done to add the permissions. It may take a few minutes to propagate.
Select permissions Read and Write all Models

Back in the Settings section for your App registration, now click Keys.
App Registration Keys

Enter a key description, choose a duration and click Save. A new key will be created. Make sure to copy and save it as you won’t be able to retrieve it after you leave this blade.
App Registration Key

The Application Registration is ready. Take note of the Application ID in the settings section for your App registration as you need it later when configuring the Logic App.
App Registration Application ID

Azure Active Directory ID (Tenant ID)
When you will configure your Logic App to use the App Registration we just created, you also need to provide a value for the Tenant property. This reflects to the Azure Active Directory ID in which the App Registration is created.

Navigate to the Azure portal, go to the Azure Active Directory page and click Properties. Now take note of the “Directory ID”.
Azure Active Directory Directory ID

Grant App Registration permissions to process your Azure Analysis Services model
Connect to your Azure Analysis Services server using SQL Server Management Studio. To process models using the API, the App Registration needs Server administrator permissions.

Open the Analysis Services Server Properties, click Security and click Add. You can add the App Registration as a manual entry using the Application ID (app guid) and the Azure Active Directory ID (tenant guid) that you saved before. Use the following syntax:

app:<app guid>@<tenant guid>

Azure Analysis Services Server administrators


Create the Logic App
Create a new Logic App from the Azure portal Marketplace and click Edit.
Edit Logic App

Pick the “When a HTTP request is received” trigger from the list of most commonly used triggers.

Click “+ New step”, “Add an action” and choose the “HTTP – HTTP” action.
Your Logic App Designer now contains all the necessary components to be triggered from Azure Data Factory and to communicate with the Analysis Services REST API.
Logic App Designer

We now need to configure the HTTP action. You can find background information regarding the Uri (Base URL) and Body in the REST API for Azure Analysis Services documentation on the Analysis Services Team Blog.

  • Method: POST
  • Uri:
    The base URL for the Azure Analysis Services REST API to process (refresh) models follows the following format:
    https:// <rollout><serverName>/models/<resource>/refreshes

    In this URL replace the following values with parts of your Azure Analysis Services Server Name property and model name. In this example my server name was:

    • <rollout>: northeurope
    • <serverName>: myaas
    • <resource>: MyModel (model name)
  • Body: Read the REST API documentation on the Analysis Services Team Blog under “POST /refreshes” for more information. In the example below I will refresh my entire database (model) by a process Type Full, but processing specific tables or partitions is also possible.
    "Type": "Full",
        "CommitMode": "transactional",
        "MaxParallelism": 2,
        "RetryCount": 2,
        "Objects": [
                "database": "MyModel"
  • Authentication: Active Directory OAuth
  • Tenant: Use the Azure Active Directory ID that you saved before
  • Audience: https://*
  • Client ID: Use the App Registration Application ID that you saved before
  • Credential Type: Secret
  • Secret: Use the App Registration Key that you saved before

Your Logic App HTTP Action should now look like this:
Logic App HTTP Action

Test if everything works as expected by running your Logic App. You can fold the HTTP Action open to see the REST API response code and message.
Logic App Run

The Logic App works and is ready to get executed from Azure Data Factory. Open the “When a HTTP request is received” trigger and take note of the HTTP POST URL.

Create the Azure Data Factory
Create a new Azure Data Factory v2 from the Azure portal Marketplace. Click on the “Author & Monitor tile in your provisioned v2 data factory blade to open the visual tools for ADF v2.

In the visual tools, create a new pipeline and drag and drop a Web Activity on the pane.
Azure Data Factory v2 Web Activity

Configuring the Web Activity is easy. Open the Settings tab and specifiy the following property values:

  • URL: Use the HTTP POST URL of the Logic App that you saved before.
  • Method: POST
  • Body: A body is required but not needed for our Logic App to function the way we want, therefore use the Sample:
    {"message":"hello world"}

Your Web Activity Settings should now look like this:
Web Activity Settings

Execute a Test Run and check the Output tab to see if everything works as expected.
Azure Data Factory v2 Web Activity Output

Now open your Logic App and check if it ran succesfully.
Logic App Run from ADF

Your Analysis Services model has started processing from Azure Data Factory! In a next blog post I plan to show how you can enhance the Logic App with a workflow that will poll the Analysis Services REST API to check when the process operation is completed.


  1. Hi,
    I ran the steps above and when I run my data factory pipeline I get following error in MyLogicApp

    “code”: “Conflict”,
    “subCode”: 1,
    “message”: “Internal error AsyncOperationConflictException_Message1.AliTest”,
    “timeStamp”: “2018-02-06T16:26:49.4094919Z”,
    “httpStatusCode”: 409,
    “details”: [
    “code”: “RootActivityId”,
    “message”: “3798339c-6cdc-4f75-9b60-7daad4d8fc9b”
    “code”: “Param1”,
    “message”: “AliTest”


      1. Sorry I missed that. I kicked the job off again from data factory and this time is showing me the job succeeded but the cube doesn’t process. Instead in the output section I get the below message:
        “startTime”: “2018-02-09T16:15:25.7689228Z”,
        “type”: “full”,
        “status”: “notStarted”,
        “currentRefreshType”: “full”,
        “objects”: [
        “status”: “notStarted”
        I checked the cube to see if it ran but it doesn’t show time changed of last process.


        1. My guess is that the message from data factory gets to the MyLogicApp. Because the MyLogicApp says it is successful. and it just doesn’t send a message to kick it off on SSAS Azure side? Can you advise which layer the issue might be or how to trouble shoot for why the cube isn’t starting even though MyLogicApp is saying it successful?


          1. Ok. I think the issue was that the “database”: “AliTest” command wasn’t kicking off the database but if I did individual tables it did process if specified per table in the ‘Object’ section of the code. Thanks.

            Liked by 1 person

  2. Hey, nice blog. I wonder why you use the logic app though? You could pass all the relevant information into your ADF as parameters and call the Azure Analysis services apis using the web activity. That way you could have one ADF pipeline and toggle environments with parameters. Just a thought.


    1. Hi Matt,

      Thank you.

      That’s not possible. The Azure AS REST API requires OAuth, which isn’t supported in ADF’s Web Activity:

      Next to that Logic Apps offer more capabilities to build advanced workflows, which is something I plan write a blog post about soon.

      But to be honest a Logic App is way better in communicating with API’s then ADF should ever be. If OAuth would be there I would still prefer to use a Logic App. In my opinion ADF should be a orchestrator that integrates with all other Azure PaaS compute services for activity dispatches and should not try to replace what other services are already very good at.



  3. If you are already connecting to a Tabular model’s service, the body of the request can be simplified to: {“type”: “full”, “maxParallelism”: 2}. You can also easily modify the trigger to accept parameters so that you can create one Logic App that can refresh lots of models. “”


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s