Azure Analysis Services deploy via DevOps

Let’s start from real life! What are the advantages in an Enterprise scenario? We can have multiple teams, with different skills and responsibilities. There are processes to take into account and multiple people have different permissions to access tools, environments and data. Therefore the person in charge of deploying a new tabular model should be able to do it in one single click without even knowing how the model has been developed

And that’s where DevOps comes into play!

Azure DevOps (former VSTS), allows to manage backlog items, work items, task assignments and prioritization, everything via easy to use boards.

Azure DevOps board

Besides it’s possible to maintain, test and deploy every piece of code of your software, from a chatbot to whatever application, even a database or a tabular data model.

The concept of DevOps is a best practice that could/should be used in every computer science project.

In this article we will see how to be able to deploy a tabular model in one click without knowing anything of the model (how is done, what it contains, what are the credential to access the datasource…) assuming that the reader already knows:
– AzureDevOps environment
– How to manage projects in Visual Studio
– Git concepts

Once you’ve set up you Tabular Model project in visual studio you can sync it on Azure DevOps and you’ll have an organization like what is shown in the picture below:

Repository containing the AAS Project

The .bim file is the core file we need to work with when a new version of the model is developed and needs to be deployed.

If you are a developer you know that you can simply publish a model to the server from Visual Studio, but to make it to work you also need the credentials for the data source and being administrator of the server … probably you won’t have those infos for your customer’s production environment. In general the idea is to modularize the process so that those credentials don’t need to be shared across the organization and the person in charge of doing the deployment won’t need any tools installed on his/her machine.

If you don’t have this package install it in the Azure DevOps environment. Once it’s done let’s set up the Release:

Case in which we already have other pipelines
Case in which we don’t have other pipelines already

We can choose EMPTY JOB here:

Let’s create a simple artifact based on Git Type where to indicate the repository where our project is. This will allow later to find the path of the .bim file within the job’s task configuration

Click on the link just below the “Stage 1” block (1 job, 0 task):

In the new screen that will appear we need to click on the “+” sign:

We’ll be asked to choose to add a task, so we type “analysis services” in the search bar as shown below and we select Azure Analysis Services (if you’ve not installed it previously you’ll be asked to install it from the MarketPlace, so click on the “Get it free” blue button)

Get it free from the market place
Installation step

Otherwise if already installed we’ll see it directly available and it will be possible to add it:

Once added it will show that it needs attention (“Some settings need attention”), don’t worry here, it’s normal. Click on it!

At this point it’s time to add all information of the server and credentials BUT here it’s also possible to put PARAMETERS. In this case we are going to put them and make them secret, but in other cases, with your customer for example, someone else will insert those credentials and we would just use the corresponding parameters.

Let’s first insert them like this:
Analysis Services Admin = $(modelAdmin)
Analysis Services Admin Password = $(modelPassword)
Source User Login = $(dwhadmin)
Source Password = $(dwhpwd)
Data Source Type = Azure SQL
Let’s indicate the path of the famous .bim file:
Model file = $(System.DefaultWorkingDirectory)/……./model.bim

(It is possible to user a service principal as well instead of the “Named User” option: for doing this it’s necessary first to insert the Service Principal as Analysis Services Admin, which cannot be done directly from the portal, but by following this official guide:

And here is where to set up variables:

Once that the pipeline is set up for the deployment you can go back to the Releases, selecting this pipeline and click on Create Release

It’s possible to choose to have the stage directly run after creation of the release, or you just create the pipeline and manually start it later ( I personally choose the latter)

At this point you can click on the new release created and then deploy it….in one click as promised !

Now for the following deployment it will be enough to upload the new .bim file, create the new release and click on deploy again.

It is possible to deploy Azure Analysis Services model without having admin permission on the server by giving access to the desired user to access DevOps; once the .bim model file is in the folder within DevOps (that is actually the directory containing the AAS project which should contain the solution files) it is now possible to deploy it with a single click.

  • Operations: if the IT group of the project has to maintain all the releases of every part of a solution it doesn’t mean it also owns deep technical knowledge of the technologies deployed. That knowledge is of the development team.
  • Productivity: in this way the deployment is a simple click, and it’s a package prepared for people in charge of this task.
  • Password management: passwords are encrypted and no need to know the data sources’ credentials to be able to do a deployment
  • Versioning: the Analysis Services project is not only a model on the server but it’s a project (.smproj) a solution in visual studio, and the .bim file contains the model structure, all its measures, table, M queries and DAX formulas.

32 thoughts on “Azure Analysis Services deploy via DevOps”

  1. I loved it. you have described in simple and best way. It was awesome, i hope i will get more knowledge from you about different technologies deployment.

    1. Hello Ashish, thanks a lot for your feedback. Happy you’ve found the article useful. Are you used to Azure DevOps? Are there any specific technologies deployment you are interested in?


    Thanks for the blog.
    Following your guide for my Azure DevOps for AAS has been quite straightforward up until:
    I’m struggling to understand how to include the path to my model.bim.
    Can you let me know the exact format for the UNC path to model.bim?

    1. lucavallarelli

      Hello Carlton, sure thing: actually you have a 3 dots menu close to the text box. If you click on them you’ll have the directory tree of your project, so you just open the folder within you have your .bim file and the path will be generated automatically. In my precise case within the repo i’ve a folder called as you can see in the second screenshot of the article. So the folder is
      more generally:
      But go for the 3 dots menu looking for the folder where you have the .bim file


        Hi Luca, thanks so much for getting back in touch… I didn’t think you would 🙂

        So, I click on the eclipses you mentioned, however I get a blank screen with the message:

        No artifact sources are linked to this release pipeline. Link artifacts from the pipeline tab.

        However, I have I’ve set up my Tabular Model project in visual studio and sync’d it on Azure DevOps and and I have an organization similar to one you have in your demonstration.

        One question, did you create an artifact that wasn’t included in your demo?

        (once again, fantastic blog)

        1. lucavallarelli

          Hello Carlton, thank you very much for your comment. you are totally right and i’ve just updated the article with a new screenshot where to add the artifact. The artifact is of type Git, and it’s for defining exactly what you were missing: the repository definition of the project within the pipeline. Now adding it you’ll find also the path when configuring the job’s task. Hope it will work more smoothly now


            Hi Luca,

            Thanks again for getting in touch.
            I thought there was a missing piece to the jigsaw puzzle. I’ve been scratching my head all week 🙂

            I even engaged Microsoft to assist me ha ha.

            I will take a look at your update now … hopefully all will go smoothly from here.

            Thanks again mate… if you’re ever in London let me know and I’ll take you out for a beer.


  3. Carlton Patterson

    Hi Luca,

    Just wanted to give you a quick shout out to let you know that after checking out your update everything checked out.

    Thanks again.

  4. I’m still learning from you, but I’m improving myself. I definitely liked reading everything that is posted on your website.Keep the information coming. I liked it!

  5. I?¦ve learn several good stuff here. Definitely price bookmarking for revisiting. I wonder how so much attempt you set to make one of these excellent informative web site.

  6. Hi,

    Can you please specify the powershell requirements this task would need if I am using custom build agent?

    1. lucavallarelli

      Hello Ashvini, actually as far as I imagine, I wouldn’t see particular powershell requirement for this task. Can you share more details on the configuration you would work with ?

  7. Hi Luca,
    Its a great article with screenshots put together.
    I followed it and tried to deploy an aas model, the problem i am facing is the inputs taken as source server and database are not getting reflected to the model.bim. I guess the reason is that there is no build step with the source details taken as we do in Visual studio.
    Could you suggest or provide a fix.


    1. lucavallarelli

      Hello bsingh reading the doc of the package it states “Also option to change the connected source datasource during release” That’s the useful thing actually. Have you tried running the pipeline with system.debug set to true to see more details? you could put the build as I’ve done here for the Database project but it means that you’ll have set up the info in your local project in advance. Did you find data from another source, or you just check via SSMS the source of the AAS after having refreshed the view?

      1. I was able to resolve it with some other plugin – “Replace In Files Between Text”. I replace the connection string to required source server and database before using – “Azure Analysis Service Deployment”.
        The next challenge i am working on is multiple Model deployment with same pipeline.

  8. Shubham Pathak

    Exception setting “name”: “The property ‘name’ cannot be found on this object. Verify that the property exists and can be set.”

    getting above error. followed all the instruction. can you help me with this

    1. lucavallarelli

      Hello Shubham, are we talking about the global variables in devops? Have you put them in the format $(variableName) ?
      where did you put the property name?

  9. I was able to resolve it with some other plugin – “Replace In Files Between Text”. I replace the connection string to required source server and database before using – “Azure Analysis Service Deployment”.
    The next challenge i am working on is multiple Model deployment with same pipeline.

  10. hey Thanks for posting the blog!

    We are having an error though
    2020-01-09T15:51:15.1228805Z ##[command] Select-AzureRMSubscription -SubscriptionId xxxx-xxx-xxx-xxx-xxxx-TenantId ***
    2020-01-09T15:51:17.8623910Z ##[error]Error during adding automatic firewall rule (A connection cannot be made. Ensure that the server is running.)
    2020-01-09T15:51:17.9489812Z ##[error]A connection cannot be made. Ensure that the server is running.

    Not sure which firewall the error is referring to.

    Thanks again and awesome blog

    1. lucavallarelli

      Hello Jacob, thanks for your comment. Are you using variables? Or writing directly the string in the fields? Please double check that variables are well written in the right format $(varName). Once I got the same error and was for a typo. Secondly: how many datasources do you have in your model? Ex.: do you simply have one data source and it a Sql server (db or dwh), or you have also other sources like a CosmosDB or whatever else? Because I noticed this package for now works properly only with one datasource of Sql type.
      Hope this helps
      and do not hesitate to comment back with further infos

  11. I have Azure Analysis Service tabular model with an Oracle Data Warehouse as the source? Is it possible to use the DevOps pipeline with the Oracle source ? Is there any update planned to include Oracle DB as source.

  12. This is really useful. I am trying to follow the same steps. However while deploying the model using SPN I am facing the below issue :
    Error during adding firewall rule (Could not find server: ‘xxx’ in any resource group in the currently selected subscription: xxxxx. Please ensure this server exists and that the current user has access to it.)
    Could not find server: ‘xxxx’ in any resource group in the currently selected subscription: xxx. Please ensure this server exists and that the current user has access to it.
    Can you please help ?

    1. lucavallarelli

      Hey Monica: did you try running the pipeline with the variable ‘system.debug’ set to true? ( For having more insights
      Does AAS and the DB have the connection to azure services enabled?

      Also I would try to take a simple AAS model, juse one table in it, and try to publish it in a “safe environment” to isolate the problem, and understand if it comes from the account you are using, the azure config, or actually firewall rules on the server


Leave a Comment

Your email address will not be published. Required fields are marked *