Process SSAS Tabular Tables and Partitions with TMSL

16 Comments



  1. Just one note: using TMSL is good for small static scripts. Generating dynamic TMSL and/or large scripts is very bad for performance, using TOM from PowerShell or other languages is much better (and simpler).

    1. Author

      That’s great to know. Thanks, Marco!

  2. Hi Chris/ Russo,

    I have a tabular model with more than 1 billion records and we need a daily full load processing. Is there any way I can achieve parallel partition processing using PowerShell script?

    1. Author

      Sanjeet, SSAS Tabular 2016 will process multiple table partitions in parallel by default if processed in the same operation. You can observe this behavior in SSMS if you select multiple table partitions and perform a process data or process full.

      See here – Sequence command (TMSL)

  3. Is there anyway I can dynamically refer to a Partition. I have one partition by quarter and I want to refresh the last 2 quarters whatever those maybe in a name formatted like YYYYQ# (ex 2018Q1).

  4. Hello,

    I have a Tabular model and having been trying to get the data to refresh from the queries through TMSL in a SQL Agent job so that I can view updated data. I’ve tried the above with no luck. Any tips on how I can achieve this goal?

    1. Author

      Are you getting an error? Have you tested the TMSL script directly in SSMS? If you can isolate the problem to the SQL Agent, perhaps it’s a permissions issue? The account used to run the SQL Agent job needs to have sufficient permissions to process SSAS models.

  5. Hello,

    I have two TMSL refresh full script to refresh two separated objects in a Tabular cube :
    Let’s say TMSL1 to refresh full a dimension/partition A and TMSL2 to refresh a dimension/partition B (data independant objects).

    When i run TMSL1 and TMSL2 at the same time, one wait for the other to finish then it process.

    We have tabular 2016 and we know that Tabular 2016 do parallel but only if all tables/partitions are in the same TMSL refresh command.
    We cannot do that because TMSL1 and TMSL2 are executed from different locations (international context to feed a tabular model from different hubs).

    Do you have an idea on how to fix that ? is there any configuration that could be setted up in server to enable that ?

    Many thanks for your retrun,

    Regards,
    Lhoussine EL OMARI

    1. Author

      I’m not aware of a way to process multiple objects concurrently in separate transactions.

  6. HI Chris,
    in your example, is the combination of “data only” and the final “calculate” equivalent to a “full” processing of each table ?
    I am currently using “full” for all the 20 tables in my tabular. This is because I need to get all data modifications (new record, deleted record, updated records) from the data source.

    Do you think this is the right choice ?

    Thanks a lot
    Alberto

    1. Author

      Yes, a Process Data followed by a Process Recalc is equivalent to a Process Full. This is a common pattern if you only need to refresh data in certain tables. If you need to refresh every table in a model, or if the model size is small, a Process Full is the easiest way to process it.

  7. Hello Chris,

    I am trying to do a process ADD. I have the XMLA script for the process add which increments the Data daily into a partition. My question is should I be using a partition for the incremental load or just incrementally load the table? The processing time for an incremental load on a table and an incremental load on a partition of the table seem to be the same. Is this because a process add adds the data to the table or the partition and recalculates the hierarchies, etc of the entire database? I am a little confused what a process add is doing when it runs. We have a tabular model SSAS. I’ve read of Process Data and then Recalc but should that be the way to go or just go with a process ADD for incremental load?

    Thanks!

Leave a Reply (Markdown formatting available)

This site uses Akismet to reduce spam. Learn how your comment data is processed.