Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
Anonymous
Not applicable

Test Automation Framework Using Dax Studio

Hello,

I have been able to write queries and test individual test cases in Dax. However I would like to have those in a framework so that Reusubale components can be captured. Is there a framework followed in Dax studio for automation? 

 

Thanks,

Vinodh

 

 

 

 

1 ACCEPTED SOLUTION
technolog
Honored Contributor

Now the useful part. You can still get solid automated tests for DAX by running queries against the model through XMLA and asserting on the results. Think of DAX Studio as your interactive workbench and use a separate runner for automation.

Plan of attack

  1. Pick a test runner you like. Pester in PowerShell works well. Pytest in Python is fine. xUnit or NUnit in C Sharp also fine. Any runner that can call a library and compare values will do.
  2. Connect to the model through XMLA. Use the local Desktop instance for fast inner loop runs or the service endpoint for shared datasets. Use ADOMD in dot NET or pyadomd in Python to send DAX and read back a table or a single value.
  3. Organize tests by feature. One assertion per query. Keep tests small and deterministic. Store expected values in plain text such as csv or json.
  4. Write three kinds of checks

Scalar checks
Send a query that returns one value and compare to an expected number or text. Add a tolerance for floating point values.

Small table checks
Send a query that returns a tiny table. Sort columns in the query. Compare to a golden csv. Fail the test if there is any extra row or any mismatch.

Model health checks
Query DMVs such as TMSCHEMA tables to assert that measures exist, are formatted, and use the right data types. This catches silly regressions before they reach users.

  1. Add performance budgets that do not depend on DAX Studio. Query last queries DMVs to read duration and row count after each test. Keep a ceiling per test. If a query crosses the ceiling, fail the test with a clear message.
  2. Put it in continuous integration. Spin up a test dataset, load a small seed of data, run the suite, publish a short report. Keep logs under version control so that failures are easy to diff.

Practical tips that save time

Freeze time logic. If your measures depend on Today, add a parameter table with a fixed date for tests and use it inside your time intelligence.

Stabilize text and numeric formatting. Compare rounded values when needed. Trim whitespace before comparing text.

Use tiny fixtures. Seed the model with the smallest data that still exercises the logic. This keeps tests fast and readable.

Be strict about sort order. Always order the result in the query before comparing tables.

 

View solution in original post

2 REPLIES 2
Anonymous
Not applicable

You're out of luck, as far as I know. There is no testing framework for DAX that would resemble the frameworks for OOP languages. I'm not sure but DaxStudio can't be automated the way Excel can with VBA.

Best
D
technolog
Honored Contributor

Now the useful part. You can still get solid automated tests for DAX by running queries against the model through XMLA and asserting on the results. Think of DAX Studio as your interactive workbench and use a separate runner for automation.

Plan of attack

  1. Pick a test runner you like. Pester in PowerShell works well. Pytest in Python is fine. xUnit or NUnit in C Sharp also fine. Any runner that can call a library and compare values will do.
  2. Connect to the model through XMLA. Use the local Desktop instance for fast inner loop runs or the service endpoint for shared datasets. Use ADOMD in dot NET or pyadomd in Python to send DAX and read back a table or a single value.
  3. Organize tests by feature. One assertion per query. Keep tests small and deterministic. Store expected values in plain text such as csv or json.
  4. Write three kinds of checks

Scalar checks
Send a query that returns one value and compare to an expected number or text. Add a tolerance for floating point values.

Small table checks
Send a query that returns a tiny table. Sort columns in the query. Compare to a golden csv. Fail the test if there is any extra row or any mismatch.

Model health checks
Query DMVs such as TMSCHEMA tables to assert that measures exist, are formatted, and use the right data types. This catches silly regressions before they reach users.

  1. Add performance budgets that do not depend on DAX Studio. Query last queries DMVs to read duration and row count after each test. Keep a ceiling per test. If a query crosses the ceiling, fail the test with a clear message.
  2. Put it in continuous integration. Spin up a test dataset, load a small seed of data, run the suite, publish a short report. Keep logs under version control so that failures are easy to diff.

Practical tips that save time

Freeze time logic. If your measures depend on Today, add a parameter table with a fixed date for tests and use it inside your time intelligence.

Stabilize text and numeric formatting. Compare rounded values when needed. Trim whitespace before comparing text.

Use tiny fixtures. Seed the model with the smallest data that still exercises the logic. This keeps tests fast and readable.

Be strict about sort order. Always order the result in the query before comparing tables.

 

Helpful resources

Announcements
Users online (9,584)