Under the kover of business intelligence https://sqlkover.com BI, SQL Server and data by Koen Verbeeck Tue, 17 Mar 2026 07:51:42 +0000 en-US hourly 1 https://sqlkover.com/wp-content/uploads/2019/11/cropped-sitelogo-32x32.jpg Under the kover of business intelligence https://sqlkover.com 32 32 Webinar Series – SQL Server Indexing https://sqlkover.com/webinar-series-sql-server-indexing/?utm_source=rss&utm_medium=rss&utm_campaign=webinar-series-sql-server-indexing https://sqlkover.com/webinar-series-sql-server-indexing/#respond Tue, 17 Mar 2026 07:51:39 +0000 https://sqlkover.com/?p=2771 I’m starting a webinar series about SQL Server indexing with the fine folks of MSSQLTips.com. Each “episode” will be about 30 minutes long and will feature a certain topic of indexes in SQL Server (or related database engines). The target audience are people new to SQL Server or new to the concepts of indexing. So […]

The post Webinar Series – SQL Server Indexing first appeared on Under the kover of business intelligence.]]>

I’m starting a webinar series about SQL Server indexing with the fine folks of MSSQLTips.com. Each “episode” will be about 30 minutes long and will feature a certain topic of indexes in SQL Server (or related database engines). The target audience are people new to SQL Server or new to the concepts of indexing. So don’t expect technical deep dives like Brent Ozar or Hugo Kornelis, but a beginner-friendly introduction to the various concepts 🙂

The first webinar is about clustered indexes and is hosted Wednesday the 25th of March 2026 and you can register for free.

MSSQLTips Logo

The post Webinar Series – SQL Server Indexing first appeared on Under the kover of business intelligence.]]>
https://sqlkover.com/webinar-series-sql-server-indexing/feed/ 0
ADF Pipeline Debugging Fails with BadRequest – The Sequel https://sqlkover.com/adf-pipeline-debugging-fails-with-badrequest-the-sequel/?utm_source=rss&utm_medium=rss&utm_campaign=adf-pipeline-debugging-fails-with-badrequest-the-sequel https://sqlkover.com/adf-pipeline-debugging-fails-with-badrequest-the-sequel/#respond Fri, 06 Mar 2026 10:06:46 +0000 https://sqlkover.com/?p=2768 A while ago I blogged about a use case where a pipeline fails during debugging with a BadRequest error, even though it validates successfully. If you’re wondering, this is the helpful error message that you get: In that blog post, the issue was that there were some lingering user properties that were configured incorrectly and […]

The post ADF Pipeline Debugging Fails with BadRequest – The Sequel first appeared on Under the kover of business intelligence.]]>

A while ago I blogged about a use case where a pipeline fails during debugging with a BadRequest error, even though it validates successfully. If you’re wondering, this is the helpful error message that you get:

In that blog post, the issue was that there were some lingering user properties that were configured incorrectly and removing them (or fixing them) resolved the issue. Yesterday, I had the same error again in another pipeline, but I couldn’t find anything wrong with the pipeline and its activities (and yet again, validation didn’t return any errors). So I started removing activities one by one and starting the pipeline over and over to find the culprit.

Turns out, the offending activity was an Execute Pipeline activity. In that child pipeline, there was an issue with an expression (not a syntax error, but rather referencing an output of an activity that couldn’t be referenced). Fixing that expression resolved the BadRequest error in the parent pipeline. Conclusion: when you have a BadRequest situation on your hand, check the pipeline for broken expressions/references, but also any child pipelines.

The post ADF Pipeline Debugging Fails with BadRequest – The Sequel first appeared on Under the kover of business intelligence.]]>
https://sqlkover.com/adf-pipeline-debugging-fails-with-badrequest-the-sequel/feed/ 0
How to Parameterize Fabric Linked Services in Azure Data Factory for Azure Devops Deployment https://sqlkover.com/how-to-parameterize-fabric-linked-services-in-azure-data-factory-in-azure-devops/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-parameterize-fabric-linked-services-in-azure-data-factory-in-azure-devops https://sqlkover.com/how-to-parameterize-fabric-linked-services-in-azure-data-factory-in-azure-devops/#respond Tue, 03 Mar 2026 08:21:32 +0000 https://sqlkover.com/?p=2764 Quite the title, so let me set the stage first. You have an Azure Data Factory instance (or Azure Synapse Pipelines) and you have a couple of linked services that point to Fabric artifacts such as a lakehouse or a warehouse. You want to deploy your ADF instance with an Azure Devops build/release pipeline to […]

The post How to Parameterize Fabric Linked Services in Azure Data Factory for Azure Devops Deployment first appeared on Under the kover of business intelligence.]]>

Quite the title, so let me set the stage first. You have an Azure Data Factory instance (or Azure Synapse Pipelines) and you have a couple of linked services that point to Fabric artifacts such as a lakehouse or a warehouse. You want to deploy your ADF instance with an Azure Devops build/release pipeline to another environment (e.g. acceptance or production) and this means the linked services need to change as well because in those environments the lakehouse or warehouse are in a different workspace (and also have different object Ids).

When you want to deploy ADF, you typically use the ARM template that ADF automatically creates when you publish (when your instance is linked with a git repo). More information about this setup can be found in the documentation. To parameterize certain properties of a linked service, you can use custom parameterization of the ARM template. Anyway, long story short, I tried to parameterize the properties of the Fabric linked service. When you look at the JSON of such a linked service, it looks like this:

{
    "name": "my_fabric_lakehouse_linkedservice",
    "type": "Microsoft.DataFactory/factories/linkedservices",
    "properties": {
        "type": "Lakehouse",
        "typeProperties": {
            "workspaceId": "someId",
            "artifactId": "anotherId",
            "tenant": "yetAnotherId",
            "servicePrincipalId": "moreId",
            "servicePrincipalCredentialType": "ServicePrincipalKey",
            "servicePrincipalCredential": {
                "type": "AzureKeyVaultSecret",
                "store": {
                    "referenceName": "my_keyvault",
                    "type": "LinkedServiceReference"
                },
                "secretName": "service-principal-secret"
            }
        },
        "annotations": []
    }
}

In this case, authentication is done using a service principal and its secret is stored in an Azure Key Vault (for which there’s also a linked service configured). We’re interested in parameterizing the workspaceId, the artifactId and the servicePrincipalId properties. So I added the following piece of JSON to the arm-template-parameters-definition.json file:

    "Microsoft.DataFactory/factories/linkedservices": {
     "*": {
            "properties": {
                "typeProperties": {
                    "workspaceId": "=",
                    "artifactId" : "=",
                    "servicePrincipalId": "="
                }
            }
        }
    }

However, none of those properties were added to the ARMTemplateParametersForFactory.json file in the adf_publish branch. Turns out, the actual JSON in ADF uses Microsoft.DataFactory/factories/linkedservices, while the JSON in the custom parameterization needs to use Microsoft.DataFactory/factories/linkedServices (with a capital S). When I made this little change, the properties were finally added to the ARM parameters file and I could overwrite them in the Azure Devops pipeline:

The post How to Parameterize Fabric Linked Services in Azure Data Factory for Azure Devops Deployment first appeared on Under the kover of business intelligence.]]>
https://sqlkover.com/how-to-parameterize-fabric-linked-services-in-azure-data-factory-in-azure-devops/feed/ 0
dataMinds Saturday 2026 – Slides https://sqlkover.com/dataminds-saturday-2026-slides/?utm_source=rss&utm_medium=rss&utm_campaign=dataminds-saturday-2026-slides https://sqlkover.com/dataminds-saturday-2026-slides/#respond Fri, 20 Feb 2026 13:08:29 +0000 https://sqlkover.com/?p=2761 At Saturday the 21st of February I’m presenting an introduction to dimensional modelling at dataMinds Saturday. It’s a topic close to my heart, and I’ve always wanted to present on it. The slides can be found at GitHub. Hope to see you there!

The post dataMinds Saturday 2026 – Slides first appeared on Under the kover of business intelligence.]]>

At Saturday the 21st of February I’m presenting an introduction to dimensional modelling at dataMinds Saturday. It’s a topic close to my heart, and I’ve always wanted to present on it.

The slides can be found at GitHub.

Hope to see you there!

The post dataMinds Saturday 2026 – Slides first appeared on Under the kover of business intelligence.]]>
https://sqlkover.com/dataminds-saturday-2026-slides/feed/ 0
SSMS 22 still inserting tabs instead of spaces https://sqlkover.com/ssms-22-still-inserting-tabs-instead-of-spaces/?utm_source=rss&utm_medium=rss&utm_campaign=ssms-22-still-inserting-tabs-instead-of-spaces https://sqlkover.com/ssms-22-still-inserting-tabs-instead-of-spaces/#respond Mon, 19 Jan 2026 09:55:42 +0000 https://sqlkover.com/?p=2750 I’m not trying to start up a debate whether you should use tabs or spaces when indenting code. Personally, I prefer spaces because when I copy the code to another editor the outlining of the code remains the same while with tabs it’s not always the case (looking at you, Word and Outlook). But I […]

The post SSMS 22 still inserting tabs instead of spaces first appeared on Under the kover of business intelligence.]]>

I’m not trying to start up a debate whether you should use tabs or spaces when indenting code. Personally, I prefer spaces because when I copy the code to another editor the outlining of the code remains the same while with tabs it’s not always the case (looking at you, Word and Outlook). But I don’t want to hit the spacebar 4 times whenever I want to indent something, so I use the setting “insert spaces instead of tabs”:

Best of both worlds 🙂 However, recently I was writing some T-SQL code and I noticed that tabs were being inserted instead of spaces. I had just installed SSMS 22 so I thought I forgot to change the settings. But this wasn’t the case, the setting was configured correctly. Maybe the issue is with Redgate SQL Toolbelt, which is installed as an extension into SSMS 22. But also there the setting was applied correctly:

However, when hitting the tab key spaces were not being inserted.

Then I noticed something small in the bottom right corner of the query window:

When clicking on “TABS”, you get the option to switch between tabs and spaces:

When switching to spaces, all tabs are automatically converted to spaces (or the other way around).

Likewise, you can set the encoding and the line endings. It seems that these “query window settings” overrule any settings from the options menu or from any extension. I wasn’t aware that these settings existed in the query window, and maybe by hitting some key combination I accidentally switched from spaces to tabs. I checked SSMS 21 and these options are there as well.

Conclusion: if you want to insert spaces instead of tabs and SSMS is refusing to play along even if all settings are correct, check the bottom right corner of your query window to see if spaces are selected.

The post SSMS 22 still inserting tabs instead of spaces first appeared on Under the kover of business intelligence.]]>
https://sqlkover.com/ssms-22-still-inserting-tabs-instead-of-spaces/feed/ 0
Power BI PBIR Format Admin Setting https://sqlkover.com/power-bi-pbir-format-admin-setting/?utm_source=rss&utm_medium=rss&utm_campaign=power-bi-pbir-format-admin-setting https://sqlkover.com/power-bi-pbir-format-admin-setting/#respond Tue, 18 Nov 2025 09:30:45 +0000 https://sqlkover.com/?p=2742 The Power BI Enhanced Report Format (PBIR) will soon become the default, and that’s a good thing because it significantly makes git integration easier. You can already enable it in the preview features of Power BI Desktop (also enable PBIP and TMDL to make git integration of the model itself much easier). There’s also a […]

The post Power BI PBIR Format Admin Setting first appeared on Under the kover of business intelligence.]]>

The Power BI Enhanced Report Format (PBIR) will soon become the default, and that’s a good thing because it significantly makes git integration easier. You can already enable it in the preview features of Power BI Desktop (also enable PBIP and TMDL to make git integration of the model itself much easier).

There’s also a new admin setting in Power BI/Fabric, which is on by default:

I strongly encourage to keep it on 🙂 It will make certain tasks easier:

  • downloading reports (check out the blog post Export a Power BI Report that cannot be Downloaded on how to do this with the Fabric CLI)
  • fetching properties. For example, using the Fabric CLI you can loop over your reports and extract the semantic model ID to which the report is connected to.
  • working with multiple persons on the same report
  • did I already mention git integration?

The post Power BI PBIR Format Admin Setting first appeared on Under the kover of business intelligence.]]>
https://sqlkover.com/power-bi-pbir-format-admin-setting/feed/ 0
Logged in as a member of an Azure AD Group Error while Deploying DACPAC https://sqlkover.com/logged-in-as-a-member-of-an-azure-ad-group-error-while-deploying-dacpac/?utm_source=rss&utm_medium=rss&utm_campaign=logged-in-as-a-member-of-an-azure-ad-group-error-while-deploying-dacpac https://sqlkover.com/logged-in-as-a-member-of-an-azure-ad-group-error-while-deploying-dacpac/#comments Fri, 14 Nov 2025 09:28:25 +0000 https://sqlkover.com/?p=2739 Quite a long title for a short blog post 🙂While deploying a DACPAC (from a SQL Server Data Tools Database Project) through Azure Devops, I got the following error message: The user attempting to perform this operation does not have permission as it is currently logged in as a member of an Azure Active Directory […]

The post Logged in as a member of an Azure AD Group Error while Deploying DACPAC first appeared on Under the kover of business intelligence.]]>

Quite a long title for a short blog post 🙂
While deploying a DACPAC (from a SQL Server Data Tools Database Project) through Azure Devops, I got the following error message:

The user attempting to perform this operation does not have permission as it is currently logged in as a member of an Azure Active Directory (AAD) group but does not have an associated database user account. A user account is necessary when creating an object to assign ownership of that object. To resolve this error, either create an Azure AD user from external provider, or alter the AAD group to assign the DEFAULT_SCHEMA as dbo, then rerun the statement.

Guess the SQL Server team didn’t get the memo that Azure AD has been renamed to Entra ID. Anyway, the Azure Devops pipeline uses a service connection defined in Devops, and in that service connection a user-defined managed identity is configured that has contributor access on the resource group that contains the Azure SQL DB. Furthermore, that managed identity is an actual user in the database, so the error message is completely misleading. The error was thrown when the following SQL script was executed:

CREATE SCHEMA myschema AUTHORIZATION dbo;

Turns out, the managed identity didn’t have the CREATE SCHEMA permissions, and it’s not part of the dbo role, so the CREATE SCHEMA script fails with the error above. I created the necessary schemas with a more privileged user and then the deployment pipeline ran without issues.

The post Logged in as a member of an Azure AD Group Error while Deploying DACPAC first appeared on Under the kover of business intelligence.]]>
https://sqlkover.com/logged-in-as-a-member-of-an-azure-ad-group-error-while-deploying-dacpac/feed/ 1
Export a Power BI Report that cannot be Downloaded https://sqlkover.com/export-a-power-bi-report-that-cannot-be-downloaded/?utm_source=rss&utm_medium=rss&utm_campaign=export-a-power-bi-report-that-cannot-be-downloaded https://sqlkover.com/export-a-power-bi-report-that-cannot-be-downloaded/#respond Thu, 06 Nov 2025 15:28:30 +0000 https://sqlkover.com/?p=2734 Yes, you’re reading that right, we’re going to download a report that cannot be downloaded. Well, it cannot be downloaded from the user interface, that is. Suppose you have a report in a Power BI workspace (Pro, PPU, Fabric, it shouldn’t matter), and they’ve lost the original Power BI Desktop file. You try to download […]

The post Export a Power BI Report that cannot be Downloaded first appeared on Under the kover of business intelligence.]]>

Yes, you’re reading that right, we’re going to download a report that cannot be downloaded. Well, it cannot be downloaded from the user interface, that is. Suppose you have a report in a Power BI workspace (Pro, PPU, Fabric, it shouldn’t matter), and they’ve lost the original Power BI Desktop file. You try to download the report, but for some reason the GUI doesn’t let you:

There are many reasons why a report can’t be downloaded (you don’t have permissions for example), but one reason is that the report has a live connection to a semantic model (or an SSAS model). You can find the full list of limitations in the documentation. There are a couple of work arounds (which might or might not work):

  • editing the report in the browser and saving it somewhere else, then trying to download that new report
  • using an API to update the report

Another method is to use the Fabric CLI. This free command line tool allows you to all sorts of admin stuff in Microsoft Fabric, but also in Power BI. And there’s an export command! Open your command prompt of choice, log into your Fabric/Power BI tenant and navigate to the desired workspace using the fab cd command. Then, try to export the report using the following command:

fab export "my_report.Report" -o "C:\myfolder\" -f

The -o switch specifies the output directory, while the -f switches “forces” the export, but this means that the fab cli tool doesn’t ask “are you sure? Y/N” (which is useful for automation where no user input is available). In my case (your mileage might vary, let me know in the comments), the report was extracted to a folder in the PBIRS format:

There’s no .pbip file, but if you look inside the folder, there’s a definition.pbir file and if you double click that one, it will also open Power BI Desktop.

The post Export a Power BI Report that cannot be Downloaded first appeared on Under the kover of business intelligence.]]>
https://sqlkover.com/export-a-power-bi-report-that-cannot-be-downloaded/feed/ 0
dataMinds Connect 2025 – Slides & Scripts https://sqlkover.com/dataminds-connect-2025-slides-scripts/?utm_source=rss&utm_medium=rss&utm_campaign=dataminds-connect-2025-slides-scripts https://sqlkover.com/dataminds-connect-2025-slides-scripts/#respond Wed, 08 Oct 2025 07:52:02 +0000 https://sqlkover.com/?p=2729 You can find all the session materials for the presentation “Indexing for Dummies” that was presented at the dataMinds Connect 2025 conference at GitHub.

The post dataMinds Connect 2025 – Slides & Scripts first appeared on Under the kover of business intelligence.]]>

You can find all the session materials for the presentation “Indexing for Dummies” that was presented at the dataMinds Connect 2025 conference at GitHub.

The post dataMinds Connect 2025 – Slides & Scripts first appeared on Under the kover of business intelligence.]]>
https://sqlkover.com/dataminds-connect-2025-slides-scripts/feed/ 0
Cloud Data Driven User Group 2025 – Slides & Scripts https://sqlkover.com/cloud-data-driven-user-group-2025-slides-scripts/?utm_source=rss&utm_medium=rss&utm_campaign=cloud-data-driven-user-group-2025-slides-scripts https://sqlkover.com/cloud-data-driven-user-group-2025-slides-scripts/#respond Thu, 25 Sep 2025 17:26:05 +0000 https://sqlkover.com/?p=2726 The slidedeck and the SQL scripts for the session Indexing for Dummies can be found on Github.

The post Cloud Data Driven User Group 2025 – Slides & Scripts first appeared on Under the kover of business intelligence.]]>

The slidedeck and the SQL scripts for the session Indexing for Dummies can be found on Github.

The post Cloud Data Driven User Group 2025 – Slides & Scripts first appeared on Under the kover of business intelligence.]]>
https://sqlkover.com/cloud-data-driven-user-group-2025-slides-scripts/feed/ 0