Table of Contents

Analytics Data Lake Connector

The Analytics Data Lake Connector enables you to export your Business Central data to Microsoft Fabric – Open Mirroring in a secure and structured way.
This connector is designed for customers who want to perform regular data refreshes. The big difference compared to APIs is that this is an outgoing API call instead of an incoming call to Business Central.

In addition, deleted records can be tracked more reliably compared to traditional API integrations.
If new tables are added to Business Central, you can easily include them in the connector without requiring support from Boltrics.

Prerequisites

Before you start, make sure you have:

  • An active subscription that includes the Analytics Data Lake Connector.
  • Access to Microsoft Fabric, including available Fabric capacity.
  • Permission to create an App Registration in Entra ID.
  • Sufficient permissions to access Business Central

Setup Steps

Follow these steps to activate and configure the Analytics Data Lake Connector.

Step 1 — Create an App Registration

  1. In the Azure Portal, search for App registrations.
  2. Click New registration.
  3. Enter a recognizable name (e.g., Boltrics Analytics Connector).
  4. Set the supported account type to Accounts in this organizational directory only.
  5. Once created, copy the Application (Client) ID and Directory (Tenant) ID, you will need these later.
  6. In Certificates & Secrets, create a Client Secret and note the value securely.
Important

The Client Secret is only shown once. Store it securely (e.g., in Azure Key Vault).

Step 2 — Add Capacity in Microsoft Fabric

If you have not allocated capacity for Microsoft Fabric, follow these steps:

  1. In Microsoft Fabric, go to the Capacity settings page and select Fabric capacity.
  2. Below the list of capacities, click the link **Set up a new capacity in Azure.
    This opens the Create Fabric capacity page in Azure in a new browser tab.
  3. On the Azure Create Fabric capacity page, enter:
    • Subscription – Select the Azure subscription to use for the capacity.
    • Resource group – Select the Azure resource group to use.
    • Capacity name – Provide a meaningful name.
    • Region – Select the region for the capacity.
    • Size – Choose the desired capacity size.
    • Fabric capacity administrator – Select one or more admins.
  4. Select Review + create.
  5. Review the details of your capacity and click Create.

Step 3 — Add the Capacity to Your Workspace

  1. Open Microsoft Fabric.
  2. Navigate to the workspace where you want the mirrored data to be stored.
  3. Assign the Fabric capacity you created in Step 2 to this workspace.
Note

If you do not have a workspace yet, create one before assigning capacity.

Step 4 — Grant App Registration Access to the Workspace

  1. In Microsoft Fabric, go to your workspace settings.
  2. Add the App Registration's service principal as a Contributor (or higher).

Step 5 — Create a Mirrored Database

  1. Within your workspace, create a Mirrored Database.
  2. Follow the Fabric prompts to complete the setup.

Step 6 — Copy the Landing Zone Path

  1. Open the mirrored database you just created.
  2. Locate the Landing Zone path.
  3. Copy this value, you will need it in Business Central.
  1. In Business Central, search for 'Data Export - Export to Azure Data Lake Storage'
  2. Enter:
    • Tenant ID (from Step 1)
    • Landing Zone Path (from Step 6)
    • Client ID (from Step 1)
    • Client Secret (from Step 1)
  3. Save your changes.

Step 8 — Select Tables and Fields

  1. On the 'Data Export - Export to Azure Data Lake Storage' page, locate the Tables section.
  2. Click Add, then select the tables you want to export.
  3. For each table, click Choose fields and select the fields that should be included.

Step 9 — Export Schema Definitions

  • Trigger the Export Schema function to generate the necessary structure in the Data Lake.

Step 10 — Export Data or Schedule Exports

  • Either:
    • Run a manual export, or
    • Create a Job Queue entry to export on a recurring schedule.
Tip

For frequent exports, use Job Queue scheduling to automate the process.