NAV2BC Data Bridge is a web-based migration utility designed to facilitate the transfer of data from legacy Microsoft Dynamics NAV (SQL Server) environments to Microsoft Dynamics 365 Business Central (Cloud).
The application functions as a bridge, allowing users to define data mappings between source SQL tables and target Business Central API resources. It handles the extraction, transformation, and loading (ETL) of data through a centralized web interface.
To host and run the application, the following requirements must be met:
- Operating System: Windows, Linux, or macOS.
- Runtime: Python 3.9 or higher.
- Database Driver: ODBC Driver 18 for SQL Server (must be installed on the machine running the application).
- Network: Access to the legacy SQL Server (typically via VPN) and outbound internet access to Microsoft Azure/Business Central APIs.
-
Install System Dependencies Ensure the ODBC Driver 18 for SQL Server is installed. This is required for Python to communicate with the MSSQL database.
-
Download the latest nav2bc wheel, for example
nav2bc_data_bridge-0.1.0-py3-none-any.whl. -
Prepare the Python Environment Open a terminal or command prompt in the project directory.
python -m venv venv source venv/bin/activate # On Windows use: venv\Scripts\activate pip install nav2bc_data_bridge # You must include the full name and extension of the wheel file here
Create a
.envfile (or copy the provided.env.example) and add aSECRET_KEY. You can use the following command to generate a random key yourself:python -c 'import secrets; print(secrets.token_hex())' -
Launch the Application Start the web server, for example with gunicorn:
gunicorn --workers 4 --bind 127.0.0.1:8000 nav2bc.app:app # MacOS and Linix only (use waitress, etc. on Windows)
The application workflow is divided into three main stages: Connections, Mappings, and Execution.
This tab establishes links to the source and target systems.
Legacy NAV (Source):
- Server Host/IP: The IP address or hostname of the SQL Server.
- Port: Default is 1433.
- Database Name: The name of the specific SQL database containing NAV data.
- Authentication: Enter the SQL Username and Password. If your environment uses Windows Authentication, leave these fields blank (this requires the application to run on a Windows machine joined to the domain).
- Connect: Click "Connect" to validate settings. Upon success, select the specific NAV "Company" from the dropdown list.
Business Central (Target):
- Tenant ID: The Azure Active Directory Tenant ID.
- Client ID: The Application (Client) ID registered in Azure.
- Company ID: The 36-char GUID of the target Business Central company.
- Environment: Typically "Production" or "Sandbox".
- Authentication: Click "Log In". A device code will be displayed. You must open the provided Microsoft link in a new tab, enter the code, and sign in. The application will automatically detect when authentication is complete.
Note: Successful connections will lock the input fields. Use the "Disconnect" buttons if you need to change connection details.
This tab defines how data moves from the source to the destination.
Creating Mappings:
- Source Table: Select a table from the dropdown list (populated from NAV metadata).
- Source Column: Select a column to read data from (populated based on the selected table).
- Target Resource: Select the Business Central API endpoint (populated from BC metadata).
- Target Field: Select the field in Business Central to populate.
- Mapping Type:
- DIRECT: Copies the value as-is.
- CONSTANT: Ignores the source and writes a fixed value defined in the "Rule" field.
- TRANSFORM: Applies logic to the source value (e.g.,
regex_remove{[a-zA-Z]}to strip letters, or value replacement maps).
- Add: Click the "Add" button to append the rule to the global mapping list.
Validation: The system attempts to validate that the selected tables and fields exist in the connected environments. Warnings will be displayed if a mapping references a field that cannot be found in the current metadata.
Profiles:
- Save Profile: Exports the current list of mappings and prerequisites to a JSON file.
- Load Profile: Imports a JSON file to restore a previous configuration.
Optimization: When ready, click "Optimize & Add to Queue". The system analyzes the global mapping list and groups them into logical jobs based on the Source Table and Target Resource. These jobs are then sent to the Execution tab.
This tab manages the running of migration jobs.
Job Queue: Displays the list of optimized jobs waiting to run. You can remove individual jobs or clear the entire queue.
Prerequisites: You can define pre-flight checks to ensure the target environment is ready before data migration begins. Common checks include:
- DATA_EXISTS: Verifies that specific reference data (e.g., a Country Code 'US') exists in Business Central.
- SETUP_CONFIG: A manual checklist item for the user.
Running Migration:
- Preview Batch: Click this button to see a summary of the pending operations. It will display which Business Central resources will be targeted and the total number of records to be written to each.
- RUN BATCH: Starts processing the queue. The application uses background threading to process the queue without freezing the user interface.
Debug Console: A collapsible console at the bottom of the screen displays real-time logs, progress updates, and error messages returned by the SQL Server or Business Central API.
Architecture:
- Backend: Python Flask acts as the web server and API controller. It handles SQL connections using
SQLAlchemy/pyodbcand HTTP requests to Business Central using therequestslibrary. - Frontend: HTML5, Bootstrap 5, and vanilla JavaScript handle the user interface. State is managed via browser
localStorageto persist form data across reloads. - Concurrency: Long-running migration tasks are offloaded to background threads. The frontend polls the server for status updates and logs.
Authentication:
- SQL: Supports SQL Server Authentication and Windows Trusted Authentication.
- Business Central: Uses OAuth 2.0 Device Code Flow. Access tokens are stored securely in server-side sessions.
Data Flow:
- Fetch: Data is streamed from SQL Server in chunks (default 100 rows) to minimize memory usage.
- Transform: Python logic applies transformation rules (regex, value replacement) in memory.
- Push: Data is sent to the Business Central API using OData
$batchrequests.
See Issues for requested features and urgent updates.
-
Environment setup:
python -m venv venv source venv/bin/activate # On Windows use: venv\Scripts\activate
-
Install dependencies, from the root directory:
pip install -r requirements.txt
-
From the root directory:
python -m build --wheel
-
Publish the generated
.whlfile underdist/as a New release
- Batch Size: The application is hardcoded to process data in batches of 100 records to comply with Business Central API limits.
- Complex Joins: The application does not support SQL JOINS. Mappings must be one-to-one between a source table and a target resource.
- Network Latency: Performance is heavily dependent on the network speed between the machine running the bridge, the SQL Server, and the Azure Cloud.
- Schema Changes: If the source SQL schema or target Business Central API schema changes, mapping profiles may become invalid and require updating.
- Authentication Expiry: OAuth tokens have a limited lifespan. Long-running batches may require re-authentication if they exceed the token duration.