Skip to main content

Metadata import with interface tables

Interface Tables allow you to import metadata into Dataedo from any source by populating a set of predefined tables in the Dataedo repository. This is useful when Dataedo doesn't have a native connector for your data source.

Interface tables concept

When to use Interface Tables

Use interface tables when you need to:

  • Document a system without a native connector - custom applications, APIs, file-based data, or legacy databases
  • Automate metadata loading - generate metadata from scripts or CI/CD pipelines and load it automatically
  • Combine metadata from multiple tools - merge metadata from various sources into one documentation entry
  • Include external lineage - document data flows that involve objects outside of Dataedo
  • Import reporting objects - document dashboards, visualizations, and datasets
info

For a UI-based approach without SQL, consider using Copy & Paste Import which uses interface tables under the hood.

For a complete working example, see the Multi-database pipeline tutorial which demonstrates documenting a CRM → Ecommerce → Data Warehouse → Reporting pipeline with cross-database lineage.

How it works

  1. You populate the import_* tables with the metadata of your database objects
  2. You use the Interface Tables connector in Dataedo Desktop to import the data
  3. Dataedo reads the interface tables and creates documentation

Think of each database_name as a documentation "snapshot." When you run an import, Dataedo reads all rows for that database_name and merges them into the documentation:

  • Objects are matched by unique keys (described below). Keep those keys stable over time.
  • If a row disappears, removal depends on import settings (whether you choose to remove missing objects).
  • If a row changes, Dataedo updates the documented object on the next import.

For reporting objects (reports, dashboards, visualizations), use the Interface Tables Reporting connector to create a Reporting data source.

Running the import

If data is already present in the Dataedo Repository, you can start importing with the connector.

Interface tables connector
info

Remember that importing comments or descriptions to Custom Fields is available in advanced settings.

Interface tables reference

You should put your data in the following tables:

TableDescription
import_tablesTables, views, structures, and datasets
import_columnsColumns for tables, views, structures, reports, and datasets
import_tables_keys_columnsPrimary keys and unique keys with their columns
import_tables_foreign_keys_columnsForeign keys with their columns
import_proceduresProcedures and functions
import_parametersProcedure and function parameters
import_triggersTriggers
import_reportsReports, dashboards, and visualizations
import_data_lineageColumn-level and object-level data lineage
import_data_lineage_processesAttributes for data processor processes (ETL steps/tasks)
import_linked_sourcesExternal systems for building cross-database lineage

Output tables:

  • import_errors - list of errors from the import process

More details about table schemas are available in Dataedo Repository Schema.

Valid object types

Before using interface tables, understand what object types are valid in a given context.

Quick reference

ContextValid Values
Tables (import_tables.object_type)TABLE, VIEW, STRUCTURE, DATASET
Reports (import_reports.object_type)REPORT, DASHBOARD, VISUALIZATION
Procedures (import_procedures.object_type)PROCEDURE, FUNCTION
Triggers (import_triggers.trigger_type)TRIGGER
Function typesScalar-valued, Table-valued, Aggregate

Keys and parameters do not follow the ObjectType/ObjectSubtype pattern:

ContextFieldValid Values
Key types (import_tables_keys_columns.key_type)key_typePK, UK
Parameter modes (import_parameters.parameter_mode)parameter_modeIN, OUT, INOUT
useful tip

For most use cases, set object_subtype to match object_type (e.g., TABLE/TABLE, VIEW/VIEW). Use specific subtypes like MATERIALIZED_VIEW or EXTERNAL_TABLE when you need more granular classification.

Object subtypes by import table

Database objects → import_tables

object_typeCommon SubtypesUse Case
TABLETABLE, EXTERNAL_TABLE, FOREIGN_TABLE, FILE_TABLE, SYSTEM_VERSIONED_TABLE, HISTORY_TABLE, COLLECTION, FLAT_FILE, ICEBERG_TABLEDatabase tables
VIEWVIEW, MATERIALIZED_VIEW, INDEXED_VIEW, DYNAMIC_TABLE, NAMED_QUERY, PROJECTION, EPHEMERAL, INCREMENTAL, SNAPSHOTViews and virtual tables
STRUCTURESTRUCTURE, CSV, JSON, XML, PARQUET, ORC, AVRO_RECORD, EXCEL_TABLE, DELIMITED_TEXT, DELTA_LAKE, STAGEFile-based and semi-structured data
DATASETDATASET, DATASET_TABLE, TABLEAU_PUBLISHED_DATASOURCE, TABLEAU_EMBEDDED_DATASOURCEBI datasets

Code objects → import_procedures

object_typeCommon SubtypesUse Case
PROCEDUREPROCEDURE, CLR_PROCEDURE, EXTENDED_PROCEDUREStored procedures
FUNCTIONFUNCTION, CLR_FUNCTION, EXTERNAL_FUNCTIONFunctions

BI objects → import_reports

object_typeCommon SubtypesUse Case
REPORTREPORT, POWER_BI_REPORT, TABLEAU_REPORT, SSRS_REPORT, PAGINATED_REPORT, TABLEAU_WORKSHEETReports and paginated reports
DASHBOARDDASHBOARDDashboards
VISUALIZATIONVISUALIZATIONCharts, visuals, widgets

Parameter modes

Use in import_parameters.parameter_mode:

ValueDescription
INInput parameter (data flows into the procedure)
OUTOutput parameter (data flows out of the procedure)
INOUTInput/output parameter (bidirectional)

Column subtypes

Use in import_columns.object_subtype:

SubtypeDescription
COLUMNStandard column (default)
FIELDDocument or record field
ATTRIBUTEObject attribute
COMPUTEDComputed/calculated column
MEASUREBI measure (aggregation)
HIERARCHYHierarchy level
USER_TYPEUser-defined data type

Trigger subtypes

Use in import_triggers.trigger_type. The object_type is always TRIGGER:

SubtypeDescription
TRIGGERStandard DML trigger
CLR_TRIGGERCLR-based trigger (.NET)
RULEDatabase rule

Object identification

Objects are identified by a combination of columns that form a unique key:

Object TypeUnique Key
Table/Viewdatabase_name + table_schema + table_name + object_type
Columndatabase_name + object_schema + object_name + object_object_type + column_name + column_path
Proceduredatabase_name + procedure_schema + procedure_name + object_type
Reportdatabase_name + report_schema + report_name + object_type
Foreign Keydatabase_name + foreign_table_* + primary_table_* + column names

The database_name groups all objects that belong to the same documentation entry.

Importing tables and views

Use import_tables to document tables, views, structures, and datasets:

INSERT INTO import_tables
(database_name, table_schema, table_name, object_type, object_subtype, description)
VALUES
('MyDatabase', 'sales', 'customers', 'TABLE', 'TABLE',
'Master table storing customer information.'),
('MyDatabase', 'sales', 'orders', 'TABLE', 'TABLE',
'Stores all customer orders with status tracking.'),
('MyDatabase', 'reports', 'vw_order_summary', 'VIEW', 'VIEW',
'Aggregated view showing order totals per customer.');

See Object subtypes for all valid object_type and object_subtype combinations.

Key columns:

  • database_name - groups objects into one documentation entry
  • table_schema - schema/namespace (can be NULL)
  • object_type - TABLE, VIEW, STRUCTURE, or DATASET
  • object_subtype - more specific type (e.g., MATERIALIZED_VIEW, EXTERNAL_TABLE, CSV, JSON)
  • definition - SQL source code for views or other executable objects

Importing columns

Use import_columns to document columns:

INSERT INTO import_columns
(database_name, object_schema, object_name, object_object_type,
column_name, ordinal_position, datatype, nullable, is_identity, is_computed, description)
VALUES
('MyDatabase', 'sales', 'customers', 'TABLE',
'customer_id', 1, 'int', 0, 1, 0, 'Unique customer identifier.'),
('MyDatabase', 'sales', 'customers', 'TABLE',
'email', 2, 'varchar(255)', 0, 0, 0, 'Customer email address.'),
('MyDatabase', 'sales', 'customers', 'TABLE',
'full_name', 3, 'varchar(200)', 0, 0, 0, 'Customer full name.');

Key columns:

  • object_object_type - must match object_type from import_tables
  • ordinal_position - column order (1, 2, 3...)
  • nullable - 0 = NOT NULL, 1 = NULL allowed
  • is_identity - 1 for auto-increment columns

Nested columns for JSON structures

For JSON, XML, or nested structures, use column_path and column_level:

INSERT INTO import_columns
(database_name, object_schema, object_name, object_object_type,
column_name, column_path, column_level, ordinal_position, datatype, nullable, is_identity, is_computed, description)
VALUES
-- Root level (column_level = 1)
('MyDatabase', 'api', 'customer_payload', 'STRUCTURE',
'profile', NULL, 1, 1, 'object', 0, 0, 0, 'Nested profile object'),
-- Inside profile (column_level = 2)
('MyDatabase', 'api', 'customer_payload', 'STRUCTURE',
'first_name', 'profile', 2, 1, 'string', 0, 0, 0, 'First name from profile'),
-- Deeper nesting (column_level = 3)
('MyDatabase', 'api', 'customer_payload', 'STRUCTURE',
'city', 'profile.address', 3, 1, 'string', 1, 0, 0, 'City from address');

Importing keys and relationships

Primary and unique keys

INSERT INTO import_tables_keys_columns
(database_name, table_schema, table_name, table_object_type,
key_name, key_type, disabled, column_name, column_order, description)
VALUES
('MyDatabase', 'sales', 'customers', 'TABLE',
'PK_customers', 'PK', 0, 'customer_id', 1, 'Primary key'),
('MyDatabase', 'sales', 'customers', 'TABLE',
'UK_customers_email', 'UK', 0, 'email', 1, 'Email must be unique');

Foreign keys

INSERT INTO import_tables_foreign_keys_columns
(database_name,
foreign_table_schema, foreign_table_name, foreign_table_object_type,
primary_table_schema, primary_table_name, primary_table_object_type,
foreign_column_name, primary_column_name,
column_pair_order, key_name, description)
VALUES
('MyDatabase',
'sales', 'orders', 'TABLE',
'sales', 'customers', 'TABLE',
'customer_id', 'customer_id',
1, 'FK_orders_customers', 'Links order to customer');

Importing procedures and functions

INSERT INTO import_procedures
(database_name, procedure_schema, procedure_name,
object_type, object_subtype, function_type, language, definition, description)
VALUES
('MyDatabase', 'sales', 'sp_create_order',
'PROCEDURE', 'PROCEDURE', NULL, 'T-SQL',
'CREATE PROCEDURE [sales].[sp_create_order] ...',
'Creates a new order from cart items.');

Procedure parameters

Use import_parameters to document procedure and function parameters:

INSERT INTO import_parameters
(database_name, object_schema, object_name, object_object_type,
parameter_name, ordinal_position, parameter_mode, datatype, description)
VALUES
('MyDatabase', 'sales', 'sp_create_order', 'PROCEDURE',
'@customer_id', 1, 'IN', 'int', 'Customer placing the order'),
('MyDatabase', 'sales', 'sp_create_order', 'PROCEDURE',
'@order_date', 2, 'IN', 'date', 'Order date (defaults to today)'),
('MyDatabase', 'sales', 'sp_create_order', 'PROCEDURE',
'@order_id', 3, 'OUT', 'int', 'Returns the new order ID');

Parameter modes: IN (input), OUT (output), INOUT (both). Function return values are mapped as OUT. See Parameter modes for details.

Importing triggers

Use import_triggers to document table triggers:

INSERT INTO import_triggers
(database_name, table_schema, table_name, table_object_type,
trigger_name, trigger_type, after, on_insert, on_update, definition, description)
VALUES
('MyDatabase', 'sales', 'orders', 'TABLE',
'trg_orders_audit', 'TRIGGER', 1, 1, 1,
'CREATE TRIGGER trg_orders_audit ON sales.orders AFTER INSERT, UPDATE ...',
'Logs order changes to audit table');

Trigger types: TRIGGER (standard DML), CLR TRIGGER (.NET-based), RULE. See Trigger types for details.

Timing flags: Set before, after, or instead_of to 1. Event flags: Set on_insert, on_update, on_delete to 1 for applicable events.

Importing reports and dashboards

Use import_reports for reports, dashboards, and visualizations. Use the Interface Tables Reporting connector to import these.

INSERT INTO import_reports
(database_name, report_schema, report_name, object_type, object_subtype,
external_id, url, description)
VALUES
('MyReports', 'dashboards', 'executive_sales', 'DASHBOARD', 'DASHBOARD',
'dash-001', 'https://bi.example.com/dashboards/executive-sales',
'Executive sales dashboard for leadership.'),
('MyReports', 'dashboards', 'sales_chart', 'VISUALIZATION', 'VISUALIZATION',
'vis-001', 'https://bi.example.com/dashboards/executive-sales#chart',
'Revenue trend chart.');

Key columns:

  • object_type - REPORT, DASHBOARD, or VISUALIZATION
  • external_id - stable identifier (required for lineage matching)
  • parent_object_* - links visualizations to their parent dashboard
info

Dataset placement: DATASET objects go in import_tables (not import_reports), but use the Reporting connector to import them.

See BI objects for all valid subtypes.

Data lineage

Data lineage tracks how data flows from source to target through a processor (transformation).

Lineage model

[Source] --> [Processor] --> [Target]
  • Source: Where data comes from (table, column)
  • Processor: The object that performs the transformation (view, procedure, report, or ETL step)
  • Target: Where data goes to (table, column)

The processor is required. If you don't have a separate transformation object, use the target object as the processor.

Importing lineage

INSERT INTO import_data_lineage
(source_database_name, source_object_schema, source_object_name, source_object_type, source_column_name,
processor_database_name, processor_object_schema, processor_object_name, processor_object_type, processor_process_name,
target_database_name, target_object_schema, target_object_name, target_object_type, target_column_name,
transformation, description, creation_date)
VALUES
('SourceDB', 'dbo', 'customers', 'TABLE', 'email',
'TargetDB', 'dim', 'dim_customer', 'TABLE', 'Load Customer Dimension',
'TargetDB', 'dim', 'dim_customer', 'TABLE', 'email',
'LOWER(TRIM(email))', 'Email normalized to lowercase', GETDATE());

Key points:

  • processor_* columns are required; rows without processor_object_name are skipped
  • The processor must belong to the database_name you are importing
  • processor_process_name (optional) groups flows into named steps under the same processor
  • Column-level lineage requires both source and target columns; otherwise, it's object-level

Object matching for lineage

Objects in lineage are matched using this priority:

  1. If external_id is provided → match by external_id + object_type + database_name
  2. If no external_id → match by schema + name + object_type + database_name
caution

Important: When the target object has an external_id in import_tables or import_reports, the lineage row must include the matching external_id column. Schema+name matching only works when both the lineage row and the target object have NULL external_id. This commonly affects DATASET, DASHBOARD, and VISUALIZATION objects.

Example with external_id:

-- When the target DATASET has external_id = 'ds-001', lineage must include it
INSERT INTO import_data_lineage
(source_database_name, source_object_schema, source_object_name, source_object_type,
processor_database_name, processor_object_schema, processor_object_name,
processor_object_external_id, processor_object_type, processor_process_name,
target_database_name, target_object_schema, target_object_name,
target_object_external_id, target_object_type,
description, creation_date)
VALUES
('Analytics_DW', 'reports', 'rpt_daily_sales', 'VIEW',
'Reporting_Catalog', 'datasets', 'daily_sales_data',
'ds-001', 'DATASET', 'Ingest Sales Data',
'Reporting_Catalog', 'datasets', 'daily_sales_data',
'ds-001', 'DATASET',
'Dataset ingests from warehouse view', GETDATE());

Linked sources

Linked sources represent external databases or systems that participate in lineage but are not fully documented in Dataedo.

INSERT INTO import_linked_sources
(database_name, source_database_id, linked_source_name, external_id, sql_dialect,
connection_details, default_schema, description)
VALUES
('MyDatabase', NULL, 'External_API', 'api-001', 'API',
'https://api.example.com', NULL,
'External API that provides customer data');

Use linked source references in lineage by setting source_database_linked_source_external_id or source_database_linked_source_name with a NULL source_database_name.

Loading data via Public API

Instead of inserting data directly into the repository with SQL, you can send it as JSON over HTTP using the Dataedo Public API. No direct database access is needed.

This approach is useful for SaaS users, organizations with strict security requirements, or when integrating with CI/CD pipelines.

The API supports the same entity types as those listed above. For the full endpoint reference, see the Public API documentation.

How it works

A typical automation flow looks like this:

  1. Extract metadata from your source (parse DDL, query catalog, read files, etc.)
  2. Clean the staging areaPOST /public/v1/interface-tables/delete
  3. Load metadata — use the corresponding POST endpoints for each entity type, for example:
    • POST /public/v1/interface-tables/tables
    • POST /public/v1/interface-tables/columns
    • POST /public/v1/interface-tables/keys
    • (and so on for foreign keys, procedures, data lineage, etc.)
  4. Trigger the import — choose one of:
    • POST /public/v1/scheduler/tasks/{taskId}/trigger — trigger an existing scheduler task
    • POST /public/v1/data-sources/{id}/import-connector — one-time import by data source ID
  5. Check statusGET /public/v1/scheduler/executions/{executionId}

API details

  • Authentication: JWT Bearer token in the Authorization header
  • Base path: /public/v1/interface-tables/

Importing from Interface Tables

Custom fields

All interface tables include 100 custom fields (field1 through field100) for storing additional metadata. These map to Custom Fields in Dataedo.

Common uses:

  • field1: Status (Active, Deprecated, Planned)
  • field2: Data classification (PII, Confidential, Public)
  • field3: Owner/Team
info

Enable custom field import in Dataedo's advanced import settings.

Resolving errors and warnings

In the case of any errors or warnings, you have the choice of either ignoring them or fixing them and starting the import once again. All errors are displayed during import and also saved in corresponding import_* tables.

Import errors

Checking for errors

After import, check for errors:

-- Check import_errors table
SELECT * FROM import_errors
WHERE creation_date > DATEADD(hour, -1, GETDATE())
ORDER BY creation_date DESC;

-- Check individual tables for row-level errors
SELECT database_name, table_schema, table_name, error_message
FROM import_tables
WHERE error_failed = 1;

Common errors and warnings

ConditionResultSolution
Missing required fieldError - row skippedPopulate required fields (database_name, table_name, etc.)
Duplicate unique keyError - row skippedUpdate existing row or use different key values
Processor object not foundError - lineage row skippedAdd processor object to import_tables, import_procedures, or import_reports
Source/target object not foundError - lineage row skippedAdd the referenced object or check schema/name/type spelling
FK references table that doesn't existError - FK row skippedAdd the referenced table first
No processor_process_nameWarning - assigned to default processAdd processor_process_name to group flows under named steps
Only source OR target column providedWarning - object-level lineage onlyProvide both source_column_name and target_column_name for column-level lineage
Column not found in existing tableWarning - object-level lineage onlyCheck column name spelling or add column to parent object
Unrecognized trigger_type valueWarning - stored as-isUse valid values: TRIGGER, CLR TRIGGER, RULE
Unrecognized object_type valueWarning - stored as-isUse valid values (TABLE, VIEW, PROCEDURE, etc.)

Sample data

To lower the learning curve, we've put a sample interface database in the repository, so that you can try importing right away.

Sample interface database

Quick reference: Required fields

TableRequired FieldsNotes
import_tablesdatabase_name, table_name, object_typeobject_type defaults to 'TABLE'
import_columnsdatabase_name, object_name, object_object_type, column_name, nullable, is_identity, is_computedBit fields default to 0
import_tables_keys_columnsdatabase_name, table_name, key_name, key_type, column_namekey_type: 'PK' or 'UK'
import_tables_foreign_keys_columnsdatabase_name, foreign_table_name, primary_table_name, foreign_column_name, primary_column_name
import_proceduresdatabase_name, procedure_nameobject_type defaults to 'PROCEDURE'
import_parametersdatabase_name, object_name, object_object_type, parameter_name
import_triggersdatabase_name, table_name, trigger_nameBit flags default to 0
import_reportsdatabase_name, report_name, object_typeobject_type defaults to 'REPORT'
import_data_lineagesource_object_name, source_object_type, target_object_name, target_object_type, processor_object_name, processor_object_typeDatabase names can be NULL if linked source provided
import_data_lineage_processesprocessor_database_name, processor_object_name, processor_process_name
import_linked_sourcesdatabase_name, linked_source_name

Column reference

import_tables

ColumnTypeRequiredDescription
database_namenvarchar(500)YesGroups objects into one documentation entry
table_schemanvarchar(500)NoSchema/namespace name
table_namenvarchar(500)YesObject name
external_idnvarchar(1000)NoStable external identifier for matching
object_typenvarchar(100)YesTABLE, VIEW, STRUCTURE, or DATASET
object_subtypenvarchar(100)NoMore specific type (CSV, JSON, PARQUET, etc.)
descriptionnvarchar(max)NoObject documentation
definitionnvarchar(max)NoSQL definition or source code
locationnvarchar(max)NoPhysical location or file path
languagenvarchar(100)NoProgramming/query language
dbms_createddatetimeNoCreation timestamp from source system
dbms_last_modifieddatetimeNoLast modification timestamp
field1 - field100nvarchar(max)NoCustom fields

import_columns

ColumnTypeRequiredDescription
database_namenvarchar(500)YesParent database
object_external_idnvarchar(1000)NoExternal ID of parent object (use when parent has external_id)
object_schemanvarchar(500)NoParent object schema
object_namenvarchar(500)YesParent object name
object_object_typenvarchar(100)YesParent object type (TABLE, VIEW, REPORT, etc.)
column_namenvarchar(500)YesColumn/field name
column_pathnvarchar(max)NoPath for nested structures (dot notation)
column_levelintNoNesting depth (1 = root)
ordinal_positionintNoColumn order (1-based)
datatypenvarchar(250)NoData type (int, varchar(255), etc.)
nullablebitYes0 = NOT NULL, 1 = NULL allowed
is_identitybitYes1 = auto-increment/identity
is_computedbitYes1 = computed/calculated column
computed_formulanvarchar(max)NoFormula for computed columns
descriptionnvarchar(max)NoColumn documentation
field1 - field100nvarchar(max)NoCustom fields

import_procedures

ColumnTypeRequiredDescription
database_namenvarchar(500)YesParent database
procedure_schemanvarchar(500)NoSchema name
procedure_namenvarchar(500)YesProcedure/function name
external_idnvarchar(1000)NoExternal system identifier
object_typenvarchar(100)NoPROCEDURE or FUNCTION (default: PROCEDURE)
function_typenvarchar(100)NoScalar-valued, Table-valued, Aggregate
languagenvarchar(100)NoSQL, Python, etc.
definitionnvarchar(max)NoFull source code
descriptionnvarchar(max)NoDocumentation
field1 - field100nvarchar(max)NoCustom fields

import_reports

ColumnTypeRequiredDescription
database_namenvarchar(500)YesDocumentation entry
report_schemanvarchar(500)NoSchema/folder/workspace
report_namenvarchar(500)YesReport name
external_idnvarchar(1000)NoExternal system ID
urlnvarchar(2000)NoReport URL
embed_urlnvarchar(2000)NoEmbedded report URL
object_typenvarchar(100)YesREPORT, DASHBOARD, or VISUALIZATION
descriptionnvarchar(max)NoDocumentation
parent_object_external_idnvarchar(1000)NoParent report/dashboard ID
parent_object_schemanvarchar(500)NoParent schema
parent_object_namenvarchar(500)NoParent name
parent_object_typenvarchar(100)NoParent type
field1 - field100nvarchar(max)NoCustom fields

import_data_lineage

ColumnTypeRequiredDescription
source_database_namenvarchar(250)ConditionalSource database (or use linked source)
source_database_linked_source_external_idnvarchar(1000)NoSource via linked source external ID
source_database_linked_source_namenvarchar(500)NoSource via linked source name
source_object_schemanvarchar(250)NoSource object schema
source_object_namenvarchar(250)YesSource object name
source_object_external_idnvarchar(128)NoSource object external ID
source_object_typenvarchar(100)YesSource object type
source_column_namenvarchar(250)NoSource column (for column-level lineage)
source_column_external_idnvarchar(128)NoSource column external ID
processor_database_namenvarchar(250)YesProcessor database
processor_object_schemanvarchar(250)NoProcessor schema
processor_object_namenvarchar(250)YesProcessor object name
processor_object_external_idnvarchar(128)NoProcessor external ID
processor_object_typenvarchar(100)YesProcessor type
processor_process_namenvarchar(250)NoNamed process (groups flows)
processor_process_external_idnvarchar(1000)NoProcess external ID
target_database_namenvarchar(250)ConditionalTarget database (or use linked source)
target_database_linked_source_external_idnvarchar(1000)NoTarget via linked source external ID
target_database_linked_source_namenvarchar(500)NoTarget via linked source name
target_object_schemanvarchar(250)NoTarget schema
target_object_namenvarchar(250)YesTarget object name
target_object_external_idnvarchar(128)NoTarget external ID
target_object_typenvarchar(100)YesTarget type
target_column_namenvarchar(250)NoTarget column (for column-level lineage)
target_column_external_idnvarchar(128)NoTarget column external ID
transformationnvarchar(max)NoTransformation expression/description
descriptionnvarchar(max)NoLineage documentation
scriptnvarchar(max)NoFull SQL/script source
line_fromintNoScript line start
line_tointNoScript line end
char_fromintNoCharacter position start
char_tointNoCharacter position end
is_directbitNo1 = direct lineage (no transformation)
field1 - field100nvarchar(max)NoCustom fields

import_parameters

ColumnTypeRequiredDescription
database_namenvarchar(500)YesParent database
object_external_idnvarchar(1000)NoParent procedure external ID
object_schemanvarchar(500)NoParent procedure schema
object_namenvarchar(500)YesParent procedure name
object_object_typenvarchar(100)YesPROCEDURE or FUNCTION
parameter_namenvarchar(500)YesParameter name
ordinal_positionintNoParameter order (0 for return value)
parameter_modenvarchar(10)NoIN, OUT, INOUT
datatypenvarchar(250)NoParameter data type
data_lengthnvarchar(50)NoLength specification
descriptionnvarchar(max)NoDocumentation
field1 - field100nvarchar(max)NoCustom fields

import_triggers

ColumnTypeRequiredDescription
database_namenvarchar(500)YesParent database
table_schemanvarchar(500)NoTable schema
table_namenvarchar(500)YesTable name
table_object_typenvarchar(100)NoTable object type
trigger_namenvarchar(500)YesTrigger name
trigger_typenvarchar(100)NoTRIGGER, CLR TRIGGER, or RULE
definitionnvarchar(max)NoTrigger source code
beforebitNo1 = BEFORE trigger
afterbitNo1 = AFTER trigger
instead_ofbitNo1 = INSTEAD OF trigger
on_insertbitNo1 = fires on INSERT
on_updatebitNo1 = fires on UPDATE
on_deletebitNo1 = fires on DELETE
disabledbitNo1 = trigger is disabled
descriptionnvarchar(max)NoDocumentation
field1 - field100nvarchar(max)NoCustom fields

import_tables_keys_columns

ColumnTypeRequiredDescription
database_namenvarchar(500)YesParent database
table_schemanvarchar(500)NoTable schema
table_namenvarchar(500)YesTable name
table_object_typenvarchar(100)NoObject type
key_namenvarchar(500)YesConstraint name
key_typenvarchar(100)YesPK (primary) or UK (unique)
descriptionnvarchar(max)NoDocumentation
disabledbitNo1 = constraint is disabled
column_namenvarchar(500)YesColumn in the key
column_pathnvarchar(max)NoPath for nested columns
column_orderintNoColumn order in composite key

import_tables_foreign_keys_columns

ColumnTypeRequiredDescription
database_namenvarchar(500)YesParent database
foreign_table_schemanvarchar(500)NoFK table schema
foreign_table_namenvarchar(500)YesFK table name (child)
foreign_table_object_typenvarchar(100)NoFK table type
primary_table_schemanvarchar(500)NoReferenced table schema
primary_table_namenvarchar(500)YesReferenced table name (parent)
primary_table_object_typenvarchar(100)NoReferenced table type
foreign_column_namenvarchar(500)YesFK column name
foreign_column_pathnvarchar(max)NoPath for nested columns
primary_column_namenvarchar(500)YesReferenced column name
primary_column_pathnvarchar(max)NoPath for nested columns
column_pair_orderintNoOrder in composite FK
key_namenvarchar(500)NoFK constraint name
descriptionnvarchar(max)NoDocumentation

import_data_lineage_processes

ColumnTypeRequiredDescription
processor_database_namenvarchar(250)YesDatabase containing process
processor_object_schemanvarchar(250)NoSchema
processor_object_namenvarchar(250)YesProcessor object name
processor_object_external_idnvarchar(1000)NoProcessor external ID
processor_object_typenvarchar(100)NoProcessor type
processor_process_namenvarchar(250)YesProcess name
processor_process_external_idnvarchar(1000)NoProcess external ID
descriptionnvarchar(max)NoProcess documentation
languagenvarchar(100)NoSQL, Python, etc.
scriptnvarchar(max)NoProcess code
process_orderintNoExecution order

import_linked_sources

ColumnTypeRequiredDescription
database_namenvarchar(500)YesInterface tables database
source_database_idintNoDataedo database ID to link to
external_idnvarchar(1000)NoExternal system identifier
linked_source_namenvarchar(500)YesName for lineage references
connection_detailsnvarchar(max)NoConnection string/URL
sql_dialectnvarchar(500)NoDatabase type
default_schemanvarchar(500)NoDefault schema
descriptionnvarchar(max)NoDocumentation
field1 - field100nvarchar(max)NoCustom fields

import_linked_source_configuration

Maps linked sources to existing Dataedo databases.

ColumnTypeRequiredDescription
database_namenvarchar(500)YesInterface tables database
linked_source_namenvarchar(500)YesLinked source name
linked_source_external_idnvarchar(1000)NoLinked source external ID
source_database_idintYesDataedo database ID
Dataedo is an end-to-end data governance solution for mid-sized organizations.
Data Lineage • Data Quality • Data Catalog