Skip to main content

Insert Table

Inserts table data into a Microsoft SQL Server database table. This node is optimized for bulk inserts of structured data.

Common Properties

  • Name - The custom name of the node.
  • Color - The custom color of the node.
  • Delay Before (sec) - Waits in seconds before executing the node.
  • Delay After (sec) - Waits in seconds after executing node.
  • Continue On Error - Automation will continue regardless of any error. The default value is false.
info

If the ContinueOnError property is true, no error is caught when the project is executed, even if a Catch node is used.

Inputs

  • Connection Id - The unique identifier of the database connection to use (optional if credentials provided directly).
  • Transaction Id - The unique identifier of a transaction (optional). Use when the insert is part of a database transaction.
  • Table Name - The name of the database table to insert data into. Can include schema (e.g., dbo.TableName).
  • Table Data - The table data to insert, with columns and rows in table structure format.

Options

  • Ignore Conflicts - When enabled, ignores unique constraint violations without throwing an error. Silently skips rows that violate UNIQUE or PRIMARY KEY constraints.
  • Credentials - SQL Server database credentials (optional). Use this if you want to insert without a Connect node.

Output

  • None - The data is inserted into the database, but no results are returned.

How It Works

The Insert Table node inserts structured table data into a SQL Server table. When executed, the node:

  1. Retrieves the database connection (either from connection ID or creates new connection from credentials)
  2. Optionally retrieves the transaction if a transaction ID is provided
  3. Validates the table name and table data
  4. For each row in the table data:
    • Constructs an INSERT statement based on the columns
    • Adds column values as parameters to prevent SQL injection
    • Executes the INSERT statement
    • If Ignore Conflicts is enabled, catches and ignores unique constraint violations (error 2627) and foreign key violations (error 547)
  5. Applies all changes to the database

Requirements

  • Either: A valid connection ID from Connect node OR valid database credentials
  • Valid table name that exists in the database
  • Table data with at least one row
  • Table columns must match database table columns
  • Appropriate INSERT permission on the table

Error Handling

The node will return specific errors in the following cases:

  • ErrInvalidArg - Table name is empty or table data is empty
  • ErrNotFound - Connection ID not found
  • ErrConnection - Cannot connect to SQL Server (when using credentials mode)
  • ErrRuntime - Insert operation failed (table doesn't exist, column mismatch, constraint violation, etc.)

Table Data Format

The table data must be in the following structure:

{
"columns": ["CustomerName", "Email", "City", "Country"],
"rows": [
{"CustomerName": "John Doe", "Email": "john@example.com", "City": "New York", "Country": "USA"},
{"CustomerName": "Jane Smith", "Email": "jane@example.com", "City": "Los Angeles", "Country": "USA"},
{"CustomerName": "Bob Johnson", "Email": "bob@example.com", "City": "Chicago", "Country": "USA"}
]
}

Usage Examples

Insert from CSV

Read CSV → Insert Table

The CSV node outputs data in table format, which can be directly used:

// CSV output is automatically in table format
$item.table = {
columns: ["ProductName", "Category", "Price", "Stock"],
rows: [...]
}

Insert from Excel

Excel Open → Excel Read Range → Insert Table

Excel nodes also output table format:

// Excel range data
$item.range = {
columns: ["CustomerName", "Email", "Phone", "City"],
rows: [...]
}

Insert from Web Scraping

Browser Scrape Table → Transform Data → Insert Table
// Transform scraped data to table format
const table = {
columns: Object.keys($item.scrapedData[0]),
rows: $item.scrapedData
};

Insert from API Response

HTTP Request → Transform to Table → Insert Table
// Convert API JSON array to table format
const apiData = $item.response.data;
const table = {
columns: ["OrderID", "CustomerID", "Total", "Status"],
rows: apiData.map(order => ({
OrderID: order.id,
CustomerID: order.customer_id,
Total: order.total,
Status: order.status
}))
};

Common Use Cases

  1. Bulk Data Import - Import data from CSV, Excel, or external APIs into SQL Server
  2. Web Scraping Storage - Store scraped data in SQL Server tables
  3. Data Migration - Move data between systems or databases
  4. ETL Operations - Extract, transform, and load data into SQL Server
  5. Report Data Storage - Save generated report data to database
  6. Batch Processing - Insert multiple records efficiently in one operation
  7. Data Synchronization - Sync data from external sources to SQL Server
  8. Archive Old Data - Copy historical data to archive tables

Ignore Conflicts Mode

When Ignore Conflicts option is enabled, the node silently ignores these SQL Server errors:

  • Error 2627 - Violation of UNIQUE KEY constraint
  • Error 547 - Foreign key constraint violation

This is useful for:

  • Importing data where duplicates may exist
  • Syncing data without failing on existing records
  • Loading data from multiple sources that may overlap
  • Gracefully handling constraint violations

Note: Other errors (syntax errors, permission denied, etc.) will still throw exceptions.

Tips for Effective Use

  • Table Format - Ensure data is in proper table format with columns and rows
  • Column Matching - Table data columns must match database table columns exactly
  • Data Types - Ensure data types match SQL Server column definitions
  • Batch Processing - For very large datasets, split into batches to avoid timeouts
  • Ignore Conflicts - Use this option to handle duplicates gracefully
  • Transactions - Use transactions for atomic bulk inserts (all or nothing)
  • Direct Credentials - Use optional Credentials field to insert without Connect node
  • Validation - Validate data before inserting to avoid runtime errors
  • Schema Prefix - Include schema name if table is not in default schema (e.g., sales.Orders)

Working with Different Data Sources

CSV to SQL Server

Read CSV (file: data.csv)

Insert Table
- Table Name: dbo.ImportedData
- Table Data: $item.table

Excel to SQL Server

Excel Open (file: data.xlsx)

Excel Read Range (range: A1:D100)

Insert Table
- Table Name: dbo.ExcelImport
- Table Data: $item.range

JSON Array to SQL Server

// In JavaScript node: transform JSON to table format
const jsonData = $item.apiResponse;

const table = {
columns: Object.keys(jsonData[0]),
rows: jsonData
};

return { table };
JavaScript → Insert Table (Table Data: $item.table)

Query to Insert (Copy Data Between Tables)

Query (SELECT * FROM SourceTable WHERE Status = 'Active')

Insert Table
- Table Name: DestinationTable
- Table Data: $item.result

Query from One Database, Insert to Another

Connect (Database A) → conn_a

Query (use conn_a)

Connect (Database B) → conn_b

Insert Table (use conn_b)

Disconnect (conn_a)
Disconnect (conn_b)

Best Practices

Validate Data First

// JavaScript node before Insert
const rows = $item.table.rows;

// Validate required fields
const validRows = rows.filter(row => {
return row.Email && row.CustomerName && row.Email.includes('@');
});

// Remove nulls and empty values
const cleanRows = validRows.map(row => {
const cleanRow = {};
for (const key in row) {
cleanRow[key] = row[key] ?? ''; // Replace null with empty string
}
return cleanRow;
});

// Remove duplicates based on email
const uniqueRows = Array.from(
new Map(cleanRows.map(row => [row.Email, row])).values()
);

return {
table: {
columns: $item.table.columns,
rows: uniqueRows
}
};

Handle Large Datasets

For large datasets (10,000+ rows), process in batches:

// Split into batches of 1000
const batchSize = 1000;
const allRows = $item.table.rows;
const batches = [];

for (let i = 0; i < allRows.length; i += batchSize) {
batches.push({
columns: $item.table.columns,
rows: allRows.slice(i, i + batchSize)
});
}

return { batches };
JavaScript (create batches)

Loop (over batches)

Insert Table (each batch)

Use Transactions for Consistency

Connect

Start Transaction

Insert Table (batch 1)

Insert Table (batch 2)

Insert Table (batch 3)

Commit (all or nothing)

Disconnect

Map Column Names

If source columns don't match database columns:

// Map column names to match database
const columnMapping = {
'Name': 'CustomerName',
'Email Address': 'Email',
'Phone Number': 'Phone',
'Location': 'City'
};

const mappedRows = $item.sourceData.rows.map(row => {
const newRow = {};
for (const [sourceCol, targetCol] of Object.entries(columnMapping)) {
newRow[targetCol] = row[sourceCol];
}
return newRow;
});

const table = {
columns: Object.values(columnMapping),
rows: mappedRows
};

Performance Optimization

Use Table-Valued Parameters (Alternative Approach)

For maximum performance with very large datasets, consider using Non Query with table-valued parameters:

-- In Non Query node
INSERT INTO TargetTable (Column1, Column2, Column3)
SELECT Column1, Column2, Column3
FROM OPENJSON('{{jsonData}}')
WITH (
Column1 NVARCHAR(100) '$.Column1',
Column2 INT '$.Column2',
Column3 DATETIME '$.Column3'
)

Connection Reuse

Connect (once)

Loop (over data batches)

Insert Table (uses same connection)

Disconnect (once at end)

Better than connecting for each insert.

Disable Indexes for Bulk Insert

For very large imports, temporarily disable indexes:

Non Query: ALTER INDEX ALL ON TableName DISABLE

Insert Table (bulk data)

Non Query: ALTER INDEX ALL ON TableName REBUILD

Use Bulk Insert for CSV Files

For direct CSV file imports, consider using BULK INSERT via Non Query:

BULK INSERT dbo.TableName
FROM '{{csvFilePath}}'
WITH (
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
FIRSTROW = 2,
TABLOCK
)

Common Errors and Solutions

Insert Operation Failed

Error: ErrRuntime: Insert operation failed

Common Causes:

  • Table doesn't exist
  • Column name mismatch
  • Data type mismatch
  • Constraint violation (NOT NULL, FOREIGN KEY, UNIQUE, CHECK)
  • Permission denied
  • Identity insert not enabled for identity columns

Solutions:

  • Verify table exists: SELECT * FROM sys.tables WHERE name = 'TableName'
  • Check column names: SELECT * FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = 'TableName'
  • Validate data types match column definitions
  • Ensure required (NOT NULL) columns have values
  • Check foreign key references exist in parent tables
  • Verify user has INSERT permission
  • For identity columns, use SET IDENTITY_INSERT TableName ON if inserting explicit values

Table Name Cannot Be Empty

Error: ErrInvalidArg: Table name cannot be empty

Solutions:

  • Provide the table name in the Table Name field
  • Ensure the variable containing the table name has a value
  • Include schema if table is not in default schema: dbo.TableName

Table Data Cannot Be Empty

Error: ErrInvalidArg: Table data cannot be empty. Please provide table data with at least one row to insert

Solutions:

  • Ensure the table data has at least one row
  • Validate the data source (CSV, Excel, etc.) has data
  • Check that data transformation didn't remove all rows
  • Verify the data variable is correctly populated

Violation of PRIMARY KEY Constraint

Error: Violation of PRIMARY KEY constraint. Cannot insert duplicate key

Solutions:

  • Enable the Ignore Conflicts option to skip duplicates
  • Remove duplicates from source data before inserting
  • Check if records already exist in the database
  • Query existing records and filter them out
  • Use a different key value generator

Violation of FOREIGN KEY Constraint

Error: The INSERT statement conflicted with the FOREIGN KEY constraint

Solutions:

  • Verify referenced records exist in parent table
  • Insert parent records before child records
  • Use transactions to ensure referential integrity
  • Enable Ignore Conflicts to skip records with invalid foreign keys
  • Validate foreign key values before insertion

Column Name Mismatch

Error: Invalid column name 'ColumnName'

Solutions:

  • Ensure column names in table data exactly match database columns
  • Check case sensitivity (SQL Server collation matters)
  • Verify no extra spaces in column names
  • Use sp_help 'TableName' to see actual column names
  • Map source columns to database columns if needed

Data Type Mismatch

Error: Conversion failed when converting value to data type

Solutions:

  • Verify data types match column definitions
  • Convert dates to proper format (ISO 8601: 'YYYY-MM-DD' or 'YYYY-MM-DD HH:MM:SS')
  • Ensure numbers are not passed as strings (or vice versa)
  • Handle NULL values appropriately
  • Check for invalid values (e.g., text in numeric column)

String or Binary Data Would Be Truncated

Error: String or binary data would be truncated

Solutions:

  • Check maximum length of VARCHAR/NVARCHAR columns
  • Truncate or validate string lengths before insertion
  • Increase column length in database if needed
  • Use NVARCHAR(MAX) for very long strings
// Truncate strings to match column length
const table = {
columns: $item.table.columns,
rows: $item.table.rows.map(row => ({
...row,
Description: row.Description?.substring(0, 500) // Truncate to 500 chars
}))
};

Advanced Examples

ETL Pipeline

HTTP Request (fetch data from API)

JavaScript (transform data)

Query (check for existing records)

JavaScript (filter new records)

Insert Table (insert new records only)

Incremental Load

// Check last import timestamp
// Query: SELECT MAX(ImportedAt) FROM TargetTable

const lastImport = $item.lastImportDate;

// Filter source data for new records
const newRecords = $item.sourceData.rows.filter(row => {
return new Date(row.CreatedAt) > new Date(lastImport);
});

// Add import timestamp
const table = {
columns: [...$item.sourceData.columns, 'ImportedAt'],
rows: newRecords.map(row => ({
...row,
ImportedAt: new Date().toISOString()
}))
};

Multi-Table Insert with Transaction

Connect

Start Transaction

Insert Table (customers)

Query (SELECT SCOPE_IDENTITY() to get new customer IDs)

JavaScript (add customer IDs to orders data)

Insert Table (orders)

Commit

Disconnect

Insert with Identity Column

// If table has an identity column, don't include it in the data
const table = {
// Exclude 'ID' column (identity)
columns: ["CustomerName", "Email", "CreatedDate"],
rows: $item.sourceData.rows.map(row => ({
CustomerName: row.Name,
Email: row.Email,
CreatedDate: new Date().toISOString()
}))
};

Insert with Computed Columns

// Add computed values to rows
const table = {
columns: ["CustomerName", "Email", "RegistrationDate", "Source"],
rows: $item.sourceData.rows.map(row => ({
CustomerName: row.Name,
Email: row.Email,
RegistrationDate: new Date().toISOString(),
Source: 'API Import' // Computed value
}))
};

Data Type Mapping

Ensure JavaScript data types match SQL Server column types:

JavaScript TypeSQL Server TypeExample
StringVARCHAR, NVARCHAR, TEXT, CHAR'John Doe'
NumberINT, BIGINT, DECIMAL, FLOAT, MONEY123, 45.67
BooleanBITtrue → 1, false → 0
DateDATE, DATETIME, DATETIME2, SMALLDATETIME'2024-01-15', '2024-01-15 10:30:00'
nullNULLnull
undefinedNULLundefined → null

Date Format Examples

// Format dates for SQL Server
const table = {
columns: ["OrderDate", "ShipDate", "TimestampColumn"],
rows: $item.orders.map(order => ({
// Date only (DATE column)
OrderDate: new Date(order.orderDate).toISOString().slice(0, 10),
// '2024-01-15'

// DateTime (DATETIME/DATETIME2 column)
ShipDate: new Date(order.shipDate).toISOString().slice(0, 19).replace('T', ' '),
// '2024-01-15 10:30:00'

// Full ISO timestamp
TimestampColumn: new Date().toISOString()
// '2024-01-15T10:30:00.000Z'
}))
};

Handling NULL Values

// Convert undefined to NULL, handle empty strings
const table = {
columns: ["Name", "Email", "Phone", "Notes"],
rows: $item.data.rows.map(row => ({
Name: row.Name || null,
Email: row.Email || null,
Phone: row.Phone || null,
Notes: row.Notes || null // Empty string → null
}))
};

Using Direct Credentials (Without Connect)

Insert Table Node:
- Table Name: dbo.Customers
- Table Data: $item.table
- Credentials: [Select Database Credential]
- Connection Id: (leave empty)

This creates a temporary connection, inserts the data, and closes the connection automatically. Useful for one-off inserts.

See Also