Back to all posts

Data Import from text/CSV file in SQL Server

Step-by-Step Explanation 1. Checking and Dropping Existing Temporary Tables Before creating the temporary tables, the script ensures that any previously ex…

Step-by-Step Explanation

1. Checking and Dropping Existing Temporary Tables

Before creating the temporary tables, the script ensures that any previously existing instances of #ImportTable and #LogTable are removed.

SQL
IF OBJECT_ID('tempdb..#ImportTable') IS NOT NULL
BEGIN
    DROP TABLE #ImportTable;
    PRINT 'Temporary table #ImportTable dropped successfully.';
END
ELSE
    PRINT 'Temporary table #ImportTable does not exist. Proceeding to create.';

This step ensures a clean slate for the import process.


2. Creating the Temporary Table

The #ImportTable is created with the necessary columns to store data temporarily.

SQL
CREATE TABLE #ImportTable(
    EmployeeCode VARCHAR(50),
    IfscCode     VARCHAR(50),
    Location     VARCHAR(50),
    LocationCode VARCHAR(50)
);
PRINT 'Temporary table #ImportTable created successfully.';

3. Creating the Logging Table

A second temporary table, #LogTable, is created to store log messages, aiding in debugging and process monitoring.

SQL
IF OBJECT_ID('tempdb..#LogTable') IS NOT NULL
BEGIN
   DROP TABLE #LogTable;
END
GO
CREATE TABLE #LogTable (
    LogID INT IDENTITY(1,1) PRIMARY KEY,
    LogTime DATETIME DEFAULT GETDATE(),
    LogMessage NVARCHAR(MAX)
);

4. Bulk Data Import with Error Handling

The BULK INSERT statement is used to load data from a CSV file (C:\requested.csv) into #ImportTable. Error handling with TRY...CATCH ensures that issues during the import are logged in #LogTable.

SQL
BEGIN TRY
    BULK INSERT #ImportTable
    FROM 'C:\requested.csv'
    WITH (
        FIELDTERMINATOR = ',',
        ROWTERMINATOR = '\n',
        FIRSTROW = 2  -- Skip the header if present
    );

    INSERT INTO #LogTable (LogMessage)
    VALUES ('Bulk insert into #ImportTable completed successfully.');
    PRINT 'Bulk insert completed successfully.';
END TRY
BEGIN CATCH
    INSERT INTO #LogTable (LogMessage)
    VALUES (FORMATMESSAGE('Bulk insert failed. Error: %s', ERROR_MESSAGE()));
    PRINT 'Bulk insert failed. Check dbo.LogTable for details.';
END CATCH

This block ensures:

  • Successful inserts are logged.
  • Errors are captured and recorded for troubleshooting.

5. Viewing Logs

Logs from #LogTable provide details about the process and any errors encountered.

CSS
SELECT * FROM #LogTable;

6. Updating the Main Database

After successful data import, the temporary data can be used to update the main database. Here’s an example (commented out in the script) for updating EmployeeAccounts:

SQL
--update EA SET EA.IFSCCode = ISNULL(E.IfscCode,''),
--             EA.Location = ISNULL(E.Location,''),
--             EA.LocationCode = ISNULL(E.LocationCode,'')
--FROM EmployeeAccounts EA 
--INNER JOIN EmployeeDetails ED ON ED.EmployeeID = EA.EmployeeID
--INNER JOIN #ImportTable E ON E.EmployeeCode = ED.EmployeeCode
--WHERE E.EmployeeCode IS NOT NULL

7. Viewing the Imported Data

To verify the imported data, the script concludes with a simple SELECT query on #ImportTable.

CSS
SELECT * FROM #ImportTable;

Benefits of This Approach

  • Reusability: Temporary tables facilitate modular and reusable logic for data import tasks.
  • Error Logging: The use of a dedicated logging table ensures that issues can be reviewed and resolved effectively.
  • Data Validation: Temporary tables allow for data validation before integrating it into the main database.
  • Ease of Debugging: Clear messages and structured logs simplify the debugging process.

Keep building your data skillset

Explore more SQL, Python, analytics, and engineering tutorials.