Bulk Insert from Generic List into SQL Server with minimum lines of code



Entity Framework and even classic ADO.NET is very slow when you need to perform a large number of inserts into the database.

Calling 1000 times INSERT INTO is not efficient because for every INSERT operation, SQL Server will also need to write to the transaction log.

A nice workaround for this is to use Bulk insert or SQLBulkCopy class in C#.

However SQLBulkCopy method WriteToServer does not accept List<T> or IEnumerable, IList as parameter.

Implement extension method AsDataTable for IEnumerable<T>

A nice workaround is to implement extension method AsDataTable to IEnumerable because WriteToServer accepts DataTable as parameter.

Here is a simple implementation to do this:

 public static class IEnumerableExtensions
    {
        public static DataTable AsDataTable<T>(this IEnumerable<T> data)
        {
            PropertyDescriptorCollection properties = TypeDescriptor.GetProperties(typeof(T));
            var table = new DataTable();
            foreach (PropertyDescriptor prop in properties)
                table.Columns.Add(prop.Name, Nullable.GetUnderlyingType(prop.PropertyType) ?? prop.PropertyType);
            foreach (T item in data)
            {
                DataRow row = table.NewRow();
                foreach (PropertyDescriptor prop in properties)
                    row[prop.Name] = prop.GetValue(item) ?? DBNull.Value;
                table.Rows.Add(row);
            }
            return table;
        }
    }

After this we can use this extension method to Bulk insert data from List<T> into SQL Server.

 var listPerson = new List<Person>
     {
          new Person() {Id = 1}, 
          new Person() {Id = 2}
     };

 using (var connection = new SqlConnection(ConfigurationManager.ConnectionStrings["SomeConnectionString"].ConnectionString))
     {
          connection.Open();
          SqlTransaction transaction = connection.BeginTransaction();

          using (var bulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.Default, transaction))
          {
             bulkCopy.BatchSize = 100;
             bulkCopy.DestinationTableName = "dbo.Person";
             try
             {
                 bulkCopy.WriteToServer(listPerson.AsDataTable());
             }
             catch (Exception)
             {
                 transaction.Rollback();
                 connection.Close();
             }
           }

           transaction.Commit();
     }

Optimal BatchSize depends on the number of items you need to insert, row size, network bandwidth and latency.

After doing some testing I can recommend BatchSize of 100 when you need to insert few thousand items into the database.

The best way to find optimal BatchSize value is to experiment yourself.

If you like this article don’t forget to subscribe to this blog and make sure you don’t miss new upcoming blog posts.

 

  • Nice extension method. I have used the sql bulk copy class many times in the past but my implementations have always been more clunky and required some understanding of the types in question. I like this generic approach.

  • swapnil dane

    Excellent article. I am SQL Server developer and new to
    .NET programming. I was looking for something to load List into SQL Server
    tables. This worked perfectly for me. Thank you..!!

  • swapnil dane

    Can I extend this to load JSON feed into sql server table? I really hope that Microsoft will provide some native support for JSON.

    • You can store JSON in varchar(max) column as plain text but you cannot query it. If you need to store JSON files in Db and query it you can look at Azure DocumentDb.

  • Danilo R

    Hi, I know post is a little bit old, but I wanted to comment on this if someone finds this useful. (As I did!)

    The code snippets are pretty handy. However, calling “transaction.Commit();” after calling the lines in the “catch (Exception)” block, will throw another exception since the connection has already been closed.

    That being said. I would remove “connection.Close();” from the “catch (Exception)” block in order to avoid this. PS: Since we have a “using” block, it will close the connection in any case.

  • Aaron Merritt

    I hate to revive an old topic but I can’t find the answer to the problem I have from using this.. This method results in literally 34x the speed of which it was saving before on my end which is really awesome.

    My problem though is that when it saves the list of string values into SQL, they’re stored in an integer format and when I read the values back into the string, the values don’t convert back to a string..

    I’m certain that I’m just an idiot but can anyone shed light on the matter for me?