Bulk Insert from Generic List into SQL Server with minimum lines of code



Entity Framework and even classic ADO.NET is very slow when you need to perform a large number of inserts into the database.

Calling 1000 times INSERT INTO is not efficient because for every INSERT operation, SQL Server will also need to write to the transaction log.

A nice workaround for this is to use Bulk insert or SQLBulkCopy class in C#.

However SQLBulkCopy method WriteToServer does not accept List<T> or IEnumerable, IList as parameter.

Implement extension method AsDataTable for IEnumerable<T>

A nice workaround is to implement extension method AsDataTable to IEnumerable because WriteToServer accepts DataTable as parameter.

Here is a simple implementation to do this:

 public static class IEnumerableExtensions
    {
        public static DataTable AsDataTable<T>(this IEnumerable<T> data)
        {
            PropertyDescriptorCollection properties = TypeDescriptor.GetProperties(typeof(T));
            var table = new DataTable();
            foreach (PropertyDescriptor prop in properties)
                table.Columns.Add(prop.Name, Nullable.GetUnderlyingType(prop.PropertyType) ?? prop.PropertyType);
            foreach (T item in data)
            {
                DataRow row = table.NewRow();
                foreach (PropertyDescriptor prop in properties)
                    row[prop.Name] = prop.GetValue(item) ?? DBNull.Value;
                table.Rows.Add(row);
            }
            return table;
        }
    }

After this we can use this extension method to Bulk insert data from List<T> into SQL Server.

 var listPerson = new List<Person>
     {
          new Person() {Id = 1}, 
          new Person() {Id = 2}
     };

 using (var connection = new SqlConnection(ConfigurationManager.ConnectionStrings["SomeConnectionString"].ConnectionString))
     {
          connection.Open();
          SqlTransaction transaction = connection.BeginTransaction();

          using (var bulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.Default, transaction))
          {
             bulkCopy.BatchSize = 100;
             bulkCopy.DestinationTableName = "dbo.Person";
             try
             {
                 bulkCopy.WriteToServer(listPerson.AsDataTable());
             }
             catch (Exception)
             {
                 transaction.Rollback();
                 connection.Close();
             }
           }

           transaction.Commit();
     }

Optimal BatchSize depends on the number of items you need to insert, row size, network bandwidth and latency.

After doing some testing I can recommend BatchSize of 100 when you need to insert few thousand items into the database.

The best way to find optimal BatchSize value is to experiment yourself.

If you like this article don’t forget to subscribe to this blog and make sure you don’t miss new upcoming blog posts.