8 ways to improve ASP.NET Web API performance



ASP.NET Web API is a great piece of technology. Writing Web API is so easy that many developers don’t take the time to structure their applications for great performance.

In this article, I am going to cover 8 techniques for improving ASP.NET Web API performance.

1) Use fastest JSON serializer

JSON serialization  can affect overall performance of ASP.NET Web API significantly. A year and a half I have switched from JSON.NET serializer on one of my project to ServiceStack.Text .

I have measured around 20% performance improvement on my Web API responses. I highly recommend that you try out this serializer. Here is some latest performance comparison of popular serializers.

SerializerPerformanceGraf

Source: theburningmonk

UPDATE: It seams that StackOverflow uses what they claims even faster JSON serializer called Jil. View some benchmarks on their GitHub page Jil serializer.

2) Manual JSON serialize from DataReader

I have used this method on my production project and gain performance benefits.

Instead reading values from DataReader and populating objects and after that reading again values from those objects and producing JSON using some JSON Serializer,  you can manually create JSON string from DataReader and avoid unnecessary creation of objects.

You produce JSON using StringBuilder and in the end you return StringContent as the content of your response in WebAPI

var response = Request.CreateResponse(HttpStatusCode.OK);
response.Content = new StringContent(jsonResult, Encoding.UTF8, "application/json");
return response;

 

You can read more about this method on Rick Strahl’s blog

3) Use other formats if possible (protocol buffer, message pack)

If you can use other formats like Protocol Buffers or MessagePack in your project instead of JSON do it.

You will get huge performance benefits not only because Protocol Buffers serializer is faster, but because format is smaller than JSON which will result in smaller and faster responses.

4) Implement compression

Use GZIP or Deflate compression on your ASP.NET Web API.

Compression is an easy and effective way to reduce the size of packages and increase the speed.

This is a must have feature. You can read more about this in my blog post ASP.NET Web API GZip compression ActionFilter with 8 lines of code.

5) Use caching

If it makes sense, use output caching on your Web API methods. For example, if a lot of users accessing same response that will change maybe once a day.

If you want to implement manual caching such as caching tokens of users into memory please refer to my blog post Simple way to implement caching in ASP.NET Web API.

6) Use classic ADO.NET if possible

Hand coded ADO.NET is still the fastest way to get data from database. If the performance of Web API is really important for you, don’t use ORMs.

You can see one of the latest performance comparison of popular ORMs.

ORMMapper

The Dapper and the  hand-written fetch code are very fast, as expected, all ORMs are slower than those three.

LLBLGen with resultset caching is very fast, but it fetches the resultset once and then re-materializes the objects from memory.

7) Implement async on methods of Web API

Using asynchronous Web API services can increase the number of concurrent HTTP requests Web API can handle.

Implementation is simple. The operation is simply marked with the async keyword and the return type is changed to Task.

[HttpGet]  
public async Task OperationAsync()  
{   
    await Task.Delay(2000);  
}

8) Return Multiple Resultsets and combined results

Reduce number of round-trips not only to database but to Web API as well. You should use multiple resultsets functionality whenever is possible.

This means you can extract multiple resultsets from DataReader like in the example bellow:

// read the first resultset 
var reader = command.ExecuteReader(); 

// read the data from that resultset 
while (reader.Read()) 
{ 
	suppliers.Add(PopulateSupplierFromIDataReader( reader )); 
} 

// read the next resultset 
reader.NextResult(); 

// read the data from that second resultset 
while (reader.Read()) 
{ 
	products.Add(PopulateProductFromIDataReader( reader )); 
}

 

Return as many objects you can in one Web API response. Try combining objects into one aggregate object like this:

public class AggregateResult
{
     public long MaxId { get; set; }
     public List<Folder> Folders{ get; set; }
     public List<User>  Users{ get; set; }
}

 

This way you will reduce the number of HTTP requests to your Web API.

Thank you for reading this article.

Leave a comment below and let me know what other methods you have found to improve Web API performance?

 
 

  • khorvat2

    Nice, really nice we use some of these libraries so we can try some of your suggestions. Thanks

    • http://blog.developers.ba Radenko Zec

      Thanks Kristijan. If you get stuck somewhere feel free to contact me.

  • http://blog.ace-dev.fr/ Acesyde

    Very nice, thanks for that !

    • http://blog.developers.ba Radenko Zec

      Thanks for comment.

  • cubski

    Great post Radenko! Looking forward to future posts.

    • http://blog.developers.ba Radenko Zec

      Thanks.

  • B. Clay Shannon

    In tip #2, where/how is jsonResult declared/initialized?

  • http://linkedin.com/in/chrisallenstudio Zias Zias

    nice!

    • http://blog.developers.ba Radenko Zec

      Thanks.

  • http://www.Marisic.Net/ dotnetchris

    Note that 7) Implement async on methods of Web API

    will likely LOWER performance. It should increase THROUGHPUT however.

    Nothing can ever be faster than synchronous code. It’s certainly not efficient however.

  • Vedran Mandić

    Nice! Very smart and simple performance tips Radenko. You write an interesting blog, I will be reading you definately. Cheers.

    • http://blog.developers.ba Radenko Zec

      Thanks Vedran. You can use this link http://eepurl.com/ZfI8v to subscribe to this blog and make sure you don’t miss new upcoming blog posts.

      • Vedran Mandić

        Thanks!! Done.

  • Allan Chadwick

    I don’t think point 6 is fully accurate. Since ORM’s, specifically Entity Framework, can consume stored procedures with the same performance as Classic ADO, the point should be to use stored procedures and avoid dynamically created SQL. Entity Framework and other ORMs are still worth using for their code generation, consistency in the dal layer, and general popularity decreasing cost of ownership. http://stackoverflow.com/questions/8103379/entity-framework-4-1-vs-enterprise-data-application-block-maximum-performance

    • http://blog.developers.ba Radenko Zec

      Hi Allan.Thanks for comment. I am not sure that it is worth using EF if you will use it only to consume stored procedures.
      If you don’t use some other features like change-tracking or LINQ I would always choose Classic ADO.NET.
      But that is my opinion. Someone else would maybe choose EF or some other ORM.

      • http://www.achadwick.com Allan Chadwick

        I don’t think enough people realize that LINQ in EF is optional. EF is still a quick code generator for your database which will get you wired up more quickly then hand writing ADO. Although I can’t blame anybody for avoiding the EF ‘Wizard’ experience or the initial learning curve without any benefit. :) As my buddy told me once.. “It’s the future!” lol.

  • tw37

    Good tips Radenko. We happen to run an Open BoK (Body Of Knowldge) called practical performance analyst. Take a look at what we do at practicalperformanceanalyst.com and let me know you are keen to come and help.

  • Evgeny

    Good post!!!
    great tips!!!
    thank you.

    • http://blog.developers.ba Radenko Zec

      Thanks Evgeny.

  • Wil

    Hi Radenko,
    Great post!
    I came across it just at the right time in my project. I have however decided to keep EF in my project for my internal business logic (i.e. small volume returns) as I like the code first method but use the old datastream methods for the big database hits. Just wondering if anyone else had experience using a mix of EF and classic ADO.net in the same project?
    Cheers,
    Wil

    • http://blog.developers.ba Radenko Zec

      Hi Wil. Thanks for comment. It seams very reasonable to use classic ADO.NET only when is necessary.
      I have also done it similar way on some of my projects.