HTTP caching and content compression

Monday, Apr 28, 2008 at 15:02


Due to the weight of code that we are now passing between the server and the browser I was getting concerned with the size and speed of the system for our readers. I decided it was time to revisit the caching and compressions system used during content delivery to the browser.

I noticed during debugging using Firebug that the caching of various elements of the system was not 100% and also I was burning user bandwidth with bulky uncompressed js files. So to combat some of this is installed a small compression script in one of my standard system libraries, the code is as below:

Public Shared Sub GZipEncodePage()
If isGZipSupported() Then
Dim AE As String = HttpContext.Current.Request.Headers("Accept-Encoding").ToString
If AE.Contains("gzip") Then
HttpContext.Current.Response.Filter = New System.IO.Compression.GZipStream(HttpContext.Current.Response.Filter, System.IO.Compression.CompressionMode.Compress)
HttpContext.Current.Response.AddHeader("Content-Encoding", "gzip")
HttpContext.Current.Response.Filter = New System.IO.Compression.DeflateStream(HttpContext.Current.Response.Filter, System.IO.Compression.CompressionMode.Compress)
HttpContext.Current.Response.AddHeader("Content-Encoding", "deflate")
End If
End If
End Sub

Public Shared Function isGZipSupported() As Boolean
If IsNothing(HttpContext.Current.Request.Headers("Accept-Encoding")) Then Return False
Dim AE As String = HttpContext.Current.Request.Headers("Accept-Encoding").ToString
If AE.Contains("gzip") Or AE.Contains("deflate") Then Return True
Return False
End Function

Then all I had to do was add a call to GZipEncodePage in the Page.Load event of the various pages of the site I want to compress. I did this on all trek notes pages, the trek index and the system home page. This routine cuts the HTML down to around 30% of the original size. Perfect that solves one problem then I moved onto caching.

There is so much information around on the net about this, however it is difficult to sort out the good from the rubbish. I decided to cache all scripts (js include files), and all stylesheets. Due to versioning issues and being able to force the browser to load an updated routine I also need to add the ability to transfer version numbers of these elements. So I used my url rewritting software to create a new rewrite rule and pointed all css and js files to be downloaded via a new program that basically does the following:

Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs)
Dim FileName As String = CType(Request("f"), String)
Dim ServerFileName As String = Server.MapPath("/Utilities/Scripts/" & FileName)

Dim MimeType As String = "text/javascript"
If Right(FileName, 3) = "css" Then MimeType = "text/css"
Dim liveStream As IO.FileStream = New IO.FileStream(ServerFileName, IO.FileMode.Open, IO.FileAccess.Read)
Dim buffer As Byte() = New Byte(CInt(liveStream.Length) - 1) {}
If Request.Browser.Type = "IE6" And Right(FileName, 3) = "css" Then
' Do not encrypt IE6 CSS file
End If

Response.Cache.SetMaxAge(New System.TimeSpan(30, 0, 0, 0))
Response.ContentType = MimeType
liveStream.Read(buffer, 0, liveStream.Length)
End Sub

Simple huh. Well yes after you work out all the little points that cause you to stumble like cache Private vs Public and compressed cache documents. Also the funnies of IE6 and compressed style sheets. Anyway to cut a long story short the above two routines are running live on various elements of the site and showing huge performance improvements in the delivery of content to the browser.

Today I have also instructed the ISP to adjust the caching expiration of the image servers and this will make it even quicker with significant savings in end user bandwidth. This will particularly help the low speed and dial up users that are still out there unable to get decent broadband service.
Business Member: My Profile  My Blog  Send Message
BlogID: 133
Views: 11237

Comments & Reviews(1)

Post a Comment
Blog Index

Sponsored Links