jQuery Validate, Large Forms and Performance
The Background
Where I work we make heavy use of jQuery with MVC - I think both technologies are great and are a really good mix. I’ve got a passion for JavaScript and it was the first commercial programming language I learned over ten years ago. Back then it would’ve been considered an almost infantile skill but jump forward ten years and JavaScript is a killer technology that’s fundamental to any web developers skill set. Over the years I followed the JavaScript scene moving from Prototype to MooTools eventually to jQuery (with some forays into other frameworks BackBone and recently CoffeeScript) - so I think I’ve got a pretty good handle on JS
The Problem
We were recently tasked with persisting large objects to the page to avoid session and this resulted in some significant page bloatage - taking our average page to between 1200 elements to 1500 elements - of which we had 770 INPUT elements (50 were visible). I know it’s less than ideal but that was our situation after the work was done, and it worked seemingly well on modern browsers. Our problem though was IE, mixed with the jQuery Validate library - in FireFox and IE9 it took approximately 17 seconds to load a page like this and in older versions of IE it would just crash-out.
jQuery validate does some checking when it starts up to find a “cancel” button and a “submit” button in certain circumstances - and it was this finding process that killed the whole thing on page-load.
In some circumstances there’s also a lookup on “element.rules()” which again kills performance, but these are less prevalent.
Analysis
I did some testing on a form with about 12 elements to see if it could be sped up
Test 1 - 158 calls, 701ms
This is roughly how the code would have executed in jQuery Validate, we were using an old version so this is a little unfair
Test 2 - 122 calls, 499ms
This is using the method from the latest jQuery Validate 1.9 libarary, in fact before seeing the latest version I was thinking about adapting our version with this change
Test 3 - 107 calls, 464ms
This is a test I dreamt up, it seemed more fitting to have jQuery apply the necessary filters at the time of iterating through the DOM to find the necessary elements, lots of people overlook this as an option - I know the performances gain is minimal but on large forms it can be significant
Conclusion
I’m not criticising jQuery Validate, I think it’s genuinely a really easy and valid tool for businesses to use. I also recognise that maybe 1200 elements is not the best approach for a web-page. My criticism, however is of jQuery - the timings above were taken on a top-end modern-day machine for a form that contains 12 elements in a modern-day browser (yes I love FireFox).
Even at the best result, 464ms is insanely slow for a page-load operation - could you imagine your enterprise consumer site having the load time delayed by 464ms? It may seem trivial but in a real-life scenario this delay is likely to be much bigger, and we should remember that Amazon has done research on this kind of thing - that for every 100ms page load delay they lost 1% of sales - all of a sudden, it’s not so trivial
Interesting article. Thanks. What advice can you give where the View Model has a lot of properties that need to be posted back but should not be displayed to the user? Is there a more efficient way posting this back without using many hidden fields?
Good point, there should be few instances where you”d need to render out hidden fields for large domain objects, I”d typically recommend TryUpdateModel (http://msdn.microsoft.com/en-us/library/ee264031.aspx) which will let you bind form values to a pre-populated domain object. I understand there are some instances where a large object may need to be serialised in some way - if that”s the case then I”d recommend the BinaryFormatter mixed wttih System.Convert.ToBase64String (http://msdn.microsoft.com/en-us/library/system.runtime.serialization.formatters.binary.binaryformatter%28v=VS.100%29.aspx)