Excellent screencasts! Your point is well taken in regards to the CPU costs of feature detection. On mobile devices, though, I thought the real costs were related to the dearth of RAM and cache space as well as the abysmal download speeds.
This is just one case, but the dojo team was able to eliminate nearly 3/4 of the code in dojo base when they tweaked it to work on webkit only.
This is why UA Profiling (whether using UA sniffing or not) is critical, imho.
While the issue of cache space and RAM are relevant, feature testing in and of itself won’t necessarily contribute to a problem with them. Reducing the size of your library via targeted builds is a great way to deal with those concerns, and I am a fan of customized builds. However, even with feature profiling you don’t have to jump to UA sniffing as your first option, instead you could use a handful of feature detections or weak inferences.
Nicely done, that should hopefully give people some perspective in the discussion.
And bonus points for non-flash video alternative 🙂
JavaScript Magazine Blog for JSMag » Blog Archive » News roundup: the performance of feature detection, Mobile Firefox 4 performance, W3C Touch Events 04:20 on 02/08/2011 | # | Reply
[…] Also check out John-David Dalton’s two part video response arguing in favor of feature testing, based on real-world data showing what he argues to be a small […]
Just to reiterate JDD’s second screencast, the order of techniques from first to last resort are:
1. Feature Testing: Check for the existence of a method and checks that it returns the correct output
2. Feature Detection: Check for the existence of a method
3. Weak Inference: Check for the existence of an unrelated method
4. User Agent Sniffing: Check the browser’s arbitrary user agent string