Andrew Parker and Jeff Atwood have both had great posts recently about performance as a feature, but I think they've each actually stopped short of a powerful point - improving a product experience by 10 or 20 percent through optimization is great, but there's incredible power when you unlock fundamentally different feature sets through radical optimization. This is an area that the code-first-then-optimize process misses entirely, because incremental improvements on the same basic design will never lead to order-of-magnitude performance improvements.
The power of such performance improvements is one of the most important lessons I learned at Google. A couple examples demonstrate the kind of optimization I'm thinking of:
- In 2004, when the standard storage for webmail was 2MB, Google was able to launch Gmail with 1GB of storage, because GFS provided a means for managing disk that was orders of magnitude cheaper than what the other providers were using. Rumor has it that Yahoo went out and gave NetApp millions of dollars to buy storage devices in order to come anywhere close to what Google was offering. Underlying this all is an optimization of storage and disk that was deeply more efficient that what others in industry were capable of at the time.
- One of the coolest features of Google maps is the ability to see a route, then grab it with your mouse and drag it to change the route. That feature is possible because Google developed a radically more efficient route-finding algorithm, years ahead of what anyone else in the market can offer. The difference between computing a route in 1 second and computing it in 10 milliseconds means you can suddenly offer users the ability to compute hundreds of times more routes.
It's part of the modern software engineering zeitgeist that "premature optimization is the root of all evil," but as I researched this post, I found out that the full Knuth quote is a lot more illuminating than just that snippet; the full statement attributed to Knuth is actually, "We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil" (emphasis mine).
So yeah, we can all agree that "performance is a feature," but that fails to convey the power of high-performance systems. We can talk about caching database results or using a CDN for static content, and everyone should be doing those things, but let's not be afraid to go much, much deeper. Consider the core operations of your service, then imagine that you could speed them up by 100x - what radically new features would be enabled? With those radical new features in mind, start working backwards to figure out actually make those 100x improvements.
The big challenge is that these order-of-magnitude optimizations are design optimizations, not the kind of changes you can make after the fact. In design discussions, the engineer arguing for keeping the entire datastore in memory is immediately shouted down with the "premature optimization" line, but I think it's time we start fighting back on behalf of design-time optimization.
No comments:
Post a Comment