(Pre)mature optimization and small efficiencies
(Pre)mature optimization and small efficiencies
- Subject: (Pre)mature optimization and small efficiencies
- From: Marcel Weiher <email@hidden>
- Date: Mon, 09 Sep 2013 14:37:22 +0200
On Sep 3, 2013, at 16:54 , Fritz Anderson <email@hidden> wrote:
> On 2 Sep 2013, at 12:47 AM, Marcel Weiher <email@hidden> wrote:
>
>> This gets (mis-)quoted out of context way too much (my emphasis):
>>
>> "We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil”
>>
>> It goes on as follows:
>>
>> "Yet we should not pass up our opportunities in that critical 3%. A good programmer will not be lulled into complacency by such reasoning, he will be wise to look carefully at the critical code; but only ***after*** that code has been identified. It is often a mistake to make ***a priori*** judgments about what parts of a program are really critical, since the universal experience of programmers who have been using measurement tools has been that their intuitive guesses fail.
> ...
>
> This is wisdom. But the aphorism is not in even rhetorical opposition to — it is a reinforcement of — what Knuth restated at length.
Have you read the rest of the paper? It's about the use of goto in structured programming, almost exclusively for efficiency. Here he gives a little caveat that, yes, some forms of optimization (small efficiencies, premature) are probably bad (not all the time, mind you, but most of the time), but damnit it is your duty as an engineer to optimize your code, and in any other engineering discipline, leaving even as much as 12% on the table would be considered a crime, let alone 250-350%.
Here’s a longer discussion: http://ubiquity.acm.org/article.cfm?id=1147993
There is lots of gold in there, and my practical experience agrees with this pretty much 100%. The quality/performance of software, including Apple’s has suffered greatly from the idea that is behind the mis-quotation and mis-application of ideas. I can’t tell you how many times I’ve seen heroic optimization efforts aimed at squeezing 10% performance out of a bad design when a better design that had considered performance beforehand would not just have been cleaner, but also 10x faster without the need for heroics. Or in other words, heroic efforts that made software barely tolerable when it should have been awesome without the need for heroics.
One example I’ve come across very often is misuse of databases, I’ve built systems 1000x faster by just leaving out the database (simpler as well). How much software today shows loading/saving progress bars on tiny documents? Huh? Numbers, when it came out, couldn’t handle 3 pages of, er, numbers. Today it’s no longer comically bad, but shouldn’t it really be excellent or awesome on computers that are probably 100x faster than a Cray 1 supercomputer?
And even the correct quote probably isn’t as valid these days as it was back in those times. Profiles are much flatter today, partly due to additional layers of abstraction and indirection, so identifying the performance issues in your software is much harder than bringing up the profiler and fixing the top 2-3 entries.
Cheers,
Marcel
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden