Scalable solutions and total cost

Chad writes:

On more occasions than I like, I've been asked by pointy-hairs why my initial efforts in a given project take "so long" [1]. My mentality with regards to almost anything development-related is to put forth more effort initially so that I can put forth less later on. If you compound the time savings for subsequent uses of your project, your initial efforts will look very small.

Yes. It's kind of what Eli talked about a bit in his Lazy Programming stories. I had talked about something similar in one of my responses, and am planning on a story on some similar topics.

I guess in some respects it comes down to being pragmatic and being able to identify what type of thing is likely to be repeated. I had a friend in college who, upset at the tedious nature of his mathematics class, wrote a program to do all his homework for him (and producing the required output, not just the answer). It took him a fair amount of time to do upfront, but it made doing his homework a lot more scalable of a process.

I've dealt with a similar issue at work recently, though the potential return was a little more obvious - the project by design was creating a process that was going to be repeated n times. In my mind, it's a throw-back to my Theory of Computation and Algorithms courses and a fairly obvious decision. From an efficiency perspective, I'd much rather trade a cost that is bound by n for some fixed cost instead.

The only trick is being able to recognize when the constant cost becomes too high to ever make up for it on the other size (in other words, n will never grow large enough to make the fixed cost investment worthwhile).  

Reminds me of a tool that Kent and Duncan have blogged about.