A developer I used to work with disagreed with this on the grounds that calling a function, and potentially passing the necessary data, incurred enough overhead to become significant to execution time.


My rationalization alarm triggered on this one!

Scott is, as usual, right in that various interpreted languages (PL/SQL, and I'd add VB in all its incarnations) can certainly load the overhead onto a resource-starved application. However, I work primarily in the real-time embedded space, where we are always starved for resources, with not enough process, not enough ROM and not enough RAM. In over 27 years, I cannot think of a single instance where modularizing the code resulted in a missed or overtime event, clobbered interrupt, or anything else that would indicate that "overhead be[came] significant to execution time."

Basically, the guy is spounting bullshit