I completely agree. My point wasn't to make web development and other distributed development models special (I was merely following the example given).

What that means, though, is that for a particular type of technology (whether that's web development, client/server, enterprise integration, etc.) it takes time for the tools to reach a point where inexperienced to non-programmers are able to produce applications quickly and with somewhat acceptable quality (obviously, this is somewhat debatable, hence the *acceptable* quality).

Until tools reach that parity, experienced developers with the technology will be in high demand (assuming the technology is in high demand). From a developer's perspective (a highly skilled one, that is), sticking with a particular technology until it's been "tooled" out isn't necessarily a good idea. Of course, there are some examples for and against this: COBOL, which everyone claimed was dead, is still fairly big (even after the spike in 1998-9); though C (not C++) development has tapered off over the years (being replaced mostly by C++/C#/Java), though still has a fairly large niche (Linux, Unix, drivers, etc.); VB and Delphi relegated PowerBuilder (and to some extent Visual C++) to niche-sized markets; and, of course, even Office, with VBA, macros, Access, etc., has made programmers out of power users. None of this is new, of course.

I guess my point is that, for the past 20 years or so, there have been waves of technologies which have required experienced developers, admins, etc. to scratch a business itch. Those that have ridden those waves have done well (financially), but when they've ridden them too long or too much, it makes it harder to jump onto the next wave (and demand for them has gone down).

Dan