IWETHEY v. 0.3.0 | TODO
1,095 registered users | 0 active users | 0 LpH | Statistics
Login | Create New User
IWETHEY Banner

Welcome to IWETHEY!

New You misremember that article
They weren't burning off natural gas to make a profit. They were burning it to avoid having it explode because they were unable to ship it anywhere without pipes. And they can't build pipes because they'll be blown up.

As for the program, measuring cost and revenue is also the wrong thing to do. What you're really creating with a process like that is reputation. And that is probably impossible to measure.

But let's say that we did it your way. Let's measure cost and revenue. What you'll find very quickly is that switching to processes that run up technical debt show up well in your measurements. And will continue to show up well until you have to addresss that technical debt. But the entire idea of "technical debt" is something that you can't easily measure or estimate. The ever-increasing difficulty of development you can't measure. Good luck determining the cost of turnover because good programmers can't have pride of ownership of crap.

While I agree that what they're doing leads to some stupidity, what you're suggesting that they do leads in an obvious way to the worst MBA mismanagement practices. Which is a lot worse.

Cheers,
Ben
I have come to believe that idealism without discipline is a quick road to disaster, while discipline without idealism is pointless. -- Aaron Ward (my brother)
New Do you think everyone not you is stupid?
You're so quick to point out how I didn't understand what I read that you didn't bother to read what I wrote.
You misremember that article. They weren't burning off natural gas to make a profit.
Really? I misremember? Hmm, let's see what I actually wrote:
The people responsible for oil production have a mandate to make as much money as possible. In doing so, they are screwing the electric plants who could really use all the natural gas being burned off at the wells. But the oil guys aren't measuring natrual gas, they're measuring oil prices.
See that highlighted part there? That's where I point out that they didn't care about the gas. Nowhere did I suggest that they thought burning it off somehow brought them profit.

And since you seem to have missed it, my point was that if you aren't measuring the eventual impact of your changes you're not measuring the right thing. See how the Iraq story is an example of that? They're measuring their piece of the project and ignoring the bottom line. Gosh, that sounds almost like my point, how you can optimize the part at the expense of the whole.

Okay, so you didn't bother to read everything, at least you wouldn't jump to conslusions about how stupid I want to be. Oh wait! Here's where you do exactly that:
But let's say that we did it your way. Let's measure cost and revenue. ... what you're suggesting that they do leads in an obvious way to the worst MBA mismanagement practices. Which is a lot worse.
Gosh, when you put it that way it does sound like a bad idea to measure only cost and revenue. I wish I hadn't said that. Oh wait (again)! I didn't say that. I said:
Any measurement that doesn't include -- or get included in -- a measurement of the end product is meaningless.
Feel free to expound again on how badly I want to screw things up by ignoring the quaility of the tools being developed. Or you could take a fresh approach and address the central point I've been trying to make: that if you fail to measure the output of your new process or application you can't really measure its success.
New No. Only when they ignore important stuff.
For a start you're referring to an article on Iraq. Perhaps [http://spectrum.ieee.org/feb06/2831/3|page 3] of that article will refresh your memory. It doesn't matter what price the Ministry is willing to offer - there is no way to deliver the natural gas to where it is needed.\r\n\r\nSo while I understand perfectly well what you were trying to express with that example, you misremembered the facts. In fact the case you noted isn't an example of what you thought it was. Sure, screwed up price incentives are a big part of Iraq's energy crisis. But at a different stage - they need to give consumers a realistic price for electricity so that people will conserve electric use a bit. If people conserved just a bit, then their electric output would stretch a lot farther than it does.\r\n\r\n<hr>\r\n\r\nNow the entire cost versus revenue thing. I used cost and revenue in my post because I was responding to http://z.iwethey.org/forums/render/content/show?contentid=253146 where you said we should measure cost and revenue. Given that I was responding to a post where you bring up cost and revenue as critical factors, it made sense to me to point out how directly measuring cost and revenue is going to lead you astray.\r\n\r\n<hr>\r\n\r\nNow let's go to your central point.\r\n\r\nThe problem with your thesis is that while it sounds great on paper to just measure the right thing, it is impossible in practice to nail down what that is. Try as you like, I guarantee you that you won't come up with any unambiguous measurement that measures the right thing. And in the process of trying you'll introduce so many potential fudge factors that it will be impossible to do an apples to apples comparison of anything.\r\n\r\nDon't believe me? Well let me give an attempt so you can see how it goes.\r\n\r\nAny decent accountant would tell you that the problem with measuring cost and revenue directly is that you're doing cash-based accounting. It is very easy to manipulate figures with cash-based accounting, and cash misses a whole ton of important factors. What you actually want to do is accrual accounting.\r\n\r\nSo let's try to do accrual accounting on software development. Well some significant assets that we gain from doing a software project are that we often are left with some libraries that we can reuse and we gain knowledge that may help future projects. Some costs that we gain are the ongoing maintainance costs of supporting the existing factor. All 3 factors are very important. (For instance ongoing maintainance typically costs more than initial delivery of the project.) However when a project finishes, <i>none</i> of them can be reliably estimated. So our attempted measurement just winds up with a series of big question marks. Attempt to fill in those question marks, and you'll find that your measurements are completely dominated by the assumptions that went into those estimates.\r\n\r\nAnd that is the whole point of looking at bug counts. Sure, it is imperfect. But it is better than nothing, and is better than any easy alternative that you're likely to come up with. Furthermore measuring bug counts has the following very concrete advantages:\r\n<ol>\r\n<li> Bug counts are a good measure of software reliability. This is something that people tend to value fairly highly.\r\n<li> Bug counts are a fairly good proxy for the cost of ongoing maintainance. Given that maintainance typically is the bulk of the cost of software, this makes them strongly correlated with the real cost of development.\r\n<li> In studies, developers who are asked to optimize for reliability do pretty well in most other measures of the software development process, including development speed and software speed. By contrast developers who attempt to optimize for other characteristics tend to do well in the chosen metric, but fairly badly in most other metrics.\r\n</ol>\r\nTherefore reducing bug counts pretty directly improves two key software characteristics (reliability and maintainance cost), while tending to make you reasonably good at other important characteristics (development speed and software speed). To the best of my knowledge, focusing on any other simple metric will give far more complex results. And trying to focus on a complex metric opens up so many grey areas that you have no hope of getting clear understanding or buy-in on what you're trying to improve on. (And little hope that you're actually measuring something that does what you really want to do.)\r\n\r\nTherefore focussing on reducing bug counts doesn't sound like a particularly stupid idea to me. (Perhaps I've just read - and believed - too much Steve McConnell...)\r\n\r\nCheers,\r\nBen
I have come to believe that idealism without discipline is a quick road to disaster, while discipline without idealism is pointless. -- Aaron Ward (my brother)
     When 6 Sigma doesn't apply to IT projects - (dbishop) - (16)
         Well, on the flipside of this... - (folkert)
         Don't knock it quite so quickly - (ben_tilly) - (12)
             Answering both you and Greg - (dbishop) - (11)
                 That's not my understanding. - (Another Scott) - (2)
                     I didn't say there's no place for it - (dbishop) - (1)
                         Good points. -NT - (Another Scott)
                 Measure what you want to improve - (ben_tilly) - (7)
                     And that should be cost or revenue - (dbishop) - (4)
                         As I recall, at IBM, peer code reviews to catch - (a6l6e6x)
                         You misremember that article - (ben_tilly) - (2)
                             Do you think everyone not you is stupid? - (dbishop) - (1)
                                 No. Only when they ignore important stuff. - (ben_tilly)
                     Many elements of software development are stochastic - (dws) - (1)
                         One must keep balance. - (ben_tilly)
         Let's start again. - (Another Scott) - (1)
             Your point #2 is pretty much my whole point - (dbishop)

This is untested and you're my guinea pig.
136 ms