IWETHEY v. 0.3.0 | TODO
1,095 registered users | 0 active users | 0 LpH | Statistics
Login | Create New User
IWETHEY Banner

Welcome to IWETHEY!

New Interesting translater
Of course, any company that says
they will translate my COBOL to
K & R Java is probably a bit flakey.

Wow. Kerningham and Ritchie Java.
I knew they were visionaries, I just
didn't know how far they saw.

No, we need to get away from PL/1 also.
Or is it PL/I? I'm never sure.

Ageism? Hey, sounds like you are accusing
us of age discrimination! We love our old
fogeys. Here is a bullet point in my
proposal:


*) We would like to retain current staff for as long as possible,
to minimize the loss of institutional knowledge.
New Can some of that knowledge be codified?
Technically it'd probably be better to document most of the "business rules" and then re-write from scratch.

For one, it can be very useful to explicitly spell out the assumptions, requirements, etc, though this isn't typically fun or exciting work.

On a much smaller scale, I'm involved with some of this -- I'm involved both with documenting all the measurements one of our machines can do -- and I'm discovering a lot of nuances along the way. And, I'm involved with re-doing some of the software, getting some 16-bit MSVC1.52 code to work in a 32-bit environment with very different assumptions. So far I've found out that the guys who did the new environment have dropped quite a bit of needed functionality (although it'll probably be added in before Christmas).

An accounting system with a 4GL (like AppGen) would be one approach, but you'd still have issues with vendor tie-in and finding qualified people.

Tony
New Re: Can some of that knowledge be codified?
Tony, the problem is that people do not as a rule document the mistakes they've made. what they learned from them, and how they fixed them. Same thing applies to assumptions made to bound a problem. So "why things are the way they are" is lost when the developer leaves even when "what is" is well documented. This makes it tough for someone new on a project to differentiate between what is "holy" and what can be changed. And so, new people often repeat the mistakes once made by their predecessors and (again for the organization) learn the lost lesson.

There is probably a tedious process that could be used to minimize this state of affairs, but it must be foreign to human nature.

Experience is knowing you've made the same mistake again. :)
Alex

Men never do evil so completely and cheerfully as when they do it from religious conviction. -- Blaise Pascal (1623-1662)
New Re: Can some of that knowledge be codified?
I have been known, when trying to fix code, to document crap code to the best of my understanding. There are a couple of times when I might have gone to extremes (the comments documenting the fix were at least a screenful in length), but at least whoever comes after has the benefit of what I managed to decipher.
"Beware of bugs in the above code; I have only proved it correct, not tried it."
-- Donald Knuth
New Interesting article for you
Yes, it is possible for humans to codify not just the what, but also the why of what they write. Take a look at [link|http://www.fastcompany.com/online/06/writestuff.html|how NASA does it].

The conclusions shouldn't be surprising, though you don't see them often:
Programming is a profession, and should be done in a professional manner.
Programming isn't "special," in need of gurus and all-nighters.
Nothing important should be left to a single person from beginning to end, ever.

Not that this is much direct help for your situation, but a good background against which to make decisions.
We have to fight the terrorists as if there were no rules and preserve our open society as if there were no terrorists. -- [link|http://www.nytimes.com/2001/04/05/opinion/BIO-FRIEDMAN.html|Thomas Friedman]
New No way
If you have an almost infinite budget, and lives at stake, maybe.
But the concept of "good enough" is a real one.
But it is a grey area. NT bluescreens are good enough for
some people, 1 year uptime for others.

What are you willing to pay for?
What are your customers willing to pay for?
How long are you willing to wait?
What is good enough for the guy writing the check?
End of story.
New The question was, "Is it possible?"
The thing I like about the article is that it questions the assumption that just about everyone seems to make: that IT in general, and software development in particular, is done the way it is because it's the only way it can be done. This article shows that, given the proper incentives, it can be done as "flawlessly" as other "critical" functions.

With very rare exceptions, multi-million dollar projects should not be entrusted to "kids" fresh out of school, as many seem to be. When managment is ready to treat IT as a critical function, maybe they will put the money into it.

None of this changes your situation, just a rant. I'm sure I'm not the only one who's tired of being told it has to be right, and there's no more time or money in the budget.
We have to fight the terrorists as if there were no rules and preserve our open society as if there were no terrorists. -- [link|http://www.nytimes.com/2001/04/05/opinion/BIO-FRIEDMAN.html|Thomas Friedman]
New Right on, Drew!
"There's not enough time to do it right the first time". "But, there's always time to do it over again". And, again, and again,...
Alex

Men never do evil so completely and cheerfully as when they do it from religious conviction. -- Blaise Pascal (1623-1662)
New I used to think I was REALLY good
This meant I really cared about getting
the design and code right, and bugs were
a personal failure.

I accepted that I would have a rare one, and
that it would be fixed immediately.

This was in about 30,000 lines of pointer heavy
'C' and Oracle Pro-C code. Bugs meant core dumps.
It was rare that a user would experience it, since
the OS would kill my program way before then.

This was when I was responsible for the design
and coding of an in-house editorial system that
had 30 active editorial users, and produced monthly
data dumps that were then published on Compuserve,
CD-ROM, and printed book.

I had 1 bad output in 3 years, and was fixed in
24 hours. My users never lost one byte of data,
even in the event of hardware failure.

Now I do junk mail. Dead trees. Marketing databases.
Fuzzy yucky data, where there is a percentage of known
bad data, it as long as it is under the magical
max, it is ok.

ok.

ok.

ewwwwww.

I gotta go wash my hands.

But:

I've got to hit the print date.

I've got hundreds of people at various companies
sitting and idling if I miss the print date.

So as long as it is good enough, it really is "good enough".
New Your exception IMO proves the rule
Someone looks at the cost to produce cleaner data, and where this line croses the cost of delaying the mailing. At that point, it's "good enough." I strongly suspect most decisions don't have nearly this clear a metric by which to define "good enough," yet the call is made anyway.

Beyond that, my comments were more directed to the process than the actual decision. If, as I said, someone is willing to trust a project to a wet, pink newbie they have just proven through action that they don't really think it's that important. You are obviously not a newbie (though I won't comment on your pinkness or wetness). They are willing to pay for you. When they give your job to a PFY to save money, that's when "good enough" is a fool's bargain.
We have to fight the terrorists as if there were no rules and preserve our open society as if there were no terrorists. -- [link|http://www.nytimes.com/2001/04/05/opinion/BIO-FRIEDMAN.html|Thomas Friedman]
New Interesting article indeed. Thanks!
It takes money, time, and discipline to do it right. The process needs a high perceived cost of failure to make it work. Can you imagine being able to sue Microsoft for damages of every Outlook exploit that one suffers? The current license absolves them of responsibility and there is no associated cost.

However, you may recall the NASA fast-track Mars mission fiasco where one sub-system used the metric system of measurements and another the English (actually just US and one third world country) measurement system. (If you read this, control yourself, Andrew!)
Alex

Men never do evil so completely and cheerfully as when they do it from religious conviction. -- Blaise Pascal (1623-1662)
New Re: PL/1 or PL/I? It is PL/I.
The I being a Roman numeral - "pee el one" - Programming Language, One. IBM's ultimate language in the mid 1960's, combining Fortran, COBOL, and who knows what else. All things to all people.

Sorry for brain fart. It's been a long time since I've given it any thought.
Alex

Men never do evil so completely and cheerfully as when they do it from religious conviction. -- Blaise Pascal (1623-1662)
     Thus the great PHB spake: - (broomberg) - (17)
         Definitely not C++ - (tonytib)
         For one thing COBOL is in some sense a subset of PL/1. - (a6l6e6x) - (12)
             Interesting translater - (broomberg) - (11)
                 Can some of that knowledge be codified? - (tonytib) - (9)
                     Re: Can some of that knowledge be codified? - (a6l6e6x) - (8)
                         Re: Can some of that knowledge be codified? - (wharris2)
                         Interesting article for you - (drewk) - (6)
                             No way - (broomberg) - (4)
                                 The question was, "Is it possible?" - (drewk) - (3)
                                     Right on, Drew! - (a6l6e6x)
                                     I used to think I was REALLY good - (broomberg) - (1)
                                         Your exception IMO proves the rule - (drewk)
                             Interesting article indeed. Thanks! - (a6l6e6x)
                 Re: PL/1 or PL/I? It is PL/I. - (a6l6e6x)
         Domino/Db2 ass end webforms front end - (boxley)
         DB/2 on the MF partition, apps on the Linux partition? - (CRConrad) - (1)
             The lobsters are coming! - (pwhysall)

I can negate every one of your facts with unverified information I've gotten from the internet machine.
107 ms