Post #20,765
12/8/01 12:54:09 AM
|
Thus the great PHB spake:
Barry, go forth and figure out my alternatives.
We got the mainframe. We will be installing a Linux partition with a dedicated CPU (performance WILL suck, doesn't matter).
We got a GABILLION lines of COBOL / PL1 / BAL / CICS programs. There is much overlap, and the data they use is all over the place. There is no relational repository for corp data.
We have a target of getting rid of COBOL, not because it sucks (it does), but because COBOL programmers are old and dying, and schools aren't churning them out.
We have a target of conversion all VSAM and other flat file proocessing to store in a relational database such as DB2.
The word "Websphere" is firmly embedded in PHB's head, so that WILL be the coding environment. We AIN'T got no Java programmers. My limited experience does not count, and I work for a profit center, which means I really don't want to deal with MIS concerns, but if I allow then to screw it up, it could KILL the company, so I won't let them.
My initial reaction was to go to UniKix, but that does not satisfy the escape for COBOL.
I certainly don't trust these people to do C or C++ or Java. This would be at LEAST a 10 person 3 year effort, which really means that the PHB will get to retire just as the project is tanking.
I'd trust them less to do PERL. Well, not exactly. If PERL had COBOL style record layouts, it might be possible. Think they might be added in the next 6 months?
And VB (acch, phhht) was mentioned.
One of the benefits of COBOL is simple record oriented programming. And that it is incredibly difficult to do complex thing, so people usually DON'T, at least in my shop.
So what are my options? Some highly restrictive 4GL? Argue FOR COBOL, at least for a while, in a UniKix like environment, and then go for a slow port away to what language?
Waffle for a bit, get research time, then forget about it and find a new job?
These systems are used for order tracking, inventory, billing, accounting (somewhat, we have PC based accounting, but they talk back and forth with certain data), customer service, etc. There is no industry packages for us because we are one of about 6 companies that do what we do.
|
Post #20,782
12/8/01 11:52:24 AM
|
Definitely not C++
C++ is way too complicated. Java is a better choice than C/C++/VB.
If you think they could learn OO, IBM's VA-ST might be a good option -- and it should work with WebSphere. There are probably tools available for it (and for Java) to make relational database mapping to classes easier.
VB is not a good option -- #1 reason is that it changes whenever MS wants it too, not when you need it to. And, of course, it's extremely limited in its scalability and portability.
The problem with a highly restrictive 4th GL language is that most of those are proprietary, so you still have the problem of vendor lock in and problems with finding qualified developers.
Maybe a high level language and CASE tools (to make database mapping easier and make it harder for the developers to screw up) would work better.
Good luck,
Tony
|
Post #20,792
12/8/01 6:00:09 PM
|
For one thing COBOL is in some sense a subset of PL/1.
If you want to reduce you language base by 1, porting COBOL applications to PL/1 should be relatively easy. Pl/1 is much richer (and therefore more complicated) and can be used for complex things as well. I did image processing with it back in the 1970's. However, PL/1 is mostly single vendor (IBM, although Unisys may still be in there) and suffers some ageism problems (vanishing programmers) of COBOL.
Now, moving to Java can even be [link|http://www.mpsinc.com/mpsjava.html|automated], but I would carefully review and test the results. Note that Pl/1 was not on that list.
Alex
Men never do evil so completely and cheerfully as when they do it from religious conviction. -- Blaise Pascal (1623-1662)
|
Post #20,796
12/8/01 7:24:10 PM
|
Interesting translater
Of course, any company that says they will translate my COBOL to K & R Java is probably a bit flakey.
Wow. Kerningham and Ritchie Java. I knew they were visionaries, I just didn't know how far they saw.
No, we need to get away from PL/1 also. Or is it PL/I? I'm never sure.
Ageism? Hey, sounds like you are accusing us of age discrimination! We love our old fogeys. Here is a bullet point in my proposal:
*) We would like to retain current staff for as long as possible, to minimize the loss of institutional knowledge.
|
Post #20,801
12/8/01 8:09:47 PM
|
Can some of that knowledge be codified?
Technically it'd probably be better to document most of the "business rules" and then re-write from scratch.
For one, it can be very useful to explicitly spell out the assumptions, requirements, etc, though this isn't typically fun or exciting work.
On a much smaller scale, I'm involved with some of this -- I'm involved both with documenting all the measurements one of our machines can do -- and I'm discovering a lot of nuances along the way. And, I'm involved with re-doing some of the software, getting some 16-bit MSVC1.52 code to work in a 32-bit environment with very different assumptions. So far I've found out that the guys who did the new environment have dropped quite a bit of needed functionality (although it'll probably be added in before Christmas).
An accounting system with a 4GL (like AppGen) would be one approach, but you'd still have issues with vendor tie-in and finding qualified people.
Tony
|
Post #20,810
12/9/01 12:09:14 AM
|
Re: Can some of that knowledge be codified?
Tony, the problem is that people do not as a rule document the mistakes they've made. what they learned from them, and how they fixed them. Same thing applies to assumptions made to bound a problem. So "why things are the way they are" is lost when the developer leaves even when "what is" is well documented. This makes it tough for someone new on a project to differentiate between what is "holy" and what can be changed. And so, new people often repeat the mistakes once made by their predecessors and (again for the organization) learn the lost lesson.
There is probably a tedious process that could be used to minimize this state of affairs, but it must be foreign to human nature.
Experience is knowing you've made the same mistake again. :)
Alex
Men never do evil so completely and cheerfully as when they do it from religious conviction. -- Blaise Pascal (1623-1662)
|
Post #20,832
12/9/01 3:55:46 PM
|
Re: Can some of that knowledge be codified?
I have been known, when trying to fix code, to document crap code to the best of my understanding. There are a couple of times when I might have gone to extremes (the comments documenting the fix were at least a screenful in length), but at least whoever comes after has the benefit of what I managed to decipher.
"Beware of bugs in the above code; I have only proved it correct, not tried it." -- Donald Knuth
|
Post #20,925
12/10/01 2:34:34 PM
|
Interesting article for you
Yes, it is possible for humans to codify not just the what, but also the why of what they write. Take a look at [link|http://www.fastcompany.com/online/06/writestuff.html|how NASA does it].
The conclusions shouldn't be surprising, though you don't see them often: Programming is a profession, and should be done in a professional manner. Programming isn't "special," in need of gurus and all-nighters. Nothing important should be left to a single person from beginning to end, ever.
Not that this is much direct help for your situation, but a good background against which to make decisions.
We have to fight the terrorists as if there were no rules and preserve our open society as if there were no terrorists. -- [link|http://www.nytimes.com/2001/04/05/opinion/BIO-FRIEDMAN.html|Thomas Friedman]
|
Post #21,036
12/11/01 3:48:58 PM
|
No way
If you have an almost infinite budget, and lives at stake, maybe. But the concept of "good enough" is a real one. But it is a grey area. NT bluescreens are good enough for some people, 1 year uptime for others.
What are you willing to pay for? What are your customers willing to pay for? How long are you willing to wait? What is good enough for the guy writing the check? End of story.
|
Post #21,046
12/11/01 4:26:23 PM
|
The question was, "Is it possible?"
The thing I like about the article is that it questions the assumption that just about everyone seems to make: that IT in general, and software development in particular, is done the way it is because it's the only way it can be done. This article shows that, given the proper incentives, it can be done as "flawlessly" as other "critical" functions.
With very rare exceptions, multi-million dollar projects should not be entrusted to "kids" fresh out of school, as many seem to be. When managment is ready to treat IT as a critical function, maybe they will put the money into it.
None of this changes your situation, just a rant. I'm sure I'm not the only one who's tired of being told it has to be right, and there's no more time or money in the budget.
We have to fight the terrorists as if there were no rules and preserve our open society as if there were no terrorists. -- [link|http://www.nytimes.com/2001/04/05/opinion/BIO-FRIEDMAN.html|Thomas Friedman]
|
Post #21,069
12/11/01 5:24:36 PM
|
Right on, Drew!
"There's not enough time to do it right the first time". "But, there's always time to do it over again". And, again, and again,...
Alex
Men never do evil so completely and cheerfully as when they do it from religious conviction. -- Blaise Pascal (1623-1662)
|
Post #21,095
12/11/01 7:39:46 PM
|
I used to think I was REALLY good
This meant I really cared about getting the design and code right, and bugs were a personal failure.
I accepted that I would have a rare one, and that it would be fixed immediately.
This was in about 30,000 lines of pointer heavy 'C' and Oracle Pro-C code. Bugs meant core dumps. It was rare that a user would experience it, since the OS would kill my program way before then.
This was when I was responsible for the design and coding of an in-house editorial system that had 30 active editorial users, and produced monthly data dumps that were then published on Compuserve, CD-ROM, and printed book.
I had 1 bad output in 3 years, and was fixed in 24 hours. My users never lost one byte of data, even in the event of hardware failure.
Now I do junk mail. Dead trees. Marketing databases. Fuzzy yucky data, where there is a percentage of known bad data, it as long as it is under the magical max, it is ok.
ok.
ok.
ewwwwww.
I gotta go wash my hands.
But:
I've got to hit the print date.
I've got hundreds of people at various companies sitting and idling if I miss the print date.
So as long as it is good enough, it really is "good enough".
|
Post #21,369
12/13/01 3:24:25 PM
|
Your exception IMO proves the rule
Someone looks at the cost to produce cleaner data, and where this line croses the cost of delaying the mailing. At that point, it's "good enough." I strongly suspect most decisions don't have nearly this clear a metric by which to define "good enough," yet the call is made anyway.
Beyond that, my comments were more directed to the process than the actual decision. If, as I said, someone is willing to trust a project to a wet, pink newbie they have just proven through action that they don't really think it's that important. You are obviously not a newbie (though I won't comment on your pinkness or wetness). They are willing to pay for you. When they give your job to a PFY to save money, that's when "good enough" is a fool's bargain.
We have to fight the terrorists as if there were no rules and preserve our open society as if there were no terrorists. -- [link|http://www.nytimes.com/2001/04/05/opinion/BIO-FRIEDMAN.html|Thomas Friedman]
|
Post #21,060
12/11/01 5:14:52 PM
|
Interesting article indeed. Thanks!
It takes money, time, and discipline to do it right. The process needs a high perceived cost of failure to make it work. Can you imagine being able to sue Microsoft for damages of every Outlook exploit that one suffers? The current license absolves them of responsibility and there is no associated cost.
However, you may recall the NASA fast-track Mars mission fiasco where one sub-system used the metric system of measurements and another the English (actually just US and one third world country) measurement system. (If you read this, control yourself, Andrew!)
Alex
Men never do evil so completely and cheerfully as when they do it from religious conviction. -- Blaise Pascal (1623-1662)
|
Post #20,808
12/8/01 11:23:55 PM
|
Re: PL/1 or PL/I? It is PL/I.
The I being a Roman numeral - "pee el one" - Programming Language, One. IBM's ultimate language in the mid 1960's, combining Fortran, COBOL, and who knows what else. All things to all people.
Sorry for brain fart. It's been a long time since I've given it any thought.
Alex
Men never do evil so completely and cheerfully as when they do it from religious conviction. -- Blaise Pascal (1623-1662)
|
Post #20,867
12/9/01 11:03:28 PM
|
Domino/Db2 ass end webforms front end
Do a process analysis of how the work of the company is done. Once documented who does what and why you will have a better understanding of where you need to go. A web front end consisting of forms, pick lists/drop downs and data entry points based on these dataflows should give you a fast stable stovepipe to the ass end. thanx, bill
tshirt front "born to die before I get old" thshirt back "fscked another one didnja?"
|
Post #20,889
12/10/01 10:48:15 AM
|
DB/2 on the MF partition, apps on the Linux partition?
[link|http://www.borland.com/kylix/|Kylix 2] -- the most NON-restrictive "3-1/2 GL" there is.
(Apart from its Windows sibling, Delphi, and C++ Builder -- which is out of the running because the language is way too convoluted (and because it isn't available on Linux (yet)).)
Once you get the data back-end into (some semblance of) order, build a (bunch of) half-done Data Module(s) and store it (them) in your Object Repository for everyone to inherit and specialize from. That way, you can make sure there aren't any *too* gigantic fuck-ups.
The Object Pascal language is simple, clear, and elegant -- they pretty much can't fuck up without it becoming immediately obvious; and who doesn't know, or can pretty rapidly learn, Pascal? (This is why it's a *good* thing that the language was originally designed as a teaching aid.)
Oh, and remember: Data Modules are just components; you can define new methods, fields, _and properties_ (automagic setter/getter calls that look just like a field access to the user-coder) on them that are inherited just like in any other component.
Oh sure, I'm our "Official Delphi Zealot"... But I *honestly* can't see how this wouldn't be the best thing someone in your situation could pick.
Christian R. Conrad The Man Who Knows Fucking Everything
|
Post #60,256
10/30/02 2:14:47 PM
|
The lobsters are coming!
Damn them, and their hatstand minions!
Peter [link|http://www.debian.org|Shill For Hire] [link|http://www.kuro5hin.org|There is no K5 Cabal] [link|http://guildenstern.dyndns.org|Blog]
|