IWETHEY v. 0.3.0 | TODO
1,095 registered users | 0 active users | 0 LpH | Statistics
Login | Create New User
IWETHEY Banner

Welcome to IWETHEY!

New Definitely.
This is *exactly* why I dislike working in bad code.

An old story:

I once had the job of re-implementing a program that loaded cryptographic keys in to EFTPOS terminals. The original program was dozens and dozens of pages of 132-column z-fold paper in FORTRAN from an old HP minicomputer that was being decomissioned. It was not modular. Loading such a key is basically a conversation and you have to tailor your responses according to a few possible states the device can report. Each send/receive sequence was a big block of code that did all the same work as every send/receive sequence. I was really not impressed with our EFT/ATM programmers after seeing that.

I developed a solution using a PC that had cryptographic hardware. Most of it was in C, but I also wrote a serial port driver in Assembly. Because of the nature of the actual key-loading, I also wrote a very simple language to describe the process (hand compiled into C macros :-). This enabled it to support *all* the different hardware, not just the one the original program did (there were other programs for the other hardware). The program also used things like function pointers and arrays of structures. Perfectly ordinary C-things, in other words.

The developer who got to maintain it encountered a bug with a terminal with new firmware: it's second-stage reset now took longer. *sigh* In trying to find the problem, he completely overlooked the finite state machine that was executing instructions and identified the problem as the wait-for-reply "statement". :-/ Clearly my efforts were way beyond anything he normally had to deal with.

More recently, developing an internet application in PHP, we hired some outsourced programmers for a few tasks. We were used to pushing what was possible in PHP. They were not. Their code came back as thousands of lines of code copied multiple times and modified slightly for each block. This was not acceptable: the rest of the codebase was fairly heavily modularised with several abstraction levels as we had demonstrably proven this gave us a leg-up in development and maintenance. We tried to get them to understand this. Several times. It was painful to see the ideas of "abstraction" and "modularity" simply fail to register in their minds.

We re-wrote their contribution to a tenth of the size. And it was faster and more versatile.

Wade.

"Ah -- I take it the doorbell doesn't work?"
New My example
Source system: MS Access database on a desktop

Target system: SQL Server on a server

Output: Reports in both PDF and Excel

The table structure on the Access database was nearly identical to that on the SQL Server database.

To move the data, they had built a VB application that did a "SELECT *" from each table and stuffed the results from each into a temporary variable. Then a bunch of cursors to walk the various variables and apply logic to group and subtotal the results. And the output of this whole process was a single flat file. Run time: ~30 minutes.

Then they copied the flat file to the server that SQL Server was on. Then a second VB app to load the flat file and parse out the data to stuff into the schema. Run time: ~30 minutes ... if nothing else was touching the server at the same time.

(Yes, this process was written by mainframe people. Flatfiles and walking your resultset applying logic are how you do things.)

Finally, a standalone VB app for each report, called by a VB interface that the users had to install ... with admin rights. Run time: ~4 minutes per report.

And it was producing bad results in an unpredictable manner.

After trying to debug the VB apps for three weeks, getting closer each time, I finally asked for a copy of the original Access DB, which they had refused to provide up til then. With the deadline already slipped, they were desperate enough to let me see it.

TWENTY MINUTES after they gave me the source data, I had written four views to aggregate the data, and an Access frontend to spit out the Excel reports. No round trip to the server required. Run time per report: Too short to measure by hand.

I used my (correct) reports to figure out that the source app -- before it fed into the Access database -- had sent duplicate records. Identified all the bad source data, and the resulting questionable output for manual review. All this within hours of getting the source.

And the business group that owned the app wanted to fix the current system rather than use mine, because it represented a smaller change.
--

Drew
New Mainframe programmers...
Or should I say, MVS/JES2/COBOL programmers, because it is that environment that strongly encourages temporary flatfiles, walking resultsets and programming-in-batch.

Back in the days of DOS and DOS-based LANs, we needed an application to "lock" a PC without actually logging out. Banyan didn't provide one. My boss discovered you could fake input to the DOS password change program from a QuickBASIC program without it showing on the screen. His "LanLock" program looked like he'd written it in COBOL and translated it... My version was quite quite different.

Wade.

"Ah -- I take it the doorbell doesn't work?"
     Programming language VS mind of coder - (crazy) - (6)
         Agree on the window into the mind of the other person. - (a6l6e6x) - (2)
             Very close to an experience of mine - (crazy)
             some comments you dont want to see in code - (boxley)
         Definitely. - (static) - (2)
             My example - (drook) - (1)
                 Mainframe programmers... - (static)

Well, from then on, we had a whale of a time. I took her to dinner, I took her to dance. I bought her a bouquet of flounders. And then I went home with her. And what did I get for my trouble? A case of the clams.
38 ms