IWETHEY v. 0.3.0 | TODO
1,095 registered users | 0 active users | 0 LpH | Statistics
Login | Create New User
IWETHEY Banner

Welcome to IWETHEY!

New Possible effects from any drastic changes to IT
Have opened this thread seperate to the one in MS Guilty on XML & Web Services, with the specific intention of addressing issues & concerns re the potential impacts on the wider world, from directions an XML Web Services led revolution might take us.

Ashton & others raised these issues but self rudely squashed the theme & yet it is an important topic. So in contrition am inviting the topic to get thrashed out in its own thread.

So aspects that I think are worth mentioning

XML & Web Services have the potential to squeeze IT and current IT jobs - that is, if Web Services become dominant then you won't need IT staff as we know them today, to be analysing & writing as much code as is written today. Business analysts using off-the-shelf tools & new languages such as WFDL (Work Flow Definition Language) will be better able to map business processes to Web Services aggregated into business solutions. The role of IT could merely beome writing an intial flood of Web Services (or wrappering existing legacy or EAI apps as Web Services), then withering away as demand for new services either drops away or becomes automated using new generation tools that are in the mill (in particular ebXML tools for extracting UML models from a registry, feeding the model into the companies software generator tools & creating the application software that can conduct transactions with the target business partner).

One other reality that has come clear to me is that whilst XML has the broadest possible support from vendors, industry, standards groups, govts etc: - Web Services (esp the UDDI/SOAP/WSDL variant that is becoming dominant) is actually being driven by Microsoft 1 and IBM 2 - the other players are either hoping on board for the ride or like Sun are being forced on board.

Web Services can be seen in two lights. It is the 2nd one that is going to create controversy, the 1st can be seen as an evolution of the object component model for designing software from re-usable parts but where the message passing has at last been standardised on a single open technology (XML), and where becuase of the XML Schema concept & WSDL, the objects get bound at runtime rather than compile time, which means the components of a solution are 'loosely bound' rather than tightly bound has must be done with CORBA, DCOM & RMI etc:.

1) The use of private or no UDDI registry:
Web Services as objects that are provided so that when aggregated, they create a web distributed solution that can be internal (inside the firewall) or between business partners over Internet based networks. (compare this to the Client Server or monolithic model where the parts of most solutions reside on one computer or aset of high-speed LAN connected computers & where incompatible & complex technologies like CORBA, DCOM, RMI etc: have to be prepared & the solution bound at compile time - any changes mean replicating the environment for testing, cloning the Corba etc: interface code, rewriting & recompiling the whole sheebang then testing & putting into production). Web Services, because of XML, are able to be bound at run time (the 1st time the application runs) & thus can be changed in ways that could never be contemplated before.

2) The use of public UDDI registries:
In this model - the UDDI registry holds an initial entry for a specific Web Service that is identified by an XML based file containing WSDL code (WSDL is comparitivley easy to read (compared to DCOM or CORBA IDL code) by humans as well as software). This 'Service' entry in effect says 'This UDDI registry has a description of a Web Service - the input data (or request name plus required input params) to the service must use the XML format defined herein & the reponse data will be in the XML format also defined herein' . Once we have a definition of a service the UDDI registry can then add implementors of that service - these are ISP/ASP/BSP companies that by registering, commit to abide by theXML formatted request/response definitions in the specific Web Service definition.

Any Web Service in a UDDI registry will *always* have the Service definition entry (the ServiceName.wsdl) entry and will usually have at least one implementor entry for the service (else why register the service if no one offers it). The Implementor entry describes info about the Implementor (service provior) plus includes the web url where their SOAP port listens for requests (which are routed over the net in SOAP containers).

By having multiple implementors for a service, one can build interactive web based apps that allow choices such as which travel agent to use for a given tour service or which delivery company to use for a particulr delivery service. The applications that can be built from such combinations are only limited by our imaginations.

Microsoft are promoting the idea that any company who sets them selves up to provide clusters of core services, will be able to charge each time the service is invoked/provided. This is what I call the 'Blackbird' model - the goal Microsoft had back in the mid 1990s when they were attempting to set up their global MSN network (not the MSN we see today, the original was a MS proprietary packet switched model - it got trashed by MS competitors who swung all their influence behind the Internet, MS then had to revise their goals & strategies & today Web Services is part of that revised effort).

Microsoft have an ulterior motive - they want to control access to all emerging web services through their 'passport' software which effectively provides a single sign on to the world of Web Services. If they control the customer's access, they have effectively replaced their eventually declining desktop control with a Web Access control & he who controls the point of entry gets to control much of the market. Just to restate this ... If MS succeed with their strategy, everyone wanting to access web services on the internet (indirectly through the applications you invoke, such as travel agency booking, pizza delivery, etc: etc:), has to provide their name & details directly or indirectly to MS so as to get your web passport which then means you can gain the indirect access to services. *They need this info and control because based on the services you consume, you will be charged accordingly* & they get to control the charging.

Hey presto - Microsoft has traded their desktop tax & monopoly for a web tax & monopoly.

Cheers

Doug Marker
Expand Edited by dmarker2 June 8, 2002, 11:57:29 PM EDT
New This reminds me of COBOL in the 60's.
COmmon Business Oriented Language for those who not familiar with the acronym derivation.

It is so English (language)-like most anyone could write an application. It reads so easy, no documentation is required. Even management could read the code!

Right!

A little skepticism is good thing.
Alex

"Men occasionally stumble over the truth, but most of them pick themselves up and hurry off as if nothing had happened." -- Winston Churchill (1874-1965)
New Re: Just ask a Smalltalk programmer - 'ST is sooo simple'

One ST person once said he was sure his 6 year old could write in Smalltalk because it was so simple.

After 9 years of working with ST I love its OOT elegance but to call wrting ST code simple proved to me that this particular person lived on another planet despite having a body on this one.

I can follow Java a lot easier than I can follow Smalltalk but I prefer to write in Smalltalk.

And although starting out decades ago with COBOL I still write with it today because it does actually force some degree of self documentation but I think the same comments about Smalltalk's simplicity also apply to the simplicity of Cobol (grin).

Cheers

Doug
New Sort of
only nobody sane would want to debug a 20,000 line COBOL program that could have done the same thing in C in under 2000 lines. :) The debugging was the hard part, because they had to do a memory dump and then do some math to figure out what the values were in some cases. Almost anyone could read, write, and code COBOL code, but the few that knew how to debug it were the good ones that they kept on.

The same thing with the GUI, almost anyone can use a computer now. In the Non-GUI days (CLI, Command Line Interface), users had to remember command words and operators and pipe symbols, etc. Very few could do it, and most ran batch files or programs that someone else wrote for them. With MacOS and MS-Windows, more people got to use computers without knowing how they worked. Those that knew how they worked, debugged OS issues and created the IT/IS Departments.

Visual BASIC is the easiest language that I know of to learn besides COBOL. Basically anyone can create a VB program by dragging and dropping controls on a form. The hard part is adding the code to check those forms, and do calculations, and formulas and other stuff that the luddite programmers don't know how to do. Again, the hardest part is the debugging. But at least VB doesn't require a hex dump to debug, it can go step by step over the code and highlight the most recent code before it is executed. Which makes debugging a lot easier.

The main threat to IT is the H1B Visa Workers and IT Sweat Shops in other countries taking the IT work away from US Citizens. Nothing against those people, but they are being used by the firms to shaft the other workers. What are you going to do when you have to train an H1B worker to do your job, or they move it overseas to another IT worker?

I am free now, to choose my own destiny.
New What is IT, and how does this make them obsolete?
I am asking this as a serious question.

Whenever I hear about these complex standards that are supposed to make traditional IT obsolete, I look at the projects that I have done and draw a big blank on why it would make the work that I did unnecessary. You see, it doesn't matter how sophisticated the ways that you can manipulate data. There is a minimum of necessary work in delivering data. Someone has to figure out how to handle numbers from people whose think that "prepaid in full" should be a delinquency code rather than a prepay code. (Hey, at least they report it!) When the system screws up, someone needs to debug it. Users given the choice between a database they can query in any way they want and prepackaged (though slightly configurable) reports, choose the reports. And on and on and on.

Yes. Standardized interoperability would be great if it could be achieved. Start the data clean, and let it go through in an automated way, sounds great. I don't see it happening.

Allow me to be a bit specific about why.

In the business that I see, there are a lot of complex data standards. Data comes from people who have a legal requirement to deliver it, but don't really want to. It is going to people who don't want to actually see most of it - they just want to know if there is anything they should be concerned about indicated in it. There is a role in there for people who go to the former, collect it, organize it, and then deliver it to the latter at their choice of digestion level.

There are actually more parties involved than just three, but the priciple is correct. The data is reporting on loans backing bonds. The people reporting legally have to do it, but have no direct incentive to do it well. Bond holders are very interested about whether there are problems in the loans, but don't really want to see details.

In a business like this, were you the bondholder would you trust any totally automated reporting process? Or would you want someone there doing various kinds of audits and validity checks, then calling up the trustees and pointing out that X makes no sense, and asking what is really going on? This is pretty much a necessary step. Agree on data formats all you want, it is needed for business reasons. Same as you need auditors. It doesn't make sense for every bondholder to do it. (Every B-piece buyer should, but that is a different story.) But someone needs to.

(Yes, you can streamline that step. We have customers wondering how we can have a finance team a fraction of theirs in size yet get better coverage of more deals. Part of the answer has been work for IT to do...)

In other words it isn't just a question of getting businesses to cooperate on their process. There are very real and important questions of how you handle having players who need to interact whose interests are far from aligned.

With that in mind, read the following [link|http://xml.com/pub/a/2002/05/29/perry.html|article] on the dangers of blindly accepting that which happens to be well-formed and conforming as valid. (Suggestion: It doesn't help that you test for well-formedness and conforming with a validator.)

Cheers,
Ben

PS Yes, I am aware that there is much work that happens in larger organizations under the name "IT" which better tools could make redundant. However replacing teams of shaven monkeys with competent programmers equipped productive and reliable tools has been able to achieve those kinds of gains for years. Businesses do not appear, by and large, to have noticed this...
"... I couldn't see how anyone could be educated by this self-propagating system in which people pass exams, teach others to pass exams, but nobody knows anything."
--Richard Feynman
New Re: Am sure we agree, issue is the question

It is being said that parts of IT as we know it could be obsoleted but we have heard this story very strongly since OOT was introduced but it just hasn't happened. Am sure we all recall the threat of the early 1990s when as a result of emrging global peace, IT development would move to India, Russia & China - that is happening but hasn't impacted us that I am aware of.

Refering to the analogy of Containerization - that sure changed the business of shipping goods. So if XML & Web Services live up to some of the claims, then IT could change dramatically. Applications licensed by subscription from specialst solution crafters. If that takes off, who knows.

I understand though, that there is a great desire on the part of business to get IT out of the loop as much as possible when it comes to quickly crafting business process that can use IT. I know that what is really being said is that roles will change (as they have done over the past 30 years) and the question is likely to be who fits where.

Are business analysts who get better & more automated tools for crafting solutions from IT components (such as Web Services) part of a new IT or are they not ?. I think the points you are making are really a debate on the potential for where who belongs.

The issue with IT has become crafting processes that suit the business, not adapting the business to suit IT - this is where I see the challenge to IT from aggregations of Web Services.

Cheers

Doug
New XML can't kill IT
My sanity check for XML killing IT is this, Does this technology involve sufficient complexity and sufficient hidden consequences that becoming an effective user of it is a full-time skill? The answer is yes.

In that case the "priesthood" that masters the technology will de-facto be IT. IT with a different skillset than today, but IT nonetheless because they have to put enough into their craft that that is their primary skill, not the business. This is as opposed to business analysts who may understand many technologies very well, but for whom that is not their primary skill.

As for the impact that XML will have on the world, I don't know what that will be like. But I suspect that it will be less than the hype has it. Why? Very simply because XML requires all parties involved to look at the same numbers, label them the same way, and think of them the same way. That works very well in a homogenous business, or if a small number of suppliers dominate a vertical market. But it stops working so well where different industries have to interface. And it runs counter to how competition in business proceeds.

A often missed fact is that businesses generally avoid competing head-on. Instead they attempt to define themselves a currently unoccupied niche, and then aim to expand the impact of that niche. What that means is that if you find a thousand different businesses, they will be trying to do a thousand different things. What is more, even ones that are very similar will be attempting to differentiating themselves, and their top management will generally have a pretty good idea of how they are trying to distinguish themselves from various nearby competitors.

What this means is that attempts to homogenize how businesses interact with each other run counter to the businesses real need to each be different. Of course there are huge savings in transaction costs in being homogeneous, so businesses walk a fine line between what to be homogenous on, and what to not be. But the closer you get to the core business processes, the more they each need to be different. (Which is one of the fundamental reasons that most of the software development done in this world is in-house - and why that will remain the case.)

To the extent that XML allows businesses to interact in their own customized ways, it will have an impact. But that impact will be coupled with more and more custom work as it moves from the current low-hanging fruit (ie stuff that is easily homogenized) to smaller and smaller niches. And that coupling will mean that somehow the incremental win from its incremental growth never quite meets the old projections, and never quite achieves the overall impact that boosters believed it would.

At least that is my take, as a non-booster...

Cheers,
Ben
"... I couldn't see how anyone could be educated by this self-propagating system in which people pass exams, teach others to pass exams, but nobody knows anything."
--Richard Feynman
New Re: I am sure that is agreed (?)


Andy Grygus posted a sanity check a few days back about what was going to kill what & whilst the question can be raised, I go along with the view that not a lot is likely to change as a result of XML & Web Services, *but*, one point I was trying to raise, was the containerization changed shipping of goods beyond recognition, I raise the spectre of if XML & Web Services *might* be to IT what containerization was to shipping.

Of course the reality will always be that even if change comes, we all change titles & shuffle chairs & learn new buzwords. That then becomes an additional proof of change :-)

Cheers

Doug
New To quote one writer on the subject . .
"XML isn't even a language, it's a syntax."

XML doesn't do one thing to simplify the business logic used by companies or to make their processes more similar or compatible. It does provide a common syntax they can use as a basis for communicating with each other, and that's a giant step.

Extremely important is that software developers are starting to integrate XML into their systems so the systems themselves have some understanding of the new syntax. This will greatly decrease the cost of implementing XML interfaces - to where companies can actually afford to do it (providing IT employment opportunities that do not now exist).

Traditional EDI entirely failed to do this. While a number of my clients sell primarly to large retailers that demand EDI, only one, the largest, has made any attempt to integrate an EDI interface with their internal system, or could afford to. The rest hand enter invoices and shipping documents into the interface provided by an EDI network / translation service, and hand enter purchase orders received as email from that service. Sometimes it takes more than one service because none of them handles all forms for all retailers.

Bosses, of course, have extrapolated the hype to believe their receiving clerks will be able to set up XML interfaces with their supply chain using Microsoft Office (an impression Microsoft has done nothing to discourage) and the shipping clerks can handle the customer interface. This is actually a good thing - because if bosses didn't believe this sort of crap, they'd never start the projects.
[link|http://www.aaxnet.com|AAx]
New XML is a container for content
which can be used to exchange information/data from one application or company to another. HTML is more for presentations, XML is more for content. Or so I think I know it as such?

I am free now, to choose my own destiny.
New Re: XML is ... Some of the definitions
(I was hoping it wouldn't come to this)

a tag based meta-language for creating other languages. The use of a DTD or Schema
allows derived languages to be defined. The DTD or Schema is used to define the TAGs
that are permitted in the sub-language. XHTML is a classic implementation of an XML
derive language. What is so great about XHTML - it can be modified (for different
display devices) in flight, by XSL (another XML derived technology).

XML allows the creation of self describing data. Self describing data offers significant
benefit in being used to pass documents and messages among otherwise incompatible
systems.

XML Tags allow XML <ELEMENTS> to be better identified when a document is
searched for its content. This is near to impossible with EDI documents in their
transmitted form as it is with messages passed between CORBA, DCOM & RMI.

The concept (like a cargo container) is incredibly simple but its simplicity doesn't
mean the impact will not be dramatic (as was containerization). XML is the same.

What is driving 'needs' for XML. It is the need for a simple but powerful mechanisim
that can be integrated into existing systems (EAI & Legacy) but can also support
entirely new concepts like Web Services, ebXML. XML Query etc: etc: etc: etc:.

XML is human readable but XML derivative standards support the ability to handle
encryption. XML can support encryption of <ELEMENTS> and it is entirely
possible for each <ELEMENT> to use a different encryption approach. So, one
might ask, what is the big about being human readable ? - this ability opens the
pandora's box of discovery !. Discovery of content, one of the greatest inhibitors in
analysing and exploiting data in information systems. Yes, there is a price, (lack
of data compression) but already hardware vendors are developing intelligent
routers capable of compressing XML in transit without losing its 'visibility'.

XML has derivative standards that support non-repudiation. An essential
ingredient for electronic document exchange.

XML has a concept (Global DOCID) that supports a unique document ID for
each any every document that chooses to employ it. This offers significant
benefits in managing XML based documents that travel globally. And that
need to be traced to their origins.

Web Services is a concept derived from XML and thus harnesses the
significant benefits already recognised in the technology. Web Services
allows the concept of 'COLLABORATIVE COMPUTING' which is the next logical
step in the evolution of computing use. In essence it is computung that spans
computers and is technology agnostic. Web Services does this. CORBA DCOM
& RMI (as methods of linking computing) arent & can't.

Web Services can do one other extremely important trick that CORBA, DCOM &
RMI can't. Becaues of XML and WSDL, They can be bound to in real time. The
other middleware technologies have to be bound together at compile time (this
seems to be so lost on many people who criticze the Web Service concept).
This 'trick' gives Web Services its magic and like XML human readability, opens
a pandora's box of potential that proprietary & non web scalable technologies
(CORBA etc:) can never rise to.

Web Services can be said to be the forunner of a 'plug and play' solutions
capability. This is where the benefits can be passed to business. If businesses
can create solutions by real-time & dynamic binding of services using business
flow languages then the world of business & IT has move substantially forward.

IN DEFENCE of CORBA etc:
But there are applications where proprietary technologies like CORBA etc: offer
value that Web Services can't (yet).

Some today say the the next big Wave will be 'GRID' computing and for that to
make sense, Web Services need to proceed as it will, down its evolutionary path.
No doubt some people have no view whatsoever of computer evolution just as
some shippers might never have grasped the benefits of containerization.

Brings to mind the famous British story of when King Canute (11th century)
sought to teach some of his aides how not even the King of Scandanavia and
England could command the tide to turn back. Darwin taught us some
powerful principles in his oragins of the species.

XML standards are driven by the W3C (WWW Consortium)
[link|http://www.w3c.org/|W3c web site]
Its a damned shame the above web site is so academic and boring if it
wasn't them maybe many more people would read the details.

W3C has defined a \ufffdfamily\ufffd of XML related technologies ... (just a few here)
XML, XHTML, DTD, XML Schema,
XML Query, XSL, XSLT, XML Path,
XLL, etc:, etc:, etc:.

XSL is a topic in its own right.

All and more can be read at the link provided.

Cheers - Doug Marker

#1 added more detail
Expand Edited by dmarker2 June 11, 2002, 10:21:57 AM EDT
Expand Edited by dmarker2 June 11, 2002, 10:25:52 AM EDT
Expand Edited by dmarker2 June 11, 2002, 10:37:28 AM EDT
New Gracias.
Or in the most recent Prince Valiant comic strip wherein, to pay back a debt, the 'Rus' are offered two choices:

1) Guidance to the (hardly molested) Lost City of Alexander the Great - an artifact (and other) mine of unimaginable value..

OR

2) Instruction in The Alphabet (even pitched positively as "the larger of the two Riches, for what it will mean for your culture" yada yada)

Well.. we can guess Which was chosen.
Kinda like Corporate next-quarter 'profits'.
It's still Prince Valiant time in most-every MBA head-space. And for those exclusively surrounded by those..


Anyway, thanks for much clarification. I have upgraded my XML-lore to: seeing many more possibilities (as well as many more lateral arabesques) certainly aimed to screw up anything approaching danger to:

My-My-My Proprietary Lock-in or Wet Dream Thereof\ufffd \ufffd \ufffd

Y'know? Billy n'Bally have demonstrated starkly how EZ it was to screw Everyone and Still Be Loved by All Those Wannabes. So I deem that your vision (undeniably Sensible as it appears to this incompetent) may need to await a more mature world:

post-Middle-East madness and its ummm er fallout ?? Maturity is sometimes galvanized by a Memorable recent-enough Spanking. OK: Clue-by-6.



Ashton
New Re: XML is ..., WebSvcs are ... (part 2) ...

Actually the whole box & dice is very far reaching. If one looks at the bulk of standards being defined at W3C - most are now based on some XML derivative. I have hardly touched on the derived languages that have already been designed and the others in the 'mill'.

There are XML derived languages for video streaming, voice data, image data etc: etc: (I don't know exactly what they do but someone saw fit to create them.

Many of these 'languages' (which are nothing more that a set of TAGs & rules for their appearance in the language, and are defined in a schema) are being defined by industry groups seeking to simplify the exchange of data between partners - this is most noticable in the Supply-Chain-Management industry which was of course the industry the drove & benefitted from containerization.

XML & Web Services specifically attempt to overcome the problems of EDI in an elegant manner that can be readily picked up by bulk of SMEs (small to medium enterprises) that today can't afford the complexity of EDI but must do business with the top 5-1-% who cannot run their business without EDI. (it pays to remember the Govts ran fastest ahd hardest with EDI & demand EDI protocols for government provisioning. This applies as much to US as to EU.

One other bit of 'magic' that was built into XML is that of language version control. This is akin to the DLL Hell on windows.
The issue is - ver 1.0 of a language appears (say it is dialect used for ordering auto-spare parts). Anyway along comes an upgrade - how does it get implemented without screwing all the apps that have linked to service providers delivering ver 1.0 formatted data. The XML approach
is that the DTD or Schema (actually the DTD is doomed to extinction as it is a patheticly simplistic & non-XML compliant mechanism that has been eclipsed by the XML Schema). Any the Schema gets published so resides at one primary location (which is likely to be replicated) and all users of a document that rely on that Schema include a link to that schema (ver 1.0) in their Documents.

To move to ver 2.0 - the schema creator only needs to create a new schema that contains the differences & link back to the original schma from the new one. This way we have two schemas published ver 1.0 & ver 2.0 & 2 only contains the changes from 1 thus any software that processes ver 2 documents will always validate against the ver 2 schema whilst any documents created with ver 1 can still refer to the version 1 schema rules.

This is a bit like object oriented inheritance. Any new schema inherits the properties of its parent but can overide some definitions and add new ones but all the other stuff is as in the previos version.

So XML offers an elegant and simple and effective version capability. This is a critical requirement for Web Services as it means a services provider can run the old and new service alongside each other. It is up to the servise requestors to determis which version or format that want to use. This means that changes don't have to break applications that are built on older versions of services.

Cheers

Doug Marker
New Not so analogous to containerization, then.
Doug describes XML:
a tag based meta-language [...] allows the creation of self describing data.
So "XML" per se doesn't mean ready-made standardized "containers" (they're in the DTD/Schema), but more of a *container-construction kit*.

If y[ou l]ook at it that way, are you still so sure it will "revolutionize" IT like containerization did shipping? Do you believe shipping would have been commoditized if there had only been a standard for *how* to build containers, but no actual standard for the resulting containers themselves?

(Needless to say, since you already inferred it from my tone above, I don't think so.)


[EDIT: Typo; "you look" inadvertently contracted to "yook".]
   Christian R. Conrad
Of course, who am I to point fingers? I'm in the "Information Technology" business, prima facia evidence that there's bats in the bell tower.
-- [link|http://z.iwethey.org/forums/render/content/show?contentid=27764|Andrew Grygus]
Expand Edited by CRConrad June 14, 2002, 06:11:40 PM EDT
New Re: Actually from several aspects

I have just obtained an excellent article from Harvard Business review that describes in a good way
the impact of XML / Web Services. When I can get the soft copy I will post a link or an extract.

It puts the impact this way

"End of proprietary computing as we know it"

As regard containerization, there are many comparisons & aspects to an anology. I have been working
on bring these together into a presentation.

Cheers

Doug
New Re: A couple of points you raise ...
"As for the impact that XML will have on the world, I don't know what that will be like. But I
suspect that it will be less than the hype has it. Why? Very simply because XML requires all
parties involved to look at the same numbers, label them the same way, and think of them
the same way. That works very well in a homogenous business, or if a small number of
suppliers dominate a vertical market. But it stops working so well where different industries
have to interface. And it runs counter to how competition in business proceeds."

But this was why the XML Schema was intoduced & why the Schema is integral to Web Services.

For this shebang to work both parties *have* to accept the published schema - not doing so is akin to a sender & receiver of a shipping container, both arguing over different bill of ladings for the one container. The Schema for a service gets published & subscribing to the service means you have adopted the common schema that defines the contents.

I agree that for example, the auto industry has many factions & some may decide to adopt a different schema than another faction say for ordering spare vehicle parts (this is probably going to happen) but even if it does, XML provides the XSL family to make the job of bringing incompatible schema formats together.

XML is extremely well thought out (as were the 2 main container sizes). There are bound to be orgs who want to ignore what becomes popular
but as in the shipping container world, they may well get trampled in the standards compliant traffic.

Also XML is actually a family of technologies - web services as a concept that extends XML and exploits many aspects of the XML family.

POINT 2 *****************

"What this means is that attempts to homogenize how businesses interact with each other
run counter to the businesses real need to each be different. Of course there are huge
savings in transaction costs in being homogeneous, so businesses walk a fine line between
what to be homogenous on, and what to not be. But the closer you get to the core
business processes, the more they each need to be different. (Which is one of the
fundamental reasons that most of the software development done in this world is in-house
- and why that will remain the case.)"

Yes, each business tends to adopt a unique set of processes that differentiates the services they offer. All XML & Web Services is likely to do is speed up the ability of businesses to define new & quicker adapting processes that provide a faster time-to-market.

There are many analogies in manufacturing - the just-in-time ordering of raw materials replaced stockpiling raw materials - cut costs, provided better business cash-flow, reduced obsolesence etc: - all brought about by IT offering more reliable ordering capacity (such as EDI).

Now EDI - that is what is going to be heavily impacted by web services - in effect WS is EDI for the masses & at the same time is a vastly more open & superior concept. But EDI is crucial the big buainess & supply-chain / logistics industries. Re EDI, it is UNCEFACT that is driving EDI/ebXML.















Cheers

Doug
Expand Edited by dmarker2 June 10, 2002, 08:48:52 AM EDT
Expand Edited by dmarker2 June 10, 2002, 08:57:38 AM EDT
New Well, it won't affect my job
except when I have to interface to the customer's data collection systems.

I think of a lot of the article you linked to could be sumed up as "Garbage In, Garbage Out" -- and people often forget the importance of garbage detectors.

Tony
New Web Services Certainly NOT! Cheaper Labor And Attitudes!
Most of the industries I'm tracking (banking, healthcare, travel) are either not using XML (without considering WDDI/SOAP/etc.) or are only using it for "tactical" and "closed" implementations.

Especially since 9/11, most companies are very rapidly becoming very closed about their transactions, what's in them, and what they do. Open Standards are quickly going out of vogue because companies don't want other companies deciphering their messages (or counting the frequency of how many they get per hour, or figuring out which messages are being sent to what vendors, etc.)

Banks/finance are the most progressive of the IT lot, and they are probably the furthest along in implementing XML. However, with 9/11, RAS (recoverability, availability, security) have become their mantra, so many "new" XML projects are on hold.

In my industry (pharmacy claims), we're about to implement another proprietary protocol, completely ignoring XML. Medicare/Medicaid are about to standardize on some ANSI protocols for HIPAA that have nothing to do with XML. Once these products are built late this year and early 2003, then there will be 10 years of ROI that must be produced by all these pharmacy/health companies before they will change again. (Will XML be around in 10 years for the next round?)

Hospitals are using government ANSI protocols, as well as the airline industry with UN/EDIFACT.

So, I think the answer is that with the government's large and growing influence in our lives, web services aren't a threat at all. Web Services will only be implemented where the government will decide to standardize on them or some commercial implementation with no government involvement will allow progressive programmers to use them. Since most big software companies aren't very progressive, that leaves, dot.coms and startups, and we all know how well these have fared recently. The InfoWorld's of the world can rave on for years about how great XML, SOAP, and WDDI are, but until these protocols become part of government standards, and established industry standards, many companies are just going to ignore them because they aren't "strategic".

Cheap software/IT/programming labor, however, IS the mantra of companies today. The goal is to get software developers and IT laborers out of the white collar (ie. "professional") labor pool and turn them into modern day low level salaried slaves with wages just slightly better than someone answering phones at a call center. (Probably worse if you account for unpaid overtime.) Many folks I know of (through church and companies I used to work for) making what I consider to be reasonable wages are currently under fire, either with promises that when their project ends that will NOT get a chance to be redeployed in new roles, or with outright layoffs. Many of these companies want to move the jobs to India, or at least get enough software skills out of work to bring the wage structures down to where software professionals make $40-50K a year.

<rant on>
Funny, we ( a small pharmacy benefits claim processing company ) have put together a really nice Java/Unix/DB2 Medicare claims processing system for a multi-billion dollar pharmacy insurance company, and the multi-billion dollar pharmacy company can't even send us Medicare claims on a regular basis to process. This is occurring because the multi-billion dollar company has few employees left who understand Medicare, who understand pharmacy benefits, and who are paid well enough to care. We spend lots of time training their employees, and begging for these claims, explaining the Medicare process, and showing them cost justification after cost justification for why we process these more efficiently than they do. They don't get it. Many of these multi-billion dollar companies want to pay slave wages and treat people badly, then expect them to care about their business.
<rant off>

The old adage is still true, "You Get What You Pay For", but I would amend "If You Know What To Ask For."

<rant on>
I'm astounded by the small number of people who understand or want to learn (you have to learn them to improve them) the business processes that companies use to get their products and services to market. I'm still very afraid that in 50 years we'll end up with the Soviet or Italian economy, where noone knows how to do anything, where everyone has a political agenda, and where 2-3 giant companies pay dreadful wages to demoralized employees who don't really care.
<rant off>

Glen Austin
New I think there are two problems here
First is that companies don't have their processes straightened out. In every reasonably large project I've worked on, the greatest bottleneck has been trying to write the business rules. For instance: The company directory that couldn't go online because the phone numbers, department assignments, secretarial assignments, office locations, etc. etc. etc. were all maintained by different people, in different formats ranging from databases to spreadsheets to WordPerfect files. No one would give up their little piece of control (power) over "their" information.

Computer systems rely on concise, or at least unambiguous, rules. When the existing business practice doesn't fit these criteria, it's not ready for automation.

The second problem, which depends on the first, is that software vendors tell companies that they can replace expensive, experienced personnel with some new product. The reason this works is that no business wants to believe they are guilty of the first problem I mentioned. "Since our processes are well thought out, they can be automated."

Your multi-billion dollar pharmacy company has gone to step three -- replace the experienced people with shaven monkees -- without going through steps one and two -- standardize the process and automate it.
===
Microsoft offers them the one thing most business people will pay any price for - the ability to say "we had no choice - everyone's doing it that way." -- [link|http://z.iwethey.org/forums/render/content/show?contentid=38978|Andrew Grygus]
New Perhaps I should have qualified....
"Where companies have their business processes worked out (a small minority)."

The rest of the companies just run what I call chaos centers. A Director or VP who realizes that they need to get enough organization together to survive long enough to get their extended severance package (golden parachute), applies band aid after band aid and hack after hack to existing systems and business processes to survive until another company hires him/her away or he gets laid off with a 2 year severance. No system, just chaos, and maybe you get lucky once in a while, and the rest of the time you play the "blame game".

The turf battles are the worst, and I left a great technology beta in 1997 at SABRE, knowing that the turf battles would rage on for years. Three YEARS after I left the company, the company finally accepted my technology as the standard for the entire company. The technology took about a year to create, and three years to adopt. Funny, I wonder if they even still have a programmer left who knows how it works.

I think the second problem you see is the reality at most companies now. The new MBA/VP who arrives is told to reduce costs and just goes after the heads with the largest salaries and bonus (who don't have some kind of "insider knowledge" on one of the owners of the company ). In other words, if you're a good worker, and you're not hoarding technology or keeping the secret of who the boss's "extra" children are, then you're sacrificed.

So, the new MBA/VP cuts the people familiar with the business processes who don't have an "insider card", and leaves the rest and the low salary new hires, and the Indian outsourcing company with the remaining mess. When it all falls apart after 2 years, he blames his peer departments and business partners who could not provide him with "good service" and walks away with a 2 year severance, only to play golf and scan out his next victim.

Sorry, I'm a little bitter about this now. Instead of a 2 year severance, I think they should get two years in Huntsville or San Quentin (prison).

Glen Austin

New Health care
Especially since 9/11, most companies are very rapidly becoming very closed about their transactions, what's in them, and what they do.
Though 9/11 has had an effect, the main impetus for companies guarding their data and processes more closely is just simply due to the business cycle. When times are lean, companies tend to become much more defensive, trying to hold onto whatever advantage they can - real or perceived.

In my industry (pharmacy claims), we're about to implement another proprietary protocol, completely ignoring XML. Medicare/Medicaid are about to standardize on some ANSI protocols for HIPAA that have nothing to do with XML.
I work on the front and back end sides of medical benefits (setting up eligibility and doing the expensing on our self-insured plans), and I just don't see that XML and web services is going to revolutionize the field. For our medicare people, most of the providers require paper for each and every little thing - having a disdain for anything electronic when it comes to dealing with government agencies. They want that signed piece of paper in their files for their own protection.

The HIPAA requirements aren't so much a standard on how to share information, as they are policies for protecting data from prying eyes - privacy . As such, HIPAA regulations are not intended to improve efficiency of information exchange so much as they are to provide a means where the necessary stuff can be exchanged without divulging too much. That's not to say that it isn't a positive move. In the past, many of our vendors received their data via email or http uploads in raw text files. Nowadays, more and more of them are setting up secure ftp and requiring pgp encryption - a good thing.

As for the file formats themselves, there is currently a certain amount of inefficiency due to each vendor having a different file layout. But even with XML, you have to cater the data to each vendor because their plans are structured quite differently. You also get into the trap of universality with the XML formats. For example, BCBS used their own proprietary format until very recently which was described in a 50 page document. Starting next year, they are shifting to an XML format for their eligibility files that is much more generic. Unfortunately, the document for creating these XML files is well over 250 pages long and leaves many unanswered questions about what is (a) required; (b) optional; (c) deprecated or irrelevant.

The beauty of flat files is that they have been trimmed to the bare necessities over the years. With XML, it seems that the vendors are using the kitchen sink approach - because it doesn't cost anything to make them optional in the DTD, they see no need to filter out the inane. Problem is that some of the optional stuff is necessary for our business - so we have to spend an inordinate amount of time dissecting the file layout to determine what is safe to ignore. In the end, you still have to make educated guesses about how the information is used by the vendor in their claims processing software and how it impacts the data that will be returned to you at the other end.
     Possible effects from any drastic changes to IT - (dmarker2) - (20)
         This reminds me of COBOL in the 60's. - (a6l6e6x) - (2)
             Re: Just ask a Smalltalk programmer - 'ST is sooo simple' - (dmarker2)
             Sort of - (orion)
         What is IT, and how does this make them obsolete? - (ben_tilly) - (12)
             Re: Am sure we agree, issue is the question - (dmarker2) - (10)
                 XML can't kill IT - (ben_tilly) - (9)
                     Re: I am sure that is agreed (?) - (dmarker2) - (7)
                         To quote one writer on the subject . . - (Andrew Grygus) - (6)
                             XML is a container for content - (orion) - (5)
                                 Re: XML is ... Some of the definitions - (dmarker2) - (4)
                                     Gracias. - (Ashton) - (1)
                                         Re: XML is ..., WebSvcs are ... (part 2) ... - (dmarker2)
                                     Not so analogous to containerization, then. - (CRConrad) - (1)
                                         Re: Actually from several aspects - (dmarker2)
                     Re: A couple of points you raise ... - (dmarker2)
             Well, it won't affect my job - (tonytib)
         Web Services Certainly NOT! Cheaper Labor And Attitudes! - (gdaustin) - (3)
             I think there are two problems here - (drewk) - (1)
                 Perhaps I should have qualified.... - (gdaustin)
             Health care - (ChrisR)

Fire up all the primary engines.
323 ms