Pages

Wednesday, November 2, 2011

Impure Implementations

I am a big fan of using the right tool for the job. When it comes to carpentry or surgery, this concept seems obvious and well-understood and usually followed. But it comes to large IT projects, I find a real tendency in large organizations toward trying to solve all problems with whatever single tool they have blessed. Let us call this tendency the Silver Bullet Assumption.

Many organizations are Visual BASIC shops, or Visual C++ shops, or Python shops, etc. This baffles me: there are many excellent tools out there, but there is no tool that is great for every aspect of a large project.

I find that large projects usually have most or all of these aspects:
  • an inbound interface for acquiring data
  • a collection of data processing functions
  • a way to store the data
  • a user interface (UI) to view the data
  • reports and exports to send the processed data down the line
  • an outbound interface for sharing data
Thus most of these projects have some systems-level requirements, some user-level requirements, some low-level data processing and some high-level data manipulation. What single tool is oriented in all these directions at once? If a single tool did encompass all these aspects, could it do so in a consistent manner, or would it be a wrapper on a wide collection of disparate parts?

Even if such a tool existed, who would use it? Someone who understood all those different domains?

In our consultancy, we have a break down that I think is pretty common, or used to be: we have systems people (who work mostly under Unix) and apps people who work mostly under Windows and Web work lies somewhere in between.

We use MS-Office or web pages to provide UIs on the desktop, web pages and thin clients to provide data entry on the floor, Unix servers to provide print service, file service and web service. It is hard to recall a project of any scope that did not cross these boundaries.

We are constantly asked questions about implementations which assume that everyone does everything: the Unix systems programmer who is supposed to know about MS-Access apps and vice versa. When we push back, we find that many organizations have the notion of "programmer," or even "systems programmer" versus "applications programmer" versus "server guy" but all these programmers are using the same environment: Windows or Unix and it is mostly Windows.

Clear as we are about our design philosophy, even we occasionally have requests for "pure" implementations, with the hope that if the technology under a large project is consistent, that large project will be easier for local IT to understand and support.

But this is often a forlorn hope: if your people do not understand bar code grokking or TCP/IP-based protocols, it very likely won't help if the thing they don't understand is implemented in a familiar technology. AT worst, they will have a false confidence that will lead them to fiddle where they should not fiddle.

(I speak from bitter experience, of course. Ah, the support phone calls which start by saying that some of our technology does not work and end with them admitting that they "fixed" something else right before the mysterious new failure began.)

I just don't buy the premise, that being fluent in systems, apps, networking, infrastructure and databases is a reasonable expectation, let alone the usual case. You know that you need network people, desktop support people, server people, etc. Why do you think that they all should be working in the same environment? What does that even mean, when you think about it: how is a desktop app like a print server?

This illusion of the benefits of purity is encouraged by vendors, so I suppose the customers are not really to blame. The first time I laid hands on Oracle, lo! these many moons ago, I was stunned at all the special-purpose configuration file formats and languages I needed to learn in order to tune the installation. But the client thought of themselves as a pure Oracle shop. This is like saying that all of humanity is part of a pure human language shop--we just use different flavors of language.

Very recently, I worked with a system that was billed as all Windows, all the time. Except that when push came to shove and I needed to debug some of its behavior, I come to find out that the core was a Unix server running a ported COBOL app. Egad! Knowing that it was COBOL through which the data was passing made debugging that systems interface much easier, by the way.

 Why tell the customer that they are buying a Windows app running on Windows servers, with some kind of remote back end? I don't know: it must be comforting to someone somewhere.

I prefer to be more upfront with my clients: I will use whatever technology will get the job done, with an eye to accuracy and speed. I want to save my time and their money. I try to use technology that they already own, but I cannot guarantee that--unless they want to pay extra; often LOTS extra.

If I have to use MS-Access at the front, FTP in the middle and Oracle on the back end, then so be it. I find the choice between requiring minimal maintenance, but making local IT uncomfortable, and requiring lots of maintenance, but making local IT (probably falsely) confident, an easy one to make.

Just last month, we shut down a system of ours that had been in continuous operation since early 1984. That's 27 years of service, with an average of under 10 hours of attention per year. This system's impure implementation made local IT nervous, but it also allowed us to adapt to the dramatic infrastructure changes over that time. In the end, it was time to retire it: a 16 bit environment runtime environment under a 32 bit operating system running on a 64 bit architecture is a bit baroque even for me.

So while nothing lasts forever, I claim that the concept is sound: until there is a single, simple, all-encompassing technology, use what makes sense, even if the final product has multiple technology environments under the hood. There is no silver bullet and there never was.

No comments:

Post a Comment