Friday, April 3, 2009

Assessing the current state of your IT system

A key step of the initial IT transformation process is to understand the true state of your IT system today.

In medieval times, paper was so rare that monks transcribing books were forced to erase previous text and reuse the paper. A “palimpsest” is a manuscript page that has been scraped off and then reused. Chances are that your company’s IT heritage is a palimpsest of a series of Rube Goldberg tactical solutions, deployed throughout the years.

You can deconstruct your current IT system just as a seasoned archeologist can figure out how different civilizations emerged by analyzing the ground strata and digging up past artifacts. The difference is that while the archeologist’s analysis will reveal the clues of long extinct tools and civilizations, your analysis will reveal that nothing really has been laid to rest—most legacy IT data centers are all like the town of Macondo in ‘A Hundred Years of Solitude’. In that town everything remains forever; the spirits of the past never go away.

There is a traditional pattern to a legacy information system. This pattern will most likely include some form of mainframe or mini-computer complex still faithfully executing relic languages like COBOL or some of those fourth generation languages that were so popular in the last decade—that software being gingerly maintained by the dwindling group of developers still able to remember these computer languages[1].

Located next to that central complex, you are likely to encounter a cluster of mini-computers, so popular in the seventies, executing a obscure network protocol translation or running programs serving no clear purpose but that everyone is fearful to remove, lest the entire system collapse. Yes, as you dig deep into your eight-year old plus system documentation (last partially updated thanks to your company’s internship program), you will find a bunch of PC’s still emulating dumb terminals, and in a topsy-turvy fashion, a few dumb terminals emulating PCs.

You will also find out that, unbeknown to most, the most critical business intelligence reports are coming not from a that data mining OLAP system deployed by expensive consultants, but by a bunch of spreadsheets from a PC in one of the MS Access data bases, scripted by a programmer-wannabe in the accounting department.

Compounding this, chances are that your IT environment will have a legacy of ossified technologies that never made it to prime time and are no longer supported. These are the kind of technologies that author Ray Kurzweil labels as “False Pretenders” such as transatlantic blimps, Quadraphonic sound systems, IVRs, and the original Apple Newton PDA[2]

All this would be amusing if it weren’t the case that this very IT system is being tasked to support the new online world and it's expected to handle millions of real-time transactions and very-large information volumes. According to U.C. Berkeley, the entire world’s print and electronic media is producing about 1.5 exabytes[3]worth of data per year (i.e. 500 billion U.S. photocopies, 610 billion e-mails, 7.5 quadrillions minutes of phone conversations). As comparison, every word spoken by all humans throughout the history of the world could be stored on around 5 exabytes. Consider now that these statistics were compiled prior to the social-networking explosion and the massive data storage growth represented by Web 2.0 features. Today, companies are expected to exploit knowledge of every nuance, preference, detail, and characteristic of a customer, would-be-customer, partner, or business event. Ultimately, the IT revolution is all about how to best master that thing called “Information.”

Utilizing and accessing this monstrous volume of information with legacy systems is simply a non-starter proposition. This is not due to constrains in the mainframe technology, or even on the storage capabilities inherited from distributed file server solutions, but to the haphazard way legacy architectures construction led to an unstructured and heterogeneous mix of systems and databases.

Now that real industry standards have finally taken hold and technology costs have dropped it actually makes sense to apply computer resources with improved usability and more enduring flexibility to support future changes. It’s time to begin the arduous process of re-architecting the new systems to use SOA. Unlike with earlier “distributed-processing” epochs be assured that SOA is not a false pretender technology, but the real thing.

The need for the IT system to support the incredible emergence of new business can be a starting point in the justification for transformation. But keep in mind that technology alone cannot be the main reason for this investment. Ultimately, the justification for the IT Transformation should come from the need to support the business and indirectly from other drivers. In my next blog I will cover what those drivers might be. . .

[1] Actually, remembering a language is easy, what’s hard is keeping current the tools used to develop and compile the programs under that particular language.

[2] The Singularity is Near—Ray Kurzweil.
Let’s call a spade a spade, Interactive Voice recognition (IVR) today is the king of dales pretenders today. I will agree that IVR works only when they manage to understand my accent!

[3] An Exabyte being equivalent to 1000 petabytes. A petabye being equivalent to 1000 terabyes. A terabyte being equivalent to 1000 Gigabytes, which is about what you can get with two external disk drives for less than $200 in 2008.