by Orlando Scott-Cowley
There is no doubt Microsoft Exchange 2010 is project many CIOs will be tackling this year, but I wonder if they’ve really thought about what it means to be, ready to migrate? I wonder how many CIOs, IT managers and admins have considered how ‘migration-ready’ their existing environment really is?
When I think about ways to help organizations achieve a smooth Exchange 2010 migration, I’m often setting my sights on the fragmented email environments that hamper this migration readiness. These are the LAN based email management tools we’ve proudly built-up over the years that run on the periphery surrounding Exchange, they are many, and they are a problem; the time has come to declare them and the fragmentation & complexity they cause – The Enemy!
Why is this? Well, we live in a throw-away society where planned obsolescence is built into everything. The life of our ‘stuff’ is limited. The technology in our lives, whether domestic or professional, is a perfect example of this planned obsolescence; especially if you happen to be a slave to a particular vendor who is prone to releasing a new version every twelve months. I hear the iPad 3 is going to have five cameras, by the way.
We generally try to find a balance between making our technology functional and keeping it up to date. At work, our IT departments are always finding innovative ways to solve problems with what they have already got, the ‘doing more with less’ principle. Every once in a while they splash out on a new gadget – usually to solve another problem! This ‘mentality’ is about being able to balance budget with functionality and effectiveness, it’s how we’ve operated the corporate IT department for the last thirty years. Our IT managers and CIOs have added solutions to the network as and when a new problem occurs (remember your first Anti-spam solution – oh the relief!!). They’ve added tools on a reasonably regular basis, holding onto the ones they already have, slowly growing the infrastructure.
Of course all these IT infrastructure tools are designed to be replaced too, but finding the time and budget to do that is hard. Patch and replace is another ‘mentality’ that allows us to simply keep on upgrading everything in place – but it doesn’t solve the new obvious problem of complexity and fragmentation and certainly doesn’t help you when it comes to being migration ready. Take your email management environment as an example – probably a collection of various solutions all working independently in support of a single Email Platform like Microsoft Exchange.
With many IT managers and CIOs thinking about ways to upgrade to Microsoft Exchange 2010, or perhaps even BPOS or Office 365, it’s natural we take a look at their current setup. They like to ignore it but the fragmentation that’s been built up over the years, and is getting in the way. You could even say the fragmentation is the enemy of an effective migration to Microsoft Exchange 2010. All of these fragmented point solutions simply conspire against you when you try to migrate any key component; they create unnecessary risk, unanticipated cost, planned (and unplanned) downtime. When we talk about this fragmentation we really do mean it has become the enemy of your migration plans, especially if you are thinking about moving all or parts of your services to the Cloud.
Getting rid of this fragmentation now, before you migrate, will make life a lot easier afterwards, and getting rid of on-premise clutter in favor of the Cloud means you won’t be looking at the same problem again when the next major upgraded is needed. I think in order to have a really effective plan for migrating any solution, especially Microsoft Exchange, you need to find a way of making as many efficiencies as possible during the migration. A piece of core infrastructure like Exchange almost requires you to do the best job you can to make sure you’re not just swapping out to a new version. De-cluttering and life laundering your current environment is the only way to really make this an effective move; it is the only way get a clean break on that tangle of technology you’re so entrenched in. I mean, why bother migrate, unless you can make the outcome as perfect and brilliant as possible. You’re only selling your users’ short if you don’t.
Image thanks to Joe Shlabotnik via Flickr
by Orlando Scott-Cowley
When I’m talking to CIOs about their email compliance strategy they usually have a clear idea what their challenges are and the tools they would need to achieve a perfect Zen-like calmness within their environment.
However, they also know that in order to create their own private Zen
Garden Server Room, it’s likely they would need a significant increase in IT budget.
We have written about the Cloud v’s DIY approach on this blog post many times; out of a desire to increase awareness that the old money way, build it yourself, is increasingly less affordable and achievable and certainly more complex than what the Cloud has to offer.
Recently Forrester Research carried out a study of risk management and compliance executives within financial services firms who were once reliant on in-house systems but who are now moving to the Cloud. The headline findings were unsurprising, to me anyway; Forrester found that more than 60% of the respondents said their shift to the Cloud was being driven by the “promise of lower total cost of ownership” increasingly simple and rapid deployments and easier management than their network based competitors. Over half of the respondents said they already had plans to use hosted email management and general compliance solutions by 2012.
Some key findings of the report also included the following:
- E-discovery and data privacy is a struggle: Some 55% said they had problems complying with privacy laws, whilst at the same time making data accessible and searchable for ediscovery purposes.
- Compliance has many heads: 75% of decision makers at the polled financial services firms said they worried about the reputational damage of regulatory oversight or investigations. They also reported having major concerns about integrating the various compliance tools in their armory into the network in an efficient manner.
- Content, content, content: Email wasn’t the only driver for taking compliance to the cloud; other sources of content are being considered too. Anything that generates, consumes, removes or manipulates their electronic content needed to be considered.
The findings of the report probably make a lot of sense to most people. I expect anyone with a reasonable driver for email compliance tools and services, whether internal or external, will be feeling similar pains.
When you stack up all the bells and whistles needed to achieve compliance with each and every regulation out there, especially if you’re in the financial services sector that’s a pretty large heap of hardware and software – certainly one that won’t fit in the average Zen Garden. It’ll make a terrible mess of the gravel and bamboo if it does. The reason I said I’m not surprised by the reports findings are because I’m exposed to people trying to solve these problems everyday. Invariably the conversation gets onto how the Cloud can help remove this complexity, and one by one we usually cross the LAN-based compliance solutions off the whiteboard and move them out to the Cloud.
What really interests me, and the study by Forrester confirms this; is that CIOs are no longer thinking about the old money. They are the ones driving their businesses towards the Cloud, it is the CIO clearing out the multitude of tools that clutter their minds and infrastructure. It’s almost as though a sort of Cloud Enlightenment has come upon them.
Either that or the cost and complexity benefits are unbelievably clear.
by Orlando Scott-Cowley
In the last blog post of my Microsoft Exchange mini-series I suggested the idea that the complexity that surrounds a core Microsoft Exchange server might be what is causing some trepidation when it comes to upgrades.
By complexity; I mean the tangle of technology that has grown up around the Exchange server – all of those point solutions that have been added over the years to solve individual problems. All of the extra email management servers, the blinking lights, content filters, archives, disk arrays, AV/AS boxes, email encryption applications and other bells and/or whistles that are needed to keep the Exchange server up, running and efficient.
I understand that my post might have caused some alarm: In fact one IT manager emailed me to say he had never really considered all of those other solutions as a collective complex problem. He told me that each point solution was managed by different members of his team, and even other departments in some cases (like legal and professional standards for example), and that everyone just got on with the job.
Is there a collective noun for email management applications and tools? If there isn’t I’d like to suggest a a few; a “Hectic” or perhaps a “Chaos”, or maybe a “Delicate”? Answers on a post card please – the best suggestion wins a Mimecast complexity monster t-shirt.
This complexity we’re talking about is a bit of an old money solution isn’t it? I often ask CIOs what they would do if they were setting up a greenfield site; would they replicate what they’ve got or would they do it differently? They usually say something along the lines of; if only I knew then what I know now, when referring back to the problems they’ve had to patch up over the years. Of course with hindsight we wouldn’t re-create the past.
The complexity problem is one that affects many businesses. Very few IT departments can really say they have designed out all their complexity. Upgrading central pieces of infrastructure, like Microsoft Exchange is undoubtedly an ominous task when you’re surrounded by stacks of supporting applications. Perhaps as you consider a Microsoft Exchange upgrade, hopefully to Exchange 2010, you’ll take a look at all those point solutions in your network; the vast array of blinking lights on that complex email management infrastructure and you’ll think that there must be a better way of doing this.
Maybe you’ll take this as the best opportunity you’ve got to ‘toss out the tin,’ get rid of those point solutions in an effort to iron out the complexity that has been handed down from generation to generation. The complexity problem isn’t one that is going to go away on it’s own, in fact the more patching, upgrading, installing and problem solving that goes on – the worse it will get and the larger that sprawl will become.
Now is the time to take a step back and look at what’s ‘become’ – then really & honestly decide what you can do to remove that complexity. What have you got to lose, apart from those pretty blinking lights – the blue ones were my favorite.
by Nathaniel Borenstein
Most people who use email today probably have never heard of punch cards, and although my readers are probably a bit more knowledgeable, I’ll describe them briefly. Punch cards were the primary method of data input and output for the earliest computers. Programmers would write programs, specialized operators would translate them into a series of cards which would be fed into the machine, which would run the programs and deliver its output via a new stack of punched cards.
In their heyday, billions of punch cards were used each year, and every program written for the first generations of computers was tailored to their capabilities. Various sizes of punch card were used and manufactured, but the dominant standard was the IBM/Hollerith 80 column card. By the time alternate input and output mechanisms such as keyboards and screens and printers became common, nearly every program in the world was designed to accept and produce 80-column data.
It was therefore natural that nearly every video display built for the first few decades of that technology’s existence was also geared to 80-column data. The 80-column display was a near universal standard, largely because it made it easier to convert older punch-card oriented programs to work on the newfangled TV-like screens.
Similarly, programs that exchanged data files tended to use 80 columns of data, which meant 80 characters per line. Various conventions were developed to indicate when data was continued on a subsequent line, though these conventions tended to differ from one protocol or application to another.
That is the world for which the first email programs were written. The Internet email protocols which evolved into SMTP were all designed for an 80 column world. Lest you judge the designers too harshly, recall that it was impossible to know what kind of system your file would be transferred to, and it was possible, well into the 1970′s, that it would involve punched cards. Limiting the data to 80 columns was simply good, conservative engineering, ensuring that it wouldn’t break when transmitted to those older systems.
The world changed, of course. But changing a protocol that everyone uses every day is no small matter; it has to be done carefully and incrementally. You want to make sure that most mail servers can handle “long-line” email before you start sending it. So, although the standard eventually raised the line length limit to its current 1000, most mail sending software still tries to keep lines under 80 characters, so that they won’t have to worry about problems with older mail software that receives it. At this point, no one really knows how many servers would have trouble with longer lines, but nearly everyone thinks it safest to stay under the 80 character limit first defined by IBM punch cards.
You might also ask, why should there be a line limit at all? Why not just send a binary chunk of data, like most modern software? Again, the answer is in the installed base. Most email software is designed to receive data in line-oriented format, terminated by a standard end-of-line marker (about which I’ll have more to say in a future essay). Binary data would break nearly everything.
In the 1990′s the mail gurus (yes, including me) designed an SMTP extension that would allow consenting mail servers to exchange binary data. As useful as that sounds, it’s seen remarkably little use, because there’s already a convention called base64 for converting binary data into line oriented data. Base64 enlarges the data by 33%, which sounds like the kind of thing engineers would want to avoid, but it’s much easier than changing all the mail software in the world.
So, we live in a world where my daughter can email me a video of my grandchildren, which still amazes and delights me, but that 6 megabyte video becomes 8 MB for transit because it has to be encoded as 80 column lines that are safe for punch cards, just in case it needs to be printed on them.
Protocols, like animals, evolve to produce solutions that work, not necessarily solutions that are optimal or elegant. We walk upright with a quadruped’s backbone, and email transmits video with a punch card’s line format.
As long as it ain’t broke, we probably won’t fix it.
Image via Luke Sheppard