by Orlando Scott-Cowley
We don’t yet know what the next version of Microsoft Exchange Server will be called. Exchange 15 is an assumption based on the version number, the last being Microsoft Exchange Server 2010 which was version 14 (14.01.0218.015 for SP1, to be specific). We also don’t yet know when Exchange 15, or Exchange Server 2013 (based on the three year cycle) will be released into the wild.
So while we are waiting, here’s a quick look at the versions that have come before:
The Younger Half-Cousin Done Good – Exchange 1.0
Actually not a server product at all, Exchange 1.0 or Windows Messaging to give it its real name was an email client included in Windows 95, 98 and NT4. The fashion at the time, 1996, was to lack support for Internet email so if you were looking for SMTP or POP3 support you were out of luck unless you installed the separate Microsoft Plus! pack, codename ‘Frosting’. Frosting included other delights such as Space Cadet Pinball, DriveSpace 3 and some space-age screensaver and wallpaper files.
Windows Messaging had us hooked though; we never looked back. In 1996 HTML messages were evil, our marketing departments hadn’t yet realized how to brand plan text emails yet. It was de rigueur to include an obscure quote in your footer, and if you were really out to impress, a string of special characters on the verge of ASCII art. There was no support for international characters either; sorry, the rest of the world you lose.
The First Server, Server Product – Exchange 4.0
The server side of the business came crashing through the door one day wanting in on the new funky Windows Messaging action. Exchange Server 4.0 was born as an upgrade to Microsoft Mail 3.5, which in 1991 was slowly turning into something called Network Courier in. Lotus were very excited about acquiring cc:Mail at about the same time, so the race was on.
Our CEO, Peter Bauer still proudly displays his Microsoft Mail 3.5 certifications on the office wall.
Exchange Server 4.0 was a wholly new system, designed on the X400 client-server model, supported by a single database and an X500 directory service, which later morphed into Active Directory.
1997 Microsoft Valued at $261 Billion and Exchange Server 5.0
In the year Microsoft becomes the World’s most valuable company, and Bill Clinton is returned to office for a second term, as is Steve Jobs; Microsoft release Exchange 5.0 and the new Exchange Administrator Console.
Adding the new Internet Mail Connector allowed your new shiny Exchange 5.0 server to communicate over the internet via SMTP, for the first time allowing your users to arrange their days around the send/receive button. Exchange 5.0 also introduced a new-fangled web-based email interface uninspiringly called, Exchange Web Access.
A Few Short Months Later – Exchange Server 5.5
In November 1997 while the tech world was distracted by the $37 Billion merger of WorldCom and MCI Communications, Microsoft snuck out Exchange 5.5, which was sold in two editions, Standard and Enterprise. Standard was limited to a 16GB database which was a throwback to previous versions, whereas Enterprise Edition had a database limit of 16TB. I remember building a business case for Enterprise Edition and having to explain to the business why 16GB wasn’t enough – if only we knew then what we know now.
Drum Roll Please, Ladies and Gentlemen Exchange 2000 Server
By November 2000 we’re on Exchange Server version 6.0, codename Platinum. A big leap forward with changes to support clustering and database size limitations. But, the upgrade required there to be a complete Microsoft Active Directory infrastructure on the network as there was no built-in directory. There was no in-place upgrade from previous versions of Exchange, so consultants and Microsoft Partners made merry with the consulting hours as customers required both platforms to be online at once.
Codename Titanium – Exchange Server 2003
Version 6.5 added some useful migration tools that helped companies consolidate their distributed Exchange environments; I have a true story that demonstrates this perfectly. One client of mine, who had twenty different Exchange Servers, one for each letter of the alphabet, distributed users depending on their surname, doubling up for some of the less common letters like X, Y & Z. All twenty servers, none of which were virtualized, sat in the same datacenter in central London. How they got to this stage was a long story, but the realization that Exchange Server 2003 could help them resolve this problem saw twenty servers whittled down to a handful of streamlined clusters in four locations across the globe.
Bells and Whistles and a Big Fanfare – Exchange Server 2007
Exchange Server 2007 was released amid much fanfare and marketing by Microsoft, and rightly too, this version brought some wonderful new technologies and functionality. Sadly though some users chose not to upgrade and stayed languishing on Exchange Server 2000 and 2003, I even knew a few still on 5.5!
64 bit support was a bit of a struggle for some customers, but eventually gave every IT department the budget to upgrade their old Exchange Server to a new, faster one. The 2007 release was version 8, codename E12 and brought and Enterprise Edition which allowed a whopping 16TB maximum database size.
HA Database Clustering was given a whole new batch of TLAs;, SCC, LCR, SCR and CCR. The Exchange Management Shell arrived as did Unified Messaging and Outlook Anywhere (which was really called RPC over HTTP).
However Microsoft announced the death of Public folders in the next release, due to what they call a ‘Wild West’ of public folders.
Which Brings us Right up to Today – Exchange Server 2010
November 2009: Exchange Server 2010, or Exchange 14 to those in the know, hit the market brimming with cool new features. Database Availability Groups or DAGs became even more popular than ever before replacing the Clustering options from Exchange Server 2007. Server Roles became important to Exchange Architects everywhere and sparked much debate about where in the network to put the CAS.
Luckily large mailbox support was extended after its initial introduction to Exchange Server 2007, which was good news for most as their end users had been using disk space like there was no tomorrow and databases everywhere were getting rather full. To this day I know an end user who refuses to ‘clean out’ their 32GB mailbox on the basis that he “needs” all of that email and knows “exactly where it all is” – you know who you are sir, if you’re reading this.
Public Folders are included in 2010 and not deprecated as planned.
Unfortunately some businesses are still languishing on old versions like Exchange Server 2000 and 2003. You also know who you are, and you know how keen we are to get you to Exchange Server 2010.
Enter the Cloud – Office 365
Luckily for administrators dealing with “Mr. I’ve got a Huge Mailbox & I don’t Care” the Cloud started to make life easier, by offloading some of those Big Data and Email Management issues. Microsoft have just launched Office 365 and introduced Exchange Online. Although EO has been around since about 2005 and mainstream since 2008, it’s only now that users are beginning to see the real benefits of the Cloud.
Enter Exchange 15
What Exchange 15 or Exchange Server 2013 will bring is still shrouded in Mystery, even the Servers real name is unknown (I hoping for “Philip”) but it’s just round the corner and slowly information is trickling out of Redmond.
Personally, I can’t wait.
by Orlando Scott-Cowley
The future… if we actually had an endless supply of dilithium crystals or flux capacitors, gadgets like floating skateboards and Tricoders might be more common. But sadly they’re not; so the only real prediction I can make for the future (that’s relevant to this blog post anyway) is that Microsoft are planning to release a new version of their Exchange Server software every three years. We should be seeing the next version towards the end of next year, currently being called Exchange 15.
Like Christmas, it feels like new versions of core server software come round far too quickly, especially such valuable services like Microsoft Exchange. We’ve previously mentioned the lengthy procurement cycles that keep such services a constant version behind before, which generated some good feedback and discussion; many Exchange admins told me those delays adversely impact their own deployment plans, which is intensely frustrating for them and often forces their migration project into the red.
So, rather than roll out the ubiquitous predictions for 2012; I’m going to suggest that in the absence of 1.21 Gigawatts you can take a stab at future-proofing your Exchange environment now, so you’re not left thinking in future -
“I’m migrating again. Surely not? Didn’t I just finish the last upgrade?”
However the last migration or upgrade you performed was probably a little easier; the requirements were different then, and there was dramatically less data than today. The move from Exchange 2003 to 2007 was mostly about the new 64 bit hardware required, but the move to Exchange 2010 is often about the volume of data instead.
As your users make merry with the disk space allocated to the Exchange Stores, their mailboxes have grown and grown, you’re probably wondering how you’re going to move several Terabytes of data to the new Exchange platform; but, more importantly wondering when you might have to do this again. The short-term nature of IT and the constant cycle of upgrades and migrations means you may have to answer those question sooner than you expected.
One simple solution that future-proofs your migration and upgrade strategy is to deal with the data now by augmenting your on-premise Exchange with a Cloud based email management solution. Using this Cloud based email management solution is simple; the elastic and scalable nature of the Cloud lets you ‘dump’ your oversize email stores into a secure, scalable, flexible and resilient solution that will grow with you, but at the same time allow the users to have direct access to that email data through Outlook as though it was still on Exchange.
Now here’s the part of plan we don’t talk about very much, but one that provides a great degree of flexibility. When the next migration or upgrade comes around, or if you want to move from one platform to another, having already dealt with the data means your core email service i.e. Exchange, can be anywhere or anything. Upgrade, downgrade, move to Office 365 and back again, migrate some users or all users, the choice is yours; Augmenting Exchange with the cloud means you’re not tied to any one solution or version, both today and next year when it’s time to upgrade again.
by Orlando Scott-Cowley
This week Mimecast has been at the Gartner Data Center Conference 2011, in Las Vegas, with a packed agenda full of insightful discussions and presentations. As expected the Cloud was a strong trend throughout the week, but I couldn’t help but notice that another trend has emerged since the last summit; that of Big Data, a topic this blog has written about many times before.
One particularly compelling presentation by Gartner Research VPs, Merv Adrian and Sheila Childs delved into Big Data. The packed session was standing room only, so this is obviously a hot topic for people looking for insight to help them solve their own unique problems.
Adrian and Childs identified a shortcoming in the way business and technology leaders talk about big data, in that the emphasis is often placed on volume. They rightly pointed out that
“The most difficult information management issues emerge from the simultaneous and persistent interaction of extreme volume, variety of data formats, velocity of record creation and variable latencies, and the complexity of individual data types within formats.”
As we’re concentrating on volume of data, we’re often forgetting about the velocity, variety and complexity of the data too.
Adrian and Childs went on to quantify velocity, which is when I started relating it to email data and Exchange Stores.
Velocity involves streams of data, structured record creation and availability for access and delivery. Velocity means both how fast data is being produced, and how fast the data must be processed to meet demand.
The most important factor when it comes to thinking about Big Data in relation to Microsoft Exchange Server, in my opinion, is velocity. Of course most Exchange databases won’t have the sort of big data that most data center managers have to worry about, but to those of us who manage Exchange Servers, I’ll bet the data therein is one of the largest repositories of data in your environment. To coin a phrase of our Chief Scientist, you have essentially got a Nano-Google’s worth of data, it’s important to you, but nothing that hasn’t been dealt with before, but trying telling that to the Exchange administrator when they’re planning to migrate the stores from one version of Exchange to another.
So what is the Velocity of your Exchange Server? If Velocity is the stream of data, record creation and availability for access and delivery, I’m sure there must be a quadratic equation that will actually give us a figure for this. But I was thinking more about it in terms of every day reality, especially if that reality means an upgrade or migration.
The unique big data complexity that exists within each Exchange environment is compounded by the velocity of the email environment that surrounds it. The data will continue to grow at a rate that can only be determined by a number of local factors; corporate culture, use of email, access to email, integration of email into other systems. Again, I’m sure there is a quantitative way to work out what this velocity is.
When you’re thinking of doing something with your nano-Google Exchange store I would suggest that getting a grip on the velocity of Exchange is the first step. I doubt very much that you can do anything to throttle this velocity, not without upsetting your users at least. So I’m drawn to the phrase “Just Enough on Site” which is one we use at Mimecast, to describe an Exchange environment that has been given the benefit of Cloud Augmentation to take the Big Data load off said server, before, during and after a tricky migration.
I would argue that the amount of ‘online’ data needed in an Exchange Server is pretty minimal, probably about a month or two. The rest doesn’t need to be offline, but keeping it near-line is way more productive. Remember velocity is also about how fast the data must be processed to meet demand. Surely putting the less accessed and older data near-line in the cloud means your Exchange can concentrate on the on-line velocity of the real time data?
by Orlando Scott-Cowley
Mimecast recently commissioned Loudhouse, an independent research consultancy to take a look at the Exchange Migration situation. The research tells us that there is a mass migration of Microsoft Exchange Servers going on right now. At Mimecast we call this ‘The Great Email Migration’ and some interesting facts and figures have been discovered.
Underneath the headline research figures there is a lot going on that struck me as interesting if not perplexing and clearly frustrating; I’m often talking to CIOs and IT Managers about their email infrastructure and recently their plans to migrate to the next version of Microsoft Exchange Server; I’m always assuming they’re planning to upgrade and migrate to Exchange 2010 or Office 365, but I’m hearing more and more choosing to stay a version behind on Exchange 2007, but not for want of trying.
Microsoft, and in fact Mimecast, are desperate to get you all off the old versions of Exchange, away from those Exchange 2000 or 2003 boxes that are still out there, but for so many the upgrade path stops at Exchange 2007. I began to wonder why this is, and after a quick unofficial straw poll I found a pattern emerging.
Firstly I noticed that upgrade plans for Exchange have been in the pipeline for quite a long time. Many people tell me they were planning to upgrade from 2000/2003 versions to Exchange to 2007 pretty much as soon as they heard about the new release. But given the scale of the upgrade the project took them longer to budget and plan for, most blame their own internal and overly complex procurement process; whereby a non-technical procurement employee veto’s or delays the project for trivial reasons.
Secondly, I’ve heard quite a few mentions of a “patch-and-pray” mentality to upgrades. Let me be clear, there is only so long this kind of support process lasts before your Exchange Admin is facing a late night and lost weekend due to some sort of failure, and that’s the last thing we want. At some point the CIO has to admit the business and the users have outgrown their email environment and it’s time to look elsewhere; but this overly cautious approach, akin to the “if it ain’t broke don’t fix it” method, means you’ll never be close to the latest version. Fear of change, hesitation and caution are the enemy of new technology.
All of this frustrating behavior adds up to significant delays; delays that leave your IT project plans looking like the airport departures board during a heavy snow storm. You know you’ll get there in the end, but the wait is agonizing and you would do almost anything just to “get on with it.”
A permanent cycle of delays means your Exchange environment could always be stuck a version behind. Given that Microsoft plan to release a new version of Exchange every three years, I’m always concerned when I hear of project life-cycles that are even longer; how can you possibly take longer to deploy the platform, than it took the vendor to write the software in the first place? Don’t answer that I already know how; project scope, evaluation, planning, more planning, more evaluation, procurement, re-scoping, procurement, deployment planning, re-scoping, procurement, and so on. Initial project evaluation to final deployment for Exchange 2007 could have taken so long, that Microsoft have released Exchange 2010 in the meantime. And so the cycle continues.
Breaking the upgrade cycle is something I’ve written about before; now is the time. Seriously, Exchange 2010 is worth the effort, especially if you’re still floundering about with old versions like 2000 and 2003.