by Clint Boessen
Microsoft has changed the way Offline Address Book (OAB) Distribution works over previous versions of the product to remove a single point of failure in the Exchange 2007/2010 OAB Generation design. While this new method of generating and distributing the Offline Address Book has its advantages, there is also a disadvantage which can result in a breach of privacy especially in multi-tenant environments. In this article we will be looking over how OAB Generation worked in the past as opposed to how it works now highlighting both the good and the bad.
Back in May 2009, I published an article entitled “How OAB Distribution Works” which has received a large number of visits and can be found on my personal blog under the following URL link. This article explains in detail the process behind OAB Generation in Exchange 2007 and 2010 and I highly recommend this read to anyone who is not familiar OAB Generation in previous releases of the product.
If you have not read the above article, let’s quickly summarise. In Exchange 2007/2010 every OAB has a mailbox server responsible for OAB Generation. The mailbox server responsible for OAB generation would generate the OAB according to a schedule and place it on an SMB share under \\mailboxservername\ExchangeOAB. The Exchange 2007/2010 CAS servers responsible for distributing this Offline Address Book would then download the OAB from this share to a folder advertised through Internet Information Services (IIS). Outlook clients then discover the path of the IIS website through autodiscover and download the files located under the OAB IIS folder through HTTP or HTTPS. If you need to gain a more in-depth understanding of this process again I encourage you to read the blog post above.
Now the problem with the above design is every OAB has one Mailbox server hard coded to be the server responsible for performing OAB Generation. The whole point of Exchange Database Availability Groups is to allow mailbox servers to fail and have databases failover to other mailbox servers which is a member of the same Database Availability Group. This presents a single point of failure. In the event the server responsible for generating the OAB was to fail, this OAB generation process would not failover to another server as the OAB is hardcoded to use that specific mailbox server as the OAB generation server. This means until an administrator brings back the mailbox server which failed or moves the OAB generation process for the specific OAB to another mailbox server, the OAB in question will never get updated.
To fix this in development of Exchange 2013, Microsoft needed a method to allow any mailbox server to fail without disrupting the OAB generation process, after all this was the whole idea behind Database Availability Groups – the ability to allow mailbox servers to fail. Instead of spending development time on putting together a failover technology around OAB Generation, Microsoft decided to incorporate the OAB Generation process into Database Availability Groups. This means instead of having one mailbox server generate the OAB and share it out via SMB, the Exchange 2013 server hosting the active mailbox database containing the Organization Mailbox is now the server responsible for generating the OAB. In fact in Exchange 2013, the OAB is now stored in an Organisation Mailbox so in the event a mailbox server fails or a database failover occurs, the OAB will move along with it. This architecture change has removed the OAB generation single point of failure which caused problems for organisations in previous releases of the product.
Whilst Microsoft removed the single point of failure from the generation process of the OAB, they introduced a problem with the distribution process. In previous releases there was a service running on CAS servers known as the Exchange File Distribution Service, a process which downloaded a copy of the OABs from various mailbox servers performing the OAB Generation task and placed the OABs in a web folder available for clients to download. This allowed companies running multiple OABs to provide NTFS permissions on the OAB folders to restrict who is allowed to download the OAB. This is especially useful in Exchange multi-tenant environments to ensure each tenant is allowed to only download the address book applicable to their organisation.
In Exchange 2013 Client Access Servers the Exchange File Distribution Service has been removed and the Exchange 2013 CAS now proxies any OAB download requests to the Exchange 2013 mailbox server holding the active organisation mailbox containing the requested OAB. The Exchange 2013 CAS finds which mailbox server this is by sending a query to Active Manager. As the Exchange 2013 CAS no longer stores each OAB in a folder under the IIS OAB directory, companies can no longer set NTFS permissions on the folders to restrict who has permissions to download each respective OAB. It is also important to note that inside each organisation mailbox there is no means provided for organisations to lock down who can download each OAB through access control lists. This introduces privacy issues for companies who offer hosted Exchange services as it presents a privacy breach. Someone who knew what they were doing and has a mailbox within the Exchange environment could download OABs from other organisations and in result gather full list of employee contacts for data mining purposes. Microsoft’s response to this threat documented in the multi-tenant guidance for Exchange 2013 is for hosting companies to “monitor the OAB download traffic” – in other words there is no real solution to prevent this from happening.
For more information about the Exchange 2013 OAB distribution process I strongly recommend the following article published by the Exchange Product Team.
by Orlando Scott-Cowley
Clint Boessen is a Microsoft Exchange MVP located in Perth, Western Australia. Boessen has over 10 years of experience designing, implementing and maintaining Microsoft Exchange Server for a wide range of customers including small- to medium-sized businesses, government, and also enterprise and carrier-grade environments. Boessen works for Avantgarde Technologies Pty Ltd, an IT consulting company specializing in Microsoft technologies. He also maintains a personal blog which can be found at clintboessen.blogspot.com.
You can’t have failed to notice the brouhaha surrounding LinkedIn’s Intro App; ‘security consultants’ from far and wide are ranting about the potential for the app to be a huge security hole or target for hackers, but very few, or none that I noticed, had spotted the silver lining here; there’s huge untapped potential built into the way we communicate and the 3rd party sources of data we could integrate into those communications.
Your real social network, your most accurate social network–is your email contact list
Firstly, I’m not downplaying the security risk of routing your email through a 3rd party whose job it is to collect as much information about you as possible. Not to mention the serious security breaches LinkedIn suffered recently. But I think it is important to point out that for corporate users, LinkedIn’s app doesn’t (yet) support Microsoft Exchange; the only supported email platforms are Gmail, Google Apps, Yahoo! Mail, AOL Mail and iCloud, so the imminent risk is related to personal information rather than corporate data.
Secondly, there is a silver lining to all of this; and if LinkedIn had thought about it, they could have exploited this idea in a much more business friendly way. Business users are, after all, their entire user base. As Google and Yahoo are finding out through the US Legal process, there are some obvious sensitivities to ‘scanning’ customers email for content; even if the scanning helps the provider serve up ads. I’m sure LinkedIn thought this through, but the poorly thought out implementation of Intro does make me wonder.
Back to the silver lining. Think for a second about two groups of users, if you will.
The first is a set of ‘contacts’ you’ve collected for quite some time; a group of people you barely know and communicate with rarely. These will be colleagues, both ex and current, people you ‘meet’ at meetings and at trade shows, recruiters, industry friends etc. In short a small network of people you mutually collect and connect to in the hope they’ll help you find your next employer.
Of course the first group is your LinkedIn contacts, and the second is your email contacts. LinkedIn is very much static, and I see Intro as being LinkedIn’s way of helping you connect more, or to be blunt, gain more static contacts for their business model.
On the other hand your real social network, your most accurate social network–is your email contact list. I don’t mean your address book, I mean the real time dynamic set of people and email addresses you are communicating with right now; your sent items today, yesterday and the day before, the people you collaborate with the most. This is where the real value is, and where email service providers could really show you who you connect with, when, why and for how long. The dynamic nature of this group also means it’s up to date, as of right now, and importantly if you chose to you could extend your view into that group down several degrees of connection. Think about who you’ve emailed today, who those people are emailing, and who their contacts are emailing—that’s a real business social network. And all of that without getting a single email from a recruiter.
If LinkedIn Intro had helped us identify those live connections without intruding on the way we connect, that would really be an App, that would really be useful, and in an enterprise or business context it would be infinitely valuable.
by Engin Yilmaz
Instant Messaging (IM) and Microsoft’s Lync in particular, are joining email as critical communications and file sharing tools in the modern organization. That means important conversations and data are being shared over Lync that need to be archived for future e-discovery.
Mimecast Archiving for Microsoft Lync IM: Secure, cloud-based interactive archive for all Lync Instant Messages including transferred files, shared presentations and whiteboards during group conversations or conferences
Lync in particular is growing in popularity. According to Microsoft, 90 of the Fortune 100 companies currently use Microsoft Lync IM as part of their employee communications. Analyst firm Ovum said, in a blog from March this year that 45 per cent of enterprises it had met were extending their deployment of Lync.
For regulated businesses failing to archive their IM traffic could be a breach of their legal and compliance regulations. IM counts as electronic information so organizations have a legal responsibility to archive it. Although it’s not just regulation that drives IM archiving; we’re seeing a dramatic growth of archiving as a best practice as more legal cases cite email and IM conversations as critical evidence, emphasizing the necessity of having the right tools in place for archiving and eDiscovery.
Most businesses that we talk to are aware that a lack of strategy on how best to archive IM traffic is preventing them from complying fully with their obligations, while many organizations believe integrating IM archiving with their existing email and file archive is just too complicated and expensive. There is also anecdotal evidence to suggest that quite a few organizations believe that by simply running Lync they are automatically archiving their messages, which is simply not the case. So for some putting off the pain of fixing this for as long as possible is fast becoming a problem, while for others their compliance requirements complicate their archiving strategy further.
Fortunately, your Lync archiving project doesn’t need to be a complex one.
I am happy to say that Mimecast has a solution. Today we released a cost effective, yet easy to implement and manage Lync archiving solution: Mimecast Archiving for Microsoft Lync IM.
It’s a secure, cloud-based interactive archive for all Lync Instant Messages including transferred files, shared presentations and whiteboards during group conversations or conferences. It’s fully integrated with Mimecast’s UEM Archive, so complements our other archiving products for email and files. Also the Mimecast Administration Console gives administrators a simple and unified interface for searches against all electronic content and conversations, now including Lync IM.
So now, Mimecast customers know they have an easy way to archive the growing volume of Lync IM conversations in their organizations. You can find out more about it here.
by Orlando Scott-Cowley
The concept of the interactive archive as the future direction of the Enterprise Information Archive is starting to take shape in the minds of CIOs and IT staff, who are trying to meet the new requirements demanded of their data. It’s a vision driven by the idea that archiving is about more than simply-storage or vault models we’ve become used to.
Visual representation of Forrester Research, Cheryl McKinnon’s presentation in The Archive Unleashed webinar
I’ve been speaking more about the concept recently, but it was ‘The Archive Unleashed’ webinar early in October, which highlighted how the idea was evolving. Cheryl McKinnon, Principal Analyst, Forrester Research who joined us for the discussion, called out the key battles grounds of ownership of unstructured data and migration of legacy systems as areas which will define how this vision will evolve.
Also, my presentations at one of Mimecast’s leading technology partners, Essential Computing, ‘Email Archive Migration Masterclass’ and IP EXPO have helped us refine the idea even further. At Mimecast we talk about an Interactive Archive and while this isn’t a specific product, it’s a vision that sets us and our customers on a journey to a more useful archive. Interactive Archiving is set apart from today’s EIA solutions by some important characteristics, which we can now define…
- Value inherent to archive data: Most EIA solutions simply store, or vault, information. Extracting value from your intellectual Property therein is nigh on impossible. A significant part of our long term vision for your archive and its data, is to help you make the most of that data and the business intelligence locked in there.
- A new type of Cloud: While cloud archiving isn’t new, most cloud solutions are just a point replacement for an on-premise archive. Outsourcing the data storage is all they can offer. An Interactive Archive is one that brings all the benefits of the cloud, and the innovation of a cloud vendor, to deliver more access, more uses cases, more advancements and more value.
- End users are vital: Today, end users have very little aside from a clunky search. Our concept of interactivity of the archive comes from the consumption of information by the end users; in short how they access their data, and on which platforms–their key productivity apps like Outlook, through to dedicated Mobile OS apps to support them on the go.
The key principles above are hard for existing archive software vendors to deliver on, I’d go as far as to say, almost impossible. This is primarily a symptom of the way their software has been designed and how there is little or no roadmap for innovation into the future.
We think about our archive data within the confines, both physical and metaphorical, of the word “vault” and how data is simply retained, never to be seen again, let alone provide any value. Sadly, these types of archives are where your data goes to dies, never having the chance to become interactive for end users, and struggling at best to deliver results.
At IP EXPO I closed my presentation by encouraging the attendees to forget about the romance of the steam age, and forget their ‘old-style’ on premise archives. Steam powered computing if you will? These archives are killing your data, and worst still they’re stifling your organization’s competitive edge. It’s time to enlighten your users with the tools they need and empower your administrators with the controls and security they demand, as well as learn to look for the value in your own corporate big data.
‘Don’t just store information, use it’, is the way in which most progressive IT teams are translating the Big Data opportunity. It’s a perspective we should all get behind.
by Barry Gill
Over the past two days, 55 of our technical customers in the UK, joined us to find out more about Exchange 2013. We organized this with friends of ours that wrote a new book called ‘Microsoft Exchange Server 2013: Design, Deploy, and Deliver an Enterprise Messaging Solution’, Nathan Winters and Nicholas Blank. They were joined by other expert speakers Brian Reid and Carl Holt.
The presenters have the full attention of the audience at the Mimecast Exchange event
We put this event on for our customers as part of our commitment to help them best exploit their messaging environment. We also have established a private community for Sys Admin customers on LinkedIn where we will share content including video content after the event.
Over the two days we dived deep into the technical detail of Exchange. In day 1 we looked at mailbox and client access Exchange architectures, load balancing and publishing, and, most importantly, designing Exchange. Day 2 was hybrid deployments, High Availability and Site Resilience, and finally migration to Exchange 2013.
This event has been extremely well received with many of the attendees being able to use the last two days as an opportunity to rapidly skill up in preparation for pending upgrades!
So check back on our blog over the coming week for highlights from the discussions. Or join our LinkedIn community if you are a technical customer.