by

Six Email Continuity Mistakes – and How to Avoid Them

The 2014 Atlantic Hurricane season is in full swing through November, putting your organization – and mission-critical systems, like email – at sudden risk of exposure to tropical storms, floods and fires.

Ask yourself: When was the last time you tested your business continuity plan? If the answer is one year or longer, you risk significant network downtime, data leakage and financial loss. According to Gartner, depending on your industry, network downtime can typically cost $5,600 per minute or more than $300,000 per hour, on average. Don’t wait for disaster to strike. Treat email like the critical system it is, and avoid making these six mistakes that could jeopardize business continuity – and your job.

Combat downtime during hurricane season by planning ahead.

Combat downtime during hurricane season by planning ahead.

  1. Not testing your continuity solution. You’ve devised and implemented what you believe to be a solid continuity solution, but you’ve not given it a production test. Instead, you cross your fingers and hope when (and if) the time comes, the solution works as planned. There are two major problems with not testing your plan from the start. First, things get dusty over time. It’s possible the technology no longer works, or worse, maybe it was not properly configured in the first place. Plus, you might not be regularly backing up critical systems. Without testing the solution, you’ll learn the hard way that data is not being entirely backed up when you perform the restore. Second, when it comes to planning, you need a clear chain of command, should disaster strike. If your network goes down, you need to know who to call, immediately. Performing testing once simply is not enough. You need to test your solution once a year, at a minimum. Depending on the tolerance of your business, you’ll likely have to test more frequently, like quarterly or even monthly.
  2. Forgetting to test fail back. Testing the failover capabilities of your continuity solution is only half the job. Are you prepared for downtime that could last hours, days or even weeks? The ability to go from the primary data center to the secondary one – then reverting back – is critical, and this needs to be tested. You need to know that data can be restored into normal systems after downtime.
  3. Assuming you can easily engage the continuity solution. It’s common to plan for “normal” disasters like power outages and hardware failure. But in the event of something more severe, like a flood or fire, you need to know how difficult it’s to trigger a failover. Also, you need to know where you need to be. For example, can you trigger the fail over from your office or data center? It’s critical to know where the necessary tools are located and how long it’ll take you or your team to locate them. Physical access is critical. Distribute tools to multiple data centers, as well as your local environment.
  4. Excluding policy enforcement. When an outage occurs, you must still account for regulatory and policy-based requirements that impact email communications. This includes archiving, continuity and security policies. Otherwise, you risk non-compliance.
  5. Trusting agreed RTP and RPO. In reality, you’ve got to balance risk and budget. When an outage happens, will the email downtime agreed upon by the business really stick? In other words, will the CEO really be able to tolerate no access to email for two hours? And will it be acceptable for customers to be out of touch with you for one day? The cost associated with RTO and RPO could cause a gap in data restore. If you budget for a two-day email restore, be prepared that during an outage, this realistically means two days without email for the entire organization. As part of your testing methodology, you may discover that you need more or less time to back up and restore data. It’s possible that, as a result, you may need to implement more resilient technology – like moving from risky tape backup to more scalable and accessible cloud storage.
  6. Neglecting to include cloud services. Even when you implement cloud technologies to deliver key services, such as email, you still have the responsibility of planning for disruptions. Your cloud vendor will include disaster recover planning on their end to provide reliable services, but mishaps – and disasters – still happen. Mitigate this risk by stacking multi-vendor solutions wherever possible to ensure redundancy, especially for services like high availability gateways in front of cloud-based email services, or cloud backups of key data.

With the proper testing and upfront business continuity preparation, you can significantly reduce – or even prevent – email downtime, data leakage and financial loss after disaster strikes.

by

Big Data: Focus and Practicality Now Vital

It’s been years in the making and has had its fair share of media hype, but according to Gartner’s August ‘Hype Cycle Special Report for 2014‘ the concept of Big Data has now entered its aptly named ‘Trough of Disillusionment’.

It's been years in the making, but according to Gartner's August 'Hype Cycle Special Report for 2014' the concept of Big Data has now reached the point where we're now in the 'Trough of Disillusionment'.

It’s been years in the making, but according to Gartner’s August ‘Hype Cycle Special Report for 2014′ the concept of Big Data has now reached the point where we’re now in the ‘Trough of Disillusionment’.

And it’s not Gartner alone. Talk to industry stalwarts and a clear message comes back – the honeymoon is over. No longer is it a positive buzzword in meeting rooms. It’s becoming tangible…real people with real salaries and real job titles are now associated with the discipline of managing and making the most of a company’s big (or small) data, both locally and in the cloud.

We’ve come to realize there are a number of opportunities for big data and it’s management, as outlined in IBM’s August report titled ‘The New Hero of Big Data and Analytics‘. In it, a new C-suite role is outlined, along with five areas are a Chief Data Officers (CDOs) can optimize and innovate in:

  1. Leverage: finding ways to use existing data.
  2. Enrichment: existing data is joined up with previously inaccessible (fragmented) data either internal or external.
  3. Monetization: using data to find new revenue streams.
  4. Protection: ensuring data privacy and security, usually in collaboration with the Chief Information Security Officer.
  5. Upkeep: managing the health of the data under governance.

It’s a great list of general outcomes for those who manage data to plan around over the coming years, but what might be even more useful is a planning framework to help develop these plans now.

Obviously this framework will evolve, and to some extent there will be a degree of trial and error as organizations try to wrangle increasingly large data-sets. But I thought it’d be useful to make some suggestions for considerations against these outcomes. So I’ve come up with some key questions to gather information to help in the CDOs strategic planning. Answering yes to most, if not all of these questions is a good indication a CDO in your organization would have a beneficial business impact.

  1. People: as mentioned in IBM’s report – is the CDO’s office a guiding, enforcing authority? Is the office fully aligned to the business and scalable? Are the skills available appropriate? Is the business giving the CDO authority or permission to operate?
  2. Compliance: not just with regional and industry regulation but with the company culture.
  3. Intelligence: how can the right information reach the right people in a digestible form that catches their attention? Does the information remain useful throughout its lifecycle?
  4. CIA: Confidentiality, Integrity and Availability. The triangular cornerstones of any information security policy, no less important. Can your CDO guarantee data CIA, and have board level authority therein?
  5. Technology: which technology providers can help support these outcomes today, and well into the future? Does the chosen technology scale in line with the parabolic growth of data, or is it linear or worse, unpredictable?

It’s by no means a definitive list, but we hope it helps stimulate the conversation around this emerging discipline of curating data to a commercial end. I look forward to sharing ideas with our customers and partners on this over the next few months. And as always, I’d appreciated any comments under this post.

by

Welcome to Number 10. Mimecast Presents to UK Prime Minister and Guests

Thursday mornings don’t get more exciting than this. Yesterday I was asked to represent Mimecast at a briefing hosted at Number 10 Downing Street by the UK Prime Minister, David Cameron.

We were one of only ten UK technology companies invited to speak to an audience of leaders of some of the world biggest companies and other members of the Government – the event was called ‘Pitch 10’. The goal was to showcase the strength and talent of the UK tech scene. It was great for Mimecast to be recognized again for our work in this way and to join other inspiring companies carving their own paths as innovators and businesses.

Mimecast at a briefing hosted at Number 10 Downing Street by the UK Prime Minister, David Cameron. We were one of only ten UK technology companies invited to speak to an audience of leaders of some of the world biggest companies and other members of the government – the event was called ‘Pitch 10’.

Mimecast at a briefing hosted at Number 10 Downing Street by the UK Prime Minister, David Cameron. We were one of only ten UK technology companies invited to speak to an audience of leaders of some of the world biggest companies and other members of the Government – the event was called ‘Pitch 10’.

My brief was pretty simple. Come and tell us about the company and what you have achieved.

Firstly I’d say that this event, and others, show that the UK tech scene is something to be admired. It’s a vibrant and diverse community of innovators and business people right across the country and from around the world. London’s Tech City gets a great deal of the press and plaudits of course but it was good to see firms from other parts of the UK represented.

As those who follow us closely will know we’re a cloud email, security and archiving business. So job number one was to explain our view about the criticality and primacy of email in business.

When you take a moment to think about it you realize quickly that we all rely on email. Email is the communications and data backbone of all organizations large and small, private or public sector. It underpins our communication, collaboration and decision making. It carries our ideas, insights and knowledge. It stores and exchanges contracts, orders and business commitments.

Because of this, managing, storing and protecting email (and the valuable data it contains) is a critical consideration for IT teams. This is where we come in. We help customers move to the cloud and solve three critical challenges beyond the mailbox.

We help them:

- Protect their organization by improving email and data security from the growing volume and sophistication of security threats they face every day.

- Ensure their business carries on when the primary email service is out of action with our continuity services.

- Archive the rapidly growing volumes of email communication and associated data safely in the cloud, and off their own on-premise infrastructure.

Now traditionally organizations have put several independent systems on their IT infrastructure to address these email needs, adding considerable cost and complexity. Mimecast’s secure cloud platform enables organizations to protect their corporate email and data; move these security, continuity and archiving services off their own IT infrastructure safely to the cloud, and decommission these additional systems, freeing budget and resources for other priorities.

I’m pleased to say that the reception to our story was very warm and supportive. We’re proud of what we’ve achieved for our customers. We also see a great deal more opportunity and chance for innovation still to be grasped.

So as probably the world’s most famous front door shut behind me, it was straight back to our offices in The City in London and back down to the day job. 

by

Graymail – Mail That You Want, but Just Not in Your Inbox Right Now

The mail you want, but just not right now. Seems like an odd way to talk about email, either you want it or you don’t. For years we’ve been talking about the unwanted types of email, like spam, that have grown to be a pest, but which have largely been dealt with by effective anti-spam services; but now there’s a less distinct line between good and bad as far as our users are concerned. The email that sits in this middle ground has become known as graymail.

Mimecast’s new Graymail Control automatically categorizes graymail and moves it to a separate folder – allowing end users to review the messages at their leisure and keeping the inbox optimized.

Mimecast’s new Graymail Control automatically categorizes graymail and moves it to a separate folder – allowing end users to review the messages at their leisure and keeping the inbox optimized.

More specifically, graymail is email like newsletters, notifications and marketing email. The types of email marketing you are bombarded with receive when you buy something online or use your email address to sign up for something. Normally you are opted-in to these marketing emails unless you manage to spot the often well-hidden opt-out tick box. These emails are initially interesting, but grow tiresome quickly.

You’re unlikely to want them all in your inbox right now, but somewhere else that makes them easier to read later. Many consumer grade email providers offer a way of categorizing graymail, such as Gmail’s Primary Inbox and Promotions tabs.

Graymail isn’t new. The idea was first suggested by Microsoft researchers in 2007, at the now defunct CEAS conference. Graymail, or Gray Mail as it was called then, was defined as messages that could be considered either spam or good. It’s fair to say many end users consider newsletters that they opted-in to, mostly unknowingly, as spam even though they could easily unsubscribe from the sender’s distribution lists.

Graymail is also described by the phrase “Bacn”, (as in bacon). The first use of the term Bacn is thought to have been coined at PodCamp Pittsburgh 2, as a way to differentiate between spam, ham and bacn in your inbox.

The unwillingness of end users to unsubscribe, or understand the problem as being somewhat self-inflicted, has led many enterprise IT teams to look for a solution. As a provider of email security services, Mimecast’s Threat Operations and Spam teams know first-hand how users are inclined to report bacn or graymail as spam email. A large percentage of the email submitted to Mimecast for analysis as spam is in fact legitimate marketing email with valid unsubscribe links.

It has become increasingly obvious that end users will continue to be frustrated by this graymail problem. The most straightforward solution is stemming the flow in such a way that keeps an enterprise inbox free of bacn so legitimate business-related emails take priority. Mimecast’s new Graymail Control provides this capability, by automatically categorizing graymail and moving it off to a separate folder – allowing your end users to review the messages at their leisure and keeping the inbox optimized.

If you’d like to find out more technical detail about how to configure Mimecast’s Graymail Control please visit our Knowledge Base article here.