Ben Hatton Why managing ‘shadow IT’ is good for data security

BYOD stats

BYOD is exploding, and with it the prevalence of shadow IT (image courtesy of ReadWrite)

Shadow IT is something that has become prevalent in many different organizations, due to the growth of corporate BYOD policies as well as the evolving needs that many employees have from a data and technology standpoint. These needs typically revolve around the ability to do one’s job more efficiently, and they are often met by the use 3rd party applications, or personal mobile devices. However, these tools are often used without the knowledge or consent of the company’s IT department. This can leave your IT department “in the dark,” and hence the phrase “shadow IT.”

As you can imagine, this presents a daunting security challenge, especially in light of the growing risk of data breaches that all businesses face today. As a business, you need to safeguard your data (and who has access to it) as well as possible, and part of this involves reining in and managing any shadow IT that may be occurring within your organization.

Here are some tips for managing shadow IT within your own company:

  • Establish specific policies on outside applications and devices. It is very important to accommodate for shadow IT within your company’s written policies and procedures. This should include the procedure that employees must follow when considering outside software options, what your company’s internal application review process looks like (if applicable), and a listing of any “approved” pieces of software that your company may choose to accept (more on that below).
  • Monitor your network for new software/devices in use. Having reliable monitoring in place will give you visibility into what devices and applications are being used on your network, allowing you to detect when new applications or devices are in use by employees. You can’t manage what you can’t see to begin with, so network monitoring is an absolute requirement when trying to manage shadow IT.
  • Evaluate and measure any new applications against your internal compliance requirements. Like any company that takes its data protection and security seriously, you probably already have specific standards when it comes to the types of software that you allow your employees to utilize on your network. If you are in healthcare or financial services, you also have a wide range of strict compliance requirements that apply to software usage as well. If you find that many of your employees are using outside applications to improve their overall efficiency and productivity (like Slack for instance), it can be beneficial to create a listing of outside applications that are “approved” for your employees to use if they desire. Prior to doing this, you will want to conduct a thorough review of the application itself to ensure that it is secure, supported, and compliant with your own compliance standards.

Shadow IT is poised to become more and more prevalent in today’s technology climate, and in light of the climate’s continually evolving security demands, it’s vital that you stay on top of any shadow IT in your own organization. With the right planning and policies, it is something that can be both monitored and maintained.

If your company has dealt with or is currently dealing with managing your own shadow IT, we’d be curious to learn how you are working through and managing that. Leave us a comment below!

Share this with your friends!

Share on Facebook Share on Twitter Share on LinkedIn Share on Google+


Ben Hatton Evaluating data center software: 3 things to consider

Recently I read a series of articles about the ‘failed’ business model of open source software companies, and what these companies should do to adapt and better compete against larger companies that provide proprietary software (here’s one of them). I won’t look too deeply into those specifics, but I thought this would be a good opportunity to look at some ways that you can evaluate software options yourself, if you are like any of our colocation clients who have multiple servers and applications to manage. Software is obviously a requirement when it comes to infrastructure management, so the ability to evaluate and make decisions on what software you will use is a very important one to have. Coding

3 factors you should consider

It’s no secret that we utilize several pieces of open source software here at Data Cave*. We use a combination of open source as well as proprietary software to serve a very wide range of functions to run pieces of equipment, as well as to monitor them. When it comes to making a decision to move forward with any piece of software, there are 3 primary factors that go into that decision:

  • Scalability: Does the software allow for new devices to be added over time? And if so, does its performance level continue to keep up? If you are a growth-focused organization like many of our clients, any software you utilize should be able to grow with your business, accommodating for higher workloads as your infrastructure needs become more demanding. This makes scalability a very big deal for any piece of software, especially software that manages some aspect of your server infrastructure.
  • Flexibility: What does the software offer in terms of overall functionality, in the short and long term? Is it reliable enough to support your business today, but full-featured enough to support needs your IT team may have in the future? This goes somewhat hand in hand with the point on scalability mentioned above; as your IT infrastructure evolves and grows, your specific needs from a software standpoint will likely evolve as well. Having a software tool in place that is flexible enough to support your current as well as future needs should play a major part in your decision-making process.
  • Support: Software support should include documentation on the product itself, as well as customer support in the form of a contract or warranty. Any reliable piece of software should be well documented and supported by its developers, making long-term support a must have for any new software you may be evaluating.

While the article I mentioned above focuses on the shifting business models being taken by some open source software companies, at the end of the day the only thing that should really matter to you is having reliable, flexible, and supported software that can help to sustain and grow your IT infrastructure along with your business. This applies regardless of whether you prefer open source vs. proprietary software, as well as whether you own your own data center vs. colocation space in a 3rd party facility. When it comes to your server management, there are many different needs for software, and there are also many different options out there to fill those needs.

These are the key things that we look for when we evaluate our own software needs at Data Cave, and I hope that these pointers can help with your own software evaluation needs as well!


Share this with your friends!

Share on Facebook Share on Twitter Share on LinkedIn Share on Google+


Ben Hatton Data center insider threats (and how to prevent them)

The threat of data breaches and data theft have been in the news a ton lately, and we’ve written a lot about it as well. Much of the assumption is that these threats primarily come from outside of an organization, and while that’s true, a sizable and growing percentage are what can be considered as ‘insider threats’ from a company’s own employees or contractors they are working with. Insider threats

Quick definition

An insider threat can most often be defined as an intentional act by an employee to steal or destroy sensitive company data. Whether it is a disgruntled employee who has been terminated, or an employee or contractor wanting to make money off of sensitive data, this type of threat is very real, and a recent study shows just how big of a deal it can be.

Insider attacks are on the rise

A recent study of over 500 security professionals by Vectra and Linkedin showed me some insights into just how prevalent this type of threat is, and I believe a lot of this information is definitely worth digesting if you are a company that relies on a large amount of sensitive data on a daily basis. I’ll start off with a few of what I found to be the most relevant findings from the study (PS: you can view the study in its entirety here).

Here are my 3 biggest take-aways:

  1. 62% of the security pro’s who participated in the study said that they have seen instances of insider attacks increase in frequency over the past year.
  2. Less than 50% of the organizations surveyed have policies or monitoring in place to help prevent against insider attacks.
  3. The specific types of data that are the most vulnerable are customer data, intellectual property, and corporate financial data.

Your biggest take-away: managing who has data center access

These stats as well as others from this report clearly indicate that your business data has the potential to be susceptible to threats from inside your organization, and that lessening the risk from these types of threats should be a priority. There are a number of ways that this can be done from a policy standpoint, which I’ll look at in a future blog post (stay tuned!), but a big thing that we advocate at Data Cave is monitoring and managing who has access to your colocated equipment at any given time.

For many of our colocation clients, this involves receiving reports from Data Cave of which employees have access to their server equipment at regular intervals (once a month, for example). This way, they can always have accurate information on who should be accessing their data and equipment, and they can also make adjustments as people leave the company, or new people come on board.

Another big thing we often recommend to our clients is to reach out to Data Cave as soon as possible in the event that an employee leaves the company, so that their data center access can be revoked. A disgruntled former employee who has access to IT equipment and sensitive data can easily become a security risk, and ensuring that their data center access is immediately revoked will help to prevent this from becoming a serious threat.

Insider threats certainly aren’t a guarantee for every company, but they are always a possibility as evidenced by this report. Taking steps now can help ensure that your data can be better protected in the event of any such threat.


Share this with your friends!

Share on Facebook Share on Twitter Share on LinkedIn Share on Google+


Ben Hatton The benefits of data center consolidation

I’ve written a lot about how data center colocation can help your business grow, especially if it is growing faster than your current data center can keep up. However, we don’t often think about the other side of the spectrum, where an organization may be so geographically spread out that it becomes more efficient and cost-effective for them to consolidate multiple data center locations into one. While this practice is sometimes a necessity for some businesses, it virtually always brings major benefits that I want to highlight in this post. Data center consolidation

Quick disclaimer

I’ll make the fairly obvious point that consolidating down from multiple data center locations isn’t something every company will ever need to go through. Consolidations are undertaken by organizations that already have multiple data center locations, usually as a result of having multiple geographic locations across a region, or the country. If a business has multiple offices throughout the country, for example, and each office has its own server room or data center that backs up data and runs applications, then this would be a prime candidate that could benefit from a data center consolidation.

Why consolidate?

Organizations like this that choose to reduce their data center footprint see several positive benefits, including:

Cost savings from having fewer data center facilities and space to maintain.

Stronger security as a result of their infrastructure becoming more centralized and less geographically spread out.

Easier to stay in compliance with fewer locations that must be audited on a regular basis.

The results speak for themselves

Easily the biggest real-world example of a data center consolidation project can be seen in the US government’s consolidation initiative that began back in 2010. Over the past 5 years, government agencies have reduced their number of data centers by the hundreds, and the process is still ongoing. To date, these agencies have seen an estimated cost savings of $2 billion as a result of this consolidation, with an expected additional savings of over $4 billion over the next 3 years*. They have been able to see significant savings while also leveraging new technologies to operate more efficiently and dynamically than ever before.

These are just a few of the high level reasons why it’s a smart business decision to consolidate and reduce your overall data center footprint when possible. In a future post I’ll look at some ways that you can begin planning for a data center consolidation as well…stay tuned!


Share this with your friends!

Share on Facebook Share on Twitter Share on LinkedIn Share on Google+

Ben Hatton Why data center humidity may be going down


Image courtesy of Flickr user mag3737

When it comes to data center environmental monitoring, temperature and humidity have always been the key metrics that provide insight into a data center’s operating environment. Like the temperature, humidity is something that is continually monitored to ensure it is kept at a consistent and acceptable level (traditionally this level has been in the range of 40-55%). At Data Cave we monitor and maintain this humidity level in all of our data suites.

With many of the advancements in server technology and in modern data centers, this traditional level has been gradually shifting, just like the recommended temperature has been (check out our Why Data Center Temperatures are going up post to learn more about that). ASHRAE (American Society of Heating, Refrigeration, and Air-Conditioning Engineers), the leading authority over heating and air-conditioning standards across many different industries, has adjusted its recommended humidity levels for data centers from time to time over the years, and is poised to announce more changes to its recommendations later this year.

Specifically, they are expected to publish recommendations that the relative humidity (dew point) of data centers can be made lower than it has been in the past, without having a substantial impact on electrostatic discharge (ESD) put off from server equipment. This is based on a recent study that they undertook together with the University of Missouri.

Some background

It’s a well known fact that when a room is less humid, the air is naturally dryer. This leads to an increase in static electricity that is generated from constantly running server equipment. If this electricity goes  unchecked, it can often lead to a discharge, which can damage or destroy the server equipment.

The reason this study (and the anticipated recommendations coming out of it) is so relevant, is because it could have implications for how data centers manage their humidity levels, as well as how frequently may they utilize free cooling methods. If it can be documented and proven that a lower relative humidity can be implemented without noticeably raising the risk of ESD, that could lead to a change throughout many data centers.

Many classes of recommendations

An important thing to remember when it comes to the existing and future recommendations on humidity levels, is that the recommendations themselves are relative to specific types of equipment. ASHRAE uses several different classes of equipment for its temperature and humidity recommendations, with the overall recommendations varying from class to class. Many of these classes have been added just in recent years as newer, more sophisticated types of IT equipment have come on the market that can handle higher temperatures and lower humidity.

Moving in a good direction

What this indicates to me is that any shift towards lower humidity levels in the data center is really the result of advancements in server technology, cooling methods, and data center layouts. If ASHRAE does indeed lower their recommendations on data center humidity levels, I believe it will be a strong indicator of how far we have come in terms of the technology that exists, as well as where the industry is heading in with regards to humidity.


Share this with your friends!

Share on Facebook Share on Twitter Share on LinkedIn Share on Google+

Next Page »