One word that you often see associated with any data center is its “tier,” or its level of service. Virtually every data center has a tier ranking of 1, 2, 3, or 4, and this ranking serves as a symbol for everything it has to offer: its physical infrastructure, its cooling, power infrastructure, redundancy levels, and promised uptime.
We consider Data Cave to be categorized as a Tier 4 data center because of our best practices and maximum uptime levels. But apart from looking good on paper, what does that actually mean to you? I wanted to use this post to answer that question! I will cover what each of the 4 data center tiers encompass, what types of factors go into each tier ranking, and what it ultimately means at the end of the day.
The data center tier system first came into existence back in 2005, as a way to bring quantifiable standards to the industry that can be measured against for each individual data center. These standards were developed and are maintained by the Telecommunications Industry Association (TIA), an entity that is responsible for a wide range of standards across many spectrums of the IT industry. In addition, a separate set of 4 tier levels has also been developed by the Uptime Institute, a 3rd party data center research organization. While these different sets of standards are maintained separately by these two organizations, they are very similar to one another in terms of the specific criteria that make up each tier.
Each data center tier ranking consists of several criteria and requirements that primarily focus on a data center’s infrastructure, levels of redundancy, and promised level of uptime. Here are some specifics on the factors that go in to each of the 4 tiers:
||A Tier 1 data center is the simplest of the 4 tiers, offering little (if any) levels of redundancy, and not really aiming to promise a maximum level of uptime:
- Single path for power and cooling to the server equipment, with no redundant components.
- Typically lacks features seen in larger data centers, such as a backup cooling system or generator.
- Expected uptime levels of 99.671% (1,729 minutes of annual downtime)
||The next level up, a Tier 2 data center has more measures and infrastructure in place that ensure it is not as susceptible to unplanned downtime as a Tier 1 data center:
- Will typically have a single path for both power and cooling, but will utilize some redundant components.
- These data centers will have some backup elements, such as a backup cooling system and/or a generator.
- Expected uptime levels of 99.741% (1,361 minutes of annual downtime)
||In addition to meeting the requirements for both Tier 1 and Tier 2, a Tier 3 data center is required to have a more sophisticated infrastructure that allows for greater redundancy and higher uptime:
- Multiple power and cooling distribution paths to the server equipment. The equipment is served by one distribution path, but in the event that path fails, another takes over as a failover.
- Multiple power sources for all IT equipment.
- Specific procedures in place that allow for maintenance/updates to be done in the data center, without causing downtime.
- Expected uptime levels of 99.982% (95 minutes of annual downtime)
||At the top level, a Tier 4 ranking represents a data center that has the infrastructure, capacity, and processes in place to provide a truly maximum level of uptime:
- Fully meets all requirements for Tiers 1, 2, and 3.
- Infrastructure that is fully fault tolerant, meaning it can function as normal, even in the event of one or more equipment failures.
- Redundancy in everything: Multiple cooling units, backup generators, power sources, chillers, etc. If one piece of equipment fails, another can start up and replace its output instantaneously.
- Expected uptime levels of 99.995% (26 minutes of annual downtime)
What Tier 4 Represents
As you can imagine, there are A LOT of factors that go into the data center tiering system, and following many of the established best practices for a specific tier are very well worth it. Being categorized under a data center tier provides the clearest, most recognizable symbol of that data center’s capabilities. Data Cave is categorized as a Tier 4 data center, because we consistently follow all of its established guidelines, as well as maintain maximum uptime levels. It is a simple and clear indicator that fully describes our facility, infrastructure, processes, and how we compare to other data centers.
Now that you know a bit more about the 4 data center Tiers, and what goes into each ranking, would you like to see firsthand what a Tier 4 data center looks like? If so, Contact us today to schedule a facility tour!
Share this with your friends!
Data center temperatures throughout the industry are on the rise.
It has never been a secret that data centers use a tremendous amount of energy (an estimated 1.5% of all the world’s energy to be exact), and that a huge chunk of that energy goes into its cooling systems. Keeping colocated equipment cool and functioning is serious business, and it will always be a priority for data centers. However, over the past few years there has been a gradual shift in the industry regarding the actual temperature that IT equipment should be cooled down to and maintained at within data centers; on the whole, these temperatures are going up.
In fact, the leading group representing the heating and cooling industry (ASHRAE) has increased the recommended temperature for data centers on more than one occasion over the past few years. Currently, their maximum recommended temperature is 80.6 degrees. While this is considerably higher than temperatures data centers have maintained traditionally, the facts are there that prove it can work.
My goal for this post is to cover the shift towards warmer temperatures in the data center. I’ll look at some of the different drivers for this change, how it is being done now, and how it is already having a serious impact.
The Drivers: Why is it being done?
When it comes to cooling, the mindset of data center operators has traditionally been “the cooler, the better.” Since IT equipment generates a lot of heat at all times, there should always be a high level of cooling in place to keep that equipment as cool as possible. While this has always been the gist of it, this high level of cooling has contributed to the immense power usage of data centers. With many new and existing data centers seeking to reduce their overall carbon footprint, it is completely understandable that they would look for ways to be more efficient with their cooling to meet this goal. Here are the primary motivating factors for shifting to warmer temperatures in the data center:
Environmental Friendliness. When a data center is able to reduce their total energy usage, they can have a tremendous impact on the environment.
Cost Savings. This really goes without saying, but using less energy for cooling also means huge cost savings for the data centers, savings that can potentially be passed on to their customers.
The Methods: How is it being done?
There are several ways that data centers are working to effectively maintain their equipment at higher temperatures in order to see these benefits. Many of the world’s largest tech companies like Google, Microsoft, and Facebook are already employing several methods in some of their new data centers to accomplish this, but here are some ways that data centers can achieve and maintain warmer temperatures:
Location. A data center’s location plays a huge role in both the power and cooling options that are available to it. Many data centers, including Data Cave, are able to take advantage of their location during the winter months and use “free cooling” to either cool the IT equipment directly, or chill the circulating water that is used for cooling (as opposed to running the water through a chiller). This results in less overall energy usage, and is made possible by the physical location of the data center.
Monitoring Technology. The methods and technology used by data centers to monitor their temperature, humidity, air flow, and energy usage in real time have become much more advanced over the past several years, allowing for more flexibility in the temperatures that server rooms are maintained at. More sophisticated monitoring tools can allow for data center personnel to be more efficiently alerted to a potential overheating piece of equipment, as well as faster response times in the event that one does indeed have an issue.
Server Equipment. More than ever before, servers themselves are being built in such a way that they can tolerate much warmer temperatures than they previously could. In fact, many servers produced by Dell (click for video) are now guaranteed by them to fully function at temperatures up to 115 degrees, and studies conducted by Intel and Microsoft have shown that higher temperatures can be achieved with many modern servers, without harming the actual equipment (you can check out the full Intel study here).
The Industry Impact
The shift towards warmer temperatures has been slowly but surely gaining traction within the data center industry, and it is already starting to have an impact. This impact can be seen in many of the new data centers around the world that have been built by Facebook, Google, and many of today’s largest tech companies (check out some examples of data centers built by Facebook and Google). This is definitely a trend that is here to stay, and I think in the long run it will have tremendous benefits for everyone involved: data center operators, customers who depend on the data center and of course, the environment.
How we do it at Data Cave
Maintaining more environmentally friendly temperatures isn’t just something for the larger companies though. There are several things we do at Data Cave as well that are in line with this industry shift. As I mentioned earlier, we take advantage of our facility’s physical location during the winter months, and use “free cooling” from the outside air to provide chilled water, which reduces our overall energy consumption. Also, we have a wide range of real-time monitoring in place that allows us to consistently improve our efficiency and cool our data suites as cost effectively as possible (I’ll cover our monitoring in more detail in a future post!). These practices have helped us to deliver colocation services that are both cost effective and environmentally friendly.
Share this with your friends!
This week we had the honor to be featured on the front cover of this month’s “The Business Connection” magazine. This magazine is published each month by The Republic newspaper in Columbus, Indiana, and it is circulated to a wide range of local businesses throughout central Indiana.
They have put together an excellent article about us and what we do here at Data Cave, and included several great photos of the team and our facility as well! You can check out the magazine in its entirety below. We hope you enjoy it!
Share this with your friends!
Like everyone at Data Cave (as well as most of our clients), I am regularly involved in a wide variety of work tasks each day. While variety is always a good thing, it has the potential to be overwhelming if you’re not prioritizing and effectively managing your workload. There are many tools that we use here at Data Cave that help make specific tasks easier, but when it comes to keeping tabs on the workload itself, there is one tool that I have grown fond of, and it is called KanbanFlow. I wanted to take some time in this post to cover what this application does, how I use it for my daily tasks, and how may be a benefit for you as well!
Kanban? What does that mean?
First off, for those of you who aren’t familiar with the term “Kanban,” here is a very brief description of what it means, courtesy of Wikipedia:
Kanban is a method for managing knowledge work with an emphasis on just-in-time delivery while not overloading the team members.
Generally, Kanban is used as a series of prioritized task lists that help in managing your day to day workflow. You can learn more about the methodology elsewhere, but I won’t go into more detail about that in this post. Now, on to the application itself:
Here is a screenshot of the KanbanFlow application (Click the image to view a larger version).
At its core, KanBanFlow is a web-based tool that allows you to create and edit multiple task lists with ease. Below are some of its primary features:
- Multiple “Category” columns that allow you to easily categorize any tasks you create.
- You are able to log how much time you have spent working on each task.
- If a particular task is more complex in nature, you can create multiple subtasks for it. These are displayed as a simple list of checkboxes, and each subtask can be checked off as they are completed.
- The app has drag-and-drop functionality, allowing you to easily move tasks from one column to another as needed, or re-order them.
When it comes to actually tracking how much time you spend on a task, there are a couple of different ways the application allows you to do it. You can manually enter in a time value for each task, and edit it at any time as well, and you can also use the built-in timer feature that will do the tracking for you.
The application’s built in timer feature.
There is a “Timer” button in the lower left corner of the application that allows you to do this; you click that button, then select the task that you are starting to work on, and click the “Start work” button. From there, the timer will start, and your total time will be saved for that task once you stop the timer. You can also view logs that detail each time entry you have made for all of your individual tasks.
This isn’t a feature I have used much myself, but I can definitely see it being useful for someone who has to be as accurate as possible with how much time they spend on a specific task.
Who should check this program out?
I have been using the KanBanFlow application for the past few months now, and it has really helped keep me on track with all of the projects that I am involved in. I use it to keep track of any new projects I create for myself (even if I don’t start on them right away), prioritize tasks I need to work on day to day, and keep track of roughly how much time it takes to me work on certain tasks. This tool allows me to stay on top of my workload, while helping ensure that nothing falls between the cracks.
I would recommend this application to anyone who’s workload regularly consists of a wide variety of projects, or to those who may have difficulty managing their time effectively on their own. I think this program has a lot to offer, and I hope it helps you as much as it’s helped me!
Share this with your friends!
When you are a business owner, you as well as your employees and customers are constantly creating and using data. Data is vital for all of your business functions, and it is what drives all business decisions. This makes it something absolutely worth protecting and making sure you back up regularly.
But like most businesses today, you likely have a wide range of “types” of data that your business creates and uses, and multiple locations where that data is located. With these challenges, how can you determine and prioritize which parts of your business data you should be backing up?
We have created a new whitepaper, Mastering Your Business Data, that will help you! This whitepaper establishes a process that will help you determine the following things:
- Exactly where all of your data is coming from, taking into account all of your business processes.
- Where your different types of data are actually located.
- Which types of your data are the most mission-critical for your business to operate.
Click the link below to download the whitepaper. We hope you find it useful in analyzing all of the data that your business creates and uses!
If you have any feedback at all, or would like to begin the conversation for how we can help with your data backup, Contact us today!
Share this with your friends!
Next Page »