Last month we attended the Indianapolis Business Journal’s annual Technology Breakfast in downtown Indianapolis. In addition to the event being a great opportunity to socialize and network with others in the local tech industry, we listened to an informative and insightful panel discussion that covered a wide range of topics relevant to our industry. These primarily focused around:
The current state of the technology field in Central Indiana, and where it is projected to go over the next several years.
Indiana’s core strengths as a state when it comes to fostering growth for tech companies and attracting talent.
I wanted to take some time in this post to cover a few of the highlights from the panel discussion, and some of the key take-aways that we at Data Cave took from it.
IT in Indiana is growing, and in a serious way
There are many exciting statistics that speak to the fact that the technology sector in Indiana has been steadily growing over the past several years. Here are just a few of them:
- In the Indianapolis area, the number of computer-related jobs grew by 7.3% from 2009 to 2012. This is more than 3 times the national average of 2% for total job growth.
- In 2013, Forbes magazine ranked Indianapolis as one of the top 10 metro areas for technology job growth (you can see more information on that here).
- This job growth isn’t exclusive to just the technology industry. Out of all the companies seeing the highest IT job growth, the majority of them aren’t IT companies specifically (some examples include Indiana University, Best Buy, and Cummins).
These were just a few of the findings presented from a recent study conducted by Techpoint, an organization representing the Indiana technology industry. Their full Technology Workforce Report can be viewed here.
Indiana has a lot to offer as a state for technology
The IBJ panel discussion (image courtesy of TinderBox)Seated from left to right:Christopher Clapp-BluelockGreg Deason-Purdue FoundryMarie Kerbeshian, PH.D-IUMichael Langellier-TechpointDustin Sapp-TinderBoxJohn Wechsler-Launch Fishers
The above stats are certainly a good indicator that Indiana has been a great state for growth in the technology sector, but why Indiana? The panel focused a significant amount of the discussion towards answering this question, and they answered it with reasons why Indiana is a great state for fostering IT growth:
- Indiana has a long and rich history of forward-thinking entrepreneurs, and the community and government has been supportive of that entrepreneurial spirit, especially in recent years.
- There is plenty of room for tech companies in Indiana to get started as well as grow, thanks in part to a venture capital tax credit that is available in Indiana. This has added an extra competitive edge to Indiana that some other states lack when it comes to attracting talent and investment.
- Many of Indiana’s top colleges, including Purdue and Indiana University, have close ties with alumni all over the country. Both of these colleges had representatives in the discussion panel, and they have been working to leverage those relationships with their alumni to create new investment and mentoring opportunities in Indiana tech. This has also helped a great deal for attracting both interest and talent to Indiana.
All in all, this was a great panel discussion to attend. It was rewarding and encouraging to hear from other Indiana business leaders about the tech industry in our area, and the positive direction it is heading. Based on what we have seen at Data Cave, we couldn’t agree more; the support of our great clients, friends, and our local community have made Central Indiana very rewarding to live and do business in, and we are excited to be a part of the state’s growing technology industry!
Share this with your friends!
Cloud Computing (Image courtesy of Forbes)
Our personal lives as well as our workplaces continue to be transformed by the Cloud as it advances into virtually every form of technology that we use everyday. When it comes to using the cloud within our businesses, it is shifting from being an option to more of a necessity for all of us.
But with that being the case, how do businesses actually view the cloud? Do they see it as being just a tool that helps to achieve a very specific goal, or do they see it as a buildable platform that can be developed, honed, and molded to their specific business to help it grow? These are a few of the attitudes and mindsets that the business world has had towards what role the Cloud can play for them, and it was the subject of a webinar I recently attended. Titled “Innovate or Die: Changing your Mindset about Cloud”, the webinar covered several great insights and take-aways that I want to share with you.
Survey findings show the benefits of using the Cloud
Led by a few Cloud specialists at Rackspace, the webinar covered a wide range of findings that they had observed from a survey taken of hundreds of Cloud users around the country, relating to how they use the Cloud and how it benefits them. There were many results from that survey, but these statistics really stuck out to me above all the others:
- 88% of those surveyed said that using Cloud services had saved their companies money.
- 56% of the people surveyed said that the Cloud had actually helped them to boost profits for their company.
While one of the typical goals of using Cloud services is to save money (which is clearly happening), the fact that there are many companies out there that are actually seeing profits and growth as a result of using the Cloud is very exciting. I think this speaks to how beneficial it can be for companies to broaden their mindsets about how they view and use the Cloud, so they can maximize the impact it has on their business.
Narrow vs. Broad Mindsets
The webinar presenters articulated some of the differing mindsets towards the Cloud very well, and for the most part they can be generalized down into these two contrasting viewpoints:
- Narrow: Those who view the Cloud primarily as a means to cut costs, be it on IT equipment, resources, etc. While using Cloud services can have a tremendous impact on the bottom line for any business, there are many business owners who tend to put too much focus on this singular benefit, ignoring the greater potential that the Cloud has to offer.
- Broad: Those who see the Cloud instead as a series of “building blocks” that can be customized, built, and tailored to their unique business. If you look at the Cloud through this lens, you’ll see it as a tool that can be shaped and molded to your business, rather than the other way around. To borrow a quote from the webinar presenters, try to visualize the Cloud as Lego pieces you can build with!
Business owners and IT Managers who adopt the latter mindset about how they view the Cloud are the same ones who have not only seen cost savings after moving to the Cloud, but actual business growth and increased profits as well. They see the Cloud not just as a simple solution for a very specific need, but rather an actual part of their business as a whole. Here’s a way to look at it and compare the two mindsets:
Cloud services can have a strong and significant impact on your business, whether it is by cutting costs on your current IT expenses, or by contributing to your business’ growth; the level of impact it can have is ultimately up to you. I hope this post helped to distinguish between the contrasting mindsets that businesses typically have towards the Cloud, as well as the key benefits that each of these mindsets can bring!
Share this with your friends!
We are excited to announce that Data Cave now fully supports the new IPv6 standard for Internet traffic routing. The need for IPv6 has been growing in importance over the past several years, as the world edges closer to running out of the old IPv4-based IP addresses for devices and networking equipment. IPv6 will inevitably become the primary protocol for all IP addresses, so having the infrastructure in place to support this standard is definitely a big deal for us as well as our customers.
Some history on the old IPv4 format
IP addresses as we have traditionally known them follow the 32-bit IPv4 format. An IPv4 address has a structure we are all familiar with:
These IP addresses have served as the identifiers for all network-connected devices for the past several decades. However, due to the limited number of possible IP addresses that use this format (approximately 4.3 billion), a shift towards the more expandable IPv6 format has been occurring over the past several years. While 4.3 billion may seem like a big number (it is!), with the ever-increasing volume of servers, mobile devices, and other equipment requiring network connectivity coming into play, the supply of IPv4 addresses will soon be fully expended.
You can see just how real the shortage of IPv4 addresses is, by looking at the ARIN “IPv4 Countdown Plan”, located here. ARIN , or the American Registry for Internet Numbers, is one of 5 organizations around the world that is responsible for the distribution of IPv4 and IPv6 address space. For the past several years they have been monitoring the remaining inventory of IPv4 addresses, and rationing them as needed in a series of 4 structured “phases.” Out of these 4 phases, we are currently in Phase 3. Once this phase ends and ARIN proceeds to Phase 4, the distribution of the remaining IPv4 addresses will involve even stricter requirements than before for organizations wanting to obtain them. Then, the end of Phase 4 will mark the full depletion of IPv4 addresses. As you can see from this, the depletion of IPv4 addresses will happen very soon.
This is where the new IPv6 standard comes in.
IPv6: An exponential increase
While IPv4 addresses have a size of 32 bits, addresses under the IPv6 format have a 128-bit size. Here is an example of how these new IP addresses are structured, so you can see the difference:
As you can see, IPv6 addresses are much larger and more varied than the old IPv4 format. This is what allows for IPv6 to be so much more expandable than IPv4 (keep reading to find out just how much more expandable).
I won’t go into the exhaustive task of covering all of the technical factors that make up the IPv6 format (but you can check out this Wikipedia page if you’re interested!), but I do want to stress the benefits that IPv6 has. Namely, this standard allows for a huge increase in the possible number of IP addresses that can be used for network devices. How huge? I’ll attempt to illustrate it below:
While 4.3 billion is itself a big number, it’s really just a drop in the ocean when you compare it to 340 undecillion. This unfathomable figure represents the total number of possible IP addresses that are available for network devices, through IPv6. This exponentially greater number of addresses ensures that we will never see an IP address shortage again.
What it means for Data Cave customers
Ultimately what this means for companies who choose Data Cave for their point-to-point Internet connectivity, is that it will help greatly when transitioning from IPv4 to IPv6. As businesses continue to grow and update their networks, they will always need new IP address allocations to ensure connectivity between all of their devices and the Internet, so having this support in place for the IPv6 standard will be very beneficial in the years to come. As we saw earlier, the transiiton from IPv4 to IPv6 is definitely not a matter of “if”, but “when.”
If Data Cave already provides Internet connectivity for your business, then you can begin taking advantage of our IPv6 support today. If not, then I would definitely encourage you to learn more about our Connectivity options!
Share this with your friends!
One word that you often see associated with any data center is its “tier,” or its level of service. Virtually every data center has a tier ranking of 1, 2, 3, or 4, and this ranking serves as a symbol for everything it has to offer: its physical infrastructure, its cooling, power infrastructure, redundancy levels, and promised uptime.
We consider Data Cave to be categorized as a Tier 4 data center because of our best practices and maximum uptime levels. But apart from looking good on paper, what does that actually mean to you? I wanted to use this post to answer that question! I will cover what each of the 4 data center tiers encompass, what types of factors go into each tier ranking, and what it ultimately means at the end of the day.
The data center tier system first came into existence back in 2005, as a way to bring quantifiable standards to the industry that can be measured against for each individual data center. These standards were developed and are maintained by the Telecommunications Industry Association (TIA), an entity that is responsible for a wide range of standards across many spectrums of the IT industry. In addition, a separate set of 4 tier levels has also been developed by the Uptime Institute, a 3rd party data center research organization. While these different sets of standards are maintained separately by these two organizations, they are very similar to one another in terms of the specific criteria that make up each tier.
Each data center tier ranking consists of several criteria and requirements that primarily focus on a data center’s infrastructure, levels of redundancy, and promised level of uptime. Here are some specifics on the factors that go in to each of the 4 tiers:
||A Tier 1 data center is the simplest of the 4 tiers, offering little (if any) levels of redundancy, and not really aiming to promise a maximum level of uptime:
- Single path for power and cooling to the server equipment, with no redundant components.
- Typically lacks features seen in larger data centers, such as a backup cooling system or generator.
- Expected uptime levels of 99.671% (1,729 minutes of annual downtime)
||The next level up, a Tier 2 data center has more measures and infrastructure in place that ensure it is not as susceptible to unplanned downtime as a Tier 1 data center:
- Will typically have a single path for both power and cooling, but will utilize some redundant components.
- These data centers will have some backup elements, such as a backup cooling system and/or a generator.
- Expected uptime levels of 99.741% (1,361 minutes of annual downtime)
||In addition to meeting the requirements for both Tier 1 and Tier 2, a Tier 3 data center is required to have a more sophisticated infrastructure that allows for greater redundancy and higher uptime:
- Multiple power and cooling distribution paths to the server equipment. The equipment is served by one distribution path, but in the event that path fails, another takes over as a failover.
- Multiple power sources for all IT equipment.
- Specific procedures in place that allow for maintenance/updates to be done in the data center, without causing downtime.
- Expected uptime levels of 99.982% (95 minutes of annual downtime)
||At the top level, a Tier 4 ranking represents a data center that has the infrastructure, capacity, and processes in place to provide a truly maximum level of uptime:
- Fully meets all requirements for Tiers 1, 2, and 3.
- Infrastructure that is fully fault tolerant, meaning it can function as normal, even in the event of one or more equipment failures.
- Redundancy in everything: Multiple cooling units, backup generators, power sources, chillers, etc. If one piece of equipment fails, another can start up and replace its output instantaneously.
- Expected uptime levels of 99.995% (26 minutes of annual downtime)
What Tier 4 Represents
As you can imagine, there are A LOT of factors that go into the data center tiering system, and following many of the established best practices for a specific tier are very well worth it. Being categorized under a data center tier provides the clearest, most recognizable symbol of that data center’s capabilities. Data Cave is categorized as a Tier 4 data center, because we consistently follow all of its established guidelines, as well as maintain maximum uptime levels. It is a simple and clear indicator that fully describes our facility, infrastructure, processes, and how we compare to other data centers.
Now that you know a bit more about the 4 data center Tiers, and what goes into each ranking, would you like to see firsthand what a Tier 4 data center looks like? If so, Contact us today to schedule a facility tour!
Share this with your friends!
Data center temperatures throughout the industry are on the rise.
It has never been a secret that data centers use a tremendous amount of energy (an estimated 1.5% of all the world’s energy to be exact), and that a huge chunk of that energy goes into its cooling systems. Keeping colocated equipment cool and functioning is serious business, and it will always be a priority for data centers. However, over the past few years there has been a gradual shift in the industry regarding the actual temperature that IT equipment should be cooled down to and maintained at within data centers; on the whole, these temperatures are going up.
In fact, the leading group representing the heating and cooling industry (ASHRAE) has increased the recommended temperature for data centers on more than one occasion over the past few years. Currently, their maximum recommended temperature is 80.6 degrees. While this is considerably higher than temperatures data centers have maintained traditionally, the facts are there that prove it can work.
My goal for this post is to cover the shift towards warmer temperatures in the data center. I’ll look at some of the different drivers for this change, how it is being done now, and how it is already having a serious impact.
The Drivers: Why is it being done?
When it comes to cooling, the mindset of data center operators has traditionally been “the cooler, the better.” Since IT equipment generates a lot of heat at all times, there should always be a high level of cooling in place to keep that equipment as cool as possible. While this has always been the gist of it, this high level of cooling has contributed to the immense power usage of data centers. With many new and existing data centers seeking to reduce their overall carbon footprint, it is completely understandable that they would look for ways to be more efficient with their cooling to meet this goal. Here are the primary motivating factors for shifting to warmer temperatures in the data center:
Environmental Friendliness. When a data center is able to reduce their total energy usage, they can have a tremendous impact on the environment.
Cost Savings. This really goes without saying, but using less energy for cooling also means huge cost savings for the data centers, savings that can potentially be passed on to their customers.
The Methods: How is it being done?
There are several ways that data centers are working to effectively maintain their equipment at higher temperatures in order to see these benefits. Many of the world’s largest tech companies like Google, Microsoft, and Facebook are already employing several methods in some of their new data centers to accomplish this, but here are some ways that data centers can achieve and maintain warmer temperatures:
Location. A data center’s location plays a huge role in both the power and cooling options that are available to it. Many data centers, including Data Cave, are able to take advantage of their location during the winter months and use “free cooling” to either cool the IT equipment directly, or chill the circulating water that is used for cooling (as opposed to running the water through a chiller). This results in less overall energy usage, and is made possible by the physical location of the data center.
Monitoring Technology. The methods and technology used by data centers to monitor their temperature, humidity, air flow, and energy usage in real time have become much more advanced over the past several years, allowing for more flexibility in the temperatures that server rooms are maintained at. More sophisticated monitoring tools can allow for data center personnel to be more efficiently alerted to a potential overheating piece of equipment, as well as faster response times in the event that one does indeed have an issue.
Server Equipment. More than ever before, servers themselves are being built in such a way that they can tolerate much warmer temperatures than they previously could. In fact, many servers produced by Dell (click for video) are now guaranteed by them to fully function at temperatures up to 115 degrees, and studies conducted by Intel and Microsoft have shown that higher temperatures can be achieved with many modern servers, without harming the actual equipment (you can check out the full Intel study here).
The Industry Impact
The shift towards warmer temperatures has been slowly but surely gaining traction within the data center industry, and it is already starting to have an impact. This impact can be seen in many of the new data centers around the world that have been built by Facebook, Google, and many of today’s largest tech companies (check out some examples of data centers built by Facebook and Google). This is definitely a trend that is here to stay, and I think in the long run it will have tremendous benefits for everyone involved: data center operators, customers who depend on the data center and of course, the environment.
How we do it at Data Cave
Maintaining more environmentally friendly temperatures isn’t just something for the larger companies though. There are several things we do at Data Cave as well that are in line with this industry shift. As I mentioned earlier, we take advantage of our facility’s physical location during the winter months, and use “free cooling” from the outside air to provide chilled water, which reduces our overall energy consumption. Also, we have a wide range of real-time monitoring in place that allows us to consistently improve our efficiency and cool our data suites as cost effectively as possible (I’ll cover our monitoring in more detail in a future post!). These practices have helped us to deliver colocation services that are both cost effective and environmentally friendly.
Share this with your friends!
Next Page »