Archive for the ‘Data Center’ Category
The Uptime Institute recently conducted a survey on data center outages reported in the last twelve months. The survey spanned across traditional enterprise data centers, colocation data centers and financial services companies with in-house data centers.
The survey was based on the number of critical data center outages experienced in the last year that impacted business continuity. Colocation data centers demonstrated high availability as only 3% of the respondents reported having five or more outages. 7% of enterprise data centers had more than five outages, and 6% of financial services data centers had outages.
Uptime should be the highest concern and determining factor when businesses select data centers. 21% of businesses reported to have faced at least one outage with their data center provider. However, a positive note was that most of the respondents surveyed reported to be satisfied with the overall performance of their data center providers. Satisfaction quotient ranged from very satisfied to somewhat satisfied.
Colocation contracts typically mention the downtime that is permissible. The survey reported the following trends for allowed downtime limits:
- 40% allowed for zero scheduled downtime.
- 30% allowed for one to twelve hours of yearly scheduled downtime.
- 6% allowed for thirteen to twenty-four hours of yearly scheduled downtime.
- 2% allowed for more than 96 hours of yearly scheduled downtime.
Another similar study conducted by the Ponemon Institute revealed that it costs, on average, $7,900 per minute for an unscheduled data center outage. This cost has risen by 41% as compared to the $5,600 per minute in 2010. Outage time was in the range of 60 to 90 minutes, making the average total cost of one incident in the range of $350,000 to $900,000. The survey also reported that sectors having critical operations such as defense, finance and telecommunications had higher downtime costs as compared to other sectors.
Lifeline Data Centers is a colocation data center in the Midwest providing 99.995% uptime and reliable data center services. As a Tier 4 data center, we have maintained 99.995% uptime in order to provide our clients with the best service possible. To learn more, contact us today.
Related articles across the web
Data centers are faced with the challenge of having clients who are entirely dependent on them for reliable and consistent service 24×7. To successfully accomplish this, data centers consume huge amounts of energy that add to business and service costs. Therefore, data centers need take the necessary steps to save energy and ensure uptime.
This is quite achievable if data centers adopt new power saving standards and innovations whenever they are available, while optimally maintaining current functions. Here are a few power saving tips that can help data centers save energy by implementing simple changes:
- Turning off equipment that’s not in use: A simple solution to a major problem is turning off equipment that is not in use, which is easy enough to do, yet it is often overlooked by most data centers or even regular homes and businesses. Keeping equipment on standby still consumes energy, so turning them off completely is a power saving solution worth remembering.
- The hot-aisle, cold-aisle trick: This is the simplest way a data center can conserve energy and most data centers adopt this as a routine function. Laying out of servers in this manner ensures not only optimum power consumption, but it also results in longer server lives.
- Optimum airflow management: Identifying the hot and cold spots in a data center helps in highly effective airflow management, which, in turn, results in lower power consumption to keep temperatures at required levels.
- Optimizing CPUs: CPUs of a server are the most power consuming components, and turning on their power management features is sure to help lower power levels required to keep them running.
- Begin measuring power usage: If data centers do not measure the amount of energy they consume with relation to specific functions, then allocating, reallocating or optimizing the use of energy can be a difficult task. Measuring the distribution of power is the first step in putting it to optimal use. Power Usage Effectiveness or PUE is the vital link here.
At Lifeline Data Centers, we ensure that the most innovative power saving guidelines are put to work so that energy will be used optimally. Saving as much power as possible, even in our routine activities, helps us save energy, allowing us to cut down on costs and offer competitive pricing. Contact us today for more information.
Related articles across the web
There has been a major push in reducing the energy footprint of data centers, especially with the amount of data that is accumulated each and every day. One approach is to go lean by cutting out unneeded servers and adopting other energy reducing measures wherever possible, which seems simple enough, right?
While being green is attractive from a business standpoint, many data centers are not too eager to take this route. The need for cooling or other energy requirements exists for a reason, and cutting it down comes at the cost of imposing some restrictions in other places. Again, while identifying and eliminating unneeded servers and other appliances may indeed cut costs, the cost to actually do that may outweigh those savings.
Data centers unable to go lean due to their business model or any other reason should consider the following approaches to go green if they are faced with these challenges:
- Reduce consumption of fossil fuel energy and tap energy from renewable sources, including solar photo-voltaic and wind. Microsoft’s proposed facility, Pilot Hill Wind Project, which aims to provide 675,000MWh of renewable energy per year from 2015 to power its data centers, is a sign of the changes that are coming. The European Commission recently announced the RenewIT initiative, which aims for 80% of the European data center industry to be powered from renewable and sustainable resources, is sure to provide a push in this direction as well.
- Avoid the use of highly polluting diesel generators for backup power and use bio diesel powered generators instead. Bio diesel reduces the carbon footprint of diesel engines significantly, but it does comes with the trade off of generators requiring more frequent care and cleaning.
- Upgrade equipment periodically to remove obsolete and energy guzzling equipment. A case in point is the Direct Expansion (DX) cooling systems, where the focus was on the most cost efficient system a decade ago. Today, the primary focus of the cooling system is on the COP (Coefficient of Performance), or the ratio of energy moved to energy used to move it.
- Use recycled equipment as long as possible. This would greatly reduce the carbon footprint associated with the manufacturing of the equipment, even if there are no major operational savings.
- Adopt energy efficient architectural approaches, especially incorporating green innovations in ventilation and air conditioning, with the overall aim of minimizing the data center’s impact on environment.
Green data centers are not just an environmental friendly move; they are a sound business preposition. Most of the time, the investment to “go green” sees a return on investment and starts to generate additional savings very shortly after it’s implemented.
Lifeline Data Centers prides itself on energy efficient practices. If you’re interested in learning more, schedule a tour of our facility today.
Related articles across the web
Data centers have always dealt with the challenge of high power usage and consumption. An efficient data center requires high amounts of power, without which its services and the businesses associated with it would all come to a standstill. Over the years, however, there have been plenty of thoughtful innovations that have helped data centers drastically lower their power consumption levels, but it still isn’t where it needs to be. Here is the current state of power usage in data centers today.
When compared to everyday situations, the power consumption of a data center seems colossal. Even at the end of 2013, the power used by data centers all over the world could be compared to the output of 30 nuclear power plants, considering that they shared 30 billion watts of energy between them. In fact, a single data center uses more than 100 times the energy used for a building the same size or the equivalent of 25,000 homes in the US.
Power usage by data centers
Statistics show that in America alone, the year 2012 documented data centers using 9,900MW of power, while in 2013, this number rose to 10,560MW. Of course, no other country in the world has adopted technology the way the US has and, therefore, it remains the undisputed international leader. In fact, the numbers for all of Europe only add up to a little more than figures for the US, standing at 12,705MW and 13,470MW for 2012 and 2013 respectively.
Between 2011 and 2012, the rise in power consumption for data centers clocked 19% and caused great concern. The rising cost of power and the detrimental effects to the environment were some major factors that caused the government, as well as the public, to sit up and take notice. In 2013, however, this figure dropped down to just 7%, thus signaling a wave of energy efficiency measures. If one considers the comparative forecast through 2016, this translates into energy savings of at least 1,000MW.
At Lifeline Data Centers, we understand the concerns related to power usage and reiterate our commitment to optimized energy use through every innovation we adopt. After all, power savings are great for the environment and our business, too! Schedule a tour to learn more about our energy efficient techniques.
Every data center security manager has nightmares about his or her network being attacked. But some have more peace of mind because they feel confident in their data center infrastructure and because they have every inch of their network covered with the right degree of protection in place.
Part of covering that network requires scanning tools which tell you about vulnerabilities in the data center. Here are a couple of free network scanning tools that could be valuable in your data center.
- OpenVas: One of the more sophisticated network scanning tools, this scanner can scan for as many as 33,000 NVT (Network Vulnerability Tests). It runs on Linux systems and also has a client for Windows-based systems. Although it is not the most simplest of tools to use, it is one of the most comprehensive and offers great value in terms of its test coverage, providing simultaneous scans and also scheduled scans.
- Retina CS Community: The Retina tools cover mobile devices, private clouds, servers and applications, hosted on primarily Windows environments. They also cover third-party software such as Firefox and Adobe applications and can scan up to 256 IPS for free. This set of tools offers both scanning (Retina Network Community) and patching (Retina CS Community). The only limitation that some users may come across is that these tools need to be installed on Windows Server.
- Qualys FreeScan: This is a simple tool that can be used by administrators of small setups to scan for a maximum of 10 URLs or IP addresses, either local or Internet facing. It can identify network-related vulnerabilities, hidden malware and SSL issues. It also checks for compliance of your computer settings as per the Security Content Automation Protocol formulated by the National Institute of Standards and Technology (NIST).
- Microsoft Baseline Security Analyser (MBSA): This one is a simple Windows only tool and can scan for missing patches, weak passwords, SQL administration vulnerabilities and security misconfigurations. It provides a detailed report post scan on the vulnerabilities that were discovered and potential fixes for each of them. This is an easy to use tool that looks at general security risks, but it does not cover complex areas such as Windows settings, network issues and non-Microsoft software.
Taking care of and addressing network vulnerabilities is an important aspect of a healthy data center. It is meticulous and requires a great deal of effort, but the rewards are worth the hard work. Recognizing these facts, Lifeline Data Centers offers the best of network security solutions in our collocation data centers. If you are looking for a secure and optimized data center for your business needs, get in touch for a free consultation.
Data centers are an important part of the business structure for organizations in every sector, and they are essential for storing mission-critical applications and data. Since a data center plays such a crucial role, isn’t its location an important factor, too?
Ideally, a data center’s location should be one that is not susceptible to threats like natural calamities, technical snags or acts of terrorism. However, being on guard against these threats alone are not enough.
Here are 5 key angles that denote the best location for a data center:
- Availability and cost of power: Data centers need a lot of power for functioning. Even in 2012, data centers across the United States consumed as much power as five million homes. It is therefore vital for a data center to be located in an area where easy access to reasonably priced power supply is available.
- Susceptibility to natural disasters: Some locations are more prone to earthquakes, tornadoes, hurricanes, wildfires and floods than others are. These and other natural calamities can interrupt operations and disrupt business activities, causing financial losses. A location that has a low occurrence rate of these calamities is an ideal place for setting up data centers.
- Economic costs: Every detail – the cost of construction, taxes, labor market and availability of quality – is a critical factor influencing location selection. These factors vary from place to place, so comparison and analysis of costs usually decide if the location is feasible.
- Infrastructure for network and telecommunications: Since network and telecommunications infrastructure are important for running operations, the location selected must have efficient connectivity and the cost of installation and operations should be reasonable. Redundancy of infrastructure and the existence of multiple carriers to bring down risk are important factors to be considered, too.
- Employees and their quality of life: In order to operate a data center, a skilled workforce is required. Therefore, access to civic amenities and a good lifestyle with good housing, educational and recreational amenities are a priority.
In addition to these aspects, other crucial elements, such as investment rules and legalities as well as company-specific requirements, come into play, too.
While choosing a data center, it is necessary for a business to choose one that has considered every pro and con before picking a location. At Lifeline Data Centers, we ensure that our location enhances our functionality and mitigates risks, thus helping your business thrive. Take a tour and check us out!
Related articles across the web
The lean approach to data centers has caught on big time as a means to reduce cost and improve efficiency. A primary focus of the lean approach to data centers is on reducing cooling expenses and improving the energy efficiency of servers. While this is indeed valuable and helps to generate savings, this may actually be a wrong priority. The losses that result from not optimizing cooling may be minuscule compared to the losses due to underutilization of servers. With excess servers eliminated, the need to cool or optimize the energy requirements of these excess servers goes away by itself.
The problem of underutilized servers is underestimated. Garner pegs industry-wide server utilization at just 12% in 2012. A 2008 McKinsey study had estimated data center utilization at just 6%, and the 2010 Green Grid study backs up these numbers as well. With the advent of server virtualization, data center utilization is actually getting worse rather than better.
The reason for such abysmal numbers is not hard to find: with servers getting cheaper, it makes more sense for managers to simply buy a new one to accommodate a new process rather than waste scarce time identifying spare capacity or improving the efficiency on existing servers. Over time, these servers accumulate and run way below their capacity and even on empty if the process has been discontinued. This increases energy and real estate costs considerably.
The solution is to focus on the complete internal data center supply chain rather than a component approach to identify waste, and to use statistical quality management methods to improve efficiency and quality. The lean principle of 5S offers a structured approach to this end. The application of 5S, in conjunction with Six Sigma, answers questions such as “How many now-virtualized processes are still running in tandem on in-house servers?,” “How many duplicate processes are running” and more. Application of Lean Six Sigma can reduce data center floor space significantly, reduce server load, energy consumption and tons of CO2 emissions.
Lifeline Data Center offers fully compliant and highly efficient collocation facilities with customized packages that suit all your requirements. The facility is noted for its high up-time, connectivity and room for growth, while also operating efficiently. Schedule a tour with us today.
Related articles across the web
Dashboards have been introduced in some data center environments today. The data center industry is always looking for ways to innovate, and highly visual and sophisticated dashboards are now tools in some data centers, which will provide you with a unified view of all of the critical aspects of data center performance. You can see the status of your data center in real-time with the new generation of sophisticated business intelligence and performance dashboards. Let’s take a look.
Some of the leading global technology organizations have adopted the use of dashboards in their data centers, including:
Facebook has become famous for broadcasting its dashboard on a huge screen as you enter the reception area of its Prineville data center. All the performance indicators, such as PUE (power utilization efficiency), WUE (water utilization efficiency), humidity and temperature are displayed with near real-time accuracy, along with annualized average metrics to compare. What is even more remarkable is that Facebook has gone one step further to make its dashboard code open source so that other data centers can benefit from similar features.
Intel has also been working hard to redefine its data center strategies, and it has reported $184 billion new business value in the last four years. A key feature of their new strategy was to build an integrated business intelligence-based dashboard that reports metrics at both system and component levels. The dashboard consolidates data from all of Intel’s data centers by polling as much as 192 million data records. Metrics related to space, storage, power and cooling are visually displayed and can be compared among different business functions and different data center locations.
Simple or over-the-top complicated, data center dashboards can surely contribute to giving a holistic view and can help identify opportunities for efficiency improvement. While we don’t have dashboards yet, we are continuing to innovate in our data center and always have current metrics on our performance. To learn more, contact us for a consultation today.
Related articles across the web
A data center faces many hazards, and risk of fire is always a live threat.
Clients who store their valuable data with data centers need to check if their data center is NFPA70 compliant. National Fire Protection Association Regulation 70 (NFPA 70) or the National Electrical Code offers a benchmark for safe electrical design, installation and inspection. It specifies stipulations for the installation of electrical conductors and equipment, signaling and communications conductors, optical fiber cables and raceways. Among other things, it provides guidelines for undertaking a comprehensive study / audit of arc flash analysis and labeling of equipment, aimed at preventing fire hazards.
According to NFPA 70 article 100, an arc flash hazard is “a dangerous condition associated with the possible release of energy caused by an electric arc”. The high arc flash energy areas are usually areas where incident energy values exceed category 4 (40 cal/cm2), and, in data centers, this is mostly in the facility’s main switch gear.
NFPA 70 specifies ways to conduct a risk assessment, recommends measures for flash and arc protection and mandates proper protective equipment for employees based on their tasks. It describes:
- Products to be used for general electrical construction.
- Minimum physical and operational characteristics for electrical products.
- Test standards that some products have to meet, to be deployed.
The data center is free to deal with the specifics of how to comply with these standards in whichever way they see fit, but is expected to use the best means available to comply with these requirements.
NFPA 70, like most NFPA codes, is not mandatory and merely aims to develop a consensus on what constitutes industry best practices. Data centers would nevertheless gain by complying with these standards, as it helps assures clients of a fully complaint and secure operations.
The NFPA reviews and updates the NFPA 70 content once every three years, and, as such, this code remains more-or-less relevant throughout.
If you’re looking for a completely compliant data center, look no further than Lifeline. We have every protection in place and adhere to most, if not all, data center compliance regulations.
Related articles across the web
Robots are making their way into the modern data centers at Google and Amazon. While they aren’t mainstream yet, the rumors are already starting regarding the number of jobs that can be eliminated due to these sleek silent machines. While on the one hand, there is a possibility for humanity to grow and to lead better balanced and meaningful lives with mechanical tasks being automated, on the other hand, there is the possibility for increased conflict, chaos and insecurity if this move is not planned and executed properly.
Amazon is the first to spearhead this game-changing phenomenon. Robots are already in use in Amazon’s warehouses and now have made their way into their data centers. Some of the key advantages to having these robots in the data centers include:
- Vertical scalability: Robots do not have the same height limitations as humans do. With space constraints becoming a key factor in managing costs, robots can climb much higher and faster so server racks can be mounted vertically instead of horizontally.
- Lights Out: There can be significant costs savings on electricity as an automated data center can eliminate all the infrastructure needs that are specifically in place for “humans.”
- Robots in Security: A new trend has started where the physical security of data centers is now handled by robots. The robot is vigilant, adaptable and, best of all, cannot be bribed or corrupted.
- Energy Efficiency with Automated Data Collection: IBM has revealed that they have 9 robots in their data centers globally. These robots are moving about the data center floor and recording humidity and temperature measurements. This gives important information on areas that can be optimized and cooled or heated more efficiently.
The future of robots in the data center looks super exciting. But, at Lifeline Data Centers, we also care about our people. We always want to make our data centers more efficient without compromising the human touch our employees bring to our operations.
For more information, contact us today.