Let the compliance experts at Lifeline Data Centers help you solve your SSAE 16, TIA-942, NFPA, HIPAA, FISMA, FDA, PCI/DSS and Sarbanes Oxley audit problems. We stay on top of compliance issues so you don’t have to.
Our passion is helping companies optimize their IT strategies and decisions. We work with companies to help them determine the best solutions for their production data centers and disaster recovery centers. Using external data center (colocation) facilities is one of the easiest ways to manage IT costs while improving service reliability and uptime. We’ll show you how! Find us on Twitter: @lifelinedatactr
Every data center has to face the reality of software compliance audits. This can be a nightmare for set ups with an array of enterprise and open source software running together. For smaller shops, this is can be a relatively easy exercise, but it still requires time and effort.
Software Asset Management is the practice of using automated tools to track all aspects of the software in the enterprise. What appears to be simple can actually take a sophisticated data center nearly five years to seamlessly implement. True, the reports and the dashboards look impressive at first, but, sometimes, the detailed effort and inputs needed to prepare them are hidden at the time of purchase. The cost benefit factor needs to be considered carefully when investing in a software asset management tool.
Simple data centers can also take as much as two years to implement a basic asset tracking system in Excel. The recommendation is to have a basic system in place, which will reveal the ground work needed before jumping in for an expensive automated tool. A further step to implementing an Excel system is to use a Configuration Management Database (CMDB), which brings in more discipline to the change management process.
An example of a highly automated software asset management system is the IBM® Endpoint Manager. This product was used recently by an enterprise to evaluate software licenses across 67 data centers, covering both laptop and desktop installations. The result led to the company closing gaps in software compliance vulnerabilities, bringing down its total non-compliance costs from six figures to a low of $7,000. As part of the next generation wave, IBM also plans on using an app store, similar to Google and Apple. The concept involves a decentralized view on software asset management where users will directly login to the app store and request for software downloads. The app store will hold licensing information and will allow for download only if the license is available.
At Lifeline, we have developed a seamless process and incorporated the tools we need to do our jobs effectively. To check out our setup, schedule a tour at Lifeline today.
Related articles across the web
In a recent survey of the best software for data and backup applications, there were a few clear winners. The survey was conducted by leading publication TechTarget, for determining the recipients of their ninth annual Quality Awards in this category. In the category of enterprise backup applications, Asigra ranked first with an impressive overall score of 6.38 points. Asigra’s software is used primarily for backup in enterprise data centers and also backup software for cloud services. Other winners in the enterprise backup software category were as follows:
- EMC NetWorker with a rating of 5.78
- CommVault Simpana with a rating of 5.75
- HP Data Protector with a rating of 5.72
- Symantec NetBackup with a rating of 5.68.
One of the significant distinguishing factors that led to the phenomenal ratings for Asigra and CommVault were the impeccable performance of their sales teams. The ratings were based on the knowledge of sales support teams and how well sales representatives understood the business of the customers. Ease of installation and configuration, along with the good value for the money, also contributed to Asigra’s winning ratings. Product features, such as file system backup, database backup and management features give Asigra a rating of 6.82 for Enterprise features.
The other category of awards was in the midrange backup and recovery products. The clear winner in this category was Veeam. Veeam had an average rating of 6.3 in the five rating categories. Again, like Asigra, Veeam also stood out because of the competency of their sales force and also particularly in the value proposition behind their licensing formula. Other winners in the midrange products were as follows:
- EMC Avamar with a rating of 5.77
- BarracudaWare Yosemite Server Backup with a rating of 5.68
Like the user’s survey revealed, an excellent product always stands out from the rest of the crowd. Factors such as product reliability, value for money, range of features, sales support and technical support are the key factors that are used by the user community to evaluate products. It was incredible to note that Asigra got a 93% buy-again score from its existing users. Similarly, Veeam had a 94% repeat purchase score.
At Lifeline, we only use the best products to support our data center infrastructure. Schedule a tour with us today to learn more.
Related articles across the web
There’s a new wave of technology that is set to invade our lives called the Internet of Things (IoT). With the expectations and opportunities innovation brings, there are plenty of challenges, too. The biggest challenge the IoT wave will bring along is storing the huge amounts of data that is guaranteed to be generated. At the forefront of tackling this issue is the trusty lieutenants of the tech and business world – data centers.
The core IoT challenges for data centers
Data centers have promising projects linked to the deployment of the IoT, and, of course, there’s a long list of challenges, too. Here are the most prominent challenges facing data centers in the IoT revolution:
- Storage: Gartner predicts that IoT numbers will reach 26 billion units by the year 2020. This, of course, translates into revenue in excess of $300 billion for all those associated with the technology. It also means mammoth amounts of data and the need to store them. Data centers will be primary resources responsible for the storage of this data, but they’ll need to figure a good way to scale with that amount.
- Investments: Getting ready for the barrage of data requires data centers to invest in additional infrastructure, ranging from new server technologies to other necessary groundwork. This means more money.
- Network: IoT will transfer significant amounts of data to a data center, data that will need to be stored and processed. The bandwidth capabilities of a data center will therefore require a massive transformation compared to current standards.
- Capacity planning: Data centers today handle their capacity planning on fairly simpler levels. The dynamic nature of IoT data will bring with it complexity and automated tools will need to be put into action.
- Protecting consumers: The term data and everything associated with it are in the eye of the storm today. Privacy concerns are a worrying aspect of technology and, with the IoT wave, this element is only going to be under more inspection. Data centers will have to comprehensively tackle the possibility of breaches.
At Lifeline Data Centers, we keep a keen eye on developing technologies and the opportunities they bring. This enables us to expand our own business and infrastructural boundaries, too. We have foreseen the IoT revolution and the challenges it will bring, and are equipped to handle them well, offering comprehensive solutions to our clientele. Contact us today to learn more about our capabilities.
Related articles across the web
Compliance with PCI Security Standards is crucial for any organization looking to conduct online credit card transactions. In today’s tech-centric world where customers increasingly prefer the convenience of online shopping to the hassles of visiting brick and mortar stores, organizations that take PCI DSS compliance lightly have a death wish.
Until recently, the target of PCI’s compliance efforts was on the merchant, with the relationship between the merchant and a third-party service provider not clearly defined. The burden was on the merchant to ensure that their third-party providers complied with PCI DSS requirements. However, the PCI Security Standards Council has now released a new information supplement geared towards third-party providers, including data centers. This supplement offers a framework for a security assurance program to ensure that third-party service providers avoid data breaches and keep payment data secure, and they can meet PCI DSS requirements.
This new supplement, titled “Third-Party Security Assurance Information Supplement,” does not have any new compliance requirements. It is rather an expansion of guidance, specifying the policies and measures already in force which third-party providers have to incorporate. The supplement specifically lists how companies can comply with the Payment Card Industry Data Security Standard (PCI DSS) requirement 12.8, and it walks through various types of issues, such as how to determine scope, how to ensure due diligence in the relationship, how to establish a good relationship with service providers and more. The supplement also guides merchants to craft detailed written agreements when outsourcing, making sure that all parties are aware of their obligations.
This new supplement, with inputs from over 160 organizations that are a part of the council’s Special Interest Group (SIG), should go a long way in helping merchants and third parties handling cardholder data understand their security roles and responsibilities. It will also help merchants vet third-party providers better before establishing business relationships with them.
PCI-DSS is a set of pragmatic best practices relating to data protection, network security, encryption, access control, monitoring, testing, policy development and more, all furthering the security of any organization. Organizations and data centers would do well to implement the specifications of these practices, even without the compliance requirement.
Related articles across the web
Going green is a consideration for every data center today. “Green” data centers are designed to have minimal impact on the extremely sensitive ecology of the earth, and they harness the forces of nature, such as solar power or chill to cool servers, to minimize their carbon footprints.
Generally, people think the need to go green is due to an increasing awareness of environmental issues, fueled by the critical state of the earth’s ecology. While this is indeed a good reason, the need to go green in inspired by many other commercial reasons, including:
- Many organizations buying data center services have become more aware of green operations and demand that their data centers follow these practices as well. Large organizations usually have written environmental policies influenced by their corporate social responsibility or statutory obligations, which they like to be echoed in their buying decisions. These organizations only opt for data centers that allow them to fulfill their “green” commitment.
- Going green comes with incentives as well, which help not only the data center, but the local economy. The UK government, for instance, offers tax relief for purchases of equipment with a coefficient of performance (COP) better than 3.0, depending on configuration. Data centers that review and replace equipment to keep up these thresholds can reap rich tax benefits.
- Green data centers are becoming a sound business proposition. Green concepts, such as Lean and Six Sigma, focus on eliminating waste and improving efficiency. This offers greater savings, which can improve the bottom-line and be passed on to customers in the form of a price advantage, which is important in an increasingly competitive business environment.
- Most green data centers incorporate innovations in their design to reduce energy usage, maximize the use of recycled materials to reduce costs and use state-of-the-art energy efficient equipment. A green data center roughly consumes about 50% less energy than a conventional data center.
A study by Stanford University, Northwestern University and Lawrence Berkeley National Laboratory researchers estimates that the use of state-of-the-art equipment can help data centers reduce greenhouse gas emissions by about 80%. Another study by IO estimates that modular data centers, a common approach to green data center design, offer 19% energy savings and 44% energy waste reduction compared to conventional data centers.
All factors considered, going green may no longer be a matter of choice. It is becoming a business necessity. At Lifeline Data Centers, we are dedicated to making our operations as energy efficient as possible. To learn more, schedule a tour of our facilities today.
Related articles across the web
The Uptime Institute recently conducted a survey on data center outages reported in the last twelve months. The survey spanned across traditional enterprise data centers, colocation data centers and financial services companies with in-house data centers.
The survey was based on the number of critical data center outages experienced in the last year that impacted business continuity. Colocation data centers demonstrated high availability as only 3% of the respondents reported having five or more outages. 7% of enterprise data centers had more than five outages, and 6% of financial services data centers had outages.
Uptime should be the highest concern and determining factor when businesses select data centers. 21% of businesses reported to have faced at least one outage with their data center provider. However, a positive note was that most of the respondents surveyed reported to be satisfied with the overall performance of their data center providers. Satisfaction quotient ranged from very satisfied to somewhat satisfied.
Colocation contracts typically mention the downtime that is permissible. The survey reported the following trends for allowed downtime limits:
- 40% allowed for zero scheduled downtime.
- 30% allowed for one to twelve hours of yearly scheduled downtime.
- 6% allowed for thirteen to twenty-four hours of yearly scheduled downtime.
- 2% allowed for more than 96 hours of yearly scheduled downtime.
Another similar study conducted by the Ponemon Institute revealed that it costs, on average, $7,900 per minute for an unscheduled data center outage. This cost has risen by 41% as compared to the $5,600 per minute in 2010. Outage time was in the range of 60 to 90 minutes, making the average total cost of one incident in the range of $350,000 to $900,000. The survey also reported that sectors having critical operations such as defense, finance and telecommunications had higher downtime costs as compared to other sectors.
Lifeline Data Centers is a colocation data center in the Midwest providing 99.995% uptime and reliable data center services. As a Tier 4 data center, we have maintained 99.995% uptime in order to provide our clients with the best service possible. To learn more, contact us today.
Related articles across the web
Data centers are faced with the challenge of having clients who are entirely dependent on them for reliable and consistent service 24×7. To successfully accomplish this, data centers consume huge amounts of energy that add to business and service costs. Therefore, data centers need take the necessary steps to save energy and ensure uptime.
This is quite achievable if data centers adopt new power saving standards and innovations whenever they are available, while optimally maintaining current functions. Here are a few power saving tips that can help data centers save energy by implementing simple changes:
- Turning off equipment that’s not in use: A simple solution to a major problem is turning off equipment that is not in use, which is easy enough to do, yet it is often overlooked by most data centers or even regular homes and businesses. Keeping equipment on standby still consumes energy, so turning them off completely is a power saving solution worth remembering.
- The hot-aisle, cold-aisle trick: This is the simplest way a data center can conserve energy and most data centers adopt this as a routine function. Laying out of servers in this manner ensures not only optimum power consumption, but it also results in longer server lives.
- Optimum airflow management: Identifying the hot and cold spots in a data center helps in highly effective airflow management, which, in turn, results in lower power consumption to keep temperatures at required levels.
- Optimizing CPUs: CPUs of a server are the most power consuming components, and turning on their power management features is sure to help lower power levels required to keep them running.
- Begin measuring power usage: If data centers do not measure the amount of energy they consume with relation to specific functions, then allocating, reallocating or optimizing the use of energy can be a difficult task. Measuring the distribution of power is the first step in putting it to optimal use. Power Usage Effectiveness or PUE is the vital link here.
At Lifeline Data Centers, we ensure that the most innovative power saving guidelines are put to work so that energy will be used optimally. Saving as much power as possible, even in our routine activities, helps us save energy, allowing us to cut down on costs and offer competitive pricing. Contact us today for more information.
Related articles across the web
There has been a major push in reducing the energy footprint of data centers, especially with the amount of data that is accumulated each and every day. One approach is to go lean by cutting out unneeded servers and adopting other energy reducing measures wherever possible, which seems simple enough, right?
While being green is attractive from a business standpoint, many data centers are not too eager to take this route. The need for cooling or other energy requirements exists for a reason, and cutting it down comes at the cost of imposing some restrictions in other places. Again, while identifying and eliminating unneeded servers and other appliances may indeed cut costs, the cost to actually do that may outweigh those savings.
Data centers unable to go lean due to their business model or any other reason should consider the following approaches to go green if they are faced with these challenges:
- Reduce consumption of fossil fuel energy and tap energy from renewable sources, including solar photo-voltaic and wind. Microsoft’s proposed facility, Pilot Hill Wind Project, which aims to provide 675,000MWh of renewable energy per year from 2015 to power its data centers, is a sign of the changes that are coming. The European Commission recently announced the RenewIT initiative, which aims for 80% of the European data center industry to be powered from renewable and sustainable resources, is sure to provide a push in this direction as well.
- Avoid the use of highly polluting diesel generators for backup power and use bio diesel powered generators instead. Bio diesel reduces the carbon footprint of diesel engines significantly, but it does comes with the trade off of generators requiring more frequent care and cleaning.
- Upgrade equipment periodically to remove obsolete and energy guzzling equipment. A case in point is the Direct Expansion (DX) cooling systems, where the focus was on the most cost efficient system a decade ago. Today, the primary focus of the cooling system is on the COP (Coefficient of Performance), or the ratio of energy moved to energy used to move it.
- Use recycled equipment as long as possible. This would greatly reduce the carbon footprint associated with the manufacturing of the equipment, even if there are no major operational savings.
- Adopt energy efficient architectural approaches, especially incorporating green innovations in ventilation and air conditioning, with the overall aim of minimizing the data center’s impact on environment.
Green data centers are not just an environmental friendly move; they are a sound business preposition. Most of the time, the investment to “go green” sees a return on investment and starts to generate additional savings very shortly after it’s implemented.
Lifeline Data Centers prides itself on energy efficient practices. If you’re interested in learning more, schedule a tour of our facility today.
Related articles across the web
Data centers have always dealt with the challenge of high power usage and consumption. An efficient data center requires high amounts of power, without which its services and the businesses associated with it would all come to a standstill. Over the years, however, there have been plenty of thoughtful innovations that have helped data centers drastically lower their power consumption levels, but it still isn’t where it needs to be. Here is the current state of power usage in data centers today.
When compared to everyday situations, the power consumption of a data center seems colossal. Even at the end of 2013, the power used by data centers all over the world could be compared to the output of 30 nuclear power plants, considering that they shared 30 billion watts of energy between them. In fact, a single data center uses more than 100 times the energy used for a building the same size or the equivalent of 25,000 homes in the US.
Power usage by data centers
Statistics show that in America alone, the year 2012 documented data centers using 9,900MW of power, while in 2013, this number rose to 10,560MW. Of course, no other country in the world has adopted technology the way the US has and, therefore, it remains the undisputed international leader. In fact, the numbers for all of Europe only add up to a little more than figures for the US, standing at 12,705MW and 13,470MW for 2012 and 2013 respectively.
Between 2011 and 2012, the rise in power consumption for data centers clocked 19% and caused great concern. The rising cost of power and the detrimental effects to the environment were some major factors that caused the government, as well as the public, to sit up and take notice. In 2013, however, this figure dropped down to just 7%, thus signaling a wave of energy efficiency measures. If one considers the comparative forecast through 2016, this translates into energy savings of at least 1,000MW.
At Lifeline Data Centers, we understand the concerns related to power usage and reiterate our commitment to optimized energy use through every innovation we adopt. After all, power savings are great for the environment and our business, too! Schedule a tour to learn more about our energy efficient techniques.
Every data center security manager has nightmares about his or her network being attacked. But some have more peace of mind because they feel confident in their data center infrastructure and because they have every inch of their network covered with the right degree of protection in place.
Part of covering that network requires scanning tools which tell you about vulnerabilities in the data center. Here are a couple of free network scanning tools that could be valuable in your data center.
- OpenVas: One of the more sophisticated network scanning tools, this scanner can scan for as many as 33,000 NVT (Network Vulnerability Tests). It runs on Linux systems and also has a client for Windows-based systems. Although it is not the most simplest of tools to use, it is one of the most comprehensive and offers great value in terms of its test coverage, providing simultaneous scans and also scheduled scans.
- Retina CS Community: The Retina tools cover mobile devices, private clouds, servers and applications, hosted on primarily Windows environments. They also cover third-party software such as Firefox and Adobe applications and can scan up to 256 IPS for free. This set of tools offers both scanning (Retina Network Community) and patching (Retina CS Community). The only limitation that some users may come across is that these tools need to be installed on Windows Server.
- Qualys FreeScan: This is a simple tool that can be used by administrators of small setups to scan for a maximum of 10 URLs or IP addresses, either local or Internet facing. It can identify network-related vulnerabilities, hidden malware and SSL issues. It also checks for compliance of your computer settings as per the Security Content Automation Protocol formulated by the National Institute of Standards and Technology (NIST).
- Microsoft Baseline Security Analyser (MBSA): This one is a simple Windows only tool and can scan for missing patches, weak passwords, SQL administration vulnerabilities and security misconfigurations. It provides a detailed report post scan on the vulnerabilities that were discovered and potential fixes for each of them. This is an easy to use tool that looks at general security risks, but it does not cover complex areas such as Windows settings, network issues and non-Microsoft software.
Taking care of and addressing network vulnerabilities is an important aspect of a healthy data center. It is meticulous and requires a great deal of effort, but the rewards are worth the hard work. Recognizing these facts, Lifeline Data Centers offers the best of network security solutions in our collocation data centers. If you are looking for a secure and optimized data center for your business needs, get in touch for a free consultation.
Since 2001, Lifeline Data Centers has helped companies improve uptime and control data center facilities operating expense. Lifeline is an innovator in wholesale colocation, continually finding ways to reduce downtime risks while driving down costs. Our approach is simple: delight customers with flexible, cost-effective data center floor space, office space, and services.
Lifeline Data Centers is a wholesale colocation facility; a high tech landlord. We provide data center and office real estate to companies who require uptime, connectivity, and room for growth. Lifeline provides secure hardened data center buildings, highly reliable power and cooling, and access to many telecommunications providers. Some clients choose to use Lifeline purely as a landlord, fully managing their own information technology infrastructure.” Other clients Lifeline’s colocation facilities and office space along with Lifeline’s managed services to augment their IT staff.
Lifeline Data Centers serves over hundreds of companies in health care, software, utilities, pharma, cloud computing, and government. If you value uptime, consider Lifeline Data Centers' flexible wholesale colocation and office space solutions.