Depositphotos_13975355_original

3 ways organisations can reduce their data centre costs

By Mak Chin Wah, General Manager, Enterprise Solutions, Dell, Asia South

In today’s globalized, tech-centric economy, data centers play an important role in driving business outcomes. However, building, managing and sustaining a data center involves significant costs. Businesses need to keep up with the capacity demand and reduce the ever-expanding carbon footprint while keeping a close tab on the total cost of ownership for their data centers.

As trends like big data and the Internet of Things continue to gain traction in Asia Pacific, the regional data center market is not expected to slow down any time soon. In the years ahead, digital content is only likely to increase – doubling every 18 months, according to market research firm IDC.

Power consumption will continue to be an area of concern as non-renewable energy resources diminish and electricity costs rise. This is resulting in a rapid increase of data center costs and CIOs are required to balance price points between innovations and business requirements. The long-term implications of inefficiency and expensive data centers can be huge to an organization. Any malfunction in the data center can potentially threaten the company’s operations and impact its overall bottom line.

Adopting some simple strategies can help companies manage their data center costs.

  1. REDUCING COOLING COSTS THROUGH ECONOMIZATION Cooling costs account for a significant portion of the energy consumption of a typical data center as servers generate heat continuously. It is important for IT managers to think long-term while choosing a data center infrastructure that generates less heat. Modern day storage infrastructure makes use of economization which involves using the outside climate conditions to cool the data center to save money on energy and cooling costs instead of using mechanical cooling means such as air conditioning. There are two primary forms of economization: air-side and water-side. Air-side economization brings outside air directly into the data center as the primary source of cool air. Water-side economization uses an air-to-water heat exchanger and then brings cooled water into the data center where a second heat exchange takes place that uses the water to cool the data center air.

2. SERVER VIRTUALIZATION In addition, virtualization improves scalability, reduces downtime, speeds up disaster recovery efforts, and enables faster deployments. Server virtualisation is also advantageous as it can move entire systems from one physical server to another for workload optimization or system maintenance without the hassle of downtime. Some virtualization solutions also have additional features such as load balancing, failover capabilities and built-in resiliency features. Due to these benefits, virtualization has become commonplace in large data centers.

Data center operators typically install several physical servers to run various applications in their organization. This means that most servers run at a low “utilization rate” which is a fraction of total computing resources engaged in useful work. Server virtualization offers organizations with a method to consolidate servers by allowing you to run multiple different workloads on one physical host server.

A “virtual server” is a software implementation that executes programs like a real server. The advantage of server virtualisation is that multiple virtual servers can work simultaneously on one physical host server, thereby increasing the utilization rate of the data center. This also reduces the number of servers required, thus decreasing the amount of heat generated and electricity consumed for cooling.

3. BETTER MANAGEMENT OF DATA STORAGE Software-defined: Software-defined storage (SDS) offers flexibility and also reduces the overall cost of storage.

  • Automated tiering of storage: Data stored in the data center loses its utility slowly as it ages. However at the same time, the data needs to be maintained as it is required for future records for the organization. Since the data is not used on a daily basis, they can be shifted to lower performance drives to save energy. It is possible to conduct storage tiering in a data center automatically to simplify the process further. The use of automated tiering has increased from 13% to 20% from 2011 to 2014 as more IT administrators are seeing the value of relieving themselves of manual data storage tiering
  • Flash at the price of disk: Today’s traditional flash offerings are available for the enterprise at the price of the disk. Flash makes retrieval of information faster and has a smaller susceptibility to damage. Hence, it is an effective way for organizations to increase the performance of their storage architecture.
  • Deduplication: Technologies such as data deduplication, which refers to the process of finding and eliminating duplicate pieces of data stored in different data sets, is said to reduce data storage needs by as much as 90 percent. This helps to reduce data footprint, cut costs for hardware, software, power and data center space.
  • Convergence: Convergence or Converged Infrastructure helps minimize compatibility issues. It simplifies the management of servers, storage systems and network devices. Storage plays a critical role in the reduction of costs within a data center. With the help of advanced technology, organizations can make changes to how they manage data storage at a reduced cost.

The three mentioned points can ensure that costs are optimized while meeting the performance demands of the modern data center. Cost-effectiveness within a data center, does not necessarily have to mean reduced efficiency or high impact on the environment. In fact, when employed wisely an efficient data center could mean higher business productivity and continuity.




Leave a Reply

Please Login to comment
  Subscribe  
Notify of