The pandemic has fast-tracked the evolution of many businesses into becoming digitally native, and enterprises had to adopt new ways of working and upgrade their IT foundations aggressively. The demand for data centre as an essential infrastructure was one landmark trend that stood shoulder-to-shoulder with the announcement of the 5G spectrum amongst other technological disruptions. While several reports stake claim that data centres will double or triple in capacity in the next two to three years, one thing that is certain is that they will continue to grow exponentially in the next decade. This begs the question – What are the key components of running a data centre?
As per a recent report, energy accounts for up to 40% of the data center’s running cost. Power is the common denominator of running any infrastructural project. Similarly in a data center project, power plays a vital role in ensuring sustainable and reliable running of operations.
Global IT companies today have realized the role of energy efficiency and PUE (power usage effectiveness) in fueling data centers efficiently, and therefore have incorporated mechanisms to save energy has boasted one of the lowest PUE scores. PUE primarily measures the efficiency at which a data center consumes power. The lower the PUE score, the higher the data center scores in terms of energy efficiency.
Consolidation or merging old redundant and legacy IT infrastructure and networks to a new neural network have been some of the ways data center companies can stay efficient and reduce high energy costs.
The Way forward
I believe that while these tried and tested techniques have already helped companies deliver better energy efficiency, we need to leverage more renewable energy to see long term results. Cooling components play a vital role in energy consumption. If we can leverage the environment to reduce cooling dependencies, we can save millions in energy costs. Incorporating new-age tech such as AI and ML to regulate data center’s cooling systems to match the weather and temperature outside also has helped check high-energy costs of data center cooling.
Another great tool that has disrupted the data center industry and energy efficiency is the role of Edge computing:
- One of the key deployments of Edge computing is the Autoscale technology. Using Edge, Autoscale reduces the number of servers that are required by enterprises during the lean/ dormant phase of activities. The role of Autoscale is to use AI-powered tools to automate deployment decisions and decide whether AI should run on a device, in the cloud or on a private cloud. This leads to both cost savings and energy efficiency
- Another significant space where Edge computing helps is the amount of data transversing the network. High-bandwidth applications which consume high volumes of data can be monitored and stopped by using Edge computing. By running applications closer to the user edge, data can be stored and processed closer to the user, instead of relying on a centralized neural network. This helps in saving high energy costs and also lead to lower latency and better bandwidth experience for the user
However, one aspect which has echoed over the past few years is the concept of a green data center or building a data center which is completely sustainable and powered by 100% renewable energy. This would take care of the carbon footprint and issues such as climate change. Today EPC/Infrastructure development partners, IT companies are quite vocal and deliberate in their approach towards sustainability, while maintaining profitability. This can only be driven by investing in clean and renewable energy.
To conclude, I would reiterate the fact that energy plays a huge role in shaping a data center and with the right practices in sustainability, deployment of technologies such as AI, ML and Edge, the enterprise will be able to regulate its energy efficiency successfully.
Ankit Saraiya is the Director and Head of the Data Center Vertical at Techno Electric & Engineering