Everyone knows that data centers are essential for global connectivity. Digital content, whether it's videos, blog posts, or financial transactions, needs to be stored and shared. These data centers work 24/7 to provide on-demand access to the world's information. But do you know how big these data centers have reached? That their energy consumption can be as large as small cities? Here are fascinating data centre facts that may sound unbelievable to you.
A data center is an infrastructure made up of a network of servers. This infrastructure can be used internally or externally by companies to organize, process, store and warehouse large amounts of data. Several types of businesses rely heavily on the applications, services, and data contained in a data center.
Data centers contain several components to serve different purposes:
It refers to the interconnections between the components of a data center and the outside world, namely routers, firewalls, application delivery controllers, switches, etc.
An organization's data is stored in data centers, and thus uses physical space to do so. Most devices used for storage are solid-state drives (SSD), although tape drives are still favored for their cost-effectiveness, particularly for large volumes of data that aren't accessed regularly.
This refers to the processing power and memory needed to run applications. It is provided by powerful computers to run vast numbers of operations simultaneously.
The term server refers to the role played by a piece of computer hardware intended to provide services to clients on the Internet or an intranet, which is a closed internal database service.
Servers have their own operating system, calibrated according to the computing power required by their central unit. Some features are common between the server and its operating system. This is the case, for example, of identity or access controls, proxy or firewall functions and other DHCP protocols.
In the digital age and with the explosion of Big Data, data centers have become essential infrastructures and represent strategic issues for countries. Here's a list of various facts and data center statistics you might find interesting:
Although the computer has been existing since the 1940s, it was in the 1990s that the term "data centre" became commonly used in technology. This coincided with the availability of inexpensive networking equipment and the expansion of IT operations, driven by the rapid growth of the internet.
About 1/3 of all data pass through the cloud.
A sizable proportion of data centers are clustered around large urban centers. More than 2,600 of them are located in the world's 20 largest metropolises.
The typical data centre can be operated by a team of 30 employees or less.
Data centers are heavy consumers of power and therefore generate a lot of heat. Large operators prefer to establish their businesses in electricity-producing countries that have colder climates, which allows them to naturally cool their facilities.
California is home to the largest concentration of data centers in the United States, with over 300 sites.
The typical data centre consumes 100 times more electricity than a large commercial building, while a large data center consumes the equivalent of a small North American town.
There are more than 7M data centers in the world.
The countries with the most data centers in the world are the US, the UK, Germany, China, the Netherlands, Australia, Canada, France, Japan, and Russia.
Data centers use high-performance in-flight encryption, which fully protects your data from the time it leaves a data center until it arrives at the destination data center.
The 600,000 square meters data center located in Langfang, China is the largest in the world.
Nearly 40% of the total operating costs of data centers come from the energy needed to power and cool the enormous amount of equipment it requires to function.
Data centers are responsible for 1% of global greenhouse gas emissions.
8% of new data centers use green energy to power their operations
Obviously, the operating costs of a data centre will vary according to its size. Generally, the annual cost of running a large data center in the early 2020s is between $10 and $25 million. But the operational costs in terms of proportion tend to be the same across the data center industry, with little less than half of the budget being spent on software, hardware, disaster recovery, electricity, and networking. Another large portion of the budget is dedicated to maintaining the infrastructure and applications. It should be noted that cooling/air conditioning represents 30 to 40% of the energy cost of a data centre.
This Data Center is the largest in Asia and is located in China, it has an area of 660,000 square meters. It operates under an alliance with IBM and is based on the most advanced cloud in the world. Therefore, this data center is responsible for storing all the information in the cloud in addition to also being on the physical hardware of the complex.
Data centers consume a huge amount of energy. In a global context alarmed by the acceleration of problems related to climate change, data center operators must react quickly. It is important to find an efficient solution to reduce the ecological footprint of their digital infrastructures and activities.
An increase in the demand for energy implies an increase in the use of fuels, and it's a given that an increase in the use of fossil fuels to generate electricity will lead to a rise in carbon emissions. According to a recent energy efficiency report released by the US Senate, the digital sector is currently responsible for about 4% of global greenhouse gas emissions, a quarter of which is attributable to data centers. The strong development of digital uses and services, such as cloud computing, has indeed significant consequences on the environment.
To reduce the energy consumption and greenhouse gas emissions generated by these data centers, some companies are adopting more environmentally friendly solutions:
Supplying data centers with renewable energies, and in particular geothermal or hydraulic energies. A 2018 report co-authored by Johan Falk, a researcher at the Stockholm Resilience Center, notes that switching to renewable energies alone would allow the digital industry to halve its carbon footprint.
More commonly called “free cooling”, this method consists of using the temperature difference between the interior and exterior of a building to supply the cooling system. It makes it possible to use the outside air during the day and to evacuate the heat at night to cool the servers throughout the day. Some companies choose more radical solutions, such as building data centers in cold countries.
Data centers generate a large amount of heat. This heat can be recovered and used in other nearby buildings or infrastructures, thus reducing the carbon footprint of a specific geographical area. A great example of such practice is the Butte-aux-Cailles swimming pool in Paris which is equipped with a digital boiler. Its basement houses servers that transfer heat to indoor and outdoor pools while providing between 8 and 10% of the facility's consumption needs. We could imagine a similar scenario in a large commercial office building in a cold climate.
How long is a piece of string? After all, a small company’s intranet system could operate using a single server, while search engines require hundreds if not thousands of servers. Google, for example, requires more than 900,000 servers spread across dozens of data centers around the world to respond to an average of 3 billion daily requests made by Internet users. Ultimately, the volume of server units depends on the number of users likely to use the service simultaneously. Having said that, many data centers are home to several hundred servers as they must handle huge quantities of data and must ensure redundancy.
Faced with the ever-increasing number of devices connected to the Internet, the demand for storage and computing capacity has increased exponentially. How do tech giants manage to meet this demand? With Hyperscale Data Centers.
As the name suggests, the primary goal of hyperscale data centers (DCH) is to achieve massive computing processing, typically for big data, cloud services, and large data-producing companies, such as Google, Facebook, or Amazon. The hyperscale IT infrastructure has been designed for horizontal scalability and induces high levels of performance, throughout and redundancy guaranteeing fault tolerance and high availability. Hyperscale computing often depends on highly scalable server architectures and a virtual network.
There are a number of reasons why a business might choose a Hyperscale data center. This type of datacenter may offer the best, if not the only, opportunity to achieve a specific goal like providing cloud services. But in general, it's all about money. Hyperscale solutions are the most cost-effective option to handle a demanding set of operations.