Communicate with Supplier? Supplier
Zhong Xian Zhang Mr. Zhong Xian Zhang
What can I do for you?
Contact Supplier
 Tel:86-0755-23737136 Email:ericzhang@tuwtech.com
Home > News > Three Directions of Data Center Technology Development in 2020
Online Service
Zhong Xian Zhang

Mr. Zhong Xian Zhang

Leave a message
Contact Now

Three Directions of Data Center Technology Development in 2020

2019-12-26

Considering the prudence of data center technology development (such as the confidentiality of critical infrastructure, non-public agreements, etc.), it is impossible to make specific predictions without taking huge risks. But through dialogue and analysis with suppliers and analysts, one can understand some of the development directions of data center technology.


The following will focus on the three development directions and trends of data center technology. Research institutions believe that these three major trends will be realized in 2020 and beyond, and are of great significance. First, machine learning and operational data collection provide new possibilities for intelligent data center management tools; second, refocusing on the power density of power and cooling technologies driven by machine learning, and reducing the need to deploy computing infrastructure at edge computers; Third, the enthusiasm for the development of data center technology may one day make diesel generators as a backup power source for data centers a thing of the past.


1. Data-driven data center management


For years, large vendors have been discussing the issue of adding predictive analytics to data center management tools (ie, DCIM software). At the same time, smaller companies such as Nlyte and Vigilent are bringing predictive tools to market.


Among them, Schneider Electric and Vertiv, two large suppliers, said in December last year that they were collecting sufficient operational data from customer equipment and have begun to launch viable forecasting functions.


"We have a very large data pool with billions of rows of data, and we think this is very important and we can start to change the way we provide services and solutions and become more predictable," said Steve Lalla, Executive Vice President of Services at Vertiv And began researching service level agreements (SLAs). "


Vendors continuously collect data from customer systems through their monitoring software (on-premise and increasing SaaS). Lalla said that over time, data has become more standardized and organized, making it useful for analytics.


Schneider Electric's senior vice president of innovation and CTO Kevin Brown said that the company's commitment to building predictive data center management capabilities and delivering them as software as a service (SaaS) began three years ago.


"Now we have enough data in the cloud to start rolling out predictive analytics, more sophisticated battery-aware models and machine learning algorithms are no longer theoretical. These products will be released this quarter," he said.


He said Schneider is currently collecting data on 250,000 to 300,000 devices deployed in customer data centers. He said the company hired a dedicated team of data scientists, and when it had about 200,000 devices, the team began to feel confident about the accuracy of some of their algorithms. For example, have enough confidence to do things like predict when a UPS power battery might fail. Schneider wants to collect more data to do this. He explained, "The more powerful the algorithm, the more data it needs. The standard will continue to increase, depending on how sophisticated the user wants the algorithm."


Andy Lawrence, research executive director of the data center industry authority Certification Institute Uptime Institute, said in a recent webinar that the advent of machine learning has driven the recovery of data center management software. The development of the DCIM software market was full of hope at one time, but it did not show the rapid growth that many people expected. Despite the slow progress, it has been recognized by users.


According to Rhonda Ascierto, vice president of research at Uptime Institute, DCIM can now be considered a mainstream technology. All data centers have some kind of DCIM, whether it's called DCIM or some other name. Most importantly, enough data center management software has been deployed to collect data that can now be used to build machine learning-driven predictive analytics and automation capabilities.


The rapid development of data availability and machine learning technologies are driving the development of data center management software. But there is a third driver: edge computing. When users plan to deploy many small compute nodes near where the data is generated, they quickly run into the issue of operating a distributed infrastructure in an economical manner. Tools like DCIM, especially as provided by cloud computing services (such as SaaS), are natural, and remote monitoring and management functions can be implemented through a centralized console.


Steven Carlini, vice president of innovation and data centers at Schneider Electric, said, "Edge computing has become the core of Schneider Electric's infrastructure management SaaS strategy. The idea of entering a data center with a cloud-based management system is that in many cases data needs to be saved At the scene, we have solved this problem. It is indeed more valuable when deployed on a large scale. The real value will be on the edge. "


2. Edge computing is smaller, faster, and ubiquitous


Edge computing is putting increasing pressure on engineers designing data center technology that need to make data centers smaller and more dense.


For example, Schneider Electric recently released the smallest micro data center to date: a 6U cabinet that can house servers, network equipment and UPS power, and can be wall-mounted. Brown said he expects this miniature data center product to generate significant revenue for Schneider in 2020.


Vertiv updated its power supply portfolio in 2019 and introduced a series of higher power density UPS power supplies. Quirk said that in all the company's products, the rack-mounted GXT5 series UPS power supply has been designed with full consideration of the needs of edge computing. Its power range is from 500VA to 10kVA (some models support 208V voltage, and some models support 208V and 120V. Voltage).


Edge computing is also an important consideration after Schneider and Iceotope, an Immersion cooling technology company, and Avnet, an electronics distributor and IT integrator, announced this October.


Iceotope's cooling method is not to immerse the server in liquid coolant or install cooling pipes on the motherboard to send frozen water directly to the chip, but to inject the coolant into the sealed server case. This means the solution can be deployed in standard data center racks, and standard servers can be water cooled.


The first problem solved by immersion cooling technology is high power density. The growth of machine learning is driving the adoption of GPU servers that are used to train deep learning models. The power density of these power-hungry GPU chips far exceeds what a standard data center design can achieve. Many users can still use air cooling technology, and the liquid-cooled rear door heat exchanger can directly cool the air on the rack, which is the most popular method to solve this problem.


Proponents of immersion cooling technology, however, emphasize its efficiency advantages. These solutions do not require fans and can save power. "Using Liquid cooling in many environments can reduce energy consumption by at least 15%," Brown said.


In addition, edge computing solves many problems. Removing other related components, such as fans, means fewer failed components. Providing higher power density in a smaller space makes it easier to deploy edge computing facilities where there is not much space. It also addresses the issue of dust, which can damage IT equipment.


Analyst Ascierto said that although vendors are excited about edge computing, a survey by Uptime Institute shows that there is still no significant demand for edge computing power. To date, most requirements for micro data centers with power levels of 100 kW or less have been driven by server rooms or remote locations where computing power already exists.


Ascierto said that demand for edge computing is not expected to surge in 2020. Once more IoT devices and 5G wireless infrastructure are deployed, a huge wave of demand is expected after 2020.


3. The promise of better backup power


Another major shift in data center design is only just beginning, and may not happen until 2020, when batteries or other technologies replace diesel generators.


As Lawrence points out, diesel generators will also be a problem for data centers, with high deployment and maintenance costs and noise and air pollution. However, so far, they have become an integral part of data centers, which are usually operating around the clock.


Data center operators have been exploring two alternatives for diesel generators: fuel cells and batteries, of which lithium-ion batteries are a particularly promising technology.


Bloom Energy currently deploys fuel cells in multiple data centers. One of the eBay data centers in Utah uses BloomEnergy's fuel cells as a backup power source instead of diesel generators.


Lawrence said that several pilot projects of Bloom Energy are being deployed from 2019 to replace diesel generators. In addition, one or two major hosting service providers have studied this.


As the electric vehicle industry has made great strides in increasing energy density and reducing the cost of lithium-ion batteries, lithium-ion batteries are quickly gaining a place in the data center industry. At present, it has been used to replace lead-acid batteries in UPS power supply systems, but the operating time it provides is continuously increasing. Schneider's Brown said that lithium-ion batteries could eventually replace diesel generators.


"I don't think this transition will happen in 2020, but we will closely track it," he said.


He said that Schneider Electric's key indicator of concern is the uptime of lithium-ion battery systems and reducing their deployment costs. Two and a half years ago, the operating time of the lithium-ion battery system was 90 minutes, and now it is close to 3 hours.


None of these trends will begin in 2019, and they will not reach a decisive inflection point in 2020. These are some of the major developments achieved in 2019. It is expected to accelerate further in 2020 and will promote the development of some data center technologies (such as chips, networking, virtualization, and containers) in the coming years.

Guangdong Giant Fluorine Energy Saving Technology Co.,Ltd
Business Type:Distributor/Wholesaler , Manufacturer , Trade Company , Agent
Product Range:Other Chemicals , Organic Intermediate , Other Chemicals
Products/Service:Fluorocarbon Refrigerant , Fluoride solution , Hydrofluoroether , UV printer ink , AF-coating , Anti-Fingerprint Original Solution
Certificate:MSDS
Company Address:Room 401 Building 2 No. 51 Bihu Dadao Fenggang Zhen Dongguan City GUangdong, Shenzhen, Guangdong, China

Previous: Five Essential Skills for the Data Science Job Market in 2020

Next: Lenovo Liu Miao: Development Trends and Strategies of a New Generation of Intelligent Cloud Data Centers

Related Products List

Home

Phone

Skype

Inquiry