Communicate with Supplier? Supplier
Zhong Xian Zhang Mr. Zhong Xian Zhang
What can I do for you?
Contact Supplier
 Tel:86-0755-23737136
Home > News > 10 Predictions for the Development of the Data Center Industry in 2020
Online Service
Zhong Xian Zhang

Mr. Zhong Xian Zhang

Leave a message
Contact Now

10 Predictions for the Development of the Data Center Industry in 2020


With the development of the data center industry and technology in 2020, enterprises need to improve the balance between on-premises data centers and cloud computing resources, adopt artificial intelligence technology on servers, and work hard to effectively manage data spread.

Industry media usually make forecasts for the coming year. People will see the arrival of some things: the rise of cloud computing, the development of SSD hard drives, and other issues, such as the repatriation of businesses from cloud platforms to on-premises data centers. And experts' predictions of the data center industry may occasionally bring some surprises.Therefore, 10 predictions for the development of the data center industry are proposed for the coming year.

1.IoT boosts data center growth in urban areas

Because this has already happened, it is not a difficult prediction. For a long time, data centers have been built away from renewable energy sources (usually hydroelectric power), but this demand will prompt more data center construction in urban areas. The Internet of Things will be a driving factor, but more and more data center providers (such as Equinix and DRT) will act as network interconnection providers.

2. The rise of network accelerators

The use of big data and various types of artificial intelligence means that a large amount of data will be generated and processed, and not all data can be generated and processed in one place. In addition, a network flow controller is currently required, which frees the CPU from the main task of processing data. Therefore, more and more network accelerators (such as Mellanox's ConnectX series) will enter the market, so that the CPU can complete the data processing work, and the accelerator can process a large amount of data faster

3.NVMe over fabrics will grow

Non-volatile memory Express (NVMe) is a storage interface similar to Serial Advanced Technology Attachment (SATA). The disadvantage of SATA devices is that their data is stored in HDD hard disks, so the speed and parallelism of SSD hard disks cannot be fully utilized. But there was a problem with early enterprise SSD hard drives: they could only communicate with the physical server they were on. And servers need storage arrays, which means network hops and latency.

NVMe over fabric (NVMeoF) is an important advance. It enables an SSD hard drive in one server to communicate with another hard drive elsewhere on the network over the network. This direct communication is critical to improving data movement in enterprise computing and digital transformation.

4. Cheaper storage-grade memory

Storage-type memory is the memory that is inserted into the DRAM slot and can work like DRAM memory, but can also work like SSD hard disk. It has a speed close to DRAM memory, but also has storage capabilities, effectively turning it into a cache for SSD hard drives.

Intel Corp. and Micron Technology Corp. are co-developing storage-grade memory (SCM) storage products, but the two companies are no longer cooperating. Intel introduced its storage-level memory (SCM) product Optane in May this year, and Micron introduced QuantX to the market in October this year. South Korean memory giant SK Hynix is also developing a storage-grade memory (SCM) product that is different from the 3D XPoint technology used by Micron and Intel.

All of this should advance storage technology and hopefully reduce prices. A 512GB Optane memory stick is now priced at $ 8,000. Xeon is even more expensive, so assembling a complete server becomes very expensive. Advances in technology and competition should reduce the price of storage products, which will make this type of memory more attractive to businesses.

5. Artificial intelligence automation for servers

All server vendors have added artificial intelligence to their server systems, but Oracle does lead in its autonomy, from hardware to operating systems, applications, and middleware stacks. Hewlett-Packard, Dell and Lenovo will also continue to make progress, but very large server vendors like Supermicro will fall behind because they only have hardware stacks and do nothing in the operating system field. They will also lag behind in storage, as this is an area where the three major server vendors excel.

Oracle may not be the top five server vendors, but no one can ignore their contribution in the field of automation. Expect other brand suppliers to continuously improve the level of automation.

6. Slow cloud migration

Remember when many companies wanted to close their data centers and move to cloud computing? The idea was very important at the time. IDC's latest CloudPulse survey shows that 85% of enterprises plan to shift workloads from public to private environments next year. A recent Nutanix survey found that 73% of respondents reported that they are moving some applications from the public cloud to on-premises. Safety is considered the main reason.

And, because security is questionable enough for some companies and some data, as people become more and more discerning about what they store in the cloud and what remains behind the firewall, cloud migration may Slowed down a bit.

7. Data Expansion Part 1

An IDC survey indicates that most of the data is not where it should be. Only 10% of company data is "hot" data (repeated access and use), while 30% is "warm" data (semi-periodical use), and the other 60% is cold storage, which is rarely accessed.

The problem is that the data is scattered all over the place and is often distributed in the wrong layer. Many storage companies are focusing on deduplication rather than storage tiers. A startup called Spectra Logic is addressing this issue, and if it is successful, hope that HP and Dell can also make a fuss.

8. Data Expansion Part 2

IDC predicts that by 2025, the total global data transmission volume will reach 175 ZB, and now it has reached 32 ZB, most of which are not used. There was a time when a data warehouse decided to classify, process, and store data as something useful. Today, people are populating data lakes with endless data from more and more sources such as social media and the Internet of Things.

People need to work hard. If you understand petabytes of data lake garbage, and start to become more picky about their storage. They will question the reasons behind spending large amounts of money on hard drives and storage arrays to store large amounts of unused and worthless data. People will go back to the data warehouse model that keeps the data available, otherwise they are at a loss.

9. More servers mix processors

Ten years ago, it didn't matter whether a server was defined as a Xeon tower server or a four-socket rack server in a cabinet. They were all based on x86 processors. But now, people are seeing more server designs that use onboard GPUs, Arm processors, artificial intelligence accelerators, and network accelerators.

This requires some changes to the server design. First, as a large number of chips run faster and hotter in confined spaces, Liquid cooling technology will become more necessary. Second, the software stack needs to be more robust to handle all these chips, which requires more work from Microsoft and Linux companies.

10.IT workload will change

Don't think that automation means that people are playing games on the iPhone. Due to its evolving system, IT professionals will face many new challenges, including:

• Fight against shadow IT.

• Address digital transformation.

• Develop an artificial intelligence strategy to keep up with competitors.

• Properly respond to the impact of new artificial intelligence strategies.

• Maintain business security governance.

• Handle increasing data inflows and figure out how to handle it.

• Respond to customer and company reputation on social media faster than ever.

Guangdong Giant Fluorine Energy Saving Technology Co.,Ltd
Business Type:Distributor/Wholesaler , Manufacturer , Trade Company , Agent
Product Range:Other Chemicals , Organic Intermediate , Other Chemicals
Products/Service:Fluorocarbon Refrigerant , Fluoride solution , Hydrofluoroether , UV printer ink , AF-coating , Anti-Fingerprint Original Solution
Company Address:Room 401 Building 2 No. 51 Bihu Dadao Fenggang Zhen Dongguan City GUangdong, Shenzhen, Guangdong, China

Previous: Big Data Sets Impressive New Standards for Integrated Business Systems

Next: Speed up the Deep Integration of Big Data and the Real Economy

Related Products List