Communicate with Supplier? Supplier
Zhong Xian Zhang Mr. Zhong Xian Zhang
What can I do for you?
Contact Supplier
 Tel:86-0755-23737136 Email:ericzhang@tuwtech.com
Home > Industry Information > If Cooling Facilities Cannot be Added, How can Operators Use Some Cooling techniques and Techniques to Prevent it Equipment Such as Servers From Overheating?
Online Service
Zhong Xian Zhang

Mr. Zhong Xian Zhang

Leave a message
Contact Now

If Cooling Facilities Cannot be Added, How can Operators Use Some Cooling techniques and Techniques to Prevent it Equipment Such as Servers From Overheating?

2019-12-31

If there is no way to add cooling facilities, operators can use some cooling tips and tricks to prevent IT equipment such as servers from overheating. With the rapid rise in the ambient temperature of data center cabinets and servers, the cooling system is often overwhelmed, and data center operators are under tremendous pressure to quickly and effectively reduce the temperature of the equipment room.


Many times, when the actual thermal load of the equipment does not significantly exceed the actual capacity of the cooling system, optimizing the airflow can improve the situation until a new cooling system is installed in the data center.The following tips and tricks may not solve the long-term effective cooling problem of the data center, but they can also play a role.


1
Perform temperature measurement on the front of the server. This is where the server draws in cold air, and it is actually the only effective and important measurement location. Take readings at the top, middle, and bottom of the rack (assuming a hot aisle-cold aisle layout in the data center). The top temperature of the rack is usually the hottest. If the bottom of the rack is cold and has open rack space, you can try relocating the server near the bottom (or the coldest area) of the rack.

2

If the temperature rises, don't panic. It does not matter even if the temperature of the air flow passage reaches 80. Although this is higher than the standard temperature of a 70-72 data center (the staff may not like to work in such an occasion), it may not be as bad for the server as people think. If the maximum temperature reading at the front of the rack is 80 or lower, it is still within the ASHRAE's TC 9.9 recommendation. Even with a slightly higher intake air temperature (up to 90), it still meets the A2 "allow" criterion, which is within the 50 to 95 operating range of most servers.


3
Don't worry about the rear temperature. Even if the temperature is 100 or higher, don't use a fan to cool the rear of the rack. This will only make more hot air mixed into the cold air channel.

4
Be sure to use a blanking plate to block all unused open space in the front of the rack. This prevents hot air from recirculating from the rear of the rack to the front of the rack.

5
If your data center has an elevated floor, you need to make sure that the grille or perforation of the floor is correctly located where the hot rack is located. If necessary, different floor grills can be rearranged or replaced to match the airflow to the heat load. Be careful not to place the floor grille too close to the room air conditioner, this will cause the cold air flow to immediately return to the room air conditioner and cause the rest of the room / row to lose cold air.
6
Check whether the floor has an opening in the cabinet. Cable openings on the floor will allow hot air to escape from the raised floor static pressure box and direct available cold air to the floor vents in the cold aisle. Using an air-tight kit can reduce this problem.
7
If possible, try to redistribute the heat load and distribute it evenly across each rack to avoid or minimize "hot spots." Keep in mind that before moving the server, you need to check the temperature at the top, middle, and bottom of the rack, just relocate the hotter server (also based on the front of the rack) to a colder area. Then use a blanking plate to block the gap. Check all rack temperatures again to ensure that no new hotspots have been created.
8
Inspect the rear of the rack for cables that block airflow. This will cause excessive back pressure on the fans of the IT equipment and may cause the equipment to overheat, even if there is sufficient cold air in front. This is especially true for 1U server racks with many long power and network cables. Consider using a shorter (1 to 2 feet) power cord and replacing the longer power cord that comes with most servers. Also use the shortest possible network cable. Use cable management to clean the rear of the rack to avoid obstructing airflow.
9
If there is an overhead duct cooling system in the equipment room, make sure that the cold air outlet is directly at the front of the rack and the return duct is above the hot aisle. Experts say that the ceiling vents and return air vents in some data center computer rooms are not well positioned, causing the rooms to overheat, just because all the cold air is not flowing directly to the front of the rack or the hot air is not properly drawn. The important issue is to ensure that hot air from the rear of the cabinet can return directly to the room air conditioner without mixing with cold air. If you have a pressurized ceiling, you need to consider using it to capture hot air and install a coupling from the top return air duct in the room air conditioner. Some plumbing projects may have a direct impact on the temperature of the equipment room. In fact, the hotter the return air, the higher the efficiency and actual cooling capacity of the room air conditioner.
10
Consider adding temporary "roll-in" cooling devices only if you can dissipate heat to the outside area. Installing the exhaust pipe to the ceiling of the air conditioner in the computer room does not play a big role. The roll-in heat exhaust pipe must be discharged into the area outside the controlled space.
11
Turn off lights when no one is working in the data center. This can save 1% to 3% of electricity and heat load, which can reduce the temperature by 1 ° C to 2 ° C in the case of marginal cooling.
12
Check if there are any idle devices still running. This is a fairly common situation and it is easy to fix, just turn it off.
Conclusion
When the heat load of the data center completely exceeds the capacity of the cooling system, although there is no real effective solution, sometimes just improving the air circulation may increase the overall cooling efficiency by 5% -20%. This could leave servers in the data center through hot days. In any case, it will always be a good thing to reduce data center energy costs.
Data center operators need to plan ahead. If all else fails, a backup plan should be put in place to shut down unimportant loads so that more important servers can keep running (email, finance, etc.). Make sure that critical loads are deployed in a cold place, which is better than critical loads accidentally shutting down due to overheating.

Guangdong Giant Fluorine Energy Saving Technology Co.,Ltd
Business Type:Distributor/Wholesaler , Manufacturer , Trade Company , Agent
Product Range:Other Chemicals , Organic Intermediate , Other Chemicals
Products/Service:Fluorocarbon Refrigerant , Fluoride solution , Hydrofluoroether , UV printer ink , AF-coating , Anti-Fingerprint Original Solution
Certificate:MSDS
Company Address:Room 401 Building 2 No. 51 Bihu Dadao Fenggang Zhen Dongguan City GUangdong, Shenzhen, Guangdong, China

Previous: Liquid-cooled Systems that Use Liquids that Dissipate Heat from Home Appliances

Next: The Organic Engine Coolant Developed Reached the International Advanced Level.

Related Products List

Home

Product

Phone

About Us

Inquiry