The Negative Impacts of Cable Temperature Rise
In last week’s blog post, titled “LP Cable: When It’s Necessary (and When It’s Not),” we talked about the rise of cables utilizing Power over Ethernet (PoE technology) – spurred on by more people and devices connecting to networks, and ever-increasing speed and bandwidth requirements.
Devices designed to connect directly to networks, such as VoIP phones, surveillance cameras and flat screens, require increased power delivery through network cables. To grow the number of devices that can be powered by PoE, available power from the current must be increased – and the amount of heat generated within the network cable must increase as well. Cable temperature rise that is too high can ultimately push cables beyond their rated temperatures, reducing performance and reliability (and causing potential damage to the cable itself).
Why Cable Temperature Rise Matters
The performance of network cabling is often defined by channel bandwidth. What distinguishes one category cable from another? Available bandwidth, which is the frequency range where the signal-to-noise ratio is positive (greater than the noise level).
Cable temperature rise (heat) impacts the cable’s electrical characteristics. Insertion loss (the loss of signal power that results from inserting a device in a cable transmission line) increases as cable temperature rises.
Cable attenuation – otherwise known as a reduction in signal strength – is significantly affected by temperature as well. In fact, ANSI/TIA-568-B.2-1 recommends that the length of a channel be reduced (de-rated) if Category 6 cable is installed at a higher temperature. De-rating the cable allows it to maintain the same transmission performance, only across a shorter distance.
Getting Rid of the Heat
When heat is generated inside a cable, it can often be dissipated through conduction, convection and radiation via the cable insulation and jacket. High cable density (whether due to tightly packed cable trays, bundled cables, etc.) can lead to even more heat build-up within the cable; heat generated within the inner cables doesn’t have a chance to dissipate.
In some cases, excessive heat build-up can also cause faster aging of the cabling jacket.
Managing cable temperature rise is important in order to keep insertion loss low and reduce the likelihood of bit errors (as well as prevent cable damage). When you decrease cable temperature rise, you reap several benefits:
- Maintain excellent transmission performance (reduce insertion loss)
- Decrease the need for cooling in cable pathways
- Allow cable to function in high ambient temperatures without exceeding temperature ratings
- Permit larger cable bundles
To prevent cable temperature rise from impacting your infrastructure, look for cables that offer the highest operating temperature rating possible with low DC resistance (such as Category 6A cabling).
Where LP Cables Fit In
As we discussed in last week’s blog, Underwriters Laboratories (UL) has developed a Limited Power (LP) certification for cable, which verifies that a cable won’t exceed its temperature rating under certain conditions. LP cables are manufactured with insulating and jacketing material that is made to handle higher temperatures.
Although LP cable doesn’t exceed its cable temperature rating under certain conditions, introducing higher temperatures into the cable negatively impacts the cable reach. Higher insertion loss is experienced at elevated temperatures. Even LP-certified cable may not reach its full 100m distance and require de-rating.
The only way to truly counteract the effect of cable temperature rise is to de-rate the length of the cable below 90m or to choose a cable with sufficient insertion loss margin.
Belden has studied the effects of cable temperature rise in detail. We can help you design and install a cabling infrastructure that is designed to support whatever technology and applications you’re deploying now – and in the future. You can also learn more about our entire data center solution set here.