Since it first came on the scene in July 2003 with the ratification of the IEEE 802.3af standard, Power over Ethernet (PoE) has become a “must have” for LAN deployments.

The first iteration, Type 1, had the ability to deliver up to 15.4 W of power over two pairs of a telecommunications cable from the source to support VoIP phones and other devices. The second iteration, IEEE 802.3at-2009 Standard, also known as PoE Plus, had the ability to deliver up to 30 W of power over two pairs from the source to the powered device. Even so, it wasn’t the most efficient and many IP-based devices now simply require more power.

That’s why the IEEE 802.3bt Task Group is currently developing the third generation PoE standard to deliver more power more efficiently using all four pairs. With a goal of completing this work by February 2016, the IEEE 802.3bt Standard will define the operation of 4-pair powering for Type 2 (up to 30 Watts), Type 3 (up to 60 Watts) and Type 4 (up to 90 Watts).

Let’s take a closer look at the benefits.

The Time to Prepare for 100W Power over Ethernet is Now - White PaperDividing the Power

The first benefit of 4-pair powering is that less power is consumed in the cable. Because the power is divided over all four pairs instead of only two, there is less current flowing on each pair. This results in lower resistive heating in the cable. Why do we care?

Resistive heating is proportional to the square of the current multiplied by the conductor resistance. Therefore, the higher the current, the higher the heating effect.

For example, at 60 W over all 4 pairs at 600 mA current per pair, 9 W of power is consumed in the cable and 51 W is delivered to the device. But when applying the same 60 W over only 2 pairs at 1.2 A current per pair, 18 W is consumed in the cable and only 42 W is delivered.

Higher Power Levels

The second benefit of 4-pair powering is that higher power levels can be delivered to remote devices while maintaining an acceptable temperature rise inside a cable bundle. Again, why do we care?

An acceptable temperature rise inside a cable bundle helps ensure that we don’t exceed the temperature rating of the cable. Because insertion loss increases as the temperature increases, an acceptable temperature rise also helps minimize any degradation in transmission performance at elevated temperatures.

The only way to counteract the effect is to either de-rate the length of the cable below 90 meters for a permanent link or to choose a cable with ample insertion loss margin that does not require length de-rating.

The IEEE 802.3 Ethernet Standard requires a 10 °C reduction in the maximum ambient operating temperature of the cable when all cable pairs are energized under worst-case conditions. A complementary document, TIA TSB-184 allows a temperature rise of 15 °C (based upon a maximum ambient temperature of 45 °C) for a bundle size of 100 cables.

Temp Rise

This graph shows the temperature rise inside a 100-cable bundle for different cable types. If the curve is within the green shaded region, the temperature rise meets the 10 °C criteria specified by IEEE 802.3. The yellow region is within the TIA 184-A guidelines of 15 °C. The red zone exceeds both criteria and indicates an excessive temperature rise for the power delivered over the cable.

There are even higher power requirements for ultra-high-definition distribution of digital media. The HDBaseT Alliance HDBaseT 2.0 specification, soon to become IEEE 1911.2, allows up to 100 Watts of power delivery over 4-pair Category cables for powering HDTVs and displays at distances up to 100m.

At these new higher power levels, cables will be put to the test. Past the adoption of IEEE 802.3bt and 1191.2, only the fittest will survive.