In an era where loss budgets are more stringent than ever and most manufacturers are publishing both typical and maximum insertion loss values, knowing which loss values to base the design of your channel on is more important than ever.
The idea of a loss budget is to ensure that the application will function over the installed channel. Rather than using the best possible connector loss values, designers should be conservative and give themselves some margin.
In other words, play it safe and base your testing loss limits on the manufacturer’s specified “maximum” insertion loss values.
To begin with, let’s consider the fact that loss budgets drop dramatically as we move from 10 to 40 and 100 Gig. The 2.6 dB (OM3) and 2.9 dB (OM4) channel loss for 10 Gig is now just 1.9 dB (OM3) and 1.5 dB (OM4) for 40 and 100 Gig, including a maximum connector loss of just 1.5 (OM3) and 1.0 dB (OM4).
When deploying multi-point architectures with cross-connects for flexibility, the loss budget becomes even more of a concern. For example, if you consider a 40 Gig OM3 channel with cross-connects at either end (i.e., 4-point architecture) that use MPO connectors with a loss of .35 dB, you’re left with just .5 dB for the remainder of the channel. That’s not a lot of wiggle room.
Without a lot of margin on loss budget, designing the channel based on the manufacturer’s typical loss values can be a huge risk. First of all, there is no industry definition for “typical”—we have no idea if this is the value you’ll get 50% of the time or 90% of the time. Second of all, some manufacturers may have a significant variance between their typical and maximum loss values—perhaps even more than a quarter of a dB. In a 4-point architecture, that’s a difference of 1 dB.
Imagine what happens when you design a channel based on the low typical values. While it might appear that you have plenty of margin to add another connector or extend the distance, there is no guarantee that the typical loss value is what you’re going to get 100% of the time.
And if there’s a big variance between typical and maximum values, you could be looking at designing a channel based on values that are far lower than what you end up with once the equipment is in place—final testing isn’t exactly when you want to find out that the channel is more than 1 dB over budget.
Designing a channel based on the maximum loss limits specified by the manufacturer is truly the only way to prevent ending up over budget and ensure the likelihood of a system warranty. Without these hard upper limits, there is no way to know what your actual margin will be until you test the system.
Remember, when working with loss budgets that are as small as 1.5 dB to 2.6 dB, a difference of 0.5 dB in loss measurements can mean the difference between a pass and a fail—and the difference between a network that supports the application and one that experiences higher-than-normal bit error rates once the active equipment is up and running.
Designing with maximum insertion loss values will give you the wiggle room you need to avoid having the common stresses of installation, over-bending and frequent patch cord manipulation push the loss too high and impact performance.
However, while maximum insertion loss values are recommended for designing the channel, low insertion loss values means that you’ll likely have plenty of headroom. And Belden’s FiberExpress systems offers the lowest maximum insertion loss of just .15 dB for OM4 LC connectors and .2 dB for OM4 MPO connectors.
Dwayne Crawford has more than 20 years of experience in the datacomm industry. He has served on several international standards committees to advance high-performance/low-latency protocols (such as IEEE-1394, GigE Vision and CameraLink) used in real-time image processing and utilizing high-performance computing platforms.