Power over Ethernet is the enabling technology for many applications, like Voice over IP telephones. However, as the power of the technology is becoming more understood it is now driving applications such as CCTV, Wireless Access Points, Asset Tracking and Access Control, amongst others.
As I have always understood, ‘nothing is free’, there is always a trade off, and this applies with PoE. Where there is power there is always heat and careful design considerations need to be taken to avoid issues created by the build up of heat within bundles of cables. A basic analysis that was carried out within an ISO Technical Report (TR) indicated that the heat build up could be quite significant, but where that document fell down was the fact that the bundles they tested where in ‘free space’ with natural air flow around them, rather than the more realistic scenario, where there would be bundles upon bundles within closed containment and no opportunity for the heat to dissipate.
The one thing that must be considered is the fact that one of the reasons for increases in attenuation (signal decrease over the distance of the channel) is the increase in temperature, a factor that is outlined within the standards. For every degree over 20C the system performance should be de-rated by 0.2%. Whilst this does not appear to be very much, some of the temperature increases the TR discussed was in the region of 16-20C above the ambient.
Just put a thermometer in most offices and that ambient will be somewhere between 22 and 24 degrees, so we are already operating above the 20C norm. Therefore the 90 and 100m rules go out of the window as other factors will come to bear that impact the performance of the system and the distance it can support the applications.
26AWG cable is probably the biggest single factor on why it should not be used outside of the Data Centre. If it is deployed in the Commercial Office environment, someone is going to want to, quite naturally, power phones and other devices, if not now certainly in the future. However, please note, both IEEE 802.3af and IEEE 802.3at call for Horizontal cable that is a minimum of Cat5 and above. The component standard referenced for this category of cable calls for a 0.5mm conductor size. 26AWG is 0.4mm.
Simple electrical principles tell us the smaller the conductor, the hotter it will get for the same amount of power. We are already reducing the supported distances due to higher attenuation, when we start to add the increase in temperature, the subsequent increase in attenuation will not be a straight line rise it will have more like a hockey stick in appearance and no one has done any research on this. For the simple reason, it is outside what IEEE 802.3af/at calls for. So why bother? You may save some pennies/cents but against an unknown risk.
As a footnote, the one thing both the TR and IEEE 802.3at agree on is that it is better to use a screened cable, as it has better thermal characteristics and can dissipate the heat far better than an unscreened cable.