
In the previous edition of Technology Mythology we talked about bad ideas and misconceptions that become “common knowledge” and end up being repeated unchallenged. In time they end up in a near-permanent state, even though they may be out of context, misunderstood, or all or partially wrong. Here are a few (more) examples.
Maybe the Cable Went Bad
Nope, wires and cables do not just “go bad.” This is one of my favorite explanations, or questions, from clients when a problem occurs in their systems. Cables can fail due to damage or misuse. They can be cut, stripped, burned up, pulled apart, shorted out or bitten through (cue Monty Python’s “Ballad of Sir Robin”), but they don’t just fail while sitting quietly doing their job. They are passive devices–no circuits, no power source, no heat.
Okay, yes, cables can heat up. Pushing too much current through too small a wire is a recipe for heating and possible failure (in an extreme scenario). But I’d call that misuse, not normal conditions. Cables carrying power, like AC cords, DC adapters, and ethernet with POE. should be sized for the application. Ask any electrician about circuit loads and wire gauges in buildings; there’s a reason this is spelled out in the National Electrical Code.
In AV, I would argue that the most likely place for a failure is where a connector attaches to a wire. Avoiding stress failures is a primary reason to maintain good installation practices like supporting long runs and having strain relief at connection points.
But things have become trickier with the use of fiber optics and very high speed ethernet because of bend radius and length limitations. Fiber, CAT6 and CAT7 cable still don’t “go bad” on their own, but if they are handled or installed poorly they could cause problems without being obviously damaged. As frequencies or data rates get higher, the infrastructure parameters become more demanding. The physics say that what worked okay at lower data rates may not hold up as rates increase.
Try to push 10Gb ethernet through infrastructure that was built for 1Gb. It might work, maybe in some areas. Or the network might start to misbehave. Did the cables suddenly go bad? I don’t think so.
The Mystery of DMX Cabling
If you use DMX control for lighting systems you may have encountered a pervasive attitude among lighting pros and manufacturers that makes it sound like DMX wiring is uniquely demanding. Cable manufacturers produce specific DMX cables, some product warranties may be voided if the “wrong” cable is used, warnings are given not to use audio or Category cable for DMX! Is DMX really that special?
Nope, it’s just serial data. The DMX512 protocol, standardized by ANSI, is based on EIA/TIA-485 data transmission, commonly known as RS-485. This specifies asynchronous, differential data, in one direction, on a twisted pair of wires. In other words, it’s like balanced audio in the sense of rejecting noise picked up on the cable. DMX officially supports up to 32 devices passively daisy-chained, usually with each device having an input and output (looping) connector. The end of a chain must have a terminating resistance of 110-120 ohms, provided by the final device or a terminating plug.
The DMX specification calls for cable with 110-ohm impedance, and the use of 5-pin XLR connectors. But since only pins 1, 2, and 3 are used in the standard, 3-pin XLRs also show up on some equipment. I’m not a fan of this “cheat” because those same connectors are already used for audio, and sometimes power. But there is nothing electronically wrong about it.
Some equipment uses RJ45 jacks for DMX, which saves space. It also invites the use of standard ethernet cables (four twisted pairs with RJ45 on each end) to connect devices. It’s popular enough that manufacturers sell adapters to go between XLR-5 and RJ45 in various directions. Is this a disaster waiting to happen?
Well, no. For starters, the data rate of DMX512 is only 250KHz (that’s ¼ of a MHz). There was a time when this was quite speedy, but not anymore. Having said that, it’s still very high frequency compared to analog audio, so using audio cable for DMX requires some caution. By contrast, the old 10BaseT ethernet on CAT5 cable was 10Mhz, so the twisted pairs and impedance of CAT5 or CAT6 cable are appropriate for this application.
As with any data signal, which is essentially high-frequency square waves, the combined effects of cable, connectors, and distance cause rounding of the square edge transitions, and smearing (jitter). So the other factor that comes into play is how well the DMX data is recovered by the receiver circuits in the lights or dimmers, which is not easy to know.
I suspect that manufacturers are concerned that people might try to use crappy cables, too many devices, or too much total distance. And when problems occur the equipment will get blamed. So they produce stringent specifications and dire warnings. And users who don’t understand serial data communication (including many lighting pros) adopt and repeat these ideas until they become gospel, and too “risky” to challenge. Technology Mythology in action!
In the end, there is no magic here, just sensible application of good practices. For a small number of lights, without much distance between devices, I would expect any decent twisted-pair cable to be fine. As systems get larger, in device quantity and layout, using appropriate cable and minding the length of runs becomes important. Using active splitters often makes sense, both to shorten daisy chains and add flexibility for how cables are laid.
DMX also requires termination at the end of each daisy-chain, which can be provided by a switch on the fixture or a plug-in terminator. The termination helps prevent signal reflections that can muddle the data and cause flaky behavior. Some setups may work fine without termination, but it’s good practice to terminate! Also note that RDM, an additional protocol for two-way communication between controllers and fixtures, runs on the same three XLR pins already connected for DMX.
So, for example, in a small TV studio, I might use an active splitter on the grid to make four or six chains from the lighting console output and use those for groups of lights that are functionally related, or physically close. I’d probably buy “DMX” cables since they are readily available and no more expensive than good audio cables. If cables need to be made on site, I’d likely use CAT6 (double-up a couple of twisted pairs so that one wire from each pair is on pin 2, and one on pin 3). Single-pair audio cable could also work for a single short run, say from the lighting board to the splitter. Call me a rebel!
More Resolution is Better
This is an area where client perception may be out of sync with actual needs. Marketing and jargon have overtaken consumer technology to the point that many—perhaps most—people outside of particular tech industries don’t really know what things mean, or when something matters. So they come to us with requirements based on hearsay or misunderstanding.
A great example is the demand for UHD/4K video. If it’s not 4K it’s not good enough! There are reasons to use 4K cameras, such as needing to capture more detail, or the ability to pull HDquality images from a larger scene. But what is the end purpose of the video? In many cases 4K capture offers no benefit over HD because nobody will see the full 4K resolution in any context. This is particularly true for video delivered to streaming platforms. And the price paid in increased data rates and file sizes is huge!
On the display end, a great number of LCD and OLED panels now made are “natively” UHD, but I sometimes look for native HD monitors if I know that the source material will be HD. Granted the scaling quality inside displays might make this unnecessary, but let’s say you’re building a quad-display video wall, using a single UHD input signal (3840×2160). Four HD displays matches those pixel dimensions exactly. Why use 4K displays? Is there any visual improvement? Could it be seen from the viewing distance? Touting 4K displays may sound impressive to someone, but sometimes it’s just talk.
Another example is displays used on stages or studio sets. If the scene is being shot with HD cameras, at various distances and zooms, what is gained by using a 4K display (or actual 4K content) on a 50” monitor behind the host? Or even a 90” monitor? The cameras cannot see that detail. Even 4K cameras won’t see that detail unless they are zoomed right into the display (at which point you’ll be fighting other artifacts anyway).
Going above 4K is for cinema projects and science, IMHO. You need enough pixels to present viewable content from the expected distance. Using more pixels may not accomplish anything. Interestingly, research has shown that what does benefit the viewer is higher dynamic range (more contrast) and wider color gamut (more possible colors). Both of these technologies are available, to some extent, in many professional and consumer products, and they add almost nothing to the data payload. As some video engineers say, “Not more pixels, better pixels!”