Software developers Panic are working on new AV software and are investigating Apple’s new-ish Lightning Digital AV Adapter. What they found is that unlike the previous 30-pin module, the Lightning adapter doesn’t carry a native 1080P signal. In fact, when mirroring, Apple says the optimum resolution is 1600 × 900 and when that signal is shown on a 1080P display, it is likely up-converted, showing artifacts consistent with streaming and uncompressing video data.
So when they split open the cable and there is a full ARM processor with 256MB of RAM to process video signals inside the adapter cable. Panic thinks that they are actually streaming an AirPlay network signal through the cable and it is being decoded by the ARM processor.
Why would Apple do this? It’s likely Apple wants to move people to AirPlay wireless streaming to Apple TV so this is just a stopgap solution. Rather than making a larger Lightning cable, it sacrificed on wired video out quality as well as HDMI (And VGA?) cable costs.
Is this being sneaky or clever?
