Quote:
Originally Posted by theducks
Old USA TV was 30 frames per second. Any educated spy should know where to find the vertical sync' pulse and use that as a trigger
Digital TV may only 'paint' changes so that sync trick won't work  (beside, all the circuit points are now in a single chip)
|
Um, you used a suitable shutter speed. I've photographed US & European video without bands.
The current displays and how DTV is transmitted (changes & key frames) are very different. LCD & OLED have persistance for different reasons and rather different to CRT scanning.
I experimented with my SD camcorder and also my 20M pixel DSLR, which I didn't realise does true 1920x1080 video recording too till I got it (various frame rates). A 4K (3840 x 2160 as there are various 4K) screen is about 8.3M pixels.
I pointed the DSLR in video mode at the 4K computer screen. It made no difference if the GPU/GFX output at 24 fps or 60 fps. Nor did it matter which HD mode the camera used. The resulting recording had no banding or flicker or artefacts.
I'd say most current TV screens or monitors have their own internal native refresh and convert. As to how the 20 M pixel camera sensor is downsampled to 2M pixel? The 1920x1080 HD. Alternate FW does enable 4K recording but the electronics overheats. You can only do prolonged HD recording in a reasonable ambient temp. About 6 hours of it on an SD card.