How come when I take a picture of a computer screen...
Movies/films run at 24fps - the only exception to this are films and projectors used in special venue situations, such as amusement park rides.
TV on the other hand is different - their frame was originally tied to the AC power frequency due to technologically limitations, but these days that isn't necessary.
I've heard that fighter pilots can recognize a fighter image that flash for less than 1/200s (which was the limit of the test equipment at the time) so it's quite possible human can see hundreds of hz. However, screens with that high refresh rate will be quite expensive even today and technically impossible to make at the time TV is standardized.
Our brain will usually start interpreting a series of images as motion at around 15fps. That set the absolute lower limit and the standards opt for something higher. Also note that those 24/25/30fps figures are interlaced. It's not because we can't make progressive screen at that time. It's because while we will see 30fps as motion, it's still not smooth enough for most people. We effectively double the refresh rate with the same bandwidth by updating only half the screen at a time.
You may want to look up fluorescent light induced headaches. For incandescent light bulbs, the filament will continue to produce light for a little while without power, so it's usually not a problem. But some people really can't stand fluorescent light.
While some people can have fluorescent light induced headaches, most people wouldn't say they flicker unless a ballast is going bad.
And if you think an incandescent light works for a little while after the power goes off, I suggest you go to the nearest light switch with a stopwatch and flip the switch.
If you can still see the stopwatch, you'll find out rather rapidly they go dark fairly quickly.
And if you think an incandescent light works for a little while after the power goes off, I suggest you go to the nearest light switch with a stopwatch and flip the switch.
If you can still see the stopwatch, you'll find out rather rapidly they go dark fairly quickly.
You are pretty closed minded. LOTS of us autists are sensitive to that. Its a pretty well documented phenomenon.
Speed shooters, pilots and certain types of athletes process much faster that 1/30th of a second.
And CloudWalker said that the discharge time for a incandescent filament smooths out the flicker. He wasnt saying just turn it off. Its simple physics. The filament is white hot and stays bright for milliseconds after the electricity ceases. Other types of bulbs(such as strobes) have a much faster charge/discharge time.
_________________
davidred wrote...
I installed Ubuntu once and it completely destroyed my paying relationship with Microsoft.
I don't usually perceive fluorescent light flicker, or have a difficulty with screen refresh rates, but I can easily detect flicker and tell which displays have the slower refresh rates, simply by turning my head or swivelling my eyes. I see the same thing from a train or car travelling through a tunnel of fluorescent lights. If you set a camera to a 1/10 sec exposure and pan across a fluorescent light then you will see stripes on the exposure.
Presumably some people perceive this more easily than I do, because I have to try to see it, and only really care when it either messes up or adds to my photographs.
I never met an Aspie that didn't think he was right about everything.
You may dispute that, but I'm pretty sure I'm right!
Right back atcha. You are doing a great job of proving your own point.
_________________
davidred wrote...
I installed Ubuntu once and it completely destroyed my paying relationship with Microsoft.
Film (35mm and 70mm) is not interlaced - however there is a shutter between the light source (typically a xenon arc lamp) and the film. The primary purpose of the shutter is to block-out the light momentarily while the film is pulled down one frame. The shutter in most projectors however also blocks-out light at other times to reduce flicker. Typical projectors have a two or three bladed shutter. While three bladed shutters produce less flicker, they also reduce the amount of light that reached the screen.
Interlaced TV standards of 25 and 30 are for the total number of frames, but the number of fields are 50 and 60 respectively. As I stated before these were used due to technological limitations at the time. Newer equipment compatible with the various high-definition standards is capable of creating and using progressive (i.e. non-interlaced) images. The exact formats supported will depend on the equipment, but you'll typically get 24, 25, and 30fps with some higher end and 3D compatible stuff able to do 60fps. Now the other thing with TVs is that even though an image is a full frame progressive and not interlaced, the TV still refreshes the screen to reduce flicker - similar to a shutter in a projector. So a flat panel that only does 60Hz you basically have each full frame image repeated twice, one that does 120Hz shows it four times, and so on.
-------------------------------
Fluorescent lights may have a noticeable flicker and they may not. Many also will flicker when first turned on until warmed up. Older (magnetic) ballasts use transformers and operation at line frequency, so depending on the country they can flicker at 50 or 60Hz. Newer ballasts use switching power supplies and can generate just about any frequency, reducing and/or eliminating flicker.
Right, almost forgot. But there are also 48p system like IMAX HD and Maxivision 48. My point is that just because NTSC is 60i, it doesn't mean that is the limitation of human vision. The standard actually prove that human vision is >30Hz because otherwise no one will bother with 60i.
In my post, I was thinking about LCDs that doesn't have flicker rate, so I approach it with motion instead of flicker. The need of double/triple rate shutter speed again reinforce the fact that we can see higher than 24p.
Actually LCD using CCFL for backlight still has flicker. The CCFL usually operates at 4x+ line frequency. I haven't seen any company advertising this spec of their FLBs. Given the price, I think it's safe to assume frequency much much much lower than 4x. I personally don't have problem with FLB. I just found it unfair to dismiss people who are adversely affected.
I was under the impression that by the time of color TV progressive screen wasn't the problem, bandwidth was. But then that's just my impression. Or is it the limitation of B&W TV that the standard strived to be backward compatible with?
HD standard supports 60p source. It's still unofficial on ATSC but it is already on Blu-ray. The reason it's not common is again bandwidth. And higher end TVs are now 120Hz. First with interlaced material, it's deinterlaced to double rate. It's then interpolated with motion estimation if the frequency is still under what the screen is capable of. 3D necessitated 2x frame rate because there are 1 image for each eye.
Unless you are talking about projectors, there's no shutter needed for flat panel. The 60/120Hz TV used the extra frame rate for interpolation to improve motion smoothness.
You're right, SOME people can. But would you say "most"?
This site is going to be great. A community of know it alls.
This site has always been great. Personally I'm only against disinformation like "those who can must live in a cave."
Similar Topics | |
---|---|
Duck Hunt for NES and modern flat screen TVs. |
Yesterday, 2:19 pm |
Edible computer and electronics |
17 Apr 2024, 5:34 am |