+- +-

+-User

Welcome, Guest.
Please login or register.
 
 
 
Forgot your password?

+-Stats ezBlock

Members
Total Members: 39
Latest: robbrogers
New This Month: 0
New This Week: 0
New Today: 0
Stats
Total Posts: 7993
Total Topics: 220
Most Online Today: 4
Most Online Ever: 48
(June 03, 2014, 03:09:30 am)
Users Online
Members: 0
Guests: 2
Total: 2

Author Topic: Frame Rate Perception  (Read 616 times)

0 Members and 0 Guests are viewing this topic.

AGelbert

  • Administrator
  • Hero Member
  • *****
  • Posts: 7823
  • Location: Colchester, Vermont
    • View Profile
    • Agelbert Truth AND Consequences
Frame Rate Perception
« on: October 10, 2013, 01:57:07 am »
Bees can find the most efficient route between flowers faster than a supercomputer can.

Bees navigate by recognizing patterns and symmetry, and although they have very small brains, they are one of the most efficient species in terms of navigation, scientists have found. In fact, research shows that bees are better than even supercomputers at finding the shortest route between many flowers without visiting the same flower twice.
The problem-solving abilityof bees is thought to be because of the bees’ need to preserve as much
energy as possible to find food and make their way home.

More about bees:
Honey bees might fly as far as 6 miles (9.7 km) away from their hives to find food.

Other than humans, bees are thought to be the only species that communicates with symbolic language or about things that are not present at the time. Bees use “dances” to communicate with one another.

Bees have three sets of eyes and are able to sense movements that are one-300th of a second apart, or the equivalent of individual a single frame of film during a movie.

http://www.wisegeek.com/how-do-bees-navigate.htm



                   



Leges         Sine    Moribus     Vanae   
Faith,
if it has not works, is dead, being alone.

RE

  • Administrator
  • Newbie
  • *****
  • Posts: 6
    • View Profile
    • Doomstead Diner
Re: Bees are Smart
« Reply #1 on: October 10, 2013, 02:03:44 am »
Bees have three sets of eyes and are able to sense movements that are one-300th of a second apart, or the equivalent of individual a single frame of film during a movie.

A standard Movie goes at 30 FPS not 300.  So Bees are either an order of magnitude faster, or you dropped an extra Zero in by accident.

RE

Surly1

  • Administrator
  • Jr. Member
  • *****
  • Posts: 85
    • View Profile
Re: Bees are Smart
« Reply #2 on: October 10, 2013, 05:44:04 am »
Film has traditionally played at 24 frames per second (fps); it is video which plays out at 30 fps.

When they started showing films on TV the difference in frame rates was a problem. The obvious solution is to simply speed up the film to match the television frame rates, but this, at least in the case of NTSC (the American TV standard), is rather obvious to the eye and ear. This problem is not difficult to fix, however; the solution being to periodically play a selected frame twice. For NTSC, the difference in frame rates can be corrected by showing every fourth frame of film twice, although this does require the sound to be handled separately to avoid "skipping" effects. A more convincing technique is to use "2:3 pulldown", discussed below, which turns every second frame of the film into three fields of video, which results in a much smoother display.

http://en.wikipedia.org/wiki/Telecine




Surly1

  • Administrator
  • Jr. Member
  • *****
  • Posts: 85
    • View Profile
Re: Bees are Smart
« Reply #3 on: October 10, 2013, 07:35:45 pm »
And you definitely would not want bees in your telecine.

I would kick me in the ass just for hijacking this thread. This is what you get when you're online at 5AM before coffee is brewed...

HAH!

RE

  • Administrator
  • Newbie
  • *****
  • Posts: 6
    • View Profile
    • Doomstead Diner
Re: Bees are Smart
« Reply #4 on: October 11, 2013, 12:19:08 am »
Film has traditionally played at 24 frames per second (fps); it is video which plays out at 30 fps.

I knew about this difference, I just always forget which was which.

RE

AGelbert

  • Administrator
  • Hero Member
  • *****
  • Posts: 7823
  • Location: Colchester, Vermont
    • View Profile
    • Agelbert Truth AND Consequences
Re: Bees are Smart
« Reply #5 on: October 11, 2013, 06:19:14 pm »
RE,
Wisegeek seems to have dropped the ball on the bee response time.  :P

Surly,
Thanks for the info on frame rates in movies. I remembered dimly it had something to do with how many 'frames' human eyes see per second but I'm certain projector speed and film manufacturing technology limitations affected the final decision on movie frame rates.

I'm going to look into human visual acuity to get the scoop on how well we see.
Leges         Sine    Moribus     Vanae   
Faith,
if it has not works, is dead, being alone.

AGelbert

  • Administrator
  • Hero Member
  • *****
  • Posts: 7823
  • Location: Colchester, Vermont
    • View Profile
    • Agelbert Truth AND Consequences
Re: Bees are Smart
« Reply #6 on: October 11, 2013, 11:30:13 pm »
Surly and RE,
Here's what I dug up on human frame rate acuity. It's kind of long but I find it quite interesting. 



Human Eye Frames Per Second

  02/21/2001 10:30:00 AM MST Albuquerque, Nm
  By Dustin D. Brand; Owner AMO


How many frames per second can our wonderful eyes see?
   

 
    This article is dedicated to a friend of mine, Mike.

   There is a common misconception in human thinking that our eyes can only interpret 30 Frames Per Second. This misconception dates back to the first human films where in fact a horse was filmed proving actually that at certain points they were resting on a single leg during running. These early films evolved to run at 24 Frames Per Second, which has been the standard for close to a century.

   A Movie theatre film running at 24 FPS (Frames Per Second) has an explanation. A Movie theatre uses a projector and is projected on a large screen, thus each frame is shown on the screen all at once. Because Human Eyes are capable of implementing motion blur, and since the frames of a movie are being drawn all at once, motion blur is implemented in such few frames, which results in a lifelike perceptual picture. I'll explain the Human Eye and how it works in detail later on in this multi-page article.

   Now since the first CRT TV was released, televisions have been running at 30 Frames Per Second. TV's in homes today use the standard 60Hz (Hertz) refresh rate. This equates to 60/2 which equals 30 Frames Per Second. A TV works by drawing each horizontal line of resolution piece by piece using an electron gun to react with the phosphors on the TV screen. Secondly, because the frame rate is 1/2 the refresh rate, transitions between frames go a lot smoother. Without going into detail and making this a 30 page article discussing advanced physics, I think you'll understand those points.

   Moving on now with the frame rate. Motion blur again is a very important part to making videos look seamless. With motion blur, those two refreshes per frame give the impression of two frames to our eyes. This makes a really well encoded DVD look absolutely incredible. Another factor to consider is that neither movies or videos dip in frame rate when it comes to complex scenes. With no frame rate drops, the action is again seamless.

   Computer Games and their industry driving use of Frames Per Second
   
  It's easy to understand the TV and Movies and the technology behind them. Computers are much more complex. The most complex being the actual physiology /neuro-ethology of the visual system. Computer Monitors of a smaller size are much more expensive in cost related to a TV CRT (Cathode Ray Tube). This is because the phosphors and the dot pitch of Computer Monitors are much smaller and much more close together making much greater detail and much higher resolutions possible. Your Computer Monitor also refreshes much more rapidly, and if you look at your monitor through your peripheral vision you can actually watch these lines being drawn on your screen. You can also observe this technology difference by watching TV where a monitor is in the background on the TV.

   A frame or scene on a computer is first setup by your video card in a frame buffer. The frame/image is then sent to the RAMDAC (Random Access Memory Digital-Analog-Convertor) for final display on your display device. Liquid Crystal Displays, and FPD Plasma displays use a higher quality strictly digital representation, so the transfer of information, in this case a scene is much quicker. After the scene has been sent to the monitor it is perfectly rendered and displayed. One thing is missing however, the faster you do this, and the more frames you plan on sending to the screen per second, the better your hardware needs to be. Computer Programmers and Computer Game Developers which have been working strictly with Computers can't reproduce motion blur in these scenes. Even though 30 Frames are displaying per second the scenes don't look as smooth as on a TV. Well, that is until we get to more than 30 FPS.

   NVIDIA a computer video card maker who recently purchased 3dFx another computer video card maker just finished a GPU (Graphics Processing Unit) for the XBOX from Microsoft. Increasing amounts of rendering capabilities and memory as well as more transistors and instructions per second equate to more frames per second in a Computer Video Game or on Computer Displays in general. There is no motion blur, so the transition from frame to frame is not as smooth as in movies, that is at 30 FPS. In example, NVIDIA/3dfx put out a demo that runs half the screen at 30 fps, and the other half at 60 fps. The results? - there is a definite difference between the two scenes; 60 fps looking much better and smoother than the 30 fps.

   Even if you could put motion blur into games, it would be a waste. The Human Eye perceives information continuously, we do not perceive the world through frames. You could say we perceive the external visual world through streams, and only lose it when our eyes blink. In games, an implemented motion blur would cause the game to behave erratically; the programming wouldn't be as precise. An example would be playing a game like Unreal Tournament, if there was motion blur used, there would be problems calculating the exact position of an object (another player), so it would be really tough to hit something with your weapon. With motion blur in a game, the object in question would not really exist in any of the places where the "blur" is positioned, that is the object wouldn't exist at exactly coordinate XYZ. With exact frames, those without blur, each pixel, each object is exactly where it should be in the set space and time.

   The overwhelming solution to a more realistic game play, or computer video has been to push the human eye past the misconception of only being able to perceive 30 FPS. Pushing the Human Eye past 30 FPS to 60 FPS and even 120 FPS is possible, ask the video card manufacturers, an eye doctor, or a Physiologist.We as humans CAN and DO see more than 60 frames a second.

    With Computer Video Cards and computer programming, the actual frame rate can vary. Microsoft came up with a great way to handle this by being able to lock the frame rate when they were building one of their games (Flight Simulator).

   The Human Eye and it's real capabilities - tahDA!
   
    This is where this article gets even longer, but read on, please. I will explain to you how the Human Eye can perceive much past the misconception of 30 FPS and well past 60 FPS, even surpassing 200 FPS.

   We humans see light when its focused onto the retina of the eye by the lens. Light rays are perceived by our eyes as light enters - well, at the speed of light. I must stress the fact again that we live in an infinite world where information is continuously streamed to us.

Our retinas interpret light in several ways with two types of cells; the rods and the cones. Our rods and cells are responsible for all aspects of receiving the focused light rays from our retinas. In fact, rods and cones are the cells on the surface of the retina, and a lack thereof is a leading cause of blindness.

   Calculations such as intensity, color, and position (relative to the cell on the retina) are all forms of information transmitted by our retinas to our optic nerves. The optic nerve in turn sends this data through its pipeline (at the nerve impulse speed), on to the Visual Cortex portion of our Brains where it is interpreted.

   Rods are the simpler of the two cell types, as it really only interprets "dim light". Since Rods are light intensity specific cells, they respond very fast, and to this day rival the quickest response time of the fastest computer.

Rods control the amount of neurotransmitter released which is basically the amount of light that is stimulating the rod at that precise moment. Scientific study has proven upon microscopic examination of the retina that there is a much greater concentration of rods along the outer edges. One simple experiment taught to students studying the eye is to go out at night and look at the stars (preferably the Orion constellation) out of your peripheral vision (side view). Pick out a faint star from your periphery and then look at it directly. The star should disappear, and when you again turn and look at it from the periphery, it will pop back into view.

AGelbert note: This is why pilots are trained to look at runway lights peripherally and not fixate when making night landings.

   Cones are the second retina specialized cell type, and these are much more complex. Cones on our retinas are the RGB inputs that computer monitors and graphics use. The three basic parts to them absorb different wavelengths of light and release differing amounts of different neurotransmitters depending on the wavelength and intensity of that light. Think of our cones as RGB computer equivalents, and as such each cone has three receptors that receive red, green, or blue in the wavelength spectrum. Depending on the intensity of each wavelength, each receptor will release varying levels of neurotransmitter on through the optic nerve, and in the case of some colors, no neurotransmitter. Due to cones inherent 3 receptor nature vs 1, their response time is less than a rods due to the cones complex nature.

   Our Optic nerves are the visual information highway by which our lens, then retina with the specialized cells transmit the visual data on to our Brains Visual Cortex for interpretation. This all begins with a nerve impulse in the optic nerve triggered by rhodopsin in the retina, which takes all of a picosecond to occur. A picosecond is one trillionth of a second, so in reality, theoretically, we can calculate our eyes "response time" and then on to theoretical frames per second (but I won't even go there now). Keep reading.

   The optic nerves average in length from 2 to 3 centimeters, so its a short trip to reach our Visual Cortex. Ok, so like the data on the internet, the data traveling in our optic nerves eventually reaches its destination, in this case, the Visual Cortex - the processor/interpreter.

   Unfortunately, neuroscience only goes so far in understanding exactly how our visual cortex, in such a small place, can produce such amazing images unlike anything a computer can currently create. We only know so much, but scientists have theorized the visual cortex being a sort of filter, and blender, to stream the information into our consciousness. We're bound to learn, in many more years time, just how much we've underestimated our own abilities as humans once again. Ontogeny recapitulates phylogeny (history repeats itself).

   There are many examples to differentiate how the Human Visual System operates differently than say, an Eagles. One of these examples includes a snowflake, but let me create a new one.

   You're in an airplane flying looking down at all the tiny cars and buildings. You are in a fast moving object, but distance and speed place you above the objects below. Now, lets pretend that a plane going 100 times as fast quickly flies below you, it was a blur wasn't it?

   Regardless of any objects speed, it maintains a fixed position in space time. If the plane that just flew by was only going say, 1 times faster than you, you probably would have been able to see it. Since your incredible auto focus eye had been concentrated on the ground before it flew below, your visual cortex made the decision that it was there, but, well, moving really fast, and not as important. A really fast camera with a really fast shutter speed would have been able to capture the plane in full detail. Not to limit our eyes ability, since we did see the plane, but we didn't isolate the frame, we streamed it relative to the last object we were looking at, the ground, moving slowing below. 

 Our eyes, technically, are the most advanced auto focus system around - they even make the cameras look weak. Using the same scenario with an Eagle in the passenger seat, the Eagle, due to its eyes only using Rods, and its distance to its visual cortex being 1/16 of ours wouldn't have seen as much blur in the plane. However, from what we understand of the Visual Cortex, and Rods and Cones, even Eagles can see dizzy blurry objects at times.

   What is often called motion blur, is really how our unique vision handles motion, in a stream, not in a frame by frame. If our eyes only saw frames (IE: 30 images a second), like a single lens reflex camera, we'd see images pop in and out of existence and that would really be annoying and not as advantageous to us in our three dimensional space and bodies.

   So how can you test how many Frames Per Second we as Humans can see?

   My favorite test to mention to people is simply to look around their environment, then back at their TV, or monitor. How much more detail do you see vs your monitors? You see depth, shading, a wider array of colors, and its all streamed to you. Sure, we're smart enough to use a 24 frame movie and piece it together, and sure we can make real of video footage filmed in NTSC or PAL, but can you imagine the devices in the future?

   You can also do the more technical and less imaginative tests above, including the star gazing, and this tv/monitor test. A TV running at only 30 FPS is picking up a Computer monitor in the background in its view, and with the 30 FPS TV Output you see the screen refreshes on the computer monitor running at 60 FPS. This actually leads to eyestrain with computer monitors but has everything to do with lower refresh rates, and not higher.

   Don't underestimate your own eyes Buddy...
   We as humans have a very advanced visual system, please understand that a computer with all it's processor strength still doesn't match our own brain, or the complexity of a single Deoxyribonucleic Acid strand.

While some animals out there have sharper vision than us humans, there is usually something given up with it - for eagles there is color, and for owls it is the inability to move the eye in its socket. With our outstanding human visual, we can see in billions of colors (it has been tested that women see as much as 30% more colors than men do). Our eyes can indeed perceive well over 200 frames per second from a simple little display device (mainly so low because of current hardware, not our own limits).

Our eyes are also highly movable, able to focus in as close as an inch, or as far as infinity, and have the ability to change focus faster than the most complex and expensive high speed auto focus cameras. Our Human Visual system receives data constantly and is able to decode it nearly instantaneously. With our field of view being 170 degrees, and our fine focus being nearly 30 degrees, our eyes are still more advanced than even the most advanced visual technology in existence today.

   So what is the answer to how many frames per second should we be looking for? If current science is a clue, its somewhere in sync with full saturation of our Visual Cortex, just like in real life. That number my friend - is - well - way up there with what we know about our eyes and brains.

   It used to be, well, anything over 30 FPS is too much. (Is that why you're here, by chance?) :) Then, for a while it was, anything over 60 is sufficient. After even more new video cards, it became 72 FPS. Now, new monitors, new display types like organic LEDS, and FPDs offer to raise the bar even higher. Current LCD monitors response rates are nearing the microsecond barrier, much better than millisecond, and equating to even more FPS.

   If this old United States Air Force study is any clue to you, we've only scratched the surface in not only knowing our FPS limits, and coming up with hardware that can match, or even approach them.

   The USAF, in testing their pilots for visual response time, used a simple test to see if the pilots could distinguish small changes in light. In their experiment a picture of an aircraft was flashed on a screen in a dark room at 1/220th of a second. Pilots were consistently able to "see" the afterimage as well as identify the aircraft. This simple and specific situation not only proves the ability to perceive 1 image within 1/220 of a second, but the ability to interpret higher FPS.

   This article was updated: 7/27/2002 due to its popularity and to reflect in more detail the science involved with our eyes and their ability to interpret more than 60 FPS.

   To Mike (and everyone else), from Dustin D. Brand...
 
Second Part on next reply
 
Leges         Sine    Moribus     Vanae   
Faith,
if it has not works, is dead, being alone.

AGelbert

  • Administrator
  • Hero Member
  • *****
  • Posts: 7823
  • Location: Colchester, Vermont
    • View Profile
    • Agelbert Truth AND Consequences
Re: Bees are Smart
« Reply #7 on: October 11, 2013, 11:31:32 pm »

Human Eye Frames Per Second 2

  05/24/2001 5:00:05 AM MDT Albuquerque, Nm
  By Dustin D. Brand; Owner AMO

so, just how many frames per second can our human eye see past 100?
 
    In my previous article (Human Eye Frames Per Second), I mentioned I'd write another to settle once and for all just how many frames per second our human eye is capable of seeing, so here we are.

   Motion Blur is so important in movies and TV programming

   In my first article, I mentioned how important motion blur is pertaining to frames per second. On Computers, this is essentially non-existent. Motion blur in movies, which run at 24 frames per second are designed for the big screen projector, which blasts movies to the screen, each frame in it's entirety in the widescreen format one frame at a time. Because each frame is filmed in a certain way, motion blur is used, meaning the frames are not perfectly clear, they contain blur.

   The blur used in todays movies will eventually be replaced by completely digital movies (on very expensive screens, I should know, I worked with the technology at age 16), and with the advent of computer animation in movies, the process of replacing the blur on the film in movies is becoming more and more inevitable.

   Computer's don't work this way (with blur that is), and essentially neither does anything digital. With digital, you either have an exact perfectly clear image, or an exact perfectly blur image like in movies. From the transition from movies to the TV, or DVD digital, an extra 4 frames are added each second in a method called frame mixing, just to match correctly the device it's being displayed on, your TV. NTSC(American) and PAL(European) use different kinds of TV formats, each with different refresh rates and resolutions. 640x480 for NTSC and 800x600 lines for PAL. With HDTV, everything is digital, and essentially 60 frames now, but most of these broadcasts use frame mixing, and until 2006 you won't need to trash your regular TV, though it may be a good idea now.

  As many of you know, pause a DVD film movie during movement, or if you can a TV with your VCR and you'll see the blur (unless the image is static to begin with). Pause an animation DVD, or a cartoon on TV and you won't see the blur. Why is this so? Filmed movies, and Filmed TV shows work by blurring their subjects, actors, actresses, whatever. Filmed movies and TV are not taking a PERFECT snapshot image of the subject, each image is a blur, and a blur to the next giving the impression that everything is moving seamlessly (if nothing is moving in the scene, you can see a static image). In an animation or a cartoon, each frame or image of the 24/30 frames per second is perfect, there is no blur in the image - EVER.

   I touched very briefly on autofocus Cameras, and even the best most expensive cameras not even coming close to matching the capabilities of our human eye in focusing. The professional cameras you see reporters with are capable of taking pictures of EXTREMELY fast moving objects in perfectly still quality at and above 1/4000 of a second. What does a camera being able to take 4000 pictures in a second prove?

   Our infinitely seamless world.

   Professional cameras can take perfectly still pictures without any blur, and like in the case of video cameras, pictures with blur. So where is the limit? How quick can we take a picture, and how slow can we take a picture? SLOW time progressed pictures have been taken, you've probably seem them at night where all the cars tail lights are in a streak. You've probably also seen the "Photo finish" camera's take the winning tell tale sign of a close horse race. What all of this really means is that unless we slow time, or speed it up, there isn't any blur in our world. That is of course unless you're drunk, the room is spinning, or you're on some LSD trip. Ok besides that.

   Images in our world are infinitely streamed to us as I've said before. Living in this 3rd dimension as we do, our eyes able us to see depth/periphery, we can focus in very close, and as far as infinity. So is there really a limit to how many frames per second we can really see with our eyes?

   Our limit, is there one?
   Until someone proves me, all the scientists, optometrists, and the like wrong, there is no limit to how many frames per second our human eye can see. Theoretical limit yes, proven limit, NO.

   Think for just a second how dumb it would be to push the limit on video displays, devices and the like if our eyes couldn't tell the difference between an HDTV and a plain old TV or a Computer monitor and a Plasma display. Ok, in that second how many times do you think your eye "framed" this screen? The number of times the screen refreshed? Nope, the number of times your eye streamed this page to you, it's a number that is potentially infinite, or at least until we understand the complexity of our own mind. Just know that this number is much, much higher than what your monitor is capable of currently displaying to you, that is matching your own interpretation.

   Our Brain is smart enough however to "exact" 24 frames into motion, isn't it ignorant to say we can't distinguish 400, or even 4000 into motion? Heh, the sky's the limit, oh wait, then space...oh wait. Give us more, we notice the difference from 30-60, the difference from 60-120. It is possible the closer we get to our limit, be there one, the harder it is to get there, and there is a theory about this. Someone is across the room. Take one full step towards them. Now 1 half step towards them, then 1 half step of a half step, on and on until your 1 half of each movement you take. Will you ever get there? That my friend is open to debate, but in the mean time, will you take one step towards me?

   The Human Eye perceiving 220 Frames Per second has been proven, game developers, video card manufacturers, and monitor manufacturers all admit they've only scratched the surface of Frames Per Second. With a high quality non-interlaced display (like plasma or a large LCD FPD) and a nice video card capable of HDTV resolution, you can today see well above 120 FPS with a matching refresh rate. With some refresh rates as high as 400Hz on some non-interlaced displays, that display is capable of 400 FPS alone. Without the refresh rate in the way, and the right hardware capable of such fast rendering (frame buffer), it is possible to display as cameras are possible of recording 44,000 Frames Per Second. Imagine just for a moment if your display device were to be strictly governed by the input it was receiving. This is the case with computer video cards and displays in a way with adjustable resolutions, color depth, and refresh rates.

   Test your limit, you tell me...
   Look at your TV, or ANY image device, then look at the device not looking at the image it is displaying, for example the TV itself, or the Monitor itself. Tell me the image on the screen is more clear, more precise than the image of the TV or the monitor itself. You can't, that's why the more frames per second, the better, and the closer to reality it really appears to us. With 3d holograms right around the corner, the FPS subject or maybe 3DFPS will become even more important.

  The real limit is in the viewing device, not our eyes.

   The real limits here are evidenced by the viewing device, not our eyes, we can consistently pick up the flicker to prove that point. In Movies the screen is larger than life, and each screen is drawn instantaneously by the projector, but that doesn't mean you can't see the dust or scratches on each frame. With NTSC and PAL/SECAM TV's, each line is drawn, piece by piece (odd, then even lines) for each frame and refreshes at the Hertz. The frames displaying because of this is exactly the hertz divided by 2 or (odd line 1 hertz then even line 1 hertz). Do a search for high-speed video cameras and you'll find some capable of 44,000+ frames per second, that should give you a clue.

   CRT's be it PC monitors or TV's have to refresh with rates, known as the Hertz. Eye fatigue can happen because of the probe or line effect that happens after prolonged viewing, yes your eye sees this. Switch to your Periphery vision like I gave an example for in my first article and you can see the refresh rate. 60Hz and 50Hz also happens to be the frequency of the main power of the countries that use these Hertz in the TV refresh rates. Because of the way the technology works, by drawing each line individually, your Frame Rate/Refresh rate (not your FPS) is tied to your FPS.

If something is running at 60 FPS however your monitor is at 60 Hertz and is interlaced, which TV's are locked at, you're seeing 30 Frames Per Second. However, if you have a nice computer monitor (NON-INTERLACED), and it's set to 120Hertz (72+ is considered "flicker free"), and your video is running at 120 Frames Per Second, you're seeing exactly 120 Frames Per Second.

You may have heard that LCD's or Liquid Crystal Displays are "flicker free". LCD displays are capable of showing their FPS in a refresh rate, much like non-interlaced monitors are, example 75 Hertz is capable of 75 Frames Per Second. Technically, because an LCD pixel/transistor is either true or false, this technology is not only better, but faster than an electron gun on a phosphor like in a CRT, thus virtually eliminating flicker.

   Technically speaking: NTSC has 525 scan lines repeated 29.97 times per second = 33.37 msec/frame or roughly 30 Frames Per Second at 60Hz BECAUSE it's INTERLACED.
   Technically speaking: PAL has 625 scan lines repeated 25 times per second = 40 msec/frame or exactly 25 Frames Per Second at 50Hz BECAUSE it's INTERLACED.

   So how does 60Hertz relate in HDTV's? Well, with progressive scanning (the XBOX supports this with it's NVidia GPU), each frame is drawn on each pass meaning 60Hz supports 60 Frames Per Second, but as you've learned although the hertz and FPS are related, the hertz of the display does not necessarily mean that it is the frames per second. Frames per second are determined by the display device and how it draws each frame. Normal TV's don't support progressive Scan and thus redraws half the screen on each pass, first draws the odd lines (interlaced), then the even = 30 Frames Per Second maximum.

   As you've seen, it's not our human eyes, it's the display. More on this is the fact between interlaced and non-interlaced monitors. All computer CRT monitors are now made non-interlaced (and have been for quite some time), meaning the entire frame is refreshed at the refresh rate or Hertz. The frame is scanned all at once, thus the refresh rate can equal the Frames Per Second, but the Frames Per Second isn't going to go past the Refresh Rate because it's not possible on the display. Just because a video card is pushing 200 Frames Per Second, your display may be at 100Hz meaning it's only refreshing 100 times per second.

   Thus, the big misconception that our eyes can only see 30 frames or 60 frames per second is purely due to the fact that the mainstream displays can only show this, not that our eyes can't see more. For the time being, the frames per second capable of any display device isn't even close to the phrase "more than meets the eye".

   Definitions of relevance:


   CRT Cathode Ray Tube - The tube or flat tube making up a TV which utilizes an electron gun to manipulate phosphors at the front of the tube for varying color.

   NTSC originally developed in the United States by a committee called the National Television Standards Committee (525 lines).

   PAL standing for Phase Alternate Lines (625 lines)

   FPS - Frames Per Second - A Frame consists of an image completely drawn to a viewing device, example: Monitor

 
 
http://amo.net/NT/02-21-01FPS.html

                         
Leges         Sine    Moribus     Vanae   
Faith,
if it has not works, is dead, being alone.

Surly1

  • Administrator
  • Jr. Member
  • *****
  • Posts: 85
    • View Profile
Re: Bees are Smart
« Reply #8 on: October 12, 2013, 06:29:04 am »
FWIW, the author uses a lot of words to repeat himself a lot and say ver y little.

What I got out of this was that the human eye can "see" up to 1/220 sec. based on USAF testing, yes? (My education taught that military testing was pretty much the gold standard for psych and human endurance testing).

"Motion blur" is a photoshop effect, a category of blur adjustable under the "blur" tool. What the author consistently confuses this with is "continuity of vision" which any child can experiment with using one of those little flip cartoon books. Haven't looked it up but I think Edison actually worked out the frame rate of 24fps. Muybridge's morion studies may also have contributed.

None of this has anything to do with bees, of course.

BTW, LOTS of content in here now, AG. Love what you've done with the place. And RE's additional little tweaks.

Good stuff.

AGelbert

  • Administrator
  • Hero Member
  • *****
  • Posts: 7823
  • Location: Colchester, Vermont
    • View Profile
    • Agelbert Truth AND Consequences
Re: Bees are Smart
« Reply #9 on: October 12, 2013, 11:08:45 pm »
Thanks Surly. I'll keep adding bit by bit.

Agreed about the author being somewhat long winded. He claims that film (not digital film but the exposure type film for movies) actually has burred frames. Am I correct to assume hat he is wrong? His assertion is that the blurring is actually necessary for us to see it smoother than the flip page child frame by frame type collection of still photos.

I saw some of those early stop image animations with puppet soldiers made in the 1940s (out there on u-tube someplace). They take a photo, then move all the puppets and animal figures a tiny bit and then take another picture and so on and so forth. It looks jerky no matter how small the movement.

Doesn't this mean that blur is needed or does it mean we just have to jack up the frame rate to 220/sec? I have been piloting an aircraft when a blur goes by of another aircraft I wasn't focused on. Maybe the blur is just a function of focusing more than speed but it's interesting to think about it.

If they ever figure out how to decode the signals to the brain from the eye, we will get a spectacular camera technology.

I do have a tendency to believe we "stream" rather than shoot a series of still photographs we translate into motion in our brains because: 

1. I remember those strobe lights in the discos several decades ago where each flash shows you a picture of reality but NOBODY, even though they are dancing and jumping around, looks like they are moving!

2. When I look at a physical object versus what is on the screen of a computer or television, the resolution simply does not compare. Reality seems to be a lot more nuanced, pixelated or whatever than a frame by frame series of pictures.

3. Looking through a window is still far better in detail than looking into a digital screen at a movie of looking out a window. Something is still missing (besides 3D).

The only bearing all this has on bees is, well, uh... Give me time, I'll think of something.  ;D

   
Leges         Sine    Moribus     Vanae   
Faith,
if it has not works, is dead, being alone.

Surly1

  • Administrator
  • Jr. Member
  • *****
  • Posts: 85
    • View Profile
Re: Bees are Smart
« Reply #10 on: October 13, 2013, 06:45:29 am »
Agreed about the author being somewhat long winded. He claims that film (not digital film but the exposure type film for movies) actually has burred frames. Am I correct to assume hat he is wrong? His assertion is that the blurring is actually necessary for us to see it smoother than the flip page child frame by frame type collection of still photos.
In a sequence of 24 (or 30) frames, it is possible, even likely, that some of the fames will be blurred due to motion. THis would be a function of the shutter speed of the camera. While the film camera is recording 24 fps, the video camera is recording 30 fps, the shutter speed (which along with lens aperture controls the amount of light reaching the focal plane) can be faster or slower. You wouldn't stop much action at 1/30 second. Sometimes when you ramp up shutter speeds you get some odd staccato-like strobing effects-- think of race video, typically shot at high shutter speeds.

I saw some of those early stop image animations with puppet soldiers made in the 1940s (out there on u-tube someplace). They take a photo, then move all the puppets and animal figures a tiny bit and then take another picture and so on and so forth. It looks jerky no matter how small the movement.

You're talking about stop action animation there, where a series of stills is played together, flip book style to create the illusion of movement.

Doesn't this mean that blur is needed or does it mean we just have to jack up the frame rate to 220/sec? I have been piloting an aircraft when a blur goes by of another aircraft I wasn't focused on. Maybe the blur is just a function of focusing more than speed but it's interesting to think about it.

/
I do have a tendency to believe we "stream" rather than shoot a series of still photographs we translate into motion in our brains because: 

1. I remember those strobe lights in the discos several decades ago where each flash shows you a picture of reality but NOBODY, even though they are dancing and jumping around, looks like they are moving!

2. When I look at a physical object versus what is on the screen of a computer or television, the resolution simply does not compare. Reality seems to be a lot more nuanced, pixelated or whatever than a frame by frame series of pictures.

3. Looking through a window is still far better in detail than looking into a digital screen at a movie of looking out a window. Something is still missing (besides 3D).   

Part of what you are talking about is resolution. HD image sizes are much larger than SD (standard definition) sizes. At the standard data rate for HD (19.4MB/sec. according to memory) you can fit five SD channels into the space occupied by one HD stream. And no, I am not a video engineer, but my job has required I pick up a certain amount of this arcana.

Now the equipment manufacturers are pushing new data-rich lines of image acquisition, 4K and 8K, which besides requiring escalating amounts of memory and computer processing power to handle and manipulate, seem to be nothing so much as a solution shopping for a problem. I hope to be well retired before having to transition another production facility to one of these data monsters. 4K appears to be the developing standard for feature filmmakers using digital acquisition.

For comparison, a chart--


AGelbert

  • Administrator
  • Hero Member
  • *****
  • Posts: 7823
  • Location: Colchester, Vermont
    • View Profile
    • Agelbert Truth AND Consequences

How Does Digital Camera Resolution Compare to the Human Eye?

The resolution of the human eye is estimated to be about 575 megapixels, while the highest digital camera resolution is only about 20 megapixels. Image resolution is generally measured in pixels, or the number of dots it contains, with one million pixels to a megapixel. Therefore, the human eye can theoretically see images with more than 28 times more clarity than the highest-quality digital camera. Scientists believe that human eyes are able to see more pixels simply because there are two eyes to intake more pixels for the brain to process. Also, eyes can quickly move around to view more pixels and send the image signals to the brain.

More about the human eye:

The human eye is made up of more than two million parts that work together. The eyes also are the most active muscle in the body.

On average, the human eye takes in more than 36,000 images every hour.

When there is a lack of light, the eye starts to see images as monochromatic and less detailed in the center. To overcome this, astronomy photographers often look at stars just off-center, to view them more clearly.




Rods see monochromatically and at an angle better than directly. These are used at night or in low light condisitons


http://www.wisegeek.com/how-does-digital-camera-resolution-compare-to-the-human-eye.htm
Leges         Sine    Moribus     Vanae   
Faith,
if it has not works, is dead, being alone.

 

Related Topics

  Subject / Started by Replies Last post
3 Replies
149 Views
Last post May 07, 2017, 03:03:32 pm
by AGelbert

+-Recent Topics

Corruption in Government by AGelbert
October 20, 2017, 08:00:13 pm

Homebody Handy Hints by AGelbert
October 20, 2017, 06:57:09 pm

Pollution by AGelbert
October 20, 2017, 06:12:17 pm

Global Warming is WITH US by AGelbert
October 20, 2017, 05:49:36 pm

Photvoltaics (PV) by AGelbert
October 20, 2017, 04:23:32 pm

Wind Power by AGelbert
October 19, 2017, 02:46:03 pm

Fossil Fuel Profits Getting Eaten Alive by Renewable Energy! by AGelbert
October 19, 2017, 02:17:49 pm

Money by AGelbert
October 18, 2017, 06:58:05 pm

Weird Science by AGelbert
October 18, 2017, 06:28:28 pm

Apocalyptic Humor by AGelbert
October 18, 2017, 02:59:48 pm

Free Web Hit Counter By CSS HTML Tutorial