|
Seems like a perfectly logical way to save an image...
="1.0"

Jeremy Falcon
|
|
|
|
|
Use power of BSON, Luke!
|
|
|
|
|
I've been researching refresh rate and other display characteristics, thinking another monitor might make a difference.
My take on higher refresh rates (> 60 hz.) is they make a difference for gamers and people watching videos, but i'm not sure they would be any more readable, or less migraine-inducing (during long sessions).
Just curious if anyone else with older/diminishing eyesight here has considered this issue.
thanks, Bill
«I want to stay as close to the edge as I can without going over. Out on the edge you see all kinds of things you can't see from the center» Kurt Vonnegut.
|
|
|
|
|
I think that anything over 100,000hz is going a bit far!
veni bibi saltavi
|
|
|
|
|
It's soooooooooooooooooo refreshing.
|
|
|
|
|
I'm not going to mention what organ this post suggests may be diminishing in capacity
«I want to stay as close to the edge as I can without going over. Out on the edge you see all kinds of things you can't see from the center» Kurt Vonnegut.
|
|
|
|
|
Probably defaulting back to the question: what are you gonna use it for?
For me, I've got pretty bad eyesight buuuut never really had too much of an issue with lower refresh rate screens. They're nice to have to be sure but when I'm working on user input apps which are mostly lots of entry forms or a general intranet which has lots of relatively static content web parts, it's fine.
As you've alluded to though, when I've been working on graphically based bits of work, it's really noticeable what the difference is. And certainly if I'm working on input based on visuals (last one was a point on a map thing) it needed to be quite high for my purposes because I was running through so many different weird scenarios in a short space of time when compared to a regular end user.
Budgetry reasons though, would that be the thing restricting you? I would personally go with higher, just because then you're covered in both scenarios whereas a lower one wouldn't.
|
|
|
|
|
Thanks ! Given the state of my eyes (surgery upcoming), and the fact that I think there'll be bargains galore on this year's state-of-the-art monitors after the holidays ... I will probably pop for a new monitor with an IPS screen.
«I want to stay as close to the edge as I can without going over. Out on the edge you see all kinds of things you can't see from the center» Kurt Vonnegut.
|
|
|
|
|
Not so sure about that: the initial move to 70+ framerates was (I thought) to get away from beat frequencies with the mains supply causing flicker between the screen refresh and ambient lighting.
Video doesn't need a high refresh, as the source is only captured at 24 / 48 / occasionally 60 FPS anyway (Avatar was filmed at 24 fps, The Hobbit was 48 for example)
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
Games achieve higher frame rates, I suppose.
|
|
|
|
|
Thanks, Griff, the mains here are 220v. 50 hertz; I hadn't thought about that issue. My nine-year old 32 inch beast is an LCD with around 720p whatever; I am running HDMI out from video card to HDMI in on monitor.
«I want to stay as close to the edge as I can without going over. Out on the edge you see all kinds of things you can't see from the center» Kurt Vonnegut.
|
|
|
|
|
OriginalGriff wrote: Not so sure about that: the initial move to 70+ framerates was (I thought) to get away from beat frequencies with the mains supply causing flicker between the screen refresh and ambient lighting.
I thought the move to 120+ refresh rates was to support 3D rendering? At least when using shutter technology it halves the framerate as it alternates the images for each eye.
|
|
|
|
|
That makes sense.
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
AIUI that flicker sync issue only mattered with CRTs. LCDs don't normally go dark between refreshes (some high framerate TV gimmickry not withstanding).
For a TV 120Hz eliminates the judder (alternating 2 or 3 refresh display time for each frame) from 24 input FPS playback; so it has some merit there. 240 lets you do the alternating blank frame gimmickry again, but meh. 480 is just one number bigger failsauce.
For twitchy gaming higher FPSes are nice (assuming you can afford the GPU to feed them). I've also seen a number of people with 120 or 144 screens claim that mouse cursor movement, window dragging, and document scrolling are noticeably smoother at higher refresh rates. I've never had a high FPS screen to corroborate those claims though. (I've always spent my bandwidth on higher resolutions instead.)
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, waging all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.
-- Sarah Hoyt
|
|
|
|
|
Are you having an LCD or CRT monitor?
Old CRT monitors flicker because of how the picture is drawn on the screen and therefore gets a better picture at higher rates.
If you have an LCD monitor and you're watching a static picture it's not flickering because the pixels don't change state.
The refresh rate for an LCD monitor is only telling you how many times per second it might change state, which in theory might have an impact for gamers.
|
|
|
|
|
Thanks, Jorgen, oh, indeed, been years since I used a CRT monitor, so I is in LCD land, with a 32 inch tv about nine-years old, with roughly 720p capability and HDMI inputs (was high-end back then).
cheers, Bill
«I want to stay as close to the edge as I can without going over. Out on the edge you see all kinds of things you can't see from the center» Kurt Vonnegut.
|
|
|
|
|
I'm pretty sure the refreshrate doesn't have anything to do with your problem in this case.
So the next question is what type of backlight your monitor has?
Vacuum fluorescent tube backlights produce quite a lot of UV radiation which indeed can be quite nasty to the eyes.
They are also driven by high frequency AC, which indeed can cause flicker.
Consider a display with LED backlights.
|
|
|
|
|
Your absolutly right.
I would take care more about a fast and silent graphics card. I grant myself a GTX 950.
And a high quality monitor as the expensive EIZOs or the also very good Samsungs. They also last very long.
Press F1 for help or google it.
Greetings from Germany
|
|
|
|
|
Considering persistence-of-vision, rates of 16 fps are normally OK for most people and movies are run at 24 fps. That, however, all refers to actual flashing images, as in movies. From the Wikipedia:
Aside from some configurations used until the early 1990's, computer monitors do not use interlacing. They may sometimes be seen to flicker, often in a brightly lit room, and at close viewing distances. The greater flickering in close-up viewing is due to more of the screen being in the viewer's peripheral vision, which has more sensitivity to flickering. Generally, a refresh rate of 85 Hz or above (as found in most modern CRT monitors) is sufficient to minimize flicker in close viewing, and all recent computer monitors are capable of at least that rate.
Flat-panel liquid crystal display (LCD) monitors do not suffer from flicker even if their refresh rate is 60 Hz or lower. This is because an LCD pixel generates a continuous stream of light as long as that part of the image is supposed to be lit (see also ghosting). With each scan, the monitor determines whether a pixel should be light or dark and changes the state of the pixel accordingly. In a CRT, by comparison, each pixel generates a temporary burst of light, then darkening, in each periodic scan. The monitor activates a phosphor on the screen during each scan if the pixel is supposed to be light, but the phosphor fades before the next scan.[9]
I'd hazard a guess that, all else being equal, it would be coincidental frequency beating that causes the headaches. Perhaps that's not even the cause: my Mrs. had a roaring headache after watching Avatar 3D version - didn't bother me - not a frequency problem at all.
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein | "As far as we know, our computer has never had an undetected error." - Weisert | "If you are searching for perfection in others, then you seek disappointment. If you are seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010 |
|
|
|
|
|
I was recently in the same position (well not the aging eyes part).
I decided to go for the FORIS FG2421 display (http://gaming.eizo.com/products/foris_fg2421/[^])
For games (some of them) it makes some difference, for movies I find (although I haven't watched many on it since I have a bigger tv connected) it makes no difference that I can see, sometimes it even causes the movie to be played in short bursts or lose frames. Could also be because of my graphical card although it should be able to play it with ease, or drivers, or settings, or quality of the movie (I still need to dig into all the settings and such on my graphical card).
As I understand it the human eye can distinguish between 60-120hz easily, but above that it becomes more difficult. So the gain between 60-120hz is a lot higher than between 120hz and 240hz.
I do find I prefer looking at that screen (I have 2 more 60hz screens connected (and the tv)) but that might just be because the others are a lot older.
If you want me to test something for you (wouldn't know what) shoot me an email and I'll see what I can do.
Tom
|
|
|
|
|
BillWoodruff wrote: Just curious if anyone else with older/diminishing eyesight here has considered this issue.
Have you tried glasses?
Marc
|
|
|
|
|
Thanks, Marc, well, yes, I have glasses, but what's happening to my eyes cannot be helped by those: it takes a laser to do that.
«I want to stay as close to the edge as I can without going over. Out on the edge you see all kinds of things you can't see from the center» Kurt Vonnegut.
|
|
|
|
|
The move to 120 was for a couple of reasons. First, as someone already mentioned, when viewing 3D content using active shutter glasses it allows the usage of 60fps source material. The other reason is with Bluray devices now having flexible frame rates, they can present content in the original frame rate instead of using the abomination that was 3/2 pull-down.
Original frame rates for most US-based content is 24, 30, or 60. What's the least common multiple of those numbers? 120.
|
|
|
|
|
"60Hz ought to be enough for everyone". Anything more is a gimmick, IMO. What I mean is, if it's "only" 60Hz, it's absolutely fine and wouldn't be a deal breaker to me. The bad CRT flicker days are gone - back then, I could immediately tell when a CRT was running at 60Hz, and I'd get nauseous within a minute.
That said, while 60Hz is fine for an LCD/LED monitor, don't settle for anything less either, especially if it's gonna be used as a computer monitor.
My 4K TV, like a lot of 4K displays hooked up to a computer, is limited to 30Hz. The mouse pointer is visibly more jittery, especially when moving it around the screen quickly. When I grab a window on my 1920x1200 monitor and drag it over to the 4K one, you can see as it crosses over that it renders significantly slower - for example, if I drag a window from the bottom of the screen to the top (and it runs across both monitors), you can immediately see that the 4K one needs a fraction of a second to catch up. Fullscreen video can get noticeably out of sync, but I don't use it to watch video.
Given the price I paid for it however, the extra desktop real-estate is still worth it to me, and I'm not going back. Caveat emptor.
[tl;dr version]: 60Hz for an LCD is fine, even if you were sensitive to it back in the CRT days.
[Edit]: Somebody else's point about 3D is probably valid. I can't say I've experienced 3D on a 60Hz-only TV--not sure if this is factor to you. And FWIW, I still have better than 20/20 vision after having laser eye surgery back in 2003.
|
|
|
|
|
☭☢☣⚡⛓ [5]
Just say what you see
veni bibi saltavi
|
|
|
|