• After 15+ years, we've made a big change: Android Forums is now Early Bird Club. Learn more here.

"4K" - Why?!

Wrend

Member
I've decided to officially rename the newer "4K" (4000 ≠ 3840) resolution standard to "4x" (4
 
I don't see the need to waste computing power on rendering such high pixel count frames on most devices in most circumstances.

I agree, especially with phones you don't really need more then 1080
 
agreed. I think even 1080p on a phone is overkill. That is based on my use cases though. I don't watch a lot of media on my phone so i am fine with sub 1080p if that was the case.
 
So that your device can cost more, run slower and have shorter battery life, in return for something you can't tell you have?

If so, I've got this bridge I'd like to sell to you... ;)
 
Things like higher refresh/frame rates, lower latency, color and brightness variation and ranges, and such are more important to me, and seem more visually relevant.

Other users, however, may have different priorities. ;)

Just because a device offers 4K capability doesn't mean that you have to use it, and there's always the choice to simply ignore devices that offer features you have no need for.
 
Sorry, but you're all missing the point: 4K's real reason "why?" is to get you to buy a replacement for your current, perfectly good, HD TV :rolleyes:

Same reason they pushed 3D - hopefully, this'll be equally successful ;)

I guess 4K will have the additional 'advantage' of allowing a few geeks to buy yet another re-mastered set of Star Wars videos :D
 
Sorry, but you're all missing the point: 4K's real reason "why?" is to get you to buy a replacement for your current, perfectly good, HD TV :rolleyes:

Same reason they pushed 3D - hopefully, this'll be equally successful ;)
Some of us might argue that HD was invented mainly because people weren't replacing their SD TVs as often as the manufacturers wanted ;)
I guess 4K will have the additional 'advantage' of allowing a few geeks to buy yet another re-mastered set of Star Wars videos :D
In which Han will still let Greedo shoot first...
 
I think if there's a noticeable difference on say, a TV or a monitor, then I think 4K isn't that bad, although I do agree that when you can't see the pixels on a 5" 1080p screen, that anything higher is kinda wasting resources.

I also find it kind of strange that I have as many pixels on my 40" TV as on my 22" monitor, which is the same on my 5" phone.
 
These go to 11

;)


Displaying 4k may be a unnecessary. Heck, for most 1080p is overkill. OTH taking 4k video will be a necessity in a year or two if not already.
 
Ahem... 4x, if you please. ;)

On a large enough screen where there is more of a visually perceivable difference, and the point in getting such a device is to take advantage of the experience the higher resolution offers, then sure. It might not be a priority for me, but in those circumstances, I can of course acknowledge that there is some benefit in getting these kinds of higher resolution devices.


80 inch TVs aside, because it's a waste of resources, namely, computing potential, electricity/battery charge, and cost, which could be used for other, more relevant features and capabilities, and data crunching (computing).

I have a decent sized (46 inch viewable) TV/Monitor. As mentioned, I use it as the screen for my main computer at home which I sit about 8 feet away from, and I have to lower the resolution to 1366
 
Early, I agree with you 100%. One other point to consider is that the broadcasters only send out 1080i. That is why many of the home theater receivers advertise up-converters for your cable box signal.
 
Early, I agree with you 100%. One other point to consider is that the broadcasters only send out 1080i. That is why many of the home theater receivers advertise up-converters for your cable box signal.

Actually, quite a few are in 720p.

http://en.wikipedia.org/wiki/High-definition_television_in_the_United_States

My DirecTV box would either do the conversion or pass the native signal to my TV and let it do it. My TV did a much better job.

But whether in the set top box or the TV, you're absolutely right, the key component here will still be the scaler (the resolution converter). Even today they range from excellent to Mickey Mouse and you have to know what you're getting when you move into the high end.

As for resolution importance -

http://carltonbale.com/1080p-does-matter/

So while it's important, you actually get more perceived details from color and contrast, the number of dots comes in last. If you get a chance to see a perfect-color SD program without compression on a modern, properly-adjusted set, you'd swear it had more native dots than the same program in HD with bad color and compression - even though it didn't.

Anyway, if I ever go with large projection, sure, I'll want more dots. I just don't see it happening for me.
 
Continuing on with Early's point on the names, why the hell is 1440p now being called 2K?

Using the same approach for naming 4K (ie 4000 pixels on the horizontal) technically 1080p is 2K and 1440p is 2.5K.

What's worse is things like 'Super HD', QHD/qHD, 'UHD' etc.
 
The factor that has been missed is 4x allows you to sit half the distance from the screen than would be rerouted for 1080p. This allows a larger screen in the same setting giving a larger field of view.
 
Back
Top Bottom