Purchasing new hardware? Read our latest product comparisons

Opinion: Has 4K's time arrived?


February 6, 2014

A wall of 4K TVs at CES 2014

A wall of 4K TVs at CES 2014

Image Gallery (16 images)

Ultra HD 4K displays were everywhere at CES 2014, with super high resolution displays measuring up to 110 inches in size. But the question remains, do we need displays with such high resolution, given that the human eye isn't likely to be able to tell the difference between 4K and 2K in most viewing environments? And who is making content in 4K anyway?

It's now a month since CES, and after making jokes about the ubiquity of 4K screens at the Las Vegas Convention Center last month, it now seems to me that 4K is an inevitable part of our future. While the hype around 3D TVs from a few years ago didn't really strike a chord with consumers, it seems as though 4K could be different. Here's why:

More is more

First off, there's the truism that specs matter, even when they shouldn't. Consumers who care about video quality will soon be boasting about their 4K displays, and it will become a key part of marketing televisions in the months and years to come. Furthermore, I think that shoppers will fall in line.

Don't believe me? Do you pay attention to how many megapixels your digital camera or smartphone camera can capture? Many people do and it continues to be the key shorthand spec for comparing cameras, even though it's just a small part of determining image quality.

4K is much the same. Surely it must be better than not-4K, right?

Research firm NPD DisplaySearch predicts that two million 4K desktop monitors will ship in 2014, even as the overall desktop monitor market is expected to contract. Keep in mind most desktop monitors are relatively small screens compared to the wall-filling 4K displays seen at CES. On a screen that size, whether or not you're looking at 4,000 lines of pixels or 2,000 will be practically impossible to determine with the naked eye, and yet they're expected to sell.

If you build it, so will everyone else

Perhaps most important, 4K seems to have achieved the momentum and critical mass necessary for it to gain mainstream adoption. Fearing being left behind, all the major manufacturers have boarded the 4K train as it leaves the station.

As I mentioned, every big tech company at CES made sure to have their take on Ultra HD. A few, like Samsung, even went a step further, offering an 8K display concept.

Samsung has its own 8K display

But these companies aren't just out to show what they can do, they're also looking to offer a real consumer product at a competitive price. Both Polaroid and Vizio showed 4K 50-inch displays that sell for less than US$1,000.

Along the same consumer-friendly lines, Sony has sought to make 4K tech more portable and practical, showing off a 4K short throw projector.

Just this month, Google also announced that its Chrome mini-PC, meant to capture the lower end of the market with a price tag of only $179, will offer support for high-end 4K displays. Hardware makers see the 4K tide building and don't want to miss catching the Ultra HD wave as it crashes on shores around the world.

If you build it, they will create the content

For many months, the big punchline about 4K was that no one was creating content at such high resolution, making the need for an Ultra HD display moot. The processing power alone needed to edit and render video of such high resolution seemed like a significant enough barrier to torpedo 4K.

That all seems to be changing very quickly this year.

Google has announced a new 4K streaming format, VP9, to reduce the required bandwith needed to stream 4K from YouTube or elsewhere. And Netflix CEO Reed Hastings popped up at multiple CES press conferences to tout upcoming 4K content on the streaming service, starting with the second season of House of Cards.

There were also 4K cameras spotted on the sidelines of this year's Super Bowl, even though the big game wasn't yet able to be broadcast in Ultra HD.

Refusing to be left behind, Amazon and cable TV king Comcast have both announced partnerships with Samsung to bring 4K television content to all the Ultra HD TVs Samsung plans to sell in the US this year.

While many may have thought all those glitzy screens at CES last month were all specs and no substance, it seems as though 4K is more about progress than punchlines. What remains to be seen, however, is if enough consumers agree.

With the obvious exception of early adopters, who may already be reading this on the latest 4K monitor, many people will have purchased a HDTV in the not too distant past. It's unlikely the move to 4K will be enough to convince many of these to upgrade again so soon, but anyone in the market for a new TV will have a tougher choice.

About the Author
Eric Mack Eric Mack has been covering technology and the world since the late 1990s. As well as being a Gizmag regular, he currently contributes to CNET, NPR and other outlets. All articles by Eric Mack

4K monitor? Sure. I'd love one. 4K TV? No thanks.

Anne Ominous

I feel the same way as Anne, I can't wait for a higher rez 27" display. 1080p for on a monitor larger than 24" just doesn't look good. When sitting this close to a display it needs to be higher resolution.

I will pick up a 4K TV eventually, when my 1080p TV dies and 4K is all I can get. Not before that.


Saw a 4K TV at the local Mart. Expensive but absolutely amazing. Can you see that difference to a 1920? Sure can

Over the air content has to catch up that's for sure, but these days people also use their TVs to plug into computers and consoles. And display port 1.2 can output it just fine (which most laptops and video card have)

Plus what most people don't know is that video post processing power has also scaled with new TVs. I saw a 1920 movie scaled to 4k on a Sony, and unless you paused the video you would think it was native. Sony of course pulled the same trick for scaling 720p up tot 1920. Something to keep in mind.

Glad they are taking 4k seriously because the jump will really be like VHF to Bluray


Saw a 4K at the local shop and it looked amazing for panoramic shots but the movement of even small boats in the background was totally pixellated and looked truly horrendous. It reminded me of the first HD screens that were rubbish for sports as they couldn't keep up, only 100 times worse. The problem probably is the difference in quality is now more pronounced rather than necessarily worse. I am guessing that for sports or anything with significant movement , waiting is definitely the best policy , though I have only seen one so far so I could be wrong.


dpp1.2 isn't a good solution for 4k monitors, to have 60Hz at 4k you need to split the monitor in 2 virtual displays. i'd rather have a monitor UHD>res>FHD with 16:10 and 144 or 120Hz


So... you can buy a $100,000 Mercedes AMG coupe for the same price as a $25,000 Chevy Cruze. It is unlikely that you will ever be driving on professional Race Tracks, and pushing the limits of the AMG; but just the idea of power and control and the option makes this deal a no brainer.

It should also be obvious that the manufacturing costs of 4K are rapidly approaching the levels of HD. Even if you claim that you cannot see or need the 400% quality improvement, no manufacturer in the 21st Century is going to continue to make and market lower quality products that cost the same to produce.

Consumer tech is surfing its own less extreme variation of Moore's Law. The advancements are slower, but the rush is no less exhilarating; and only the recalcitrant Luddites will want the better for less to go away.

Robert Walther

" given that the human eye isn't likely to be able to tell the difference between 4K and 2K in most viewing environments?"

1) Unless it's a placebo effect, and my eyesight is not even that good, I was amazed at how 4k resolution looked, compared to blu ray. 2) Was at the Sony center in NYC and the rep said the current TVs they had on display did not upscale to 4K, but only up to 2K, so forget about watching regular HD cable (1080i where I live) as 4K.

Mike T

Actually the analogy of number of pixels with cameras is not totally valid. With cameras higher pixel resolution allows cropping of the image or zooming in depending on what you are trying to do and still end up with a usable resolution (provided the lens is good enough etc. For viewing then the argument becomes less convincing, as the eye does have limitations but the again I remember the argument that you didn't need HD resolution on a 24" TV - you do!

Brian M

" given that the human eye isn't likely to be able to tell the difference between 4K and 2K in most viewing environments?"

I don't know why this rubbish seems to continue to be pushed around. It is absolutely possible to tell the difference between 4k and 2k in most viewing environments. The only people who say this are people who have bad eyesight or who have never seen a 4K tv. It is EASY to notice the difference between 4k and 2k and everyone I know that has seen one agrees that it is significantly noticeable.


I saw a couple of 4K 80 inch screens and I sure could tell the difference. In fact I was amazed because it was so much better than the other HD tvs on display. When you say that people can't tell the difference then I've got to think that they need to clean their glasses or get a new prescription.


Over the past 10 years, slowly increasing pixel density has driven the typical software UI to the point of borderline usability. Most were coded when 1280x1024 was considered monster-size. And the UI fonts are usually not scalable - if you've ever tried to use Windows "scale fonts to 125%" when using a 27" screen, you already know this. We work with flyspeck icons and 4 pt text now.

Vendors have been riding the upgrade gravy train for years, making as few improvements as possible for as long as possible. Savvy users keep old SW versions, knowing there's been little value added to the new ones. All this will change when 4K trickles down.

For me, it won't be too soon.


You don't need 4k in your home. Don't fall for it, it's just the industry trying to bump up their profits for another year by selling you more crap you don't need.

Before you say "I saw a 4K TV in the store, and it looked much better than 2K" ask a few other questions. What was the lighting like? Were they getting the same source? What other factors about the set were different? Retailers and Manufacturers sales reps use a lot of tricks to point you to the product they want you to buy. Some of them are pretty "questionable".

Ask yourself, where am I going to get 4K source? Unless you're downloading to a server and displaying from there, or taking 1080/2K and scaling it up (which is NOT 4K) there's almost nothing available.

I've worked with hi-def (and beyond) video since HD started. With all other factors being equal, most people can't descern any significant difference between 2K and 4K on a TV set sized display. You may be that special snowflake, but...


Oh dear lord. Samsung showed an 8K monitor prototype. Which can only mean 8K is right around the corner. Why buy anything less than that?


I couldn't care less. The same old crap we get force fed every day does not get any better when it comes in a higher pixel count.


"On a screen that size, whether or not you're looking at 4,000 lines of pixels or 2,000 will be practically impossible to determine with the naked eye, and yet they're expected to sell." That's completely untrue. Apple (to give an example) names their displays as "retina display", when their pixel density goes past the point when it is possible to distinguish between individual pixels. For the larger displays, from laptops to desktop monitors, that point has been considered the 220ppi (pixels per inch) mark. A 31,5" 4K monitor has 140ppi, a 28" 4K monitor has 157ppi. I wouldn't consider these monitors too big for desktop computing, on the opposite, however even smaller 4K monitors would not reach Apple's definition of "retina display". Which, I may as well mention it, is 264ppi for tablets, and 326 ppi for smart phones and smaller tablets. And according to this last definition, a 4K display would need to have less than a 13" diagonal for the pixels to become indiscernible. Or less than 20", by their laptop/monitor definition. So, at some point to have bigger definition in desktop computing one will have to go bigger in size, and I don't really see nothing wrong with that, but considering the monitor sizes most used currently, it won't be 4K that will dictate such a trend yet. What it will start by doing though, is highlighting the need for scalability of operating system and applications text and graphical features, as they will appear to small otherwise.

David Guerra

As a practical matter, have you noticed that people's faces usually don't look so good in even HD, much less 4K?


4K? Would absolutely love love that. Problem: I have a 1080p TV and even now much, if not the majority, of what I watch is either not really 1080p, or is too highly compressed. Providers trying to compress to fit more stations on their bandwidth. Color banding, block artifacts, complete picture breakdown during rapid camera motion or complex dynamic scenes . . . Cable stations, live tv, DVDs, streaming content . . . about the only content that is up to snuff as far as the 1080p spec goes is BluRay and maybe some high profile broadcast that is not overly compressed.

The problem is not so much that 4K content will not be available, it's that 4K content as delivered will be degraded and not worth the bother. It will not look like the demo reel at the trade show. If they are going to squeeze a 4K picture over existing delivery methods, they are going to have to compress it even more than they do 1080p signals. It's four times the amount of picture data. Does not bode well. I mean look at 1080p on Youtube. Pretty bad.

I think if you had a really high quality 1080p tv, and were watching a carefully made 1080p source and it were not overly compressed, it would be stunning and equal to what you are likely to see delivered as "4K" going by the industry's track record.


The statement/claim that "The human eye isn't likely to be able to tell the difference between 4K and 2K in most viewing environments" is misleading and only accurate if viewing 2 - 40" screens side by side (one 4K & one 2K) from a distance of 8 ft. If you are viewing 2 - 80" screens side by side (one 4K & one 2K) from a distance of 5 ft. there is a HUGE difference. It depends on the size of the screen and how close you are to it. I used a 2K 24" monitor at 18" that worked just fine. I bought a 2K 27" monitor to replace it and a week later had to return it for a 4K monitor to find viewing comfort. There IS a point at which 4K makes sense. 10 years ago a 36" screen was considered HUGE. In 6 to 8 years as cost continues to fall, 4K programming expands and screen size continues to increase, when you replace your 50" screen with a 100" screen (with the same viewing distance), then 4K will SHINE.


Lack of content is an issue, as is the need for a fairly huge screen. Plus, I can't even get 1080p to download reliably over video on demand.

Still, it is happening much faster on the PC side of things than I would ever have guessed.

Jon A.

I have a 2.5K monitor. It was only 280 bucks for a 27"


You can read the same story a few years back and substitute 3D for 4K. look how that turned out. Content is virtually gone and few if any broadcast 3d any more. Then add in the Net Neutrality issues and figure out how they are going to broadcast these huge data sets and 4K will die. Its just another marketing ploy to sell TVs.


4K is at our doorstep, just waiting to come in. As for content, there is a lot but, consider that a broadcaster will need a whopping 168 hours per week of eye popping 4k content to fill up those viewing hours. That doesn't even consider the new 4K Smartphones that will be hitting the shelves soon. So who will fill those gapping holes? Independent Filmmakers, the people who create content need affordable cameras to deliver these kind of numbers. As one of those filmmakers, I have to say that until recently 4K was not affordable. I finally jumped in to 4K production when Sony came out with the new PXW-Z100. At $6k it has brought 4K content creation to a realistic number in these money crunched days. And the quality of the camera rivals those costing ten times the cost. YouTube is already delivering 4K content to your UHD TV. Here are a few examples of how good 4K can look. Even on HD sets, you really can see a difference. Judge for yourself. There is even some news footage of the recent protests shot in Bangkok, Thailand Be sure to change to 2160p in settings

Treasures of Thailand http://youtu.be/m-pa2k_JncU

Muay Thai Madness http://youtu.be/gyNZFW1g180

Bangkok Protests- The Calm Before The Storm? http://youtu.be/CPUAQb4cIsM


I am much with Anne's opinion on this subject, namely because of the subject of content. 4K content is very rare, and even 1080p is still not the standard when it comes to tv broadcast. But that is completely beside the point. Why should video dictate display needs when there is so much digital content available at resolutions far higher than any current display supports? Why isn't there demand for screens capable of reproducing 20+ megapixel images? Even 4K can't display more than a 6.2MP image pixel per pixel. This is also because digital photographs are usually 4:3 and monitors 16:9 (once again because video formats have taken over monitor screen formats, killing every other aspect ratios for the sake of massification of production). Watching 10 or 20 MP images zoomed to 100% on a screen of similar aspect ratio would be an incredible and completely new experience, and I don't understand why manufacturers haven't even tried to sell that idea to the general public.

David Guerra

I viewed a 4K display in the Sony store at a mall, and I was stunned. There is something about having all that visual information available that is interesting to the brain.

Average best corrected visual acuity is actually much better than 20/20 and peaks in the age range of 25-29 at approximately 20/13.5 (See Elliot et al., Optometry and Vision Science 72(3) 1995, pp. 186-191.). 20/13.5 acuity corresponds to 0.675 arc minute resolution. Some eyes are better and some are worse, but this is average. A full HD 1080p display 1920 pixels wide would fill 21.6 degrees horizontally at this resolution. A 4K display with 3840 pixels wide would fill 43.2 degrees wide at this average human visual acuity limit.

The eye comfortably rotates +-15 degrees (Field Guide to Visual and Ophthalmic Optics, SPIE 2004, p. 13). When the eye rotates more than about +-20 degrees, head motion automatically kicks in to reduce the eye rotation requirement (see Exp. Brain Res. (2008) 190:369-387). Therefore to view a display without turning the head, the display should span no more than approximately 40 degrees. As shown above, a 4K display provides a 43 degree horizontal span at the typical young adult visual acuity limit. The pixels will not be "wasted" if the typical viewer is close enough that the 4K screen spans at 43 degrees horizontally.


It will become cheaper, everyone will have it in their homes and smartphones, on their smartphone cameras, end of story. cheaper cameras to make the content, The future is complete custom display walls at home. even on the roof, providing continuous custom content and also light in the areas.

Dawar Saify

Sorry to disappoint you folks, but 4k is NOT, in fact, at our doorstep.

The displays might be capable of displaying it. But are they displaying it WELL? And content, and the rest of (consumer-grade) hardware is not yet up to snuff.

First, the displays: are they IPS? What is the contrast ratio? Do they have the -- probably required for a decent moving picture -- absolute maximum response time of 5ms? I could be wrong, but my guess is that the early 4k TVs will have pretty lousy specs.

The monitor-buying crowd tends to be a bit more demanding when it comes to specification. It still takes a fairly expensive video card to run a 4k display WELL.

Fine for early adopters who want to spend $$. But I just don't think that at this time, 4k TV offers enough added value for the added money. Hell, people are still complaining about being able to see the pores in actress' faces on 1080.

Anne Ominous

The manufacturers got plain greedy with 3D, wanting to charge a huge premium for a minor hardware upgrade. People didn't buy and the content dried up. It didn't help that reviewers were incapable of describing the shutter glasses without using the term 'geeky'. 'Outrageously priced' would have been more accurate and to the point, given that the average family of four would have had to fork out around €500 in addition to the set. (Nowadays the polarised specs sell for a few euros apiece)

This time around they seem to have taken a more consumer friendly approach, and priced accordingly, and sporting broadcasts and films will supply the content without the need for specially trained camera operators and editors.

Where this technology will come into its own is not with massive wall mounted screens or computer monitors, but with projectors. I have just spent Saturday afternoon watching Ireland thrash Wales in the rugby 6 nations in HD on a projected 8' screen. It was outstanding. Better than being there. One thing more. As the image quality improves, it also takes on a 3d like quality, so maybe we'll get our 3d by the back door.


The whole "You can't tell the difference anyway" and "don't fall for it" reminds me an awful lot of the talk between Full HD and HD Ready TV's.

There's no content for Full HD. You can't tell anyway, on a display that's smaller than 40". They're just trying to push sales.

And in the end, if you bought an HD Ready TV, you had the clearly inferior product. Because Full HD is very visible, even on smaller screens, and its not just a marketing gimmick. And the content came once there was a userbase for them.

As for my content, sure, I watch a lot of 720p things. But the 1080p movies I watch are a lot better-looking.

Frank van Schie

Sorry, I have to laugh at some of the "religious" responses to the potential of 4K. If your eyes and brain can't perceive the difference generated by a 4K stream - you need help, not the hardware and software. If you believe advances in communications are only a plot to delude the public - then you will never [and probably shouldn't] work on a competitive campaign to market new electronics. There are wall-to-wall reviewers out here - and many of them really do know what they're analyzing.

If you have seen a demo and were unimpressed it is likely some Big Box twerp did the setup - not that it's a great deal more complex than a decent setup for 1080p.

Surprised the worry warts didn't bring the "impossibility" of either OTA or Sat feeds of 4K - because the protocols are ready and that hardware/software will soon be in place. Read up on h.265 replacements for H.264. I expect that capability will be in D14 launching this spring for DirecTV - though I doubt anyone at D* will confirm that. They are as selective as Apple about disclosing future hardware and software.

Most of all, folks - learn a bit about how creative artists and content producers think and work. Folks who create motion pictures, who document sport as live or stored entertainment always want to move on to formats which offer the greatest capacity to capture the real deal. And sell it to you.

If you read the article on a monochrome monitor and might save it to a floppy disc - you'll never get a 4K TV set. The rest of us? We'll leave HD behind as easily as we did SD.

Ed Campbell

You can tell the difference. I was walking past a set in Sam's Club and I stopped to marvel at what a great picture it was. I was thinking "that's the best picture i've ever seen on a tv" When I looked at the brand and high price I saw it was a 4k.


Re: MG127 Tiled or no, 4k at 60Hz displayed on a modern video card is seamless, even in games. Though Anything less then a 780Ti is crap for games at 4k these days.

Re: LHS Nice description. Well referenced. I could tell 4k at about 3m on a 60" 4k, so that's about right for what you describe.

Re: Anne Ominous Nvidia 650 up for ~$200 can display 4k. Though only good for desktop applications as you can't play any games at decent frames

2 things to keep in mind for the future of content as we know it. 1. 3D is not dead. It just evolved. 120 Hz glass free is around the corner, and in teh distant past after OLED has also left its calling card, we will see it in 4k displays. 2. Most movie content to date even dating back to the early days was shot in analogue at equivalent resolutions to 4k. Movie theatres broadcast content at almost 4k. The watered down equivalent flows to BluRay for your 1080 viewing pleasure. When 4k monitors at an affordable price flood the market they will simply 'Remaster' the original old footage and sample it digitally at the higher rez. Content is really where the money is, not in selling people 4k monitors. Either way, happy days :)

Post a Comment

Login with your Gizmag account:

Related Articles
Looking for something? Search our articles