In the market for a new television? Technology moves fast, and it can be hard to keep up. And, with the industry’s fondness for new acronyms and marketing terms, picking out the essential features from the nice-to-have can be difficult.
Do LCD TVs still exist? What happened to plasma screens? And is LED better than OLED or vice versa? These are just some of the questions you may have about the current crop of TV sets. In short, to answer: yes; they became obsolete; and in terms of screen quality, no, but it depends on what is important to you.
SCREENS
The first thing to decide is what type of TV you want. It’s not just about the size of the screen, but the technology behind it. LED and OLED are common enough terms these days, but you also have QLED displays and Nanocell, not to mention the new technologies coming down the line such as micro LED. So what do they all mean, and which is the best to go for?
LCD vs LED
TVs using LCD, or liquid crystal display, are probably the most common on the market at the moment. The displays use liquid crystals, switching pixels on and off to create different colours and pictures. However, they require a backlight to do this. In the past, this was a fluorescent light that illuminated the entire back panel. That could cause issues with light bleeding from the illuminated pixels into those around it, affecting image quality.
LED is a form of LCD display, but instead of fluorescent lamps it uses a very small light known as a light-emitting diode, or LED. These TVs can produce a better picture than standard LCD TVs, as you no longer have to illuminate the entire screen and can instead light up certain zones as you need to.
To confuse matters, there are technologies such as QLED and Nanocell popping up for new TVs. QLED is predominantly a Samsung technology. It is essentially an LED TV that uses quantum dots instead of liquid crystals, to help improve the picture. And Nanocell is technology on LG TVs that filters red and green better, also to give you a better image.
OLED
Then there is OLED, widely considered the top standard currently available in television displays. Despite sounding similar to LED, they are very different technologies. OLED, or organic light-emitting diode, uses pixels that produce their own light source. No need for a backlight, which means better colours and contrast, as it is the individual pixels that light up rather than a zone on the screen.
That image quality comes at a price, though, with OLED TVs commanding a significant premium over LED displays. Prices have fallen in recent years though, and are likely to continue falling.
Already, there are challengers to OLED’s title though. This year’s CES exhibition added a few new technologies into the mix: mini LED and Micro-LED.
Mini LED TVs use smaller LEDs – and more of them – to light up a television’s pixels. Better light control equals less light bleed, equals a better overall picture.
MicroLED, meanwhile, is aimed at the extreme premium end of the market. As in, the sort of market that has room for a 100-inch screen in the living room. This technology is closer to OLED, lighting its own pixels, with millions of microscopic LEDs packed on to the screen. That means excellent light control and colour. But – and here’s the big but – they’re incredibly expensive, and they are only available in extremely large-sized screens.
RESOLUTION
Most of the new crop of TVs carry the claim to be at least 4K resolution these days, offering better image quality and sharper pictures. But just as we were getting used to the jump to 4K and Ultra HD, some TV makers are throwing 8K into the mix. So what is the difference?
4K refers to how many pixels are on a screen, also known as the resolution. In general, the higher the resolution, the better your picture will be.
4K started out as a cinema standard, where it referred to the 4,096 pixels horizontally across the screen, and 2,160 vertically. However, these days the term is used interchangeably with Ultra HD.
Full HD TVs have 1,920 pixels horizontally across the screen and 1,080 pixels vertically. An Ultra HD or 4K TV doubles that to 3,840 horizontally and 2,160 pixels vertically.
And 8K pushes that again, doubling the resolution of 4K, to 7,680 by 4,320. That adds up to a lot of dots on the screen.
But it’s not just enough to have an ultra-high resolution screen; you also need content that can show it off to its best potential. Just because your TV is 4K enabled doesn’t mean that all the content you watch on it is. For that, you usually need a set-top box that can handle 4K, an Ultra HD Blu Ray player, and additional subscriptions to 4K services.They are fairly standard these days, though.
When it comes to 4K content, there is a decent amount out there right now, such as Ultra HD TV channels provided by Sky; streaming services such as Netflix offering 4K Ultra HD content on its Premium subscription; Amazon's Prime Video; and Disney+, which streams films such as Star Wars in 4K Ultra HD.
If you don’t have access to 4K content, the TVs can upscale full HD or standard definition– add pixels using software to fill in the gaps in the picture – to make it look better on a 4K screen.
But should we skip the 4K TV and go straight to 8K? For most people, it probably doesn’t make financial sense. The current crop of 8K TVs suffers from the same drawbacks that 4K screens did in the early days: they are expensive and, given the lack of native content available out there right now, you can probably get better value from a good-quality 4K TV.
Not only do you need the proper content to take advantage of that super-high-resolution screen, you may also need to upgrade other equipment. The file sizes for 8K content are significantly larger than 4K, so streaming would require serious bandwidth – when the services are available.
Cables and other home cinema equipment would also need to be HDMI 2.1, to allow it to carry the content. Suddenly it’s not just a TV that needs upgrading, but your whole home entertainment set-up. Things can get expensive quickly.
Having said that, if you have the budget and are trying to future-proof your TV purchase, there are some good 8K TVs to be found. Just be prepared to see their price fall significantly in the coming years.
HDR
So you’ve decided on the screen technology, but what about things like HDR? HDR stands for “high dynamic range”. While 4K and 8K define how many pixels are on a screen, HDR determines what those pixels show. That means shades of grey where before it would have been murkier, or details being brought out in shadows when areas of the screen are brightly lit. Colours can be more vivid, too.
Not all content is HDR compatible, but when you get the right content on your HDR screen, the picture quality will be higher than standard dynamic range footage.
The problem is that, again, there are different standards. HDR10 is an open standard, which means it lays down a technical standard that must be met for both content and displays using it. It is also free for manufacturers and content producers to use. It is compatible with all TVs and HDR content.
Where things get murky is with standards such as HDR10+, which is a proprietary Samsung standard, and Dolby Vision. TV makers will usually support one format or the other, but rarely both.
HDR10+ offers higher brightness than HDR10, and includes colour and brightness information for individual scenes, unlike the open standard.
Dolby Vision does similar, with frame by frame information for colour and brightness, and allows for higher peak brightness, better black levels and colours, but is more widely supported.
WHAT ABOUT SMART TVS?
Most of the TVs sold these days have some sort of smart TV interface, whether it is their own system – as LG, Samsung and others have done – or using Google's Google TV (formerly Android TV) to deliver apps and streaming services to customers.
It can be a handy way to cut down on the amount of devices you need. If your TV isn’t due an upgrade just yet, but you want to add a smart TV platform, there are plenty of choices.
Apple and Google both offer streaming devices, with Apple TV and Chromecast available for a lot less than a new TV. Chromecast has a few different versions; the cheaper streaming device uses your smartphone for content, casting Netflix, Disney+ and other services to the screen from your device, while the newer Chromecast with Google TV is a standalone device you can add apps to and even comes with its own remote control.
Other options are Amazon’s Fire TV Stick, Now TV’s streaming stick and Roku.