Direct view Micro LED displays are a relatively new display technology that so far has been publicly demonstrated only by Samsung and Sony, the two of which tend to experiment with variety of technologies in general. At IFA last week TCL, a major maker of televisions, threw its hat into the ring by demonstrating its ultra-large Micro LED-based Ultra-HD TV.

Dubbed the Cinema Wall 132-Inch 4K, TCL’s Micro LED television uses 24,000,000 individually controlled LEDs as RGB subpixels, and features a 1,500 nits max brightness level as well as a 2,500,000 contrast ratio (good enough to compete against OLEDs). The manufacturer claims that the TV can display a wide color gamut, but does not disclose whether they're using DCI-P3 or BT.2020.

Like other early-generation display products, TCL is not revealing if and when plans to release its 132-inch 4K Micro LED TV commercially, but the fact that that it has a device that is good enough to be shown in public (see the video by Quantum OLED channel here) is an important step. Just like other makers of Micro LED televisions, TCL might want to increase peak brightness supported by these devices, as many modern titles are post-produced using Dolby’s Pulsar reference monitor for Dolby Vision HDR, which has a peak brightness level of 4000 nits.

Numerous TV makers are currently investigating Micro LED technology as a viable alternative to OLED-based screens. While OLEDs tend to offer superior contrast ratio when compared to LCDs, they have a number of trade-offs, including off-axis color shifting, ghosting, burn-in, etc. WOLED has mitigated some of these issues, but it has also introduced others due to the inherient limitations of using color filters.

By contrast Micro LED TVs are expected to be free of such drawbacks, while still retaining the advantages of individual LEDs like brightness, contrast, fast response time, and wide viewing angles. As an added bonus, Micro LED TVs will not need any bezels and can be made very thin.

Related Reading:

Sources: Quantum OLED, MicroLED.info, LEDs Inside

Comments Locked

74 Comments

View All Comments

  • eek2121 - Saturday, September 14, 2019 - link

    27" 4k Master Race here. IMO if I buy a 75" TV it better be 8K or I'm not buying it. I don't know why so many people are blind. I wear glasses and I can tell side by side 4k content and non 4k content, also 4k TVs/Monitors and non-4k TVs/Monitors apart pretty easily, even as low as 24".
  • rrinker - Monday, September 16, 2019 - link

    There is a HUGE difference between lower res content upscaled and the lower res on a native mode display. The upscaling is almost ALWAYS worse and noticeable, but just because you can see the difference between 4K native content and 1080P content upscaled to 4K doesn't mean the same size display with a native 1080 resolution would look anything like that upscaled content. And as small as 24"? I call BS, try putting yourself through a double blind study and see if you csn really tell which 24" display is 4K and which is lower.
  • mode_13h - Wednesday, September 18, 2019 - link

    Depends on what kind of content you're talking about. There's a good case for upscaling full-bandwidth content, because pixels do a poor job of reconstructing a properly band-limited signal.

    https://en.wikipedia.org/wiki/Nyquist%E2%80%93Shan...
  • twtech - Sunday, September 15, 2019 - link

    While I think 8k might find a use at that resolution, I also think it would be perfectly fine at 4k - just like 1080p is fine at 80".
  • mckirkus - Monday, September 16, 2019 - link

    "and most certainly at 132", 8K is required"

    Guys, you realize that most movies at the movie theater are 2k. And most CGI is rendered at 2k. The SuperBowl was broadcast at 720p last year (0.9 megapixels), which I watched broadcast on a large screen. To suggest that you need 35X more resolution to enjoy watching it is simply ridiculous.

    4k is fine at this resolution. Stop comparing pixel density with phones.
  • 0ldman79 - Tuesday, September 17, 2019 - link

    Thank you.

    It just gets on my nerves, people talking about "4K @ 65 looks like shit" or whatever...

    For one, get better content. I watch 480P scaled up to 1080P, play @ 4K scaled down to 1080P and everything in between.

    Today I'm in glasses, my vision is only 20/20. Previously I wore contacts, my vision was 20/10 or better. Even with 20/10 the difference between 720P and 1080P on my 46 inch 1080P 6ns TV is minimal, depends on the content.
    Farcry 5 has jagged edges regardless of the resolution.

    Fallout 4 barely has any jagged edges regardless of the resolution.
  • 0ldman79 - Tuesday, September 17, 2019 - link

    Honestly, a lot of games look better @ 4K render, 1080P display or 720P with massive AA than they do at 1080P. People ignore that though because they have to stroke their ego and justify newer, faster hardware for points on the Internet. Stargate SG1 looks just as good @ 480P as it does at 720P and you can't see an improvement at 1080P until season 9 or so, and that's mostly in CGI scenes...
  • 0ldman79 - Tuesday, September 17, 2019 - link

    If I'm so disturbed by a less than perfect 4K picture then either my game or my show sucks and I'd much rather enjoy the show at 720P than count pixels at 8K.
  • 0ldman79 - Tuesday, September 17, 2019 - link

    broken into multiple messages because the damn forum censor decided my post was spam and wouldn't take it all in one comment.

    That was frustrating as hell...
  • mode_13h - Wednesday, September 18, 2019 - link

    > Guys, you realize that most movies at the movie theater are 2k. And most CGI is rendered at 2k.

    Perhaps a couple decades ago, but certainly not today. Related:

    https://4kmedia.org/real-or-fake-4k/

    Probably not all of the discs mastered in 2k for home video release were done so for digital cinema distribution.

Log in

Don't have an account? Sign up now