Choosing the right TV is never easy. With so many factors to consider and so many options to choose from, it’s hard to know where to begin and what to look for. To help make your TV-shopping experience a little less intimidating and (hopefully) a lot more enjoyable, we’ve compiled a list of all the important questions you need to ask yourself before making your next TV purchase.
What Size TV Should You Get?
When it comes to television size, it’s hard to argue with the “bigger is better” philosophy. After all, a bigger screen means a bigger picture, a greater sense of immersion, and so on. However, despite your relatable and charmingly capitalistic desire for more, a bigger TV might not be better.
So, how do you determine what size television to get? First, take a look around your living room (or wherever you plan on installing your new TV). How big is your space? More importantly, what’s the distance from where you’ll be watching to your new TV? That’s your viewing distance, and it’s a good indicator for how big a screen you should get.
According to the Society of Motion Picture and Television Engineers (SMPTE), the optimal viewing distance should result in the TV screen filling 30 to 40° of your field of view (FOV). In terms of a formula, that comes out to around 7" of screen size for every foot of viewing distance for a 30° FOV and 10" of screen size for every foot of viewing distance for a 40° FOV. Don’t worry, we’ll break it down in the chart below.
|Viewing Distance (30° FOV)
|Viewing Distance (40° FOV)
|7.1' (2.16 m)
|5' (1.53 m)
|7.9' (2.41 m)
|5.5' (1.68 m)
|8.6' (2.62 m)
|6' (1.83 m)
|9.3' (2.83 m)
|6.5' (1.98 m)
|10' (3.04 m)
|7' (2.13 m)
|10.7' (3.26 m)
|7.5' (2.29 m)
|11.4' (3.47 m)
|8' (2.44 m)
The 30 to 40° guideline is a good jumping off point, but by no means is it written in stone. Instead, treat it like a rule of thumb, one that should get you in the ballpark of the right size for your setup. To help refine your size search further, you’ll want to consider additional factors, as well, including panel type and screen resolution, both of which we’ll go over.
Finally, when shopping screen size, do keep in mind that bigger screens tend to cost bigger bucks. Not only should you make sure there’s enough room in your viewing area for that glorious new 98" Sony BRAVIA, make sure there’s space in your budget, too.
Which Display Technology Should You Get?
Of all the many factors to consider when shopping for a new TV, none is more confusing and jargon-laden than display technology. LED, OLED, Micro-LED? What are all these acronyms? What the heck do any of these letters even mean? Well, for one, they can mean the difference between spending hundreds and thousands of dollars, so there’s that.
But there’s plenty more at play than just the impact on your wallet. Display technology affects every important aspect of your future screen, including color, contrast, brightness levels, and more. Let’s take a look at the display technologies you’re most likely to run into, so you can decide which one is right for you.
LED: When we say LED television, what we’re talking about is any TV that uses an LCD panel with an LED backlight. LED is the most common type of display technology, and LED TVs are the most prevalent. The catalog of available LED screens is extensive, covering a wide spectrum of size, cost, image quality, and features.
Often, what separates good LEDs from great LEDs is their backlighting technology (local dimming, edge lighting, etc.). However, in general, the pros of LED televisions include the vast selection of available sets, affordability, and superior brightness (LEDs are more visible in sunny rooms and direct sunlight than non-LED sets).
The downside to LEDs are that, despite the continued development of backlighting technology and improvements to overall image quality, LEDs can’t compete with the picture quality of QLEDs.
QLED: QLED (or quantum dot LED) TVs are LED TVs that use an additional quantum dot filter to enhance the screen. They’re still considered LEDs, meaning they still use an LCD panel and a separate LED backlight to illuminate the display. However, the addition of quantum dot technology means better color saturation, contrast, and brightness than conventional, non-quantum LEDs.
The advantages of QLED technology include the aforementioned boosts to color and contrast, which, in turn, can lead to better image quality and HDR (High Dynamic Range) performance. Also, because QLEDs are really just supercharged LEDs, they put out more light than non-LEDs, making them better suited for sunny or well-lit rooms.
The downsides to QLED display technology aren’t many. They still can’t match the overall picture quality of an OLED, but with better contrast and comparable color performance (depending on the set), QLEDs have narrowed the gap.
QLEDs do cost more than most conventional, non-quantum LEDs, but on the other hand, they’re still cheaper than most OLEDs, so you can think of them as a happy medium in terms of affordability.
Mini-LED: To better understand mini-LED display technology, we first need to talk about local dimming. Local dimming is a common backlighting method used in many conventional LED televisions. Unlike direct-lit LEDs, which use a uniform backlighting system, local dimming LEDs utilize multiple LED zones to illuminate specific parts of the screen.
The benefits of local dimming over uniform backlighting include better contrast, reduced light bleed, and less haloing. Local dimming also enables and supports HDR, which uniformly lit displays don’t support.
Mini-LED display technology is like local dimming, but much more precise. Mini-LED TVs use thousands of miniaturized LEDs that are much smaller than conventional LEDs. Because they’re so small, thousands more of these mini-LEDs can be packed onto the backlight panel, enabling much smaller, more finely tuned lighting zones.
Advantages of mini-LED display technology include higher brightness levels, better overall contrast, and improved HDR. Also, because they use smaller-sized LEDs, mini-LED televisions are often thinner and more aesthetically pleasing than their non-mini counterparts.
The biggest downside of mini-LED technology is that it’s still not OLED, meaning you won’t get the absolute best picture quality possible.
OLED: Unlike most LED televisions, OLED TVs don’t have a separate backlight layer. Instead, OLEDs use millions of self-emissive pixels that act as their own individual backlight. Because each pixel lights (or doesn’t light) itself, OLED TVs offer best-in-class contrast, color performance, and the deepest black levels you’ve ever seen. In terms of overall picture quality, OLED screens can’t be beat.
OLED TVs aren’t perfect, however. One notable drawback is that OLEDs don’t get very bright (compared to LED screens). You’ll want to pay careful attention to the brightness levels if you plan on viewing your new TV in a space that gets a lot of direct sunlight—an OLED screen might not be the most ideal choice.
The other downside of OLED display technology is that it’s more expensive. In general, OLED TVs cost more than LED TVs. But, as the saying goes, you get what you pay for. OLED TVs offer premium screens, so you pay premium price.
Micro-LED: The first thing you need to know about micro-LED technology is that it’s prohibitively expensive. Currently, the average cost of a micro-LED television is around one-hundred grand.
Why is it so expensive? In short, because the production process is so difficult. Like OLEDs, micro-LED screens use self-emitting pixels to produce images. But instead of organic material, micro-LED screens use millions of microscopic LEDs. Producing and placing all of those microscopic lights onto a display is not easy, nor is it cheap.
OK, so what do you get for a TV that costs more than cloning a horse? Well, in terms of color quality, contrast, HDR performance, and brightness, you basically get the best screen ever made. Micro-LED technology wins all of those categories, hands down.
As far as other important features like sharpness, resolution, and blackness levels go, it’s pretty much neck-and-neck between micro-LEDs and OLEDs.
Which brings us to size. Not only is the average price of a micro-LED television more than the GDP per capita of Norway, the average size is equally immense. Currently, the only micro-LED television available at B&H is the Samsung MS1A. It measures 110", which is around the size for a micro-LED display.
Why so big? Again, because of production challenges. Packing millions of microscopic LEDs onto a 100" screen is hard enough. Getting them onto a screen that’s half the size isn’t feasible or commercially viable.
Right now, micro-LED technology isn’t really an option for most of us. It’s too expensive. The size options are limited and way too big. Even if those aren’t prohibitive factors, you should probably still wait to buy one.
Micro-LED display technology is still very new, but it’s also very promising, which means it’s going to develop rapidly. Prices and size restrictions will come down, features and screen quality will go up. So for now, just hold off.
What TV Resolution Should You Get?
Apart from screen size, no feature gets more attention (or hype) than screen resolution. Screen resolution indicates the total number of pixels that make up your display, expressed in terms of width and height (e.g., 3840 x 2160). More pixels means more details and a sharper picture, so, in theory, the higher the resolution, the better the picture quality.
One thing to keep in mind is that screen resolutions are often called by more than one name. For example, 1920 x 1080 is also known as 1080p, Full HD, or FHD. 3840 x 2160 is more commonly referred to as 4K, Ultra HD, or UHD.
When shopping for a television, the resolutions you’re most likely to run into are 1080p, 4K, and 8K. Most current TVs come in 4K, which is the current resolution standard. For years, the resolution standard was 1080p, which is far less common these days. Similarly, 8K televisions are pretty rare. They’re out there, but not in force.
Odds are, you’re going to wind up with a 4K TV, and that’s a good thing. 4K TVs offer four times the number of pixels as 1080p screens, so the overall picture quality is usually much better. There’s also a ton of 4K content available for you to enjoy, including on-demand content from streaming services like Netflix and Amazon Prime Video. Live TV hasn’t adopted 4K yet (nor will it anytime soon), but mainstream TV companies like Dish Network and DirectTV do have 4K programming and movies available to stream.
But what about 8K? Isn’t it better?
Well, despite what you might’ve heard, 8K television really isn’t a thing right now. Yes, 8K TVs exist, And yes, 8K screens should theoretically look better than their 4K counterparts. But there are problems.
The first problem is content. Right now, there really isn’t any 8K content available―kinda hard to enjoy your screen if there’s nothing to watch. Second, while the number of pixels in an 8K screen is far greater than 4K, depending on the size of your TV and the viewing distance, the difference might be subtle if not totally imperceptible.
Bottom line: Get a 4K TV. There’s no reason for you to get an 8K screen. Maybe in a year or two, if prices come down and 8K content starts to become available. Until then, 4K all the way.
Which Ports Do You Need?
Conventional wisdom says a quality big screen should have at least three HDMI inputs. Of those inputs, at least one should be HDMI 2.1 (although the more HDMI 2.1 ports you can get, the better).
In case you’re wondering: HDMI stands for High-Definition Multimedia Interface. It’s the current connection standard for modern TVs. Game console, media player, soundbar, you name it—if it connects to your television, odds are it does so via HDMI.
Now, you see those numbers that follow HDMI (2.0, 2.1, etc.)? Those numbers indicate the HDMI version. Currently, there are three primary versions of HDMI floating around: HDMI 1.4, HDMI 2.0, and HDMI 2.1. There are also two version updates: HDMI 2.0a and HDMI 2b.
HDMI version numbers are important because they tell you which features—including screen resolution—that particular port supports. For example, an HDMI 1.4 port only supports video resolution up to 1080p, while HDMI 2.1 supports video up to 4K120.
Checking to see which HDMI versions your prospective TV supports is a critical step in the evaluation process, as it reveals which devices are compatible and to what extent they can be fully utilized.
What do we mean by “fully utilized?” Well, consider the PlayStation 5. Sony’s next-gen gaming console features several titles that support refresh rates up to 120 Hz. However, if your TV doesn’t support HDMI 2.1, you won’t be able to enjoy the benefits of a faster refresh rate, because HDMI 2.1 is the only version that currently supports 120 Hz gameplay on the PS5.
You should also look for HDMI 2.1 support if you plan on connecting any high-end audio equipment, such as soundbars and receivers, or installing a surround sound system. HDMI 2.1 features eARC (Enhanced Audio Return Channel), meaning it’s the only version of HDMI that supports uncompressed sound formats, such as Dolby Atmos.
Bottom line: The best way to determine which ports you’ll need is to take stock of which devices you plan on connecting to your TV and what their requirements are. If you plan on connecting a next-gen game console, like the PS5, or a hi-res media player, such as Apple TV 4K, look for a TV with HDMI 2.1 support. The same applies if you plan on connecting a multi-channel soundbar or speaker system.
However, if you’re not a gamer or home theater enthusiast, and you prefer to consume content using your TV’s built-in streaming services (Netflix, Hulu, etc.), then having the latest HDMI version isn’t as critical (but we’d still recommend it, just in case).
Do You Need a Soundbar for Your TV?
Here’s the thing: Most television speakers are, as the French say, le bad. Some TV speakers are fine, some are better than others, but at the end of the day, not one of them is what you would call good.
Now, does that mean you absolutely have to have a soundbar? No, not at all. In fact, unless you’re watching Mission: Impossible – Rogue Nation every single night (it’s the best one, people), you probably won’t be using your soundbar as often as you think.
Why? Because not everything calls for a soundbar or surround sound. In fact, a lot of things don’t. Half of the time, I watch YouTube videos on my TV. Do I need a soundbar to enjoy a video montage fully of cats wearing ties? No, I do not. And neither do you.
However, a quality soundbar will absolutely enhance your TV’s audio quality and make your movie nights pop off. They’ll increase immersion for almost any listening or viewing experience, including gaming, streaming music, or blissing out to your favorite ASMR.
Does that mean you need to run out and buy a soundbar the minute you purchase your new TV? No, not all. In fact, we recommend waiting until you’re more familiar with your new TV’s sound performance and its limitations. Once you’ve gauged the overall audio quality, you’ll have a better idea of what your audio needs are and whether you would benefit from a soundbar.
That about wraps it up for our TV buying guide. We hope that some of the information presented here will prove useful. If you have questions about anything we’ve discussed, or any topic we left out, or if you would like a specific recommendation based on your own specific needs, please drop us a line below. We’re happy to answer questions, recommend products, or argue about why no one will ever need 8K.