When you think of a display interface cable, chances are, the humble HDMI cable is the first thing that springs to mind. Manufacturers have been making use of the HDMI input for years now, allowing the high-speed transfer of both audio and video from a source device to a display. It’s everywhere – monitors, TVs, media streamers, Blu-Ray players, A/V receivers, gaming consoles, and even digital cameras. If it’s a consumer electronic device, chances are it’ll have HDMI support.
That being said, HDMI isn’t the only fish in the digital interface pond. The DisplayPort – often named the gamer’s input – is another option that manufacturers can utilize for their input/output requirements. Funnily, the DisplayPort actually outperforms the HDMI (by some way) when comparing to two on a strict spec by spec basis. However, manufacturers of consumer goods still opt for the HDMI over its more impressive cousin.
You’ll have more luck spotting the less-popular audio/video standard on the latest graphics card hardware – alongside some Macs and laptops marketed towards gamers and business owners.
That being said, both HDMI and DisplayPort deliver high-definition digital video and audio from a source device to a visual display. So, why is HDMI still the preferred choice by electronics manufacturers?
In the following article, we’ll be answering that very question – taking a closer look at the two major audio/video standards to see how they compare. We’ll be answering all the pressing questions surrounding the subject, concluding with which is better for your needs as both a gamer and a general user. So, with plenty to get through, let’s waste no further time and dive straight into it!
To get a better understanding of both HDMI and DisplayPort, we must first revisit the origins of their fruition.
HDMI feels like it’s been around forever, however, it was first brought to life in 2002 by six consumer electronics powerhouses: Hitachi, Panasonic, Philips, Silicon Image, Sony, and Toshiba. It was initially designed to create an AV connector that would ultimately be backward-compatible with DVI, enabling the seamless transfer of both audio and video in a much smaller input connector. Soon after its arrival, it became the standard for HDTVs across the globe, being utilized in 90% of digital televisions by mid-2007.
Fast forward to the present day and over 80 vendors and part of the HDMI Forum – all with equal rights and the ability to participate in the development of the HDMI standard.
DisplayPort, on the other hand, was developed by VESA (Video Electronics Standards Association) – a large group of manufacturers that includes the likes of AMD, Apple, Google, and a whole host of other major companies. The alternative display input made its debut in 2006 and was initially released to replace two aging standards that were starting to become outdated – VGA and DVI. Historically, DisplayPort was a royalty-free product – allowing manufacturers to utilize the standard without adding any additional cost to an electronic’s design. However, since then, and like HDMI, DisplayPort has become a payable additional – with rates varying depending on the usage. Whilst this seemed like a huge positive ‘back in the day’, it still wasn’t enough to overturn the HDMI’s popularity.
To better understand the pros and cons of all display inputs, we must get a greater understanding of bandwidth and how it affects your viewing experience.
Bandwidth is the amount of data that a particular standard can transfer from one source to another – usually measured in seconds. When referencing display bandwidth, there are three main areas that contribute to the majority of the bandwidth – Color depth, maximum number of pixels, and the refresh rate.
In modern displays and GPUs, color is created by using three components; red, green, and blue – sometimes seen as luma, blue chroma difference, and red chroma difference (YCbCr/YPbPr). Whenever your GPU is processing color, that data gets converted into a signal that is readable by the display. Over the years, the standard for color depth has been 8-bit color – equating to 24-bits total or 8-bit per channel (red/green/blue). However, with HDR becoming more popular amongst consumer-grade products, we’re now seeing the arrival of 10-bit color (30 bits total). This is the first factor that makes up the bandwidth from a source to a display.
Next, we have Maximum resolution. This is fairly straightforward – it’s the total number of pixels a monitor can actually display. Simply multiplying the vertical and horizontal figures of a resolution will give you the total number of pixels on a screen (1920 x 1080 = 2,073,600 total pixels).
Lastly, we have the monitor’s refresh rate. Refresh rate is the number of times a display refreshes itself every second – measured in hertz. So, a 60hz refresh rate will refresh the display 60 times per second.
These are the three main contributing factors that make up the majority of the bandwidth. To work out the minimum amount of bandwidth required for a specific panel, simply multiply the total number of pixels by the refresh rate and color depth (1920 x 1080 x 60 x 24 = 2,985,984,000 bits).
Using this formula we’ve worked out the required bandwidth for some common display specifications:
|Screen Resolution||Color Depth||Refresh Rate||Required Data Bandwidth|
|1920 x 1080||8-bit||60Hz||3.20 Gbps|
|1920 x 1080||10-bit||60Hz||4.00 Gbps|
|1920 x 1080||8-bit||144Hz||8.00 Gbps|
|1920 x 1080||10-bit||144Hz||10.00 Gbps|
|2560 x 1440||8-bit||60Hz||5.63 Gbps|
|2560 x 1440||10-bit||60Hz||7.04 Gbps|
|2560 x 1440||8-bit||144Hz||14.08 Gbps|
|2560 x 1440||10-bit||144Hz||17.60 Gbps|
|3840 x 2160||8-bit||60Hz||12.54 Gbps|
|3840 x 2160||10-bit||60Hz||15.68 Gbps|
|3840 x 2160||8-bit||144Hz||31.35 Gbps|
|3840 x 2160||10-bit||144Hz||39.19 Gbps|
With a better knowledge of bandwidth, we start to get a greater understanding of why more advanced audio/video interface standards are required. Monitor and GPU technology is advancing at an exponential rate, with higher screen resolutions and faster refresh rates becoming ever-popular in today’s market.
With that in mind, let’s take a closer look at the bandwidth capabilities for each revision of both HDMI and DisplayPort:
As you can see from the table above, both HDMI and DisplayPort have widened their transfer limits hugely since their arrival to the market. The more recent revision of HDMI (2.1) is now capable of supporting bit rates up to an impressive 48Gbps. In contrast, DisplayPort 2.0 offers up 80Gbps – equating to 4K at 240Hz.
Please note though, the displayed 48Gbps and 80Gbps bandwidth limits do refer to raw delivery speeds only. In theory, only 80% of that is actually used for the transfer of data – with the other 20% being used to maintain a steady connection. That being said, it’s still a huge leap forward when compared to older revisions of both standards – allowing the futureproof of mainstream TV and GPU products.
Actually being able to utilize those kinds of bandwidth limits, however, is still some time away. That being said, there are a number of high-performance gaming monitors available right now that do utilize a lot of bandwidth – 4K at 144Hz with 10-bit color, for example. This type of panel alone would require the need of either DisplayPort 2.0 or HDMI 2.1. Furthermore, with the arrival of 8K TVs (and capable GPUs), the latest interface standard is required to push this resolution with at least a 30Hz refresh rate.
It’s also worth mentioning that both HDMI and DisplayPort are backward compatible with older display types: HDMI to VGA and DVI, as well as DisplayPort to VGA, DVI, and even HDMI.
Ultimately, HDMI and DisplayPort do very similar things. That being said, both still have unique features that tailor them to specific scenarios. One of the major benefits of using DisplayPort is its ability to drive four daisy-chained displays – with HDMI only offering support for two (in extremely rare cases). That being said, HDMI explicitly supports CEC (Consumer Electronics Control) for large A/V configurations – alongside the transfer of ethernet data too.
Taking a quick look at the physical connectors of both DisplayPort and HDMI, there are some obvious differences to be found.
The HDMI connector has 19 pins and comes in three different sizes – all utilized by a variety of unique devices. Type A (Standard) is what you’re likely to see on most TV’s, Soundbars, consoles, and other larger A/V systems. Type C (mini) and Type D (micro) are usually found on smaller devices like mobile phones, dash cams, and tablets.
DisplayPort is made up of 20 pins and is only available in two different sizes, DisplayPort and Mini DisplayPort. The first is what you’ll see on modern GPUs, the second is more commonly seen on Microsoft’s Surface Pro and Apple Mac’s before the arrival of USB Type-C/Thunderbolt 3.
Both HDMI and DisplayPort come in a variety of different forms. Interestingly, proprietary locking methods have been adopted by manufacturers to help keep each cable in place – especially useful when using the cable in a vertical fashion (monitors for example).
Ultimately, the Displayport is the gamer’s choice when it comes to input options. At present, DisplayPort 1.4 is the most commonly used and readily available version of that particular standard. Despite DisplayPort 2.0 being ‘released’ in the summer of 2019, there still aren’t any GPUs or displays that actually utilize this new iteration – something I didn’t think would be the case after Nvidia’s 30-series and AMD’s Big Navi launches. That being said, both opted to stick with the DisplayPort 1.4a variant, allowing for the transfer of 8K at 60Hz with DPC – more than enough for today’s standards.
That being said, there are still some fairly obvious advantages to be gained from using DisplayPort over the current HDMI standard. Alongside its longstanding relationship with VRR (FreeSync & G-Sync), DisplayPort also offers a much more robust connection when compared to HDMI. At the base of the connector, the DisplayPort has two prongs that enable it to lock into place – requiring a button to be pressed upon release. The same can’t be said for HDMI cables – often being pulled out when moving or changing monitors regularly. More impressively, however, is DisplayPort’s ability to support up to four displays (at any given time) via Multi-stream transport.
One downside to the usage of DisplayPort is the maximum length that is available – 3m at the time of writing this. This doesn’t really bode too well for the general consumer of electronics goods and is definitely something that should be looked at by DisplayPort manufacturers.
For all other usage scenarios, the HDMI standard still reigns champion. With constant updates allowing the HDMI interface to remain relevant, it has been the go-to audio/video interface for well over 16 years now.
Historically, HDMI standards have been worse when compared to DisplayPorts. That being said, real-world usage has put them on a level playing field – with 24-bit color at 4K 60hz being supported by HDMI 2.0 since 2013. Until some new technology becomes available that offers crazy resolutions at stupidly high refresh rates, there’s really no difference between the two standards.
AMD GPU owners will also be pleased to hear that HDMI has supported VRR (via an AMD extension) since 2.0b. That being said, only now at HDMI 2.1 are we seeing VRR become part of the official standard.
The main benefit of utilizing an HDMI connection all comes down to its versatility. The HDMI connection is fully ubiquitous – utilized in millions of devices since its arrival back in 2002. TVs, Blu-ray players, and all other kinds of consumer electronics now offer multiple HDMI inputs – meaning you can hook several devices up to one display. Furthermore, TV manufacturers are already starting to roll out HDMI 2.1 products for the upcoming change – with LG having an HDMI 2.1 OLED TV on the market since 2019.
Unlike DisplayPort, HDMI doesn’t have the same sizing issues to deal with. There are 15m HDMI cables available for purchase right now – exceeding the maximum length of DisplayPort by 5 times. That being said, on paper, they still offer a lot less bandwidth than the alternative DisplayPort standard.
As I sit here today writing this article, both DisplayPort and HDMI standards have come a long way. Fortunately, both have evolved at a similar rate, providing consumers (and the developers behind the consumer electronics) with the bandwidth needed for the latest display technology. However, several questions still remain around the two standards – including which is better for gamers in the modern-day.
Well, at time of writing this, there isn’t much difference between the two. Both offer a very good visual experience, with high resolution, color depth, and refresh rate being supported. That being said, DisplayPort 1.4 is still considered the better standard as of right now. It offers multiple display compatibility on a single cable – alongside better bandwidth support for high resolutions and refresh rates.
We’ll have to wait for HDMI 2.1 to be released for the people’s standard to beat the DisplayPort variant – however, let’s not forget, when DisplayPort 2.0 arrives, HDMI will quickly be knocked off the top spot – for PC enthusiasts.
HDMI 2.1 will have the backing of the latest consoles and the majority of the newly released consumer electronics – continuing the same trend we’ve seen over the past 15 years or so. Leading us to conclude that, yes, whilst DisplayPort is the better standard, it is, unfortunately, the lesser used of the two. However, the bottom line is this, as long as the interface you plan to use can accommodate the graphical settings your display and hardware offer, it really doesn’t matter which you choose. They’re both excellent options.