Over the past couple of decades, gaming monitors have seen huge leaps forward in technological advancement – leaving the bulkiness of CRTs (Cathode-Ray Tube) behind and embarking on a much more exciting LCD (Liquid Crystal Display) future.
Gone are the days when monitors were bigger than a PC tower and more beige than your grandad’s loafers. Today’s monitors boast a whole host of impressive design features, performance specifications, and immersive attributes that, to most people, make them far superior to CRTs of the past.
Furthermore, and thanks to modern-methods, manufacturers can now tailor a monitor’s performance to suit your individual needs. Whether you’re a gamer, a creative type, or someone that just wants to be immersed in film and TV, today’s market will have the exact monitor you’re looking for.
With all that being said, however, one large debate still remains; CRT vs LCD monitors – which display technology is best for gaming? Many would assume that LCDs are better because they are newer, but that isn’t exactly true. As we look back over the history of monitor technology, we’ll be comparing CRTs to LCDs to see which is actually better for gaming.
So, with all that in mind, let’s waste no further time and dive straight into it!
The Early Years
The first computers were a mind-boggling mess of electronics-filled cabinets that would interact with its users via flashing lights – if they were lucky. Many had no visual output at all, with users needing paper print outs to decipher the information they were calculating.
Whilst CRT technology had been around since the late 1890s (and used more recently within mainstream television sets ), it wasn’t until the 1970s when cathode-ray tube technology was first introduced into consumer-grade computer monitors.
The first of its kind was the Xerox Alto computer. It came equipped with its own CRT monitor that utilizes a monochrome display – text only. It might have been basic, but it was the start of a new era.
Throughout the 70s and 80s, computer scientists were hard at work trying to replicate the success that had been achieved with TV sets – trying to develop hardware and code that would allow PCs to output an image on portable consumer televisions. Whilst they managed to do this, the resolution was low and the color was extremely limited. It wasn’t until the mid-to-late 1980s when CRT monitors became available to the public – albeit limited and not very versatile.
After the development of multisync technology, however, the manufacturing of CRT monitors changed dramatically. Manufacturers now had the ability to create and design CRT monitors that weren’t specific to any one brand or model. Multisync not only enabled the support for any-PC compatibility, but it also supported multiple resolutions, refresh rates, and scan frequencies. It was at this moment that CRT monitors really took off, providing an enjoyable viewing experience that was both versatile and functional.
That being said, CRTs still had their flaws. Not only were they heavy, large, and ugly, but they also took up the majority of your desk real-estate. As these characteristics were undesirable by many, and thanks to the speed in which technology was moving forward, it wasn’t long before a new favorite was in town.
The First LCDs
Whilst LCD technology stretches back to the 1880s, it wasn’t until the late 1970s when liquid-crystal displays finally became available in computer monitors. At this stage, the main bulk of LCD technology was being used in calculators and watches. It wasn’t until later that desktop monitors started to see the impact of LCD technology.
LCD technology was revolutionary when it came to the way monitors were designed. Its very existence meant that display dimensions could be drastically reduced, opening up a whole host of consumer-tailored benefits. Not only were LCD monitors thinner and less bulky, but they were also lighter and consumed much less energy. They were also larger (in terms of screen size) and could be curved – bringing a whole new level of immersion to the table. Not to mention heavily-reduced eye-strain too.
At this time, however, the manufacturing process for LCD monitors was still extremely expensive, meaning liquid-crystal displays would still be most people’s second choice – for now.
As we approach the late 1990s and early 2000s, manufacturing methods for LCD monitors have been refined substantially – now allowing creators to bring much cheaper displays to the CRT-flooded marketplace. This, for all intents and purposes, was the start of the decline for cathode-ray tube monitors.
As more affordable LCD offerings were introduced into the market, consumer demand for CRT-based monitors started to drop. By the late 2000s, most of the high-end CRT production had ceased, now focusing on LCD alternatives instead. The UK’s biggest retailer of domestic electronic goods saw a complete shift in CRT model sales – stating they had seen a 75% reduction in sales from 2004-2005. Similar scenarios were occurring in America too, with stores like Best Buy rapidly reducing the amount of space allocated to CRTs.
By the end of 2010, the previous mainstay of display technology was all but dead. And whilst many thought the new slimline display options were superior, there was still a bunch of individuals that weren’t completely sold on the idea.
Competitive gaming has been around since the early 70s, with Stanford University hosting one of the first competitions with multiple player entries. However, it wasn’t until the 90s when competitive gaming really started to take off.
Street Fighter II (one of the most beloved fighting games of the last 30 years), was one of the first to popularize the idea of head-to-head competition. It paved the way for many multiplayer online action games – some of which are still being played today. Fast forward a decade or so and competitive gaming is bigger than ever, encompassing a global audience of players and viewers.
At this stage, PC esports also started to grow in popularity. As it did, competitive players would look for any advantage they could get over their opponent. One aspect that was taken particularly seriously during this time, was their monitor of choice.
This was right around the time when LCDs started to take hold of the monitor marketplace. Whilst many individuals were more than happy to get rid of their bulky CRT monitor, the same couldn’t be said for gamers.
As many will know, monitors used for competitive gaming require a number of different factors to be classed as “high-performance”. Out of those factors, the most important are; refresh rate, response time, resolution, and input lag. These are the factors that make gameplay smooth, ensure visuals are optimized and reduce the lag between peripheral and display – all essential when it came to the highest levels of competitive gaming.
Unfortunately, this is where the LCD fell short. Early LCD monitors were worse in almost every department, leaving competitive gamers no choice but to carry on with their tried-and-tested CRTs. It wasn’t until years later when LCDs could finally boast CRT worthy performance levels.
Monitors Of Today: A CRT Vs LCD Comparison
Fast-forward to the present day, it’s 2020 and CRT monitors are all but forgotten. Competitive gaming is at its most popular and titles such as LOL, CS:GO, Dota 2, FIFA, and Fortnite start to make mainstream news. Professional esports franchises are vast and players are earning huge sums of money thanks to contract and sponsorship deals.
With esports newfound lucrativeness, monitor manufacturers start to jump on the gaming bandwagon – marketing what seems like everything as “the best for gaming”. Whilst that might be the case when compared against LCDs of today, can they really boast the same credentials when compared to older CRT monitors?
CRT Vs LCD
Well, let’s take a look:
Above we have compiled a short side-by-side comparison between one of the best CRT monitors ever made (the Sony FW900) and one of the latest high-end gaming LCD panels (the ROG XG279Q). Whilst initial readings may look like the ROG is the superior monitor, there are certain technologies within a CRT display that might make you think differently.
Firstly, we have the pixel grid. Unlike LCDs, CRTs don’t adhere to a fixed pixel grid. Instead, CRTs use three “guns” to beam light directly onto the tube, meaning there is no upscaling blur and no need for any specific native resolution as such. As this is the case, CRTs have the ability to run at lower resolutions – decreasing the amount of stress on your GPU – whilst still looking crystal clear.
The second biggest advantage found within CRT monitors is motion resolution. Modern LCD monitors use a technique known as “sample and hold”, a method in which motion renders at a significantly lower resolution than static images. What this means is, when a game is panning from left to right, the image will be rendered in a lower resolution, therefore leading to a much blurrier visual experience. CRTs, on the other hand, handle motion in a completely different way. CRTs don’t utilize the “sample and hold” method, instead, rendering each and every frame identically. What this means for gaming is, a CRT sporting a 768p resolution can actually look as good as, if not better, than a 4K display of today.
Next up we have input lag. Whilst input lag isn’t a huge issue in high-end LCD monitors, it still plays a role in the purchasing decision of many individuals today. However, when it comes to CRTs, input lag is nonexistent. The imagery displayed on a CRT is beamed directly onto the screen at a rate which is close to the speed of light. This means zero input lag or delay – one of the main reasons why some individuals still use CRTs to this day.
It isn’t all victories for CRT monitor technology though. Some of the most glaringly obvious downsides fall in the CRT corner, including size, weight, and health hazards. They’re also extremely fragile, don’t have any stand versatility, and can put a real strain on your eyes.
So, ultimately, it comes down to what you prioritize most in a gaming monitor. If you prioritize raw gaming performance nothing else matters, CRTs are still the better choice to go for – even today. However, if you’re mostly interested in immersion, ease of use, versatility, and features, a modern LCD will serve you well.
Monitors Of The Future
With all being said, the only thing left to discuss is the future of display technology and what it has in store for gamers. With OLED already being used within modern TVs, it’s only a matter of time before we see the technology used in mainstream desktop monitors. It’ll offer a whole host of design features and performance-tailored specifications that will likely see LCD panels of today become obsolete.
Gamers will be treated to faster response times, deeper blacks, higher contrast ratios, wider viewing angles, and greater power efficiency. Not forgetting the OLED’s ability to flex, bend, rollup, and curve.
But one question still remains; whilst they offer up clear performance advantages over LCDs of today, can the same be said when comparing them to CRTS of 20 years ago?
Well, whilst the answer is almost certainly no, it’s still a debate that lasts long into the future.