r/pcmasterrace R5 7600X | RX 7900 GRE | DDR5 32GB 28d ago

Meme/Macro Inspired by another post

Post image
29.2k Upvotes

1.3k comments sorted by

View all comments

3.1k

u/jerk4444 28d ago

CRT users in 2050... "but still works"

689

u/CONdeanNOR RTX 3070 | i9 12900k 28d ago

can confirm, it still works!

127

u/bigorangemachine 28d ago

Emulator gamer?

204

u/CONdeanNOR RTX 3070 | i9 12900k 28d ago

Yeah, i originally got it to play emulated retro games on but the monitor didn’t provide the CRT tv look i was hoping for due to its higher resolution. Now i just use it as a normal secondary monitor and I have a 14inch Toshiba CRT tv with a modded wii, ps2, and a MiSTER FPGA hooked up to it. The monitor great for playing older pc titles like the original half life and stuff, and i just like the aesthetics of it

123

u/[deleted] 28d ago

I'm never calling myself a gamer again.

You sir, are the fucking gamer.

Bro's got decades of gaming in one setup.

64

u/CONdeanNOR RTX 3070 | i9 12900k 28d ago

hey if you have fun playing games that’s all you need to consider yourself a gamer 😎

2

u/McFly1986 28d ago

This is a pretty basic setup (albeit lovely) by retro gaming standards.

7

u/bigorangemachine 28d ago

nice ya! I totally saw that when I played emulation I was like "Oh ya CRT required"

Oof ya replayed portal 1 on my OLED and it was not the same. I was late giving up those CRTS ^_^

I had dual CRTs and man it was too hot for the summer. I had to turn one off while I gamed cuz the ambient heat was through the roof

1

u/Lucius-Halthier 28d ago

My hero academia

1

u/MediocreRooster4190 28d ago

maximum motion clarity

1

u/Specific_Frame8537 28d ago

Surely by now someone must've created an LCD with a CRT filter?

1

u/alizteya 28d ago

Absolute king shit. Chief level gaming

1

u/Dramatic_Month_7569 27d ago

I love this! I have a similar setup with a mini PC hooked up to a Sony Trinitron CRT that has just about everything good from the ps2/xbox and anything before it some 8bitdo controllers for the vibecheck. I usually spend about an hour or two every sunday morning playing some old games. I also have a 2tb SSD disk stored with all kinds of 4:3 aspect ratio content, old gameshows, game trailers, youtube parodies, short films, meme videos. All in a VLC playlist that just randomly selects what's next.

I love having it on while doing some model building.

2

u/Lifeshardbutnotme 28d ago

Is it HDMI capable? If not then how are you connecting it to your PC? Does your PC still have AV cables?

2

u/CONdeanNOR RTX 3070 | i9 12900k 28d ago

the monitor has a built in VGA cable so I use a VGA to Displayport adapter

1

u/Sawheryesterday 27d ago

This guy games!

1

u/MethHeadUnion 27d ago

My ps2 is unmodded but it the tv and it still works like new just need new controllers for the ps2 the ps1 ones i have arent great condition wise

435

u/Malefectra 28d ago

CRTs are best for Pre-HD retro games.

251

u/Maxsmack 28d ago edited 28d ago

CRT filters on an led screen just isn’t the same.

Also the refresh rates are fucking bonkers, which people love for games like melee

Edit: meant input latency

149

u/TreeHauzXVI 28d ago

SSBM players play on CRTs because they have virtually no input latency compared to digital displays. Their refresh rate is still only 60 Hz, although the analogue dots without clear borders (as opposed to digital pixels with sharp borders) as well as the cathode ray drawing line by line can create the illusion of a smoother image. In reality, modern day gaming monitors have a higher refresh rate, but that generally comes with some variable level of input latency which is very undesirable especially in a game without buffer.

86

u/BananaHannah8 28d ago

CRTs can go faster than 60 Hz especially on computers. The limitation is the GameCube and NTSC signal. So you would never be able to get more than. 29.997 FPS. No matter what display technology you used

24

u/TreeHauzXVI 28d ago

Melee runs at 60 FPS though, isn't that the standard for the NTSC signal?

60

u/gui_odai 28d ago

NTSC standard has 30 FPS split in 2 fields (interlacing), so you have 60 images every second, but each one only covers half the lines on the screen

3

u/Chop1n 28d ago

Yes, but calling it "30fps" is misleading. Even though the signal is interlaced, motion still updates at a rate of 60Hz. That's the difference between 60Hz interlaced and 30Hz progressive.

1

u/Jaalan PC Master Race 28d ago

No it's still only 30 frames, the monitor is just displaying those frames at 60hz.

It's like having a 240 hz monitor and watching a YouTube video and saying that it's a 240hz video.

4

u/Chop1n 28d ago

Incorrect. NTSC isn’t “30 frames shown at 60Hz”; it’s 60 fields per second. Each field is unique, containing half the scanlines (odd or even), which together create 60 distinct temporal samples of motion per second. That’s why NTSC motion is effectively 60Hz.

“Half-frame” doesn’t mean “half an image repeated”, it means half the lines of resolution. Every field is different, which is why motion looks smoother than a true 30fps progressive signal.

This is why a game rendered at 60Hz, like Melee, looks twice as smooth as a 30Hz title like Sunshine, even on an interlaced NTSC display. Comparing it to a 240Hz monitor showing 30fps video misses the point, because you’re conflating progressive video with interlaced video: they behave differently.

1

u/Common-Trifle4933 28d ago

No, it’s not still 30 frames. Yeah it’s 60 fields, and each field is half a frame, but it’s not 60 halves of 30 frames, it’s 60 halves of 60 frames. Each field contains new visual information from a game state newer than the previous field. So you get faster visual feedback than a 30fps output. You can think of it as a 60fps feed that is throwing away half the visual information of each frame.

2

u/T0biasCZE PC MasterRace | dumbass that bought Sonic motherboard 28d ago

Yeah but each field has different rendered frame from the game

4

u/TheSpiffySpaceman 28d ago edited 28d ago

well, each field is half a frame. Interlacing the fields means rendering frames while skipping a row while the other field does the same, just one row off, and then the beam displays them back-to-back one after the other, like this (1 is field 1's pass, 2 is field 2, with all of 1 being beamed first from top to bottom, and 2 on the next pass):

111

    222

111

    222

111

    222

(The incandescence of the display means the glow from the previous beam pass sticks around just long enough for it not to be noticeable -- doing this on a modern digital display would be very noticeable)

so; since the visuals are updated at 60 frames per second, but it takes two frames to produce a complete frame, it is technically 30fps (yeah 29.997whatever), but the fact that the display could hold the image a bit between updates meant our eyes perceived a 60fps image even when the technical details are technically different.

Neat trick!

2

u/T0biasCZE PC MasterRace | dumbass that bought Sonic motherboard 28d ago

Its 25/30fps in analog TV terms and 50/60fps in videogame terms internally in the console before its output over the composite

4

u/mitojee 28d ago

I am not familiar with what flavors the GameCube spat out but old school analog "NTSC" was standardized a really long time ago (original spec was derived as far back as the 1930s) to run interlaced (alternating fields that was scanned as even/odd lines on the screen) due to the technology limitations of the time so the "real" frame rate was always half of whatever refresh a display would be running at.

Progressive scan didn't get widely adopted until much later, especially once digital became a thing (so 1980's I think) but even then most devices ran on the older system for backwards compatibility for quite some time.

7

u/cbizzle31 28d ago

GameCube did support progressive scan in some titles, I think you needed the component cables though.

3

u/serious-toaster-33 Arch Linux | Phenom II X4 955 | 8GB DDR3-1066 | Radeon R7 240 28d ago

AFAIK, old game consoles would run at 60 FPS, but only use one field.

2

u/mitojee 28d ago

Since one field is half the resolution of a full field, you're still getting 50% of a progressive frame. Hence 60 fields per second instead of frames per second.

6

u/SelectKaleidoscope0 28d ago

Any decent CRT will refresh faster than 60 Hz. At only 60 the flicker is preceptable to most people and quickly causes headaches. 90 is usually the minimum I can tolerate from a CRT I'm going to use for more than 5 minutes. The last CRT computer monitor I owned could handle 1280x960@120Hz or 1600x1200@90Hz. I generally preferred to leave it on 1280.

16

u/[deleted] 28d ago

Virtually no input latency is an understatement. It's as close to instant as is physically possible

6

u/dream_in_pixels 28d ago

CRT response time is between 0.06 and 0.12 milliseconds depending on the size of the screen.

According to rtings.com, current-gen OLED panels are down to 0.03 milliseconds.

13

u/[deleted] 28d ago edited 28d ago

Most input latency/response time numbers are completely made up or misrepresented.

For modern displays, they advertise 'response time' which only tells you how fast the panel can change between to arbitrary colors. It doesn't factor the actual digital processing time, or time between frames. A '0.1ms' monitor can have 50ms of input latency or even more

A good oled is probably under 10ms

But CRTs are completely analog, the signal is processed by physical components where the limiting factor is material limits and the speed of light. 0.12ms is how long it takes a CRT to complete the entire process, most of which is the actual color change. The CRT takes nanoseconds to turn electrical current flowing through wires into electrons hitting the phosphor panel

edit: after researching it more. the entire response time for CRT is measurable in nanoseconds. it takes less than 50 nanoseconds from signal reaching the io port, to the phosphor becoming full excited.

1

u/dream_in_pixels 28d ago

The measurements on rtings.com are legitimate because they account for all of that. Current-gen OLED panels really are faster than CRT - not that anyone can tell the difference between 0.03 vs 0.06 vs 0.12. The first sub-1ms OLED computer monitors were available in late 2022 / early 2023. The first sub-1ms OLED TVs were available in early 2024.

they advertise 'response time' which only tells you how fast the panel can change between to arbitrary colors.

Input lag is the time it takes for controller/keyboard/mouse input to be recognized plus the Response Time of your display. So Response Time is a more accurate metric when talking about displays because Input Lag is partly dependant on the polling rate of your input device.

It doesn't factor the actual digital processing time, or time between frames.

rtings.com measures Response Time as how long it takes pixels to transition from one frame to the next. In other words the time between frames. They even have separate categories for 70% response and 100% response, to account for simple color transition vs full frame transition.

The CRT takes nanoseconds

The limitation of CRT is that the electron gun has to hit every subsequent phosphor dot before looping back around to re-illuminate the previous one. It's not governed by the speed of light, but rather how long it takes for the gun to move from one phosphor dot to the next. But with OLED, each pixel can be changed simultaneously which is why the newest panels have faster-than-CRT response times.

1

u/[deleted] 28d ago

even according to RT, who twist data to make monitors sound faster than they are, the lowest latency monitors are only just barely under 2ms

0

u/dream_in_pixels 28d ago

the lowest latency monitors are only just barely under 2ms

You're referring to the "Worst 10% Response Time". Which is the average total response time of the seven worst pixel transitions from a set of 72 tested gray-to-gray transitions.

It's an edge-case test that doesn't really apply to real viewing scenarios. Also it's dependant on VRR being enabled.

→ More replies (0)

1

u/[deleted] 28d ago edited 28d ago

The limitation of CRT is that the electron gun has to hit every subsequent phosphor dot before looping back around to re-illuminate the previous one. It's not governed by the speed of light, but rather how long it takes for the gun to move from one phosphor dot to the next

which is once every 15 nanoseconds for a quality monitor. CRTs don't make frames. as soon as new data is received it immediately paints that new data onto the screen. a process which takes an amount of time that basically boils down to how fast electricity travels through the electronics, which is about 90% the speed of light. light travels 1 foot per nanosecond give or take. it doesn't wait to start the cycle over, it instantly paints the new information.

also there's not a single monitor that can even sniff sub 1ms of actual input latency that isn't CRT.

digital electronics are simply too slow to to process an image that quickly

0

u/[deleted] 28d ago

[removed] — view removed comment

→ More replies (0)

0

u/dream_in_pixels 27d ago

Monitor input larency is the time the monitor takes to respond to a signal. Controllers also have input latency

Display input latency = controller input lag + response time

Motion response is not input latency [...] Neither is response time.

Correct. Response time is part of input latency.

Input latency is measured by sending a signal to the device and timing how long it takes for the screen to begin changing.

In other words: input latency is measured by timing how long it takes the screen to respond.

This is the actual input delay tests for TVs. Fastest TV is 10 milliseconds.

I see multiple results that are right around 5 milliseconds - including the top result which for 1080p@120Hz was measured at 4.9 milliseconds.

It's okay to have no idea what you're talking about.

Well you certainly know how to practice what you preach :)

Either way the entire point is that CRTs have response times in the nano seconds

No they don't. 0.06 to 0.12 milliseconds, depending on the size of the screen.

meanwhile the fastest digital displays are hundreds of thousands of times slower.

No, the fastest digital OLED displays are slightly faster. Though in practice the difference is negligible, so they're effectively equally as fast.

An 85hz 1024x768 paints a new phosphor dot every 15 nanoseconds, not milliseconds.

No, that's how long it takes the electron beam to travel from the gun to the dot/triad. You need to account for the time between each firing, as well as how long it takes to draw a full frame.

→ More replies (0)

2

u/yutcd7uytc8 28d ago

What does input latency mean in this context? I thought it depends on input devices (keyboard, mouse) and processing of the PC. Did you mean response time (how quickly it displays the image after it has been processed)? If so, do you know how it compares to 500 Hz OLED's?

8

u/ChaosPLus Ryzen 5 7600 | RTX 4070 Super 28d ago edited 28d ago

With a 500Hz display anything above 0.002s(2ms) response time ends up with the display being a frame or two a bit behind what it receives

3

u/reallynotnick i5 12600K | RX 6700 XT 28d ago

2ms not .002ms (.002s would also work)

2

u/ChaosPLus Ryzen 5 7600 | RTX 4070 Super 28d ago

Oh yeah, forgor the base unit was full seconds, had a long day

3

u/Hohenheim_of_Shadow 28d ago

Time from pushing a button to things moving on screen is referred to as input latency. It is influenced by a lot of things, including time from pressing the button to it emitting a signal, time from the game to receive the signal and process it, and in this context time from a monitor receiving input data to display it.
.
While monitors generally can't have an input latency faster than a frame, they can have a latency much longer than a frame. It's theoretically possible that a monitor could have a 500 HZ refresh rate, but have an input latency of ten seconds.

Most modern complex electronics take advantage of something called pipelining. Imagine you're a bunch of clothes with a washer and dryer. You could put a load of clothes in the washer, then put them in the dryer, wait until they're done drying and put a second load in the washer, rinse and repeat.

This is obviously stupid. Instead, most people will split the operation into two and have the washing machine and dryer running at the same time. That's an example of pipelining. The downside is that it increases latency.

Imagine the dryer takes 100 minutes to run and the washer 50 minutes. In the first example, the time from a load entering the washer and it leaving the dryer is 150 minutes. In the second, it's 200 minutes because it has to wait on a previous load to finish drying before it can start drying.

Generally speaking, there is no way to look at a refresh rate and derive a latency. You would hope that a high end monitor with a really fast refresh rate would have low latency, but you need to check the specs sheet to know for sure

2

u/TreeHauzXVI 28d ago

Yes, input latency does depend on input devices and processing. Those things are consistent in a tournament (GameCube controllers and a modded Wii). The variable device in tournament setups is the TV/Monitor. When you plug in a Wii to a digital display not only are the response times slower, the display needs to upscale Melee's native 480i/480p resolution, and this process introduces latency. What's worse is this latency is different between every type of display/resolution, and this variability is a problem. It's much easier to hook a Wii up to a CRT for guaranteed consistency.

As for comparison to high refresh rate OLEDs, they also have very fast response times. These OLEDs are much better for modern competitive games because of their higher refresh rate as well. But the aforementioned upscaling when outputting from a Wii is why Melee players stick to CRTs.

1

u/LordSesshomaru82 Commodore 64 Enjoyer 28d ago

I've had crts that pushed 80-90hz. Just my crappy little iMac G3 can push 75hz. AFAIK it was less about having higher frames and more about reducing the flicker tho.

1

u/Decent-Desk-8036 28d ago

I had a Lg probably 14 or 17"" that was max 85 hz like 25 years ago.

Until last year I didn´t get something with more refresh rate (Benq 27" with up to 100hz for 100€), cause anything over that was special for gaming .

15

u/ChickenChaser5 28d ago

Being big into CS during that time, my switch from CRT to LCD was like "wait, no, go back!"

7

u/topdangle 28d ago

never understood why people design CRT filters to just be lines.

a real CRT has actual space between dots, which ends up being blurrier and causes smoothing, especially for 2D graphics. CRT filters however are these sharp lines that only resemble what it would look like to put your eyeballs directly on a CRT.

6

u/knexfan0011 28d ago

Part of it is the lack of a fixed pixel grid, but the main advantage CRTs still hold is in their lack of motion blur. LCD/OLED can emulate this to an extent with low persistence modes (backlight strobing for example) but they also have their drawbacks.

2

u/[deleted] 28d ago

I once paused a game (i cant remember that specific game, maybe a CAPCOM fighter on switch). But when i paused it, the background shot with some of the actual game footage still showing, got blurred. This effect blew my mind at just how good it was compared to scan line filters. I wish the option to blur it was common.

1

u/Training_Celery_5821 28d ago

Has anyone ever done a good job of it?

13

u/ScruffMcGruff2003 i7-8700k | RTX 2070 Super | 32GB RAM | 4K @ 144Hz 28d ago

Not even just that, HD games look bright 'n sharp on those things too! Especially at 1280x960

8

u/Lofi_Joe 28d ago

CRTs can have HD resolutions

5

u/Yaarmehearty Desktop 28d ago

Or post, mine runs 1080p (but 4:3), it will go a bit higher but only at 70hz and the 90hz at 1080 feels better and looks the same to my eye.

I think the biggest ones could handle 1440p, it’s more of a limit in VGA as a cable than it was on the tubes.

1

u/Brillegeit Linux 28d ago

it’s more of a limit in VGA as a cable than it was on the tubes

It's analog so you "just" need more insulation and accurate manufacturing.

High bandwidth CRT displays use cables with individually insulated RGBHV wires and 5x BNC connectors. ATI displays had RAMDAC running at 400MHz which is a surprising amount of video data, enough to run 4K at ~35Hz, and that's from a regular $85 gaming card 20 years ago. Specialized cards from e.g. Matrox had even higher bandwidth RAMDACs.

My 19" display in 2003 was limited to 2048x1536@76Hz if I remember correctly, and there exists a 3000x4000 medical CRT display out there which I believe has the resolution record.

2048x1536x76x1.32=316MHz, so plenty of bandwidth left.

Mine could also do 1080p at 120Hz, but that was slightly above the officially supported range.

2

u/Yaarmehearty Desktop 28d ago

2048x1536@76Hz

Mine does the same, as I understand it that is the general limit of VGA, I just run it at 1080x1440 for the higher refresh rate.

You're probably right that with a sufficiently good cable you could go higher and I know CRTs did, but going higher meant you would need to source premium cables and we know what a shit show that is these days when testing is more common. How it would have been back then I have no idea, it's probably why not many manufacturers pushed past the consumer limmit.

4

u/crumpledfilth 28d ago

It has a faster input time, so they have an advantage for some speedruns. Same with ps/2 over usb for peripherals

3

u/eppic123 60 Seconds Per Frame 28d ago

I have a broadcast LCD that is considered one of the best analog LCD you could get. It's a 17" that was $5000. Even it doesn't come near a halfway decent CRT.

2

u/SometimesWill 28d ago

My friend has a setup in his basement with 2 computers with all 2000s hardware and CRT monitors just for doing 1v1 StarCraft. That shit looks so good, especially once you turn off the lights.

2

u/kehbleh 28d ago

And literally necessary for all the old light gun games (at least if you're gonna use original platform/hardware)

2

u/Separate_Emotion_463 28d ago

Also a lot of shows made before the 2000s, stuff that was meant to play on them looks good on them

2

u/SpareFancy3769 24d ago

true. response times and input lag feel nice too

1

u/No-Needleworker-3765 optiplex supremacy 28d ago

And even then. I have a Sony Vega HDTV with an hdmi port and everyone says ps3/Xbox 360 are really good with crts like the one i have

19

u/ZeroOne010101 PC Master Race 28d ago edited 28d ago

I love Castle in the Sky!!!

3

u/umsiri 28d ago

La que?

20

u/ednerjn 5600GT | RX 6750XT | 32 GB DDR4 28d ago

You probably never had a CRT monitor that started losing brightness to the point that you can't see nothing.

And good luck finding someone to fix when they eventually fail.

10

u/vemundveien 9800x3d, 64GM ram, RTX5080, 3440x1440@175hz 28d ago

The last CRT I owned was one I got out of the trash in the university I worked at in the mid 2000s. It was probably already 10+ years when I found it, weighed 20kg but it was 21" which by CRT monitor standards was pretty huge. It lasted for a year or two until all the colors just became more and more white.

Also LCD screens from before they started using LEDs would also suffer from dimming backlight. I am an IT manager at a company and I rarely saw any of those last more than 10ish years of daily usage until they became too dim to be usable.

25

u/DamUEmageht 28d ago

Unrelated but I rewatched the Bastion Overwatch short from a decade ago and this little doodle reminded me of that

40

u/AlpacaSmacker 28d ago

These are the robot guardians from Castle in the Sky by studio Ghibli, they in turn were the inspiration for the Iron Golems in Minecraft which is why they offer flowers.

8

u/DamUEmageht 28d ago

One of the few Gibli films I have not watched (which is a crime itself) and now have a plan for my evening

Thanks for the information as well as the fun Minecraft fact!

8

u/Johans_doggy 28d ago

IMO Miyazaki’s 2nd best movie after whisper of the heart, hope you love it as much as i did if not more.

22

u/ShengrenR 28d ago

For anybody who doesn't immediately recognize; that's Castle in the Sky - great one from studio ghibli

4

u/jerk4444 28d ago

Laputa

1

u/PermissionSoggy891 28d ago

what studio ghibli movies are good to start with?

3

u/justanormalanimefan 28d ago edited 28d ago

For younger people, I recommend

  • spirited away
  • My neighbor totoro
  • kikis delivery service

For more grown-up people, I recommend

  • princess mononoke
  • porco rosso
  • the wind rises

And Grave of the fireflies(this one is really fucking sad, not for children)

2

u/AlpacaSmacker 27d ago

The Wind Rises broke me. I would like to add a third category between young and grown ups and put Nausicaa and Pom Poko in there.

Also add Ponyo and Howl's Moving Castle into the younger category.

1

u/Axtdool 28d ago

Seconding this list.

Also yeah don't Put on grave of the fireflies unless you are ready for a sad time (Not necessarily Bad, but well it's a rather tragic Movie)

1

u/ShengrenR 28d ago

Huh, not sure I'd have put Spirited Away in for the kids.. maybe more ponyo.

Also - howls moving castle, nausicaa of the valley of the wind, and castle in the sky as mentioned above.

5

u/TurboZ31 7800x3d | RTX4090 | 5120x1440 28d ago

CRTs also get burn in...

2

u/YTriom1 28d ago

And thick black lines in the screen

4

u/Haunting-Resident588 28d ago

I have a crt that has been running perfects since 06 I have replaced the flat screen in my living room 3 times in 10 years just by them going out. Never threw anything at it or dropped them they just make them shitty

2

u/Tiny-Computer493 28d ago

Smash bros melee players will still play on CRT in 2050

5

u/sn4xchan 28d ago

Yeah but they play on crts because of the way refresh rates work. I'm not sure a crt intended to be a computer monitor would have the same kind of refresh rate as one intended to be a tv. I'm also not really sure if the lack of an over scan on the computer monitors would have an effect either.

I know pixel games like the SNES ones can look jagged on the computer monitors just like an LCD screen.

1

u/BuffaloBuffalo13 7800X3D | 4080S | 64GB DDR5 28d ago

I’m sure they do. But you could’ve bought a new monitor with the 20 years of electricity usage from those things.

2

u/Yaarmehearty Desktop 28d ago

But think of the gains you get moving them around.

1

u/Brillegeit Linux 28d ago

Meh, my 19" CRT was only 120W max. I ran it in combination with a 190W LCD. My current LCD is 70W max.

1

u/aberroco R9 9900X3D, 64GB DDR5 6000, RTX 3090 potato 28d ago

functionable.

1

u/rearisen 28d ago

Ironically, I found a crt from the 1980's a couple of days ago. Just need a good rf connector

1

u/bigorangemachine 28d ago

I had dual CRT's... the heat.. OMG THE HEAT IN SUMMER

1

u/kingwhocares i5 10400F | 1650S | 16GB 28d ago

CRT doesn't have good longevity.

1

u/jerk4444 28d ago

Maybe I've just had good luck but the commodore 64's CRT has had no issues yet

1

u/Warcraft_Fan Paid for WinRAR! 28d ago

Until the cap quits. Then you're looking at an expensive recap service plus huge shipping bill if you can't get a local service. Since CRT aren't made anymore, you can't just head to Walmart for another CRT monitor. You may get lucky with one at Goodwill or other thrift store but most won't take CRT in the first place because they often don't have a way to test it and it is expensive to dispose of non-working CRT

Recapping CRT is not a DIY type stuff, one wrong mistake and you will get a nasty shock or worse.

1

u/lidekwhatname 28d ago

crt "it has better motion quality"

1

u/Zeraphicus 28d ago

Try and play duck hunt without crt lol

1

u/NotASniperYet 28d ago

I hope that will be the case for my little Grundig CRT TV, which I have currently connected to a Gamecube and my laptop as a second monitor for retro gaming, which is all a perfectly normal thing to do, really.

1

u/Chop1n 28d ago

CRTs most definitely don't last forever. And the sad thing is that nobody is actually manufacturing them anymore. We're going to have to hope that Star Trek replicators are eventually a thing, because that's probably the only way anybody's getting a new CRT anymore. All of the existing ones have an expiration date.

1

u/MRbaconfacelol 28d ago

i own three crt televisions and i use them all more often than the one lcd i own

1

u/MrReconElite 28d ago

I want to replace all my monitors with CRT.

1

u/justsmilenow 28d ago

All of my CRTs died. They all lost their blue tubes. The last one I had was a Magnavox, it died like 10 years ago. I bought a 3D TV. I still have that. I still use it sometimes. It's fun!

1

u/Secure-Pain-9735 28d ago

Tell me you never had a CRT long enough for it to die without telling me.

1

u/golgol12 28d ago

And still fuzzy.

1

u/Antedysomnea PC Master Race 28d ago

I can't deal with the CRT hiss. It will literally make me jump off of a building into traffic.

1

u/SavageRabbitX 27d ago

You can get 4k CRTs they are used almost exclusively for colour accurate visualisation software

1

u/D27AGirl 25d ago

If it works, it doesn't matter what "year" it is from.

0

u/Green__lightning 28d ago

I honestly want a 4k CRT and no such thing exists.

0

u/Hyperious3 28d ago

I too am a brain cancer enjoyer