• BmeBenji@lemm.ee
    link
    fedilink
    arrow-up
    137
    ·
    10 months ago

    4K is overkill enough. 8K is a waste of energy. Let’s see optimization be the trend in the next generation of graphics hardware, not further waste.

    • Zink@programming.dev
      link
      fedilink
      arrow-up
      52
      ·
      10 months ago

      Yeah. Once games are rendering 120fps at a native 6K downscaled to an amazing looking 4K picture, then maybe you could convince me it was time to get an 8K TV.

      Honestly most people sit far enough from the TV that 1080p is already good enough.

      • frezik@midwest.social
        link
        fedilink
        arrow-up
        12
        ·
        10 months ago

        I find 4k is nice on computer monitors because you can shut off anti-aliasing entirely and still leave jagged edges behind. 1440p isn’t quite enough to get there.

        Also, there’s some interesting ideas among emulator writers about using those extra pixels to create more accurate CRT-like effects.

        • Zink@programming.dev
          link
          fedilink
          arrow-up
          5
          ·
          10 months ago

          Oh yeah, I have read some very cool things about emulators and being able to simulate the individual phosphors with 4K resolution. I have always been a sucker for clean crisp pixels (that’s what I was trying to achieve on the shitty old CRT I had for my SNES) so I haven’t jumped into the latest on crt shaders myself.

        • Holzkohlen@feddit.de
          link
          fedilink
          arrow-up
          1
          ·
          10 months ago

          But anti-aliasing needs far less performance. And you need to mess about with scaling on a 4k monitor which is always a pain. 1440p for life IMHO

      • minibyte@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        10 months ago

        I’m to THX spec, 10 feet from an 85 inch. I’m right in the middle of 1440P and 4K being optimal, but with my eyes see little difference between the two.

        I’d settle for 4k @ 120 FPS locked.

        • Zink@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          10 months ago

          I’m 6-8 feet from a 65, depending on seating position and posture. It seems to be a pretty sweet spot for 4K (I have used the viewing distance calculators in the past, but not recent enough to remember the numbers). I do wear my glasses while watching TV too, so I see things pretty clearly.

          With games that render at a native 4K at 60fps and an uncompressed signal, it is absolutely stunning. If I try to sit like 4 feet from the screen to get more immersion, then it starts to look more like a computer monitor rather than a razor sharp HDR picture just painted on the oled.

          There is a lot of quality yet to be packed into 4K. As long as “TV in the living room” is a similar format to now, I don’t think 8K will benefit people. It will be interesting to see if all nice TVs just become 8K one day like with 4K now though.

    • Final Remix@lemmy.world
      link
      fedilink
      arrow-up
      28
      ·
      10 months ago

      *monkey’s paw curls*

      Granted! Everything’s just internal render 25% scale and massive amounts of TAA.

    • flintheart_glomgold@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      10 months ago

      For TV manufacturers the 1K/4K/8K nonsense is a marketing trap of their own making - but it also serves their interests.

      TV makers DON’T WANT consumers to easily compare models or understand what makes a good TV. Manufacturers profit mightily by selling crap to misinformed consumers.

    • bruhduh@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      10 months ago

      Divide resolution by 3 though, current gen upscale tech can give that much, 4k = upscaled 720p and 8k = upscaled 1440p

      • AngryMob@lemmy.one
        link
        fedilink
        arrow-up
        4
        ·
        10 months ago

        can doesn’t mean should.

        720p to 4k using dlss is okay, but you start to see visual tradeoffs strictly for the extra performance

        to me it really shines at 1080p to 4k where it is basically indistinguishable from native for a still large performance increase.

        or even 1440p to 4k where it actually looks better than native with just a moderate performance increase.

        For 8k that same setup holds true. go for better than native or match native visuals. There is no real need to go below native just to get more performance. At that point the hardware is mismatched

        • bruhduh@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          10 months ago

          Devs already use it instead of optimisations, what makes you think that bosses don’t try to push it further because deadlines and quarterly profits, immortals of aveum is example and it’s not even end of generation, only half (i agree with you from user standpoint though)

  • ivanafterall@kbin.social
    link
    fedilink
    arrow-up
    62
    ·
    10 months ago

    A few years ago, I got a good deal on a 4K projector and setup a 135" screen on the wall. The lamp stopped working and I’ve put off replacing it. You know what didn’t stop working? The 10+ year old Haier 1080p TV with a ding in the screen and the two cinder blocks that currently keep it from sliding across the living room floor.

    • Domi@lemmy.secnd.me
      link
      fedilink
      arrow-up
      3
      ·
      10 months ago

      The lamp stopped working and I’ve put off replacing it.

      If you still have it, do it. Replacing the lamp on a projector is incredibly easy and only takes like 15 minutes.

      If you only order the bulb without casing it’s also very cheap.

      • ivanafterall@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        10 months ago

        Yep! I bought a model with pretty cheap/easy replacement bulbs. I just need to actually pull the trigger and replace it.

    • Apollo2323@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      Yep I feel the same. I love how old stuff seem to last longer and longer and the new stuff breaks just out of the blue.

  • LaunchesKayaks@lemmy.world
    link
    fedilink
    arrow-up
    47
    ·
    10 months ago

    Has anyone else here never actually bought a TV? I’ve been given 3 perfectly good TVs that relatives were gonna throw out when they upgraded to smart TVs. I love my dumb, free TVs. They do exactly what I need them to and nothing more. I’m going to be really sad when they kick the bucket.

    • woodenskewer@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      edit-2
      10 months ago

      I was a given free, very decent, dumb tv and upgraded it to a smart tv with a $5 steam link and ran a cat 6 cable to it from my router. Best $5 ever. Have no intention of buying a new one. If I ever do, I will try my hardest to make sure if it’s a dumb one. I know they sell “commercial displays” that are basically a tv with no thrid party apps or a way to install them.

    • ikidd@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      10 months ago

      Any TV is a dumb TV if you plug a Kodi box in the HDMI and never use the smart trash.

    • Leg@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      10 months ago

      Yes, people like me buy TVs. I’m the guy who keeps giving away perfectly good TVs to other people because I’ve bought a new one and don’t want to store the old one. I’ve given away 2 smart TVs so far, though I’m not sure what I’ll do with my current one when I inevitably upgrade.

    • lengau@midwest.social
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      10 months ago

      I’ve bought my TVs because all my relatives are the same as us. My mom finally tossed an old CRT TV a couple of years ago because it started having issues displaying colours correctly.

    • starman2112@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      ·
      10 months ago

      I used my family’s first HDTV from 2008 up until last year, when my family got me a 55" 4k TV for like $250. Not gonna lie, it’s pretty nice having so much screen, but I’m never getting rid of the ol’ Sanyo.

    • Flying Squid@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      10 months ago

      One of my TVs was given to us by my mother-in-law, but we did buy the other one. Before the ‘smart’ TV era though.

  • teft@lemmy.world
    link
    fedilink
    arrow-up
    49
    arrow-down
    2
    ·
    10 months ago

    That’s how I feel when people complain about 4k only being 30fps on PS5.

    I laugh because my 1080p tv lets the PS5 output at like 800fps.

      • teft@lemmy.world
        link
        fedilink
        arrow-up
        36
        arrow-down
        2
        ·
        10 months ago

        My 120 fps on ps5 1080 in front of me says that your comment is mistaken.

        • dizzy@lemmy.ml
          link
          fedilink
          arrow-up
          21
          arrow-down
          1
          ·
          10 months ago

          The fact it can output a 120Hz signal doesn’t mean the processor is making every frame. Many AAA games will be performing at well under 120fps especially in scenes with lots of action.

          It’s not limited to 30fps like the other poster suggested though, I think most devs try to maintain at least 60fps.

          • GooseFinger@lemmy.world
            link
            fedilink
            arrow-up
            9
            arrow-down
            1
            ·
            10 months ago

            Unlike Bethesda, who locks their brand new AAA games with terrible graphics at 30 fps, and that if you don’t feel that the game is responsive and butter smooth, then you’re simply wrong.

            I’d almost bet money that Todd has never played a game at 60 fps or higher.

            • Poggervania@kbin.social
              link
              fedilink
              arrow-up
              3
              ·
              10 months ago

              iirc that more has to do with lazy coding of their physics system with the Gamebryo Creation engine. From what I understand, the “correct” way for physics to work is more or less locked at 60fps or less, which is why in Skyrim you can have stuff flip out if you run it above 60fps and can even get stuck on random ledges and edges.

              There are use cases for tying things to framerate, like every fighting game for example is basically made to be run at 60fps specifically - no more and no less.

              • EldritchFeminity@lemmy.blahaj.zone
                link
                fedilink
                arrow-up
                4
                ·
                10 months ago

                This used to be the way that game engines were coded because it was the easiest way to do things like tick rates well, but like with pretty much all things Bethesda, they never bothered to try to keep up with the times.

                There’s some hilarious footage out there of this in action with the first Dark Souls, which had its frame rate locked at I believe 30fps and its tick rate tied to that. A popular PC mod unlocked the frame rate, and at higher frame rates stuff like poison can tick so fast that it can kill you before you can react.

      • Poggervania@kbin.social
        link
        fedilink
        arrow-up
        8
        ·
        10 months ago

        No, the PS5 can output higher FPS at 1080p.

        What you might be thinking of is refresh rate, which yeah, even if the PS5 was doing 1080p/60fps, if you for some reason have a 1080p/30hz TV, you won’t be able to see anything above 30fps.

  • ApexHunter@lemmy.ml
    link
    fedilink
    arrow-up
    36
    ·
    10 months ago

    Jokes on you – I’m still using the last TV I bought in 2005. It has 2 HDMI ports and supports 1080i!

    • NarrativeBear@lemmy.world
      link
      fedilink
      arrow-up
      20
      arrow-down
      1
      ·
      10 months ago

      I miss this the most, older tv models would have like over 30 ports to connect anything you wanted. All newer models just have like 1 HDMI connection if even.

      To add these older screens last. New stuff just dies after a few years, or gets killed with a firmware upgrade.

      PSA: Don’t connect your “smart” appliances to the internet fokes.

      • xyguy@startrek.website
        link
        fedilink
        English
        arrow-up
        6
        ·
        10 months ago

        We had an older Hitachi tv with 4 HDMI plus component plus RCA input and 4 different options for audio input.

        New Samsung TV. 2 HDMI, that’s it. One is ARC which is the only audio interface besides TOSLINK so really theres effectively 1 HDMI to use.

        But of course all the lovely spyware smart features more than make up for it.

      • Bipta@kbin.social
        link
        fedilink
        arrow-up
        3
        ·
        10 months ago

        Is that a joke? My old TV has 3 and that’s the only reason I can still use it. 2 of them broke over the years.

        • comador @lemmy.world
          link
          fedilink
          arrow-up
          6
          ·
          10 months ago

          Depends on the TV.

          Many older mid to high end models had 4+ ports and it sucks you can rarely find a new one with 4 anymore.

          My circa 2008 Sony Bravia has 6 HDMI ports that all still work.

      • poppy@lemm.ee
        link
        fedilink
        arrow-up
        10
        ·
        10 months ago

        I was curious I so went and browsed some budget TVs on Walmart’s website. Even the no-name budget ones all had 3 HDMI. Maybe if it’s meant to be a monitor instead of a living room TV but I just looked at living room style TVs.

      • Dudewitbow@lemmy.zip
        link
        fedilink
        arrow-up
        4
        ·
        10 months ago

        i feel like the only way youd get one with a single HDMI port are like models that were built specifically for black friday (to maximize profit, by cuting out features)

    • BorgDrone@lemmy.one
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      10 months ago

      It’s a chicken/egg problem. We need 8k so we can use bigger TV’s, but those bigger TV’s need 8k content to be usable.

      • Holzkohlen@feddit.de
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        10 months ago

        What kind of TV do you need bro? A 60 inch with 4k is more than enough, especially when you think about how far you are gonna sit from a 60 inch TV. Only suckers buy into 8k. Same people who bought those rounded screen smartphones thinking it will be the new thing. Where are those phones now?

        • BorgDrone@lemmy.one
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          10 months ago

          What kind of TV do you need bro? A 60 inch with 4k is more than enough, especially when you think about how far you are gonna sit from a 60 inch TV.

          You misunderstand the point of higher resolutions. The point is not to make the image sharper, the point is to make the TV bigger at the same sharpness. This also means the same viewing distance.

          At the end of the CRT ear I had 28” TV, at PAL resolutions that is ~540p visible. At the end of the HD era I had a 50” TV. Note that the ratios between resolution and size are close together. Now we’re partway through the 4k era and I currently have a 77” 4k TV. By the time we move to the 8k era I expect to have something around 100”. 8k would allow me to go up to a 200” TV.

          I sit just as far from my 77” TV as I sat from my 27”, my 50” or my 65”. The point of a larger TV is to have a larger field-of-view, to fill a larger part your vision. The larger the FoV the better the immersion. That’s why movie theaters have such large screens, that’s why IMAX theaters have screens that curve around you.

          Don’t think of an 8k TV as sharper, think of 4k as a cropped version of 8k. You don’t want to see the same things sharper, you want to see more things. Just like when we went from square to widescreen TV’s. The wider aspect-ratio got us extra content on the side, the 4:3 version just cut of the sides of the picture. So when you go from a 50” 4k to a 100” 8k, you can see this as getting a huge additional border around the screen that would simply be cut off on a 4k screen.

          Of course, content makers need to adjust their content to take into account this larger field-of-view. But that’s again a chicken/egg problem.

          The endgame is to have a TV that fills your entire field-of-view, so that when you are watching a movie that is all you see. As long as you can see the walls from the corners of your eye, your TV is not big enough.

    • starman2112@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      10 months ago

      I have a 4k TV, it legitimately is no better than 1080 lmao

      There’s a very noticeable difference, but it’s nothing like the difference between SD and HD. It’s pretty, but not that pretty. I prefer the performance (and proper scaling for my computer) of 1080, even on a 55" screen

      • NaoPb@eviltoast.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        Could this be a configuration issue? I can’t talk out of experience but I’d assume it would be quite a bit better.

        Thanks for the info anyway.

        P.s. I’m not the person who downvoted you. I don’t do that when disagreeing.

  • Cowbee [he/they]@lemmy.ml
    link
    fedilink
    arrow-up
    29
    ·
    10 months ago

    4k is the reasonable limit, combined with 120 FPS or so. Beyond that, the returns are extremely diminished and aren’t worth truly considering.

    • kandoh@reddthat.com
      link
      fedilink
      arrow-up
      31
      arrow-down
      3
      ·
      10 months ago

      8k is twice as big as 4k so it would be twice as good. Thanks for coming to my ted talk

      • Cowbee [he/they]@lemmy.ml
        link
        fedilink
        arrow-up
        4
        ·
        10 months ago

        There are legitimately diminishing returns, realistically I would say 1080p would be fine to keep at max, but 4k really is the sweet spot. Eventually, there is a physical limit.

        • starman2112@sh.itjust.works
          link
          fedilink
          arrow-up
          4
          ·
          10 months ago

          I fully agree, but I also try to keep aware of when I’m repeating patterns. I thought the same thing about 1080p that I do about 4k, and I want to be aware that I could be wrong again

          • Cowbee [he/they]@lemmy.ml
            link
            fedilink
            arrow-up
            3
            ·
            10 months ago

            Yep, I’m aware of it too, the biggest thing for me is that we know we are much closer to physical limitations now than we ever were before. I believe efficiency is going to be the focus, and perhaps energy consumption will be focused on more than raw performance gains outside of sound computing practices.

            Once we hit that theoretical ceiling on the hardware level, performance will likely be gained at the software level, with more efficient and clean code.

    • N-E-N@lemmy.ca
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      10 months ago

      4K id agree with, but going from 120 to 240fps is notable

  • Flying Squid@lemmy.world
    link
    fedilink
    arrow-up
    27
    ·
    10 months ago

    One of my TVs is 720p. The other is 1080p. The quality is just fine for me. Neither is a ‘smart’ TV and neither connects to the internet.

    I will use them until they can no longer be used.

    • AngryCommieKender@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      10 months ago

      The last TV I owned was an old CRT that was built in the 70s. I repaired it, and connected the NES and eventually the SNES to it. Haven’t had a need for a TV ever since I went to university, joined IT, and gained a steady supply of second hand monitors.

  • CrowAirbrush@lemmy.world
    link
    fedilink
    arrow-up
    26
    ·
    10 months ago

    We are at a point where 4k rtx is barely viable if you have a money tree.

    Why the fuck would you wanna move to 8k?

    I’m contemplating getting 1440p for my setup, as it seems a decent obtainable option.

      • CrowAirbrush@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        10 months ago

        And getting the newest gpu every year because they lock you out of the most recent dlss update when you don’t upgrade to the newest line up right?

          • CrowAirbrush@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            10 months ago

            Make it piratable then, i ain’t getting no subscription on hardware functioning.

            Fuck that shit to the high heavens and back.

            • T00l_shed@lemmy.world
              link
              fedilink
              arrow-up
              3
              ·
              10 months ago

              Oh I’m sure some folks will figure out how to pirate it for sure. But as long as big businesses pay the sub fee, Nvidia won’t give a shit about us.

    • michael_palmer@lemmy.sdf.org
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      10 months ago

      You can play not only 2023-2024 games. I play GTA V with ultra setting and have 4k@60 FPS. My GPU is 150$ 1080ti.

      • And009@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        If we’re comparing the latest tech then I’d like to be playing the most recent gen games. GTA V feels as old as San Andreas, in a few years my phone should be running it fine.

  • HEXN3T@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    23
    ·
    10 months ago

    I have a 4K 120hz OLED TV. The difference is quite drastic compared to my old 1080p LED. It’s certainly sharper, and probably the practical limit. I’ve also seen 8K, and, meh. I don’t even care if it’s noticable, it’s just too expensive to be worthwhile. We should just push more frames and lower latency for now, or, the Gods forbid, optimise games properly.

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      21
      ·
      10 months ago

      I feel like resolution wasn’t much of an issue even at 1080p. It was plenty. Especially at normal viewing distances.

      The real advantages are things like HDR and higher framerates including VRR. I can actually see those.

      I feel like we’re going to have brighter HDR introduced at some point, and we’ll be forced to upgrade to 8K in order to see it.

      • HandBreadedTools@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        10 months ago

        Ehhhh, I think 1080p is definitely serviceable, it’s even good enough for most things. However, I think 1440p and 4k are both a pretty noticeable improvement for stuff like gaming. I can’t go back to 1080p after using my 3440x1440 monitor.

        • Blackmist@feddit.uk
          link
          fedilink
          English
          arrow-up
          4
          ·
          10 months ago

          I notice more on my PC. Being up close I can see individual pixels. And for productivity software, the higher resolution wins every time.

          On a 55" TV, sitting 3 metres away, no real difference for me. I’d rather have extra frames than extra pixels.

          And that’s for gaming. With good quality video, I can’t see any difference at all.

      • Honytawk@lemmy.zip
        link
        fedilink
        arrow-up
        3
        ·
        10 months ago

        Depends entirely on the size of the screen.

        A normal monitor is fine on 1080p

        But once you go over 40", a 4K is really nice

    • johannesvanderwhales@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      Too expensive both in terms of the price, and the massive amount of storage needed for 8k video. I don’t really think 8k is ever going to be the dominant format. There’s not really much point in just increasing resolution for miniscule gains that are almost certainly not noticeable on anything but a massive display. Streaming services are going to balk at 8k content.

    • Neil@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      10 months ago

      I’ve heard recently that there’s “cheap OLED” and “expensive OLED.” Which one did you go for? I’ve got a 75" 4k OLED for $400 and it’s definitely super dark. I can’t even watch some movies during the day if they’re too dark. The expensive ones are supposed to be a lot better.

      • Venat0r@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        10 months ago

        I’ve got an older Sony bravia A9G and I’ve seen reviews complaining that it’s too dim but I’ve had no issues. I think some people just have really poorly thought out tv placement, or overly bright rooms. Also just close the curtains if the movie is dark…

        If you want to watch tv outside in direct sunlight you’ll need to follow this guide to build a custom super bright tv: https://youtu.be/WlFVPnGEb8o

  • ConfusedPossum@kbin.social
    link
    fedilink
    arrow-up
    22
    ·
    10 months ago

    The only time I replace electronics anymore is when something breaks or when I’m gifted someone else’s hand-me-downs

    • The_v@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      10 months ago

      I have everything on a upgrade list depending on how much we use it and how fast the technology is changing.

      Phones: 3 years. Thinking of moving this to 4 or 5 years with the industry’s stagnation. Starting to see some companies offering updates for longer times.

      Laptops/desktops: 5-6 years.

      Wifi/modem/router: 10 years.

      • Sprokes@jlai.lu
        link
        fedilink
        arrow-up
        4
        ·
        10 months ago

        3 years for a phone is very low. Maybe change battery and you can keep it for 3 years more. Though you need to buy phones with custom rom support.

        • nossaquesapao@lemmy.eco.br
          link
          fedilink
          arrow-up
          2
          ·
          10 months ago

          I’m still trying to understand what people do in their phones that they need to run the very latest model with the specs of a laptop. Mine is from 2018 and is doing the job nicely. What am I missing out there?

          • The_v@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            10 months ago

            Most people never use all the specs. If all you are doing is browsing the Internet, watching videos, or playing some simple games, you don’t need much.

            I have purchased unlocked mid-range phones for for a while now. Expensive enough to have decent specs but not so cheap that the build quality suffers. When the teenager is dropping the phone 3-4 times per day, a good case and a good build quality is required. 3 years with that type of abuse is about all you’ll get out of it.

            I run two phones. For work I get one of the flagship phones. I only pull out my laptop in my office. Most of the time I am using the phone in all weather conditions. I use those specs for thousands of hi-res pictures, data entry etc… all day long. At the end of 3 years it’s toast.

      • Flying Squid@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        10 months ago

        My notebook is 9 years old. My desktop is 6 years old. I haven’t found a reasonable argument to replace them until they stop working. Why 5-6 years?

        • The_v@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          10 months ago

          Why 5-6 years, that’s about when I start seeing the cascade of little things. Weird transitory bugs when rebooting. Speed issues and compatibility issues with bloated new software etc. After that amount of time, I start to spend way too much time maintaining them.

          I could tinker with them and keep them going. Its what I used to do when my kids were small. Install a Linux distro on an old computer, load a bunch of educational games and set the browser homepage to PBSkids.

          However I have 5 computers to maintain now and my teenagers need compatible fast systems for college and school. My wife works from home at times and needs something that reliably works.

    • cm0002@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      10 months ago

      I mean, you can get 4K TVs for cheap and fix them (As long as the display is NOT damaged, once that’s gone the TV is nothing but scrap)

      Got a 60 inch 4K HDR TV for free off Facebook, the led backlights had just gone out. $20 for a replacement set, 2 hours of my time and a couple cuts on my hand and it’s been a fantastic TV since lmao

  • doublejay1999@lemmy.world
    link
    fedilink
    arrow-up
    21
    arrow-down
    1
    ·
    10 months ago

    My son is on his 3rd Dualsense controller in about 18months.

    Yesterday I plugged my Xbox 360 controller into my steam deck and played Halo 3 Like an OG.

    • Domi@lemmy.secnd.me
      link
      fedilink
      arrow-up
      19
      arrow-down
      1
      ·
      10 months ago

      Yesterday I plugged my Xbox 360 controller into my steam deck and played Halo 3 Like an OG.

      If you had told someone 10 years ago that you can play Halo 3 on a handheld running Linux with a OG Xbox 360 controller on Steam they would call you crazy.

      • WhiskyTangoFoxtrot@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        10 months ago

        Halo 3 is seventeen years old. Ten years ago, a seventeen-year-old game would be something like Quake 2 or Castlevania: Symphony of the Night, both of which could easily be run on handhelds by that time.

        • Pika@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          ·
          10 months ago

          Have you looked into repairing them yourself? Had it happened with my PS4 controller and it was fairly simple to fix myself and it costed significantly less than buying a whole new controller

        • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          10 months ago

          I had it pretty consistently with every Joycon I went through; but I’ve had my PS5 for little over a year and I use it for PC gaming also without issue on the dual sense. Did they redesign them between when they first launched and more recently, maybe? I worry about it all the time because they’re $80. I can’t be replacing them all the time.

          • doublejay1999@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            10 months ago

            Yeah we had the joycon fiasco as well .

            He runs 2 controllers - one has been fine. The other 2 got the drift

            Best breakdown here https://www.youtube.com/watch?v=7qPNyio3VDk&pp=ygUPRHVhbHNlbnNlIGRyaWZ0

            It’s wild to me that flagship consoles ship with weak controllers - and I was reminded of it, when was using the x360 controller - which is battered and old probably 15 years old - and it still works perfectly.

            Even more annoying is I think they are selling a ‘pro’ controller now for Dual Sense edge or something- for 150 odd.

      • Kiosade@lemmy.ca
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        10 months ago

        I had mine maybe 8 months before the left stick started drifting hard. Completely unusable. And sony wanted me to go through all these hoops AND spend like $20 bucks to ship it to them.

        Ended up getting an 8bit Pro Ultimate instead, and so far it’s worked great! Has Hall-Effect joysticks too, so no chance of drifting ever. The major console makers NEED to switch to HE for the next gen.

    • rimjob_rainer@discuss.tchncs.de
      link
      fedilink
      arrow-up
      3
      ·
      10 months ago

      I’m still with my first dualsense, my dualshocks from PS3 and PS4 still work without any issues. I don’t want to know what people do to their controllers.

    • starman2112@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      10 months ago

      My Xbox Series S controller got stick drift like 3 months after I got it. My friend’s finally succumbed last week, after about a year of owning it. What is it with stick drift on new controllers? Seems like every modern system has the exact same problem

    • brbposting@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      ·
      10 months ago

      Cherish it (though maybe not its power requirements?) - based on the big ole chunky bois I’ve seen at the dump 📺 (looked like those rear projector models or something).

    • Ep1cFac3pa1m@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      10 months ago

      Same here. 40” Sharp Aquos quattron not only still working, but working flawlessly. It’s also got way more inputs than any TV that size today, and a stand that swivels that I use all the time. I’m in no hurry to replace it.