• gerbler@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 day ago

    A buddy of mine worked in a theatre and told me that the film’s were all 1080P. I called bullshit. Those screens were huge they were clearly 4K. He showed me the reel and yup he was right.

    If theatres don’t even bother with 4K, your TV doesn’t need 8K.

    • JohnEdwa@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      24 hours ago

      Actual film doesn’t work like that (35mm or 70mm IMAX for example), but you are correct that most cinemas these days are digital and they use “1080p” (more accurately DCI 2K which is 2048×1080 when the aspect ratio is 1.90:1). There are a few that do 4K, but overall not that many.

      The main reason that’s enough for cinema though is that those “1080p” films are like 500GB with very little compression displayed through a DLP projector, so they look a heck of a lot better than showing a blu-ray through a massive TV with palm sized pixels.

      • Obi@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        0
        ·
        21 hours ago

        Also you’re quite far away from the screen so even if it’s bigger you don’t need as much resolution.

  • rottingleaf@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 day ago

    Somebody - probably those guys thinking how they’ll sell more stuff, - is operating in reality where “you can, but you shouldn’t” can’t be said in human languages.

    I’ve been reminiscing parts of my childhood where I’d watch a lot of karate matches and try to repeat moves (I know it sounds stupid). The fights, the lights, the room, my grandma, the summer evening outside matter in these memories. Not how much logical dots there were on that goddamn screen.

    One of the most depressing things about now is how, even compared to 10 years ago, people are thinking not about new things to do with tech, but about doing old things with more resources wasted, because that’s apparently better.

    1920x1080 seems an overshoot sometimes. 8K - why the hell? And with more expensive shorter less reliable cables, other things equal.

    I guess hoping for cheap thin LPD displays is useless.

    • Llewellyn@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 day ago

      Resolutions higher than fullHD are useful not only for TV in front of your sofa. You need 8k (and even higher) to VR-helmets. Also there already are cinemas with giant displays instead of projectors. Don’t be a retrograde.

      • rottingleaf@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 day ago

        Don’t be a retrograde.

        I already know that launching stuff to the orbit this way is more expensive, but there are situations where a retrograde launch makes sense. Mainly of military nature.

        I refuse to accept other meanings of the word.

        Also there already are cinemas with giant displays instead of projectors.

        Tool for the job. For me instinctively a projector and a screen seem to be a better solution than a humongous LCD display. But if that’s cheaper (how in the world though), then let them do as they want.

  • baatliwala@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 day ago

    I doubt the general public cares about or can even tell the difference from 4k to 8k. Not to mention the amount of bandwidth that will be required.

    • spyd3r@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 day ago

      I expect the content delivery companies to do something stupid again with 8k (like when they rolled out 4K), and totally nerf the bitrate and encoding quality, making it look worse than a properly encoded high bitrate 720p/1080p file.

    • Squizzy@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 day ago

      This exaxt comment could have been for 1080 to 4K. That said 4k has had a lot less fanfare to HD.

      • rottingleaf@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 day ago

        I really don’t know a movie the cinematographic virtues of which were better seen in 4k as compared to 1080. Colors’ quality, refresh rates - maybe, but the amount of pixels? You see a square 1080 pixels tall, 1920 pixels wide. How the hell are you going to discern a group of 2 or 3 pixels in that, let alone a separate pixel (I’m aware it’s in fact a group of 3 colored ones)? Even if you will, what will it change in your perception of the movie, which detail?

        Apple users sometimes boast how fonts look nicer on their screens, but IRL I haven’t seen much difference with a glossy screen of worse resolution.

      • baatliwala@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 day ago

        Not really.

        • like I said bandwidth required is insanely high, 4k movies are around 30-60 gb (for download not streaming). Extrapolate that to 8k, and a TV series? No one is going to reserve potentially 500gb to 1TB for a TV series in 8k for local download
        • phones especially mid rangers are still somewhat struggling with watching 4k content if it is local, not to mention even 4k is useless for the screen size of a phone. Every thing is mobile driven these days in terms of design.
        • 8k is not going to be worth for monitors because of their size
        • Top of the line graphic cards still cannot max out the latest games at 4k60, how can they achieve 8k? Not to mention the power draw will be higher.

        Sorry but no way native 8k is going mainstream any time soon. 4k is genuinely better looking over 1080p but 8k to 4k is not much of a difference for the relative increase in requirements.

  • Reygle@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    Just for the record, the HDMI consortium can place their mouths on my genitals and consume my waste

  • flop_leash_973@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    I couldn’t care less about 8k since I can’t even see streaming 4k content without using a platform infested with DRM.

  • circuitfarmer@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    I haven’t even gotten on the 4k bandwagon yet. I fully expected to by now, but then again, my eyes aren’t getting any better and 1080p content still looks… fine.

    • Cratermaker@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 day ago

      A few weeks ago I watched Ladyhawk on a 13" TV with a built in VHS player. I realized that my brain didn’t care about the quality as soon as I started paying attention to the content. I still like my 1080p but there’s definitely massively diminishing returns after that.

      • rottingleaf@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 day ago

        I realized that my brain didn’t care about the quality as soon as I started paying attention to the content.

        You are a genius! At least compared to everyone seriously discussing how important it is to replace one barely (if at all) visible pixel with 4, or better 9, or 16, more pixels. More of everything. If you are buying a movie that takes up 4 times more space, then it must be 4 times more content. There’s such a nice word, “content”, as if food for one’s brain and soul, which is art, could be factory-produced by the schedule.

    • AbidanYre@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      I have a 4k TV. I don’t think I’ve ever actually watched something on it in 4k because finding the content isn’t worth the effort.

      • ikidd@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        I have to filter out all the 4K feeds I get on Kodi because I can’t play them. I sure haven’t seen a shortage of them. Now whether they play at an actual 4K would be the question, but they’ve been there for years.

        • AbidanYre@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 days ago

          That’s fair. Maybe effort is the wrong word. Between the extra money for Netflix or HD space to borrow from my friends on the Internet it never seemed worthwhile over 1080.

    • Toribor@corndog.social
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 days ago

      1440p at 120Hz+ is superior to 4k 60Hz and is much more achievable for most hardware anyway. That’s the sweet spot in my opinion.

      For media 4k is a pretty big upgrade from 1080p though.

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        2 days ago

        Higher refresh rates for movies are meh at best, VRR OTOH is a godsend as 24Hz just won’t fit into 60Hz. Gaming, too, is much nicer when you have VRR, figures that delayed frames are quite a bit less noticeable than dropped frames.

  • caseyweederman@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    My next TV purchase will be based on which models have Display Port.

    …And which don’t have smart features, but that’s a given.

    • Tikiporch@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      19 hours ago

      No TVs have DP, and the largest monitors you can find now are below 55". I wish you luck.

    • Kbobabob@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      That’s just a commercial display. Most commercial displays don’t have an OS and require a separate device for showing video like an Nvidia Shield, PC, etc.

    • pHr34kY@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      I got a new Android TV for offline use. Most people say you get an OK experience if you don’t connect the TV into a network.

      The biggest remaining annoyance is that it takes 45 seconds to cold-start. Almost as if it’s booting an OS desgined for a phone or something.

    • mac@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      Mine would be as well, but tbh I don’t love the kodi UI. At least I didn’t a few years ago when I tried it.

      Maybe Nvidia will drop a new shield with DP support, but not going to hold my breath on that

      • caseyweederman@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        The transformation into crochety old man is complete. This AI being shoehorned into everything can get off my damn lawn too.

        • Buddahriffic@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          21 hours ago

          This is different from the old man angry at change meme. The change isn’t the problem; personally I like change and seeing evolutionary and revolutionary improvements.

          The problem is that so many of these changes are for the benefit of the corporations involved in the product at the expense of anyone who ends up using it or is near enough to be affected collaterally.

          The idea of a smart TV is nice. Except they put the underpowered hardware in it that struggles to display a menu. Maybe because of all the data it is gathering and sending home or the time it spends making sure the latest ads are downloaded.

          Smart appliances are also a nice idea. Except most just want to connect to some proprietary web service so they can middle man every interaction to sell your data or a subscription.

          A smart car also sounds cool. Except they are also designed to just make more money either via more expensive repairs, possibly even forced to go through a manufacturer approved mechanic because they use security features to protect them from competition, or by the usual selling your data and ads. Oh and also they can save money by sticking a bunch of controls into the software and not needing to make physical buttons. Also they save even more by also using underpowered hardware and probably not even bothering with UX design. Maybe even deliberately because bad experiences can be upsellers. Oh they also want to sell subscriptions to whatever they can, including to things that don’t even benefit from going through their services.

          It’s all just rent seeking.

    • Is DVI completely out of the picture? I hate the connector, but I’ve had a lot of issues with DP, mainly around Linux support and multi-monitor setups.

      I was kinda hoping USB-C/4/Thunderbolt would step into this space and normalize chaining and small connectors, but all of those monitors are stupidly expensive.

      • caseyweederman@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        The only problems I’ve had with DP are when I have to put it through an adapter to turn it into HDMI for a display that didn’t have DP input.

        Video over USB-C just ends up being Display Port, doesn’t it? I guess it depends on the subtype of USB.

      • ramble81@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        DVI isn’t capable of the bandwidth needed for higher resolutions. Even dual link maxes at about 8 Gbps and 2560x1600 @ 60Hz. This new HDMI spec is 96 Gbps for reference.

        Ironically though HDMI is pin compatible with DVI and you could output HDMI to a DVI monitor with just a simple HDMI to DVI cable, or vice versa. I know a lot of people who like DP bit in order to convert you need active circuitry and that can impact quality if you don’t have native DP on both ends.

          • ramble81@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 days ago

            USB4 uses something called DP-Alt where it’s actually DisplayPort over USB exactly what you’re looking for. I have a portable USB-C monitor that runs powered and video over a single connection

      • Uli@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        I also find USB to be limiting when it comes to range. I can go about 50 feet with a nice thick HDMI with copper wiring, but any further than 20 feet on USB necessitates fiber optics. Not an issue for everyone, but something I have been running into.

        • That’s one I hadn’t encountered. At those distances I start contemplating wireless solutions.

          Got myself a nice outdoor POE camera, a bunch of appropriate CAT and power adapter… and then realized that since the previous owners had put in a sheet-rock ceiling (not complaining; drop ceilings make me fell like I’m living in a warehouse), I had no easy way to run the ethernet all the way crosswise across the house from where the switch was to where I wanted the camera. I’d been thinking since the utility room wasn’t finished I’d figure a way to thread it with a flashlight and a “step 3: ???”. The moral is that running long wires is not my favorite thing.

  • dan@upvote.au
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    “Ultra96” sounds like it could have been a codename for the Nintendo 64.

    • otp@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      Or the GameCube…or an add-on to the N64.

      The N64’s codename was the Ultra 64 afterall!

  • Jestzer@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 days ago

    Good thing the word Premium® is there to let me know it’s a high quality product!