Eskating cyclist, gamer and enjoyer of anime. Probably an artist. Also I code sometimes, pretty much just to mod titanfall 2 tho.

Introverted, yet I enjoy discussion to a fault.

  • 42 Posts
  • 1.86K Comments
Joined 2 years ago
cake
Cake day: June 13th, 2023

help-circle



  • Both are perfectly serviceable, but for the self-hosted storage/office suite combo, Collabora simply fits into Nextcloud better. Which is likely why you don’t see OnlyOffice discussed much.

    Collabora is just more integrated. The NC and Collabora developers actually directly collaborate on integrating it into NC as the “official” office suite.

    And AFAIK the backend of Collabora is simply LibreOffice, meaning the “desktop” version is: LibreOffice. The UI is the same, too, though they might’ve diverged since I last used LibreOffice on desktop.

    Personally I’m not really concerned with formats, as long as I can finish documents as PDFs, and Collabora has brought a google-drive-like experience to my nextcloud instance that OnlyOffice didn’t manage. Either way I was able to do a google takeout of my drive storage, and just plop that into my nextcloud. But with Collabora, actually interacting with the resulting files within the nextcloud UI has been nicer.






  • Any car tire.

    For road bicycles 7 bar is just “normal”, 8 and above isn’t unheard of.

    A guy once asked if I was crazy when I was pressurizing my hybrid bike to 6 bar, and I just pointed to the sidewall where the rating said 4.5-6.5 bar. The range is wide because the pressure you should use varies depending on what you weigh, and how you want to balance rolling resistance vs comfort.

    And even then the safety margin on bike tires is more than double the max rating, so it’s perfectly safe to go a full bar over if you want.




  • I would absolutely use it. In fact creating and editing services would be the primary selling point IMO. It doesn’t need to be much “easier” than doing it in the terminal or file explorer, to me the primary benefit would just be the ease of use of creating, loading, and starting a new service all in one place.

    I think a generic template would be great.

    You could turn the whole thing into a giant GUI settings screen, allowing navigation to an exectuable, after which you could provide some of the most typical options as sliders, number fields, switches, or whatever is suitable. But that would be a large amount of work, and I’m not sure it would simplify things much.

    The starting point should just be a text field, but with a link to the service file docs for help/reference.







  • So the article above is sraight up wrong? All frame generation is already extrapolation, not interpolation?

    I had to look it up because I could have sworn that reprojection can and does use motion vectors to do more than just update the perspective.

    AND IT DOES.

    You’re talking about what VR does as the last step of EVERY rendered frame, which is an extremely simple reprojection to get the frame closer to what it would have been (what oculus called ATW), had it been rendered instantly (which it obviously can’t be). This is also seemingly the extend to which the unity demo showcased by LTT took it.

    What Oculus called ASW, asynchronous space warp, absolutely can and does update the position of the ball, which is why it can and is used to entirely replace rendering every other frame.

    Valves version of it is a lot simpler, and closer to just ATW, and does not use motion vectors when compensating for lost frames. Unlike ASW their solution was never meant to be used constantly, for every other frame, to enable VR on lesser hardware.



  • The reprojected frame with the ball in the same spot is still more up to date than a generated frame using interpolation.

    With reprojection, every other frame is showing where the ball actually is.

    It essential displays the game-world at the framerate it is actually being generated, with as little latency as possible.

    I vastly prefer this. Together with the reduced perceived input latency, this makes motion tracking FAR easier than with frame generation.

    With current frame generation, every frame, is showing where the ball was, two or three frames ago. You never see where it is right now. Due to this, in fast paced action, hand-eye coordination is slower, more likely to overshoot, etc.

    And further developed reprojection, absolutely could account for such things.