• CrawlMarks [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    29
    ·
    7 months ago

    I think the biggest factor here is them having good enough infrastructure to trust their internet well enough to bet someone’s life on it. Like, one lag spike and we get to see if lungs have rollback netcode.

  • Awoo [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    6
    ·
    7 months ago

    When it says “negligible latency” what does this actually translate to?

    30ms I can agree with, even up to 80ms I could see not really affecting surgery. If we’re talking 300ms or higher that would take incredible patience to deal with because you will notice it on every single action taken.

    • miz [any, any]@hexbear.net
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      7 months ago

      as a lower bound speed of light over 5000km is just under 17ms, so 34ms for a round trip. as complete speculation, I’m guessing they got it down to around 50-60ms

  • Hohsia [any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    5
    ·
    7 months ago

    Literally the only type of use case for AI with tangible benefits

    I truly dread the day physical robots achieve a ChatGPTesque adoption in the west. That will be gg

      • vegeta1 [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        5
        ·
        7 months ago

        Yeah this I can get behind. Not the absolute hotdog water the silicon valley dark enlightenment types try to burn our planet for.

        • FortifiedAttack [any]@hexbear.net
          link
          fedilink
          English
          arrow-up
          3
          ·
          7 months ago

          Even the text generation models have uses. For programmers, there are local models (i.e. don’t require huge computational power) that can provide code completion in a similar style that static analysis tools used to give, just much more general and not restricted to a particular language.

          Other models can provide you quick answers for basic system operation questions that search engines have gotten too shitty to query for. Just way faster than having to wade through irrelevant results littered with ads to maybe find something tangentially relevant to your question.