• vvvvv@lemmy.world
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      1
      ·
      6 months ago

      106 Gbps

      They get to this result on 0.6 MB of data (paper, page 5)

      They even say:

      Moreover, there is no need to evaluate our design with datasets larger than the ones we have used; we achieve steady state performance with our datasets

      This requires an explanation. I do see the need - if you promise 100Gbps you need to process at least a few Tbs.

      • neatchee@lemmy.world
        link
        fedilink
        arrow-up
        13
        ·
        6 months ago

        Imagine you have a car powered by a nuclear reactor with enough fuel to last 100 years and a stable output of energy. Then you put it on a 5 mile road that is comprised of the same 250 small segments in various configurations, but you know for a fact that starts and ends at the same elevation. You also know that this car gains exactly as much performance going downhill as it loses going uphill.

        You set the car driving and determine that, it takes 15 minutes to travel 5 miles. You reconfigure the road, same rules, and do it again. Same result, 15 minutes. You do this again and again and again and always get 15 minutes.

        Do you need to test the car on a 20 mile road of the same configuration to know that it goes 20mph?

        JSON is a text-based, uncompressed format. It has very strict rules and a limited number of data types and structures. Further, it cannot contain computational logic on it’s own. The contents can interpreted after being read to extract logic, but the JSON itself cannot change it’s own computational complexity. As such, it’s simple to express every possible form and complexity a JSON object can take within just 0.6 MB of data. And once they know they can process that file in however-the-fuck-many microseconds, they can extrapolate to Gbps from there

        • vvvvv@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 months ago

          Based on your analogue they drive the car for 7.5 inches (614.4 Kb by 63360 inches by 20 divided by 103179878.4 Kb) and promise based on that that car travels 20mph which might be true, yes, but the scale disproportion is too considerable to not require tests. This is not maths, this is a real physical device - how would it would behave on larger real data remains to be seen.

          • neatchee@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            6 months ago

            Except we know what the lifecycle of physical storage is, it’s rate of performance decay (virtually none for solid state until failure), and that the computers performing the operations have consistent performance for the same operations over time. And again, while for a car such a small amount can’t be reasonably extrapolated, for a computer processing an extremely simple format like JSON, when it is designed to handle FAR more difficult tasks on the GPU involving billions of floating point operations, it is absolutely, without a doubt enough.

            You don’t have to believe me if you don’t want but I’m very confident in my understanding of JSON’s complexity relative to typical GPU workloads, computational analysis, computer hardware durability lifecycles, and software testing principles and best practices. 🤷

      • trolololol@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        But to write such a file you need a few quantum computers map reducing the data in alternative universes