• 0 Posts
  • 27 Comments
Joined 2 months ago
cake
Cake day: July 10th, 2024

help-circle
  • Zacryon@feddit.orgtoScience Memes@mander.xyzHorseshoe crabs be like
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    20
    ·
    1 day ago

    Fair point. Although one may say this is fine here for comic purposes.

    The same argument could be made about the statement “Gods perfect creation”.
    But I’d argue that the suggestion of a creationist god expands the distance to scientific contexts even more while simple speech bubbles are fine due to less ideological conflict potential.

    Admittedly, I am also rather allergic to religions, which is why I am having a difficult time with that part of the meme.








  • I feel this. Fell into a similar rabbit hole when I tried to get realtime feedback on the program’s own memory usage, discerning stuff like reserved and actually used virtual memory. Felt like black magic and was ultimately not doable within the expected time constraints without touching the kernel I suppose. Spent too much time on that and had to move on with no other solution than to measure/compute the allocated memory of the largest payload data types.








  • If we’re speaking of transformer models like ChatGPT, BERT or whatever: They don’t have memory at all.

    The closest thing that resembles memory is the accepted length of the input sequence combined with the attention mechanism. (If left unmodified though, this will lead to a quadratic increase in computation time the longer that sequence becomes.) And since the attention weights are a learned property, it is in practise probable that earlier tokens of the input sequence get basically ignored the further they lie “in the past”, as they usually do not contribute much to the current context.

    “In the past”: Transformers technically “see” the whole input sequence at once. But they are equipped with positional encoding which incorporates spatial and/or temporal ordering into the input sequence (e.g., position of words in a sentence). That way they can model sequential relationships as those found in natural language (sentences), videos, movement trajectories and other kinds of contextually coherent sequences.





  • Zacryon@feddit.orgtoMemes@lemmy.mlFirefox + Ublock = 👑
    link
    fedilink
    arrow-up
    6
    arrow-down
    3
    ·
    1 month ago

    I’ve read the announcement. Sounds reasonable and sufficiently private to me. So saying “Mozilla wants your data” sounds misleading and like an overreaction to me. Also might help to mitigate the arms race in privacy protection versus tracking for ads and worse stuff.

    Mozilla is definitely going to try more scummy crap like this in the future.

    How do you know that?

    Even if, there will still be alternatives. But right now, Firefox is the best browser with regards to privacy and security. It even passed minmum ratings by the german IT security authority, contrary to other widely used browsers.