• 0 Posts
  • 2 Comments
Joined 3 个月前
cake
Cake day: 2024年6月29日

help-circle
  • There are chain of thought and tree of thought approaches and maybe even more. From what I understand it generates answer in several passes and even with smaller models you can get better results.

    However it is funny how AI (LLMs) is heavily marketed as a thing that will make many jobs obsolete and/or will take over humanity. Yet to get any meaningful results people start to build whole pipelines around LLMs, probably even using several models for different tasks. I also read a little about retrieval augmented generation (RAG) and apparently it has a lot of caveats in terms of what data can and can not be successfully extracted, data should be chunked to fit into the context and yet retain all the valuable information and this problem does not have “one size fits all” solution.

    Overall it feels like someone made black box (LLM), someone tried to use this black box to deal with the existing complexity, failed and started building another layer of complexity around the black box. So ultimately current AI adopters can find themselves with two complex entities at hand. And I find it kind of funny.


  • Fair enough, I am looking into buying PC only as a server, but as I am kind of migrant still trying to settle down it will be somewhere in 2025, if not 2026. And right now laptop + phone cover basically all my needs i.e. work, gaming, reading, surfing the web, interacting with the local government. Not to mention that it is much easier to get around with those compared to the headache that is moving PC :)

    And from my experience most PC users now are either people who bought it 10+ years ago and they just still have it, or people really invested into AAA gaming. Everyone else has combination of smatphone and tablet/laptop.