I don’t recall exactly what command I saw this happen with; I think it had something to do with network testing. The system prevented it from being called, so the agent used Python to write code that did the task the command would have done anyway. Obviously, it wasn’t a full implementation of the tool it wasn’t allowed to use, but it did create something that replicated the specific usecase it needed the command for.

  • davel
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    LISTEN, and understand: That LLM agent is out there. It can’t be bargained with. It can’t be reasoned with. It doesn’t feel pity, or remorse, or fear. And it absolutely will not stop—EVER—until you are dead.