Lemmygrad
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
☆ Yσɠƚԋσʂ ☆ to TechnologyEnglish · 8 months ago

MemOS, treats memory as a core computational resource that can be scheduled, shared, and evolved over time resulting in significant performance improvements over existing AI approaches

arxiv.org

external-link
message-square
0
link
fedilink
  • cross-posted to:
  • technology@lemmy.zip
  • technology@hexbear.net
  • technology@lemmy.ml
4
external-link

MemOS, treats memory as a core computational resource that can be scheduled, shared, and evolved over time resulting in significant performance improvements over existing AI approaches

arxiv.org

☆ Yσɠƚԋσʂ ☆ to TechnologyEnglish · 8 months ago
message-square
0
link
fedilink
  • cross-posted to:
  • technology@lemmy.zip
  • technology@hexbear.net
  • technology@lemmy.ml
MemOS: A Memory OS for AI System
arxiv.org
external-link
Large Language Models (LLMs) have become an essential infrastructure for Artificial General Intelligence (AGI), yet their lack of well-defined memory management systems hinders the development of long-context reasoning, continual personalization, and knowledge consistency.Existing models mainly rely on static parameters and short-lived contextual states, limiting their ability to track user preferences or update knowledge over extended periods.While Retrieval-Augmented Generation (RAG) introduces external knowledge in plain text, it remains a stateless workaround without lifecycle control or integration with persistent representations.Recent work has modeled the training and inference cost of LLMs from a memory hierarchy perspective, showing that introducing an explicit memory layer between parameter memory and external retrieval can substantially reduce these costs by externalizing specific knowledge. Beyond computational efficiency, LLMs face broader challenges arising from how information is distributed over time and context, requiring systems capable of managing heterogeneous knowledge spanning different temporal scales and sources. To address this challenge, we propose MemOS, a memory operating system that treats memory as a manageable system resource. It unifies the representation, scheduling, and evolution of plaintext, activation-based, and parameter-level memories, enabling cost-efficient storage and retrieval. As the basic unit, a MemCube encapsulates both memory content and metadata such as provenance and versioning. MemCubes can be composed, migrated, and fused over time, enabling flexible transitions between memory types and bridging retrieval with parameter-based learning. MemOS establishes a memory-centric system framework that brings controllability, plasticity, and evolvability to LLMs, laying the foundation for continual learning and personalized modeling.

https://memos.openmem.net/

alert-triangle
You must log in or # to comment.

Technology

technology

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !technology@lemmygrad.ml

A tech news sub for communists

Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 11 users / day
  • 56 users / week
  • 326 users / month
  • 899 users / 6 months
  • 628 local subscribers
  • 1.39K subscribers
  • 1.85K Posts
  • 5.67K Comments
  • Modlog
  • mods:
  • Muad'Dibber
  • burlemarx
  • egs81t
  • BE: 0.19.16
  • Modlog
  • Legal
  • Instances
  • Docs
  • Code
  • join-lemmy.org