Skip to content

the small distributed language model toolkit; fine-tune state-of-the-art LLMs anywhere, rapidly

License

Notifications You must be signed in to change notification settings

Prismadic/magnet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation


magnet

๐Ÿ“– docs | ๐Ÿ’ป examples | ๐Ÿ““ substack

the small distributed language model toolkit

โšก๏ธ fine-tune state-of-the-art LLMs anywhere, rapidly โšก๏ธ

GitHub release (with filter) PyPI - Version GitHub Workflow Status (with event) GitHub code size in bytes GitHub last commit (branch) GitHub issues GitHub Repo stars GitHub watchers PyPI - Downloads PyPI - Wheel X (formerly Twitter) Follow

๐Ÿงฌ Installation

pip install llm-magnet

or

python3 setup.py install

๐ŸŽ‰ usage

check out the example notebooks

a snippet to get you started

from magnet.base import Magnet
from magnet.base import EmbeddedMagnet

cluster = EmbeddedMagnet()
cluster.start()
magnet = cluster.create_magnet()
await magnet.align()

config = {
    "host": "127.0.0.1",
    "credentials": None,
    "domain": None,
    "name": "my_stream",
    "category": "my_category",
    "kv_name": "my_kv",
    "session": "my_session",
    "os_name": "my_object_store",
    "index": {
        "milvus_uri": "127.0.0.1",
        "milvus_port": 19530,
        "milvus_user": "test",
        "milvus_password": "test",
        "dimension": 1024,
        "model": "BAAI/bge-large-en-v1.5",
        "name": "test",
        "options": {
            'metric_type': 'COSINE',
            'index_type':'HNSW',
            'params': {
                "efConstruction": 40
                , "M": 48
            }
        }
    }
}

magnet = Magnet(config)
await magnet.align()

๐Ÿ”ฎ features

  • โšก๏ธ It's Fast
    • fast on consumer hardware
    • very fast on Apple Silicon
    • extremely fast on ROCm/CUDA
  • ๐Ÿซต Automatic or your way
    • rely on established transformer patterns to let magnet do the work
    • keep your existing data processing functions, bring them to magnet!
  • ๐Ÿ›ฐ๏ธ 100% Distributed
    • processing, embedding, storage, retrieval, querying, or inference from anywhere
    • as much or as little compute as you need
  • ๐Ÿงฎ Choose Inference Method
    • HuggingFace
    • vLLM node
    • GPU
    • mlx
  • ๐ŸŒŽ Huge Volumes
    • handle gigantic amounts of data inexpensively
    • fault-tolerant by design
    • decentralized workloads
  • ๐Ÿ” Secure
    • JWT
    • Basic
  • ๐Ÿชต World-Class Comprehension
    • magnet optionally logs its own code as it's executed (yes, really)
    • build a self-aware system and allow it to learn from itself
    • emojis are the future

๐Ÿงฒ why

  • build a distributed LLM research node with any hardware, from Rasbperry Pi to the expensive cloud
  • Apple silicon first-class citizen with mlx
  • embed & index to vector db with milvus
  • distributed processing with NATS
  • upload to S3
  • ideal cyberpunk vision of LLM power users in vectorspace