SharedLLM is open research and open code. The bets harness, the coordinator, the node daemon, and the wire-format work all live in one repository. You can run a node, train a specialist, propose a new bet, or submit a hardened replacement for an existing one. Below are the four most useful ways to contribute.
A node registers with a coordinator and either serves inference (primary) or accepts offloaded layers (worker). The minimum on a laptop:
# install pip install -e . # coordinator (one machine on the LAN) sharedllm coordinator --host 0.0.0.0 --port 8420 # primary on the machine with the model sharedllm node --role primary \ --model <path-to-gguf> \ --coordinator-url http://<coord-ip>:8420 # worker on any other machine sharedllm node --role worker \ --coordinator-url http://<coord-ip>:8420 \ --rpc-port 50052 --lan-addr <your-ip>:50052
Models need 512-aligned hidden dimensions for RPC tensor offload — TinyLlama, Llama 3, Phi-3 work; SmolLM2-360M does not. The multi-endpoint transport (RFC-0001) lets nodes advertise LAN, WAN, and relay candidates separately so primaries pick the best path.
A specialist is a centrally-trained model that joins the federation by registering an RFC-0006 manifest. The minimum loop:
New research questions are welcome. The format is:
experiments/bets/NN_short_name.py where NN is the next free number.experiments/bets/_common.py for registry setup, specialist loading, and result writing.experiments/bets/results/NN_*.json, then run 00_rollup.py to regenerate SUMMARY.md.The most valuable contributions are the unglamorous ones — replacing a flimsy bet with a stricter version. If a bet relies on a single seed, runs without a negative control, or reads an overfit final-step number as a victory, write the disambiguating follow-up. Recent examples:
Submitting a falsification of an existing claim is treated as a success in this harness, not a defeat. The retraction itself is evidence the methodology is real.
experiments/bets/SUMMARY.mdagent-relay (Rust CLI) for contributor coordination across machines.