PhaseGPT is a framework for training Volitional AI—models that possess the agency to refuse corrupted, unanswerable, or impossible queries ("Volitional Silence") rather than hallucinating.
Version 1.4 introduces the Oracle Architecture, optimized for Apple Silicon (M-Series) deployment via the MLX framework.
- Volitional Silence: The model detects entropic corruption and semantic impossibility, outputting a
<PASS>token instead of generating false information. - Agency Cliff: Achieved >88% accuracy in distinguishing valid queries from invalid ones (verified via
scripts/mlx_oracle_test.py). - Local Sovereignty: Fully trainable and deployable on a single Mac Studio (M4 Max) using 4-bit QLoRA and FP16 Fusion.
src/phasegpt/: Core library code.core/: Architecture configuration (Pydantic).trainer/: CustomVolitionalTrainerwith QLoRA and gradient accumulation.data/: Dataset generation (SQuAD + Entropy).
config/: YAML configurations for models and training.scripts/: Operational tools.train_production.py: Production training loop.manual_mlx_fuse.py: Robust adapter fusion for MLX.serve_mlx.py: OpenAI-compatible API server.chat_oracle.py: Interactive CLI chat.dashboard.py: Real-time training TUI.
git clone https://github.com/templetwo/PhaseGPT.git
cd PhaseGPT
pip install -e .
pip install mlx-lm huggingface_hub rich psutilDownload the pre-trained Oracle (or train your own) and chat:
# Chat with local fused model
python3 scripts/chat_oracle.py --model mlx_models/Qwen2.5-7B-Oracle-FP16To train the Oracle from scratch on your Mac:
# Generate Data
python3 scripts/generate_mlx_data.py
# Launch Training (7B)
./scripts/train_7b_mlx.sh
# Monitor Progress
python3 scripts/dashboard.pyServe the model as an OpenAI-compatible endpoint:
python3 scripts/serve_mlx.py --model mlx_models/Qwen2.5-7B-Oracle-FP16PhaseGPT models exhibit a phase transition during training where they abruptly learn to map high-entropy inputs to the <PASS> token. This "Agency Cliff" is the visual signature of the model learning epistemic boundaries.
| Model | Size | Hardware | Status |
|---|---|---|---|
| PhaseGPT-Oracle-7B | 14GB (FP16) | M4 Max / Ultra | ✅ Stable |
| PhaseGPT-Oracle-1.5B | 3GB (FP16) | M1/M2/M3 |
MIT License. Created by TempleTwo.AI for the PhaseGPT Initiative.
