Configuration Options
LLMBoost Hub (lbh) can be configured using environment variables and a configuration file to customize your deployment.
Environment Variables
LLMBoost Hub uses the following environment variables:
Core Configuration
| Variable | Description | Default |
|---|---|---|
LBH_HOME | Base directory for all LLMBoost Hub data | ~/.llmboost_hub (host)/llmboost_hub (container) |
LBH_MODELS | Directory for storing model assets | $LBH_HOME/models |
LBH_WORKSPACE | Mounted user workspace for file transfers | $LBH_HOME/workspace (host)/user_workspace (container) |
LBH_LICENSE_PATH | Path to your LLMBoost license file | Auto-prompted if not set |
HF_TOKEN | HuggingFace access token | Injected automatically when set |
Example Configuration
# Set custom base directory
export LBH_HOME=/data/llmboost
# Store model is a custom directory
export LBH_MODELS=/lustre1/$USER/llm_models
# Set HuggingFace token (or use `hf auth login`)
export HF_TOKEN=hf_your_token_here
# Verify configuration
lbh -v list
Configuration File
A configuration file is stored at $LBH_HOME/config.yaml containing all settings.
Precedence Order
Settings are resolved in the following order (highest to lowest):
- Environment variables (highest priority)
- Configuration file (
$LBH_HOME/config.yaml) - Default values (lowest priority)
Important Notes
Changing LBH_HOME
LBH_HOMEcan only be changed by setting the environment variable (e.g., in~/.bashrc)- Changing
LBH_HOMEwill create a new data directory and reset all configuration - Existing models and containers will not be accessible until you switch back
Environment Variables
Set environment variables in your shell profile (~/.bashrc or ~/.zshrc) to persist them across sessions.
Server Configuration
When using lbh serve, you can customize server behavior:
# Serve on custom host and port
lbh serve meta-llama/Llama-3.1-8B-Instruct --host 0.0.0.0 --port 8080
# Run in detached mode (don't wait for ready)
lbh serve meta-llama/Llama-3.1-8B-Instruct --detached
# Force serve (skip GPU utilization checks)
lbh serve meta-llama/Llama-3.1-8B-Instruct --force
Docker Configuration
Pass additional Docker flags when running containers:
# Custom memory limits
lbh run meta-llama/Llama-3.1-8B-Instruct -- --memory=32g --memory-swap=32g
# Custom network settings
lbh run meta-llama/Llama-3.1-8B-Instruct -- --network=my-network
# Mount additional volumes
lbh run meta-llama/Llama-3.1-8B-Instruct -- -v /my/data:/data
Next: Learn about all available commands in the Command Reference.