Skip to main content

Configuration Options

LLMBoost Hub (lbh) can be configured using environment variables and a configuration file to customize your deployment.

Environment Variables

LLMBoost Hub uses the following environment variables:

Core Configuration

VariableDescriptionDefault
LBH_HOMEBase directory for all LLMBoost Hub data~/.llmboost_hub (host)
/llmboost_hub (container)
LBH_MODELSDirectory for storing model assets$LBH_HOME/models
LBH_WORKSPACEMounted user workspace for file transfers$LBH_HOME/workspace (host)
/user_workspace (container)
LBH_LICENSE_PATHPath to your LLMBoost license fileAuto-prompted if not set
HF_TOKENHuggingFace access tokenInjected automatically when set

Example Configuration

# Set custom base directory
export LBH_HOME=/data/llmboost

# Store model is a custom directory
export LBH_MODELS=/lustre1/$USER/llm_models

# Set HuggingFace token (or use `hf auth login`)
export HF_TOKEN=hf_your_token_here

# Verify configuration
lbh -v list

Configuration File

A configuration file is stored at $LBH_HOME/config.yaml containing all settings.

Precedence Order

Settings are resolved in the following order (highest to lowest):

  1. Environment variables (highest priority)
  2. Configuration file ($LBH_HOME/config.yaml)
  3. Default values (lowest priority)

Important Notes

Changing LBH_HOME
  • LBH_HOME can only be changed by setting the environment variable (e.g., in ~/.bashrc)
  • Changing LBH_HOME will create a new data directory and reset all configuration
  • Existing models and containers will not be accessible until you switch back
Environment Variables

Set environment variables in your shell profile (~/.bashrc or ~/.zshrc) to persist them across sessions.

Server Configuration

When using lbh serve, you can customize server behavior:

# Serve on custom host and port
lbh serve meta-llama/Llama-3.1-8B-Instruct --host 0.0.0.0 --port 8080

# Run in detached mode (don't wait for ready)
lbh serve meta-llama/Llama-3.1-8B-Instruct --detached

# Force serve (skip GPU utilization checks)
lbh serve meta-llama/Llama-3.1-8B-Instruct --force

Docker Configuration

Pass additional Docker flags when running containers:

# Custom memory limits
lbh run meta-llama/Llama-3.1-8B-Instruct -- --memory=32g --memory-swap=32g

# Custom network settings
lbh run meta-llama/Llama-3.1-8B-Instruct -- --network=my-network

# Mount additional volumes
lbh run meta-llama/Llama-3.1-8B-Instruct -- -v /my/data:/data

Next: Learn about all available commands in the Command Reference.