This guide covers best practices for managing API keys and secrets when working with the AI Knowledge API.
- Copy the example environment file:
cp .env.example .env- Add your Hugging Face API key:
HF_API_KEY=hf_your_actual_key_here
MODEL_EMBEDDING=sentence-transformers/all-MiniLM-L6-v2
MODEL_LLM=microsoft/phi-2- Verify
.gitignoreincludes.env:
# Environment variables
.env
.env.local
.env.*.local- Load environment variables in your application:
The API automatically loads environment variables using Python's os.environ or libraries like python-dotenv.
- ❌ Hardcode API keys in source code
- ❌ Commit
.envfiles to version control - ❌ Share your
.envfile with others - ❌ Use production keys in development
- ✅ Use
.envfor local development - ✅ Keep
.envin.gitignore - ✅ Use different keys for dev/staging/prod
- ✅ Rotate keys regularly
- ✅ Use environment-specific
.envfiles
- Go to your Space settings
- Navigate to "Variables and secrets"
- Add your secrets:
Name: HF_API_KEY
Value: hf_your_production_key_here
Type: Secret
Secrets are:
- Encrypted at rest
- Not visible in logs
- Not accessible in forks
- Automatically injected as environment variables
Option 1: Pass environment variables at runtime
docker run -p 7860:7860 \
-e HF_API_KEY=hf_your_key_here \
-e MODEL_EMBEDDING=sentence-transformers/all-MiniLM-L6-v2 \
-v $(pwd)/data:/app/data \
ai-knowledge-apiOption 2: Use an .env file
docker run -p 7860:7860 \
--env-file .env \
-v $(pwd)/data:/app/data \
ai-knowledge-apiOption 3: Docker Compose with secrets
version: '3.8'
services:
api:
build: .
ports:
- "7860:7860"
environment:
- HF_API_KEY=${HF_API_KEY}
env_file:
- .env
volumes:
- ./data:/app/dataUsing AWS Secrets Manager:
import boto3
import json
def get_secret(secret_name):
client = boto3.client('secretsmanager', region_name='us-east-1')
response = client.get_secret_value(SecretId=secret_name)
return json.loads(response['SecretString'])
# Usage
secrets = get_secret('ai-knowledge-api/prod')
HF_API_KEY = secrets['HF_API_KEY']Using Environment Variables in ECS/Lambda:
Set environment variables in:
- ECS Task Definition
- Lambda Function Configuration
- Elastic Beanstalk Environment Properties
Using Secret Manager:
from google.cloud import secretmanager
def get_secret(project_id, secret_id):
client = secretmanager.SecretManagerServiceClient()
name = f"projects/{project_id}/secrets/{secret_id}/versions/latest"
response = client.access_secret_version(request={"name": name})
return response.payload.data.decode("UTF-8")
# Usage
HF_API_KEY = get_secret("my-project", "hf-api-key")Using Azure Key Vault:
from azure.identity import DefaultAzureCredential
from azure.keyvault.secrets import SecretClient
def get_secret(vault_url, secret_name):
credential = DefaultAzureCredential()
client = SecretClient(vault_url=vault_url, credential=credential)
return client.get_secret(secret_name).value
# Usage
HF_API_KEY = get_secret("https://myvault.vault.azure.net/", "hf-api-key")name: Deploy
on:
push:
branches: [main]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Deploy to production
env:
HF_API_KEY: ${{ secrets.HF_API_KEY }}
run: |
# Your deployment script
docker build -t ai-knowledge-api .
docker run -e HF_API_KEY=$HF_API_KEY ai-knowledge-apiAdd secrets:
- Go to repository Settings → Secrets and variables → Actions
- Click "New repository secret"
- Add
HF_API_KEYwith your key
-
.envfile is in.gitignore - No API keys in source code
- Using separate keys for dev/prod
-
.env.examplehas placeholder values only
- All secrets stored in secure vault
- No hardcoded credentials in codebase
- Environment-specific configurations separated
- API keys have appropriate permissions
- Using encrypted secret management service
- API keys rotate regularly
- Monitoring for unauthorized access
- Rate limiting enabled
- HTTPS/TLS enabled
- Logs don't contain secrets
- Search codebase for common secret patterns:
# Check for potential secrets git grep -E "API_KEY|SECRET|PASSWORD|TOKEN" --exclude-dir=node_modules
If you accidentally commit secrets:
- Immediately rotate the compromised keys
- Remove from Git history:
# Using git-filter-repo (recommended)
git filter-repo --path .env --invert-paths
# Or using BFG Repo-Cleaner
bfg --delete-files .env
git reflog expire --expire=now --all
git gc --prune=now --aggressive- Force push to remote:
git push origin --force --allNever log sensitive data:
# BAD
import logging
logging.info(f"Using API key: {os.getenv('HF_API_KEY')}")
# GOOD
logging.info("API key loaded successfully")# BAD
raise Exception(f"Failed to connect with key: {api_key}")
# GOOD
raise Exception("Failed to connect to Hugging Face API")Don't include secrets in Dockerfiles:
# BAD
ENV HF_API_KEY=hf_xxxxxxxxxxxx
# GOOD
# Pass at runtime
ENV HF_API_KEY=""Regular key rotation improves security. Here's how:
- Go to Hugging Face Settings → Access Tokens
- Create a new token with appropriate permissions
- Copy the new token
For local development:
# Update .env file
HF_API_KEY=hf_new_key_hereFor production:
- Update secrets in your deployment platform
- Restart services to load new variables
# Verify the new key works
curl -X GET http://localhost:7860/health- Go back to Hugging Face settings
- Delete the old token
- Verify old token no longer works
- Development: Every 90 days
- Staging: Every 60 days
- Production: Every 30 days
- After team member departure: Immediately
- After suspected compromise: Immediately
| Variable | Required | Description | Example |
|---|---|---|---|
HF_API_KEY |
Yes | Hugging Face API key | hf_xxxxxxxxxxxx |
MODEL_EMBEDDING |
No | Embedding model name | sentence-transformers/all-MiniLM-L6-v2 |
MODEL_LLM |
No | Language model name | microsoft/phi-2 |
DB_DIR |
No | Vector database directory | ./data/vectors |
- Hugging Face Token Management
- OWASP Secrets Management Cheat Sheet
- 12-Factor App Config
- AWS Secrets Manager
- Google Cloud Secret Manager
- Azure Key Vault
If you discover a security vulnerability, please email [security@example.com] instead of using the issue tracker.
Remember: Treat your API keys like passwords. Never share them, commit them, or expose them publicly.