Initial vault-backup project

- R2 백업 스크립트 (Raft 스냅샷 + fallback)
- 경로 기반 백업 스크립트
- 환경변수 템플릿
- README 문서

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
kappa
2026-01-30 22:09:45 +09:00
commit 750f8ac241
5 changed files with 375 additions and 0 deletions

16
.env.example Normal file
View File

@@ -0,0 +1,16 @@
# Vault Backup R2 Configuration
# Copy to vault-backup-r2.env and fill in values
# Vault Configuration
VAULT_ADDR=https://vault.anvil.it.com
VAULT_TOKEN=hvs.xxxxxxxxxxxxx
# Cloudflare R2 Configuration
R2_ACCOUNT_ID=your-cloudflare-account-id
R2_ACCESS_KEY=your-r2-access-key-id
R2_SECRET_KEY=your-r2-secret-access-key
R2_BUCKET=vault-backup
# Backup Settings
BACKUP_DIR=/tmp/vault-backups
RETENTION_DAYS=30

17
.gitignore vendored Normal file
View File

@@ -0,0 +1,17 @@
# Environment
.env
*.env.local
# Backups
*.snap
*.snap.gz
*.json.gz
old-backups/
# Logs
logs/
*.log
# Temp
/tmp/
.DS_Store

98
README.md Normal file
View File

@@ -0,0 +1,98 @@
# Vault Backup to Cloudflare R2
HashiCorp Vault 시크릿을 Cloudflare R2에 자동 백업하는 도구
## 구조
```
vault-backup/
├── scripts/
│ ├── vault-backup-r2.sh # 전체 백업 (Raft 스냅샷 + fallback)
│ └── vault-backup-mcp.sh # 경로 기반 백업
├── docs/
├── old-backups/ # 기존 로컬 백업
├── .env.example # 환경변수 예제
└── README.md
```
## 요구사항
- `curl`, `jq`, `aws` CLI
- Cloudflare R2 API 토큰
- Vault 접근 토큰
## 설정
### 1. 환경변수 설정
```bash
cp .env.example .env
vi .env
```
### 2. R2 API 토큰 생성
1. [Cloudflare Dashboard](https://dash.cloudflare.com) → R2
2. **Manage R2 API Tokens****Create API Token**
3. 권한: `Object Read & Write`
4. 버킷: `vault-backup`
### 3. Account ID 확인
Cloudflare Dashboard → 우측 하단 **Account ID** 복사
## 사용법
### 수동 백업
```bash
source .env
./scripts/vault-backup-r2.sh
```
### 자동 백업 (Cron)
```bash
# 매일 새벽 3시 백업
0 3 * * * cd ~/vault-backup && source .env && ./scripts/vault-backup-r2.sh >> logs/backup.log 2>&1
```
### 특정 이름으로 백업
```bash
./scripts/vault-backup-r2.sh my-custom-backup-name
```
## 백업 방식
| 스크립트 | 방식 | 용도 |
|----------|------|------|
| `vault-backup-r2.sh` | Raft 스냅샷 우선, 실패시 시크릿 export | 전체 백업 |
| `vault-backup-mcp.sh` | 지정된 경로만 백업 | 선택적 백업 |
## 환경변수
| 변수 | 설명 | 기본값 |
|------|------|--------|
| `VAULT_ADDR` | Vault 서버 주소 | - |
| `VAULT_TOKEN` | Vault 인증 토큰 | - |
| `R2_ACCOUNT_ID` | Cloudflare Account ID | - |
| `R2_ACCESS_KEY` | R2 Access Key ID | - |
| `R2_SECRET_KEY` | R2 Secret Access Key | - |
| `R2_BUCKET` | R2 버킷 이름 | `vault-backup` |
| `RETENTION_DAYS` | 백업 보존 기간 (일) | `30` |
## 복원
```bash
# R2에서 다운로드
aws s3 cp s3://vault-backup/vault-snapshot-YYYYMMDD-HHMMSS.snap ./restore.snap \
--endpoint-url https://<ACCOUNT_ID>.r2.cloudflarestorage.com
# Vault 복원 (Raft)
vault operator raft snapshot restore restore.snap
```
## 라이선스
MIT

69
scripts/vault-backup-mcp.sh Executable file
View File

@@ -0,0 +1,69 @@
#!/bin/bash
# Simple Vault Backup using known paths
# For use when you know the secret paths to backup
set -euo pipefail
# R2 Configuration
R2_ACCOUNT_ID="${R2_ACCOUNT_ID:?R2_ACCOUNT_ID is required}"
R2_ACCESS_KEY="${R2_ACCESS_KEY:?R2_ACCESS_KEY is required}"
R2_SECRET_KEY="${R2_SECRET_KEY:?R2_SECRET_KEY is required}"
R2_BUCKET="${R2_BUCKET:-vault-backup}"
R2_ENDPOINT="https://${R2_ACCOUNT_ID}.r2.cloudflarestorage.com"
VAULT_ADDR="${VAULT_ADDR:-https://vault.anvil.it.com}"
VAULT_TOKEN="${VAULT_TOKEN:?VAULT_TOKEN is required}"
TIMESTAMP=$(date +%Y%m%d-%H%M%S)
BACKUP_FILE="/tmp/vault-backup-${TIMESTAMP}.json"
echo "[INFO] Starting Vault backup at $(date)"
# Known secret paths to backup (add your paths here)
PATHS=(
"app/config"
"app/database"
"shared/api-keys"
# Add more paths as needed
)
echo '{"backup_time": "'$(date -Iseconds)'", "secrets": [' > "$BACKUP_FILE"
first=true
for path in "${PATHS[@]}"; do
echo "[INFO] Backing up: $path"
secret=$(curl -s -H "X-Vault-Token: ${VAULT_TOKEN}" \
"${VAULT_ADDR}/v1/secret/data/${path}" 2>/dev/null | \
jq '.data.data // empty' 2>/dev/null || echo "")
if [[ -n "$secret" && "$secret" != "null" ]]; then
if [[ "$first" == "true" ]]; then
first=false
else
echo "," >> "$BACKUP_FILE"
fi
echo "{\"path\": \"${path}\", \"data\": ${secret}}" >> "$BACKUP_FILE"
else
echo "[WARN] Path not found or empty: $path"
fi
done
echo ']}' >> "$BACKUP_FILE"
# Compress
gzip "$BACKUP_FILE"
BACKUP_FILE="${BACKUP_FILE}.gz"
# Upload to R2
echo "[INFO] Uploading to R2..."
export AWS_ACCESS_KEY_ID="$R2_ACCESS_KEY"
export AWS_SECRET_ACCESS_KEY="$R2_SECRET_KEY"
aws s3 cp "$BACKUP_FILE" "s3://${R2_BUCKET}/$(basename $BACKUP_FILE)" \
--endpoint-url "$R2_ENDPOINT"
# Cleanup
rm -f "$BACKUP_FILE"
echo "[INFO] Backup complete: s3://${R2_BUCKET}/$(basename $BACKUP_FILE)"

175
scripts/vault-backup-r2.sh Executable file
View File

@@ -0,0 +1,175 @@
#!/bin/bash
# Vault Snapshot Backup to Cloudflare R2
# Usage: ./vault-backup-r2.sh [snapshot-name]
set -euo pipefail
# ============================================
# Configuration
# ============================================
VAULT_ADDR="${VAULT_ADDR:-https://vault.anvil.it.com}"
VAULT_TOKEN="${VAULT_TOKEN:?VAULT_TOKEN is required}"
# R2 Configuration
R2_ACCOUNT_ID="${R2_ACCOUNT_ID:?R2_ACCOUNT_ID is required}"
R2_ACCESS_KEY="${R2_ACCESS_KEY:?R2_ACCESS_KEY is required}"
R2_SECRET_KEY="${R2_SECRET_KEY:?R2_SECRET_KEY is required}"
R2_BUCKET="${R2_BUCKET:-vault-backup}"
R2_ENDPOINT="https://${R2_ACCOUNT_ID}.r2.cloudflarestorage.com"
# Backup settings
BACKUP_DIR="${BACKUP_DIR:-/tmp/vault-backups}"
RETENTION_DAYS="${RETENTION_DAYS:-30}"
TIMESTAMP=$(date +%Y%m%d-%H%M%S)
SNAPSHOT_NAME="${1:-vault-snapshot-${TIMESTAMP}}"
# ============================================
# Functions
# ============================================
log() {
echo "[$(date '+%Y-%m-%d %H:%M:%S')] $*"
}
error() {
log "ERROR: $*" >&2
exit 1
}
check_dependencies() {
for cmd in curl aws jq; do
if ! command -v "$cmd" &>/dev/null; then
error "$cmd is required but not installed"
fi
done
}
create_snapshot() {
log "Creating Vault snapshot..."
mkdir -p "$BACKUP_DIR"
local snapshot_file="${BACKUP_DIR}/${SNAPSHOT_NAME}.snap"
# Vault Raft snapshot (requires Raft storage backend)
curl -s -X POST \
-H "X-Vault-Token: ${VAULT_TOKEN}" \
"${VAULT_ADDR}/v1/sys/storage/raft/snapshot" \
-o "$snapshot_file" \
--fail || {
# Fallback: Export secrets as JSON (for non-Raft backends)
log "Raft snapshot failed, trying secrets export..."
export_secrets "$snapshot_file"
}
echo "$snapshot_file"
}
export_secrets() {
local output_file="$1"
local secrets_file="${output_file%.snap}.json"
log "Exporting secrets to JSON..."
# Get list of secret paths (requires list permission)
local paths=$(curl -s -X LIST \
-H "X-Vault-Token: ${VAULT_TOKEN}" \
"${VAULT_ADDR}/v1/secret/metadata" | jq -r '.data.keys[]?' 2>/dev/null || echo "")
if [[ -z "$paths" ]]; then
log "No secrets found or no list permission"
echo '{"secrets": [], "timestamp": "'$(date -Iseconds)'", "note": "empty or no permission"}' > "$secrets_file"
else
echo '{"secrets": [' > "$secrets_file"
local first=true
for path in $paths; do
local secret=$(curl -s \
-H "X-Vault-Token: ${VAULT_TOKEN}" \
"${VAULT_ADDR}/v1/secret/data/${path}" | jq '.data' 2>/dev/null)
if [[ -n "$secret" && "$secret" != "null" ]]; then
if [[ "$first" == "true" ]]; then
first=false
else
echo "," >> "$secrets_file"
fi
echo "{\"path\": \"${path}\", \"data\": ${secret}}" >> "$secrets_file"
fi
done
echo '], "timestamp": "'$(date -Iseconds)'"}' >> "$secrets_file"
fi
# Compress and encrypt (optional)
gzip -c "$secrets_file" > "$output_file"
rm -f "$secrets_file"
}
upload_to_r2() {
local file="$1"
local filename=$(basename "$file")
log "Uploading to R2: ${R2_BUCKET}/${filename}"
# Configure AWS CLI for R2
export AWS_ACCESS_KEY_ID="$R2_ACCESS_KEY"
export AWS_SECRET_ACCESS_KEY="$R2_SECRET_KEY"
aws s3 cp "$file" "s3://${R2_BUCKET}/${filename}" \
--endpoint-url "$R2_ENDPOINT" \
--quiet
log "Upload complete: s3://${R2_BUCKET}/${filename}"
}
cleanup_old_backups() {
log "Cleaning up backups older than ${RETENTION_DAYS} days..."
export AWS_ACCESS_KEY_ID="$R2_ACCESS_KEY"
export AWS_SECRET_ACCESS_KEY="$R2_SECRET_KEY"
local cutoff_date=$(date -v-${RETENTION_DAYS}d +%Y-%m-%d 2>/dev/null || date -d "${RETENTION_DAYS} days ago" +%Y-%m-%d)
aws s3 ls "s3://${R2_BUCKET}/" --endpoint-url "$R2_ENDPOINT" 2>/dev/null | \
while read -r line; do
local file_date=$(echo "$line" | awk '{print $1}')
local file_name=$(echo "$line" | awk '{print $4}')
if [[ "$file_date" < "$cutoff_date" && -n "$file_name" ]]; then
log "Deleting old backup: $file_name"
aws s3 rm "s3://${R2_BUCKET}/${file_name}" --endpoint-url "$R2_ENDPOINT" --quiet
fi
done
}
cleanup_local() {
log "Cleaning up local backup files..."
find "$BACKUP_DIR" -name "vault-snapshot-*.snap" -mtime +1 -delete 2>/dev/null || true
find "$BACKUP_DIR" -name "vault-snapshot-*.gz" -mtime +1 -delete 2>/dev/null || true
}
# ============================================
# Main
# ============================================
main() {
log "=== Vault Backup to R2 Started ==="
log "Vault: $VAULT_ADDR"
log "R2 Bucket: $R2_BUCKET"
check_dependencies
local snapshot_file
snapshot_file=$(create_snapshot)
if [[ -f "$snapshot_file" ]]; then
upload_to_r2 "$snapshot_file"
cleanup_old_backups
cleanup_local
log "=== Backup Complete ==="
else
error "Snapshot file not created"
fi
}
main "$@"