round two final dataset + one-shot
Current production dataset archive: final-training-dataset-for-round-two.tar.gz
- Final trainable pairs:
465 image + 465 caption
- Source gallery pool observed:
577 (dedupe + caption filtering reduced it)
- Run name:
anky_flux_lora_v2, rank/alpha 32/16, steps 6500
runpod one-shot setup
On a fresh RunPod terminal (Blackwell-safe torch handled in script):
HF_TOKEN='hf_xxx' ANKY_TOKEN='xxx_optional' ANKY_DATASET_URL='https://huggingface.co/jpfraneto/anky-flux-lora-v2/resolve/main/training-data/final-training-dataset-for-round-two.tar.gz' ANKY_DATASET_MIN_IMAGES=460 TRAIN_NAME='anky_flux_lora_v2' LORA_RANK=32 LORA_ALPHA=16 TRAIN_STEPS=6500 SAVE_EVERY=500 SAMPLE_EVERY=500 bash <(curl -fsSL https://anky.app/static/train_anky_setup.sh)
If you omit tokens, script prompts for them interactively.
post-training one-shot upload (weights + samples)
source /workspace/venv/bin/activate && HF_TOKEN='hf_xxx' REPO_ID='jpfraneto/anky-flux-lora-v2' RUN_DIR='/workspace/output/anky_flux_lora_v2' bash -lc 'set -euo pipefail; STAGE=/tmp/hf_upload_anky_flux_lora_v2; export STAGE; rm -rf "$STAGE"; mkdir -p "$STAGE/weights" "$STAGE/samples"; cp -av "$RUN_DIR"/*.safetensors "$STAGE/weights/"; cp -av "$RUN_DIR/samples/." "$STAGE/samples/" 2>/dev/null || true; python -c "import os; from huggingface_hub import HfApi; HfApi(token=os.environ[\"HF_TOKEN\"]).upload_folder(folder_path=os.environ[\"STAGE\"], repo_id=os.environ[\"REPO_ID\"], repo_type=\"model\", path_in_repo=\"training-runs/anky_flux_lora_v2\", commit_message=\"Upload round-two final weights and samples\")"'
uploaded artifacts (run 002)
training-runs/anky_flux_lora_v2/weights/anky_flux_lora_v2.safetensors
training-runs/anky_flux_lora_v2/samples/...
training-runs/anky_flux_lora_v2/meta/...
Canonical repo: jpfraneto/anky-flux-lora-v2