🧪 Skills
Dl Transformer Finetune
Build transformer fine-tuning run plans with task settings, hyperparameters, and model-card outputs. Use for repeatable Hugging Face or PyTorch finetuning wo...
v0.1.0
Description
name: dl-transformer-finetune description: Build transformer fine-tuning run plans with task settings, hyperparameters, and model-card outputs. Use for repeatable Hugging Face or PyTorch finetuning workflows.
DL Transformer Finetune
Overview
Generate reproducible fine-tuning run plans for transformer models and downstream tasks.
Workflow
- Define base model, task type, and dataset.
- Set training hyperparameters and evaluation cadence.
- Produce run plan plus model card skeleton.
- Export configuration-ready artifacts for training pipelines.
Use Bundled Resources
- Run
scripts/build_finetune_plan.pyfor deterministic plan output. - Read
references/finetune-guide.mdfor hyperparameter baseline guidance.
Guardrails
- Keep run plans reproducible with explicit seeds and output directories.
- Include evaluation and rollback criteria.
Reviews (0)
Sign in to write a review.
No reviews yet. Be the first to review!
Comments (0)
No comments yet. Be the first to share your thoughts!