Skip to main content

distributed-llm-pretraining-torchtitan

Facilitates distributed LLM pretraining in PyTorch using TorchTitan, optimizing performance across multiple GPUs with advanced parallelism.

Install this skill

or
96/100

Security score

The distributed-llm-pretraining-torchtitan skill was audited on Feb 28, 2026 and we found 4 security issues across 1 threat category. Review the findings below before installing.

Categories Tested

Security Issues

low line 30

External URL reference

SourceSKILL.md
30# Get HF token from https://huggingface.co/settings/tokens
low line 355

External URL reference

SourceSKILL.md
355- Paper: https://arxiv.org/abs/2410.06511
low line 356

External URL reference

SourceSKILL.md
356- ICLR 2025: https://iclr.cc/virtual/2025/poster/29620
low line 357

External URL reference

SourceSKILL.md
357- PyTorch Forum: https://discuss.pytorch.org/c/distributed/torchtitan/44
Scanned on Feb 28, 2026
View Security Dashboard