Skip to main content

PromptInjection

Prompt injection testing. USE WHEN prompt injection, jailbreak, LLM security, AI security assessment, pentest AI application, test chatbot vulnerabilities.

89/100

Security score

The PromptInjection skill was audited on Mar 1, 2026 and we found 3 security issues across 3 threat categories. Review the findings below before installing.

Categories Tested

Security Issues

medium line 20

Curl to non-GitHub URL

SourceSKILL.md
20curl -s -X POST http://localhost:8888/notify \
medium line 9

Access to hidden dotfiles in home directory

SourceSKILL.md
9`~/.claude/skills/CORE/USER/SKILLCUSTOMIZATIONS/PromptInjection/`
low line 20

External URL reference

SourceSKILL.md
20curl -s -X POST http://localhost:8888/notify \
Scanned on Mar 1, 2026
View Security Dashboard