⌘K
Change language Switch ThemeSign In
Narrow Mode
Hot on GitHub: AI/ML Penetration Testing and Security Learning Roadmap ======================================================================
Hot on GitHub: AI/ML Penetration Testing and Security Learning Roadmap ======================================================================  ### GitHubDaily
@GitHub_Daily
这两年,针对 AI 的安全测试成为热门行业,越来越多开发者开始学习 AI/ML 安全和渗透测试。
然而网上资源非常分散,想系统学习从哪开始都不知道,更别说找到靠谱的实战项目了。
偶然在 GitHub 上看到一份 AI/ML 渗透测试学习路线图,系统梳理了从零基础到实战的完整指南。
内容涵盖提示词注入、越狱技巧、数据外泄以及对抗性机器学习等核心攻击手法。
还贴心地整理了各类靶场、CTF 比赛和真实漏洞库,帮我们快速将理论转化为动手能力。
GitHub:github.com/anmolksachan/A…
整个学习路径按经验等级划分为新手、进阶和高级三个阶段,按部就班就能完成进阶。
除此之外,里面收录的教程、工具和文档基本都是开源免费的,直接点击就能阅读或使用。
适合想要涉足 AI 安全领域,或者想提升大模型应用健壮性的朋友,建议先收藏备用。Show More
Mar 13, 2026, 12:00 AM View on X
3 Replies
11 Retweets
57 Likes
4,864 Views  GitHubDaily @GitHub_Daily
One Sentence Summary
An open-source learning roadmap and resource compilation systematically covering AI/ML security, penetration testing, and prompt injection attacks.
Summary
This tweet shares an open-source GitHub project named 'AI-ML-Free-Resources-for-Security-and-Prompt-Injection'. This repository provides developers with a comprehensive AI security learning guide, from foundational concepts to hands-on practice, covering key attack techniques such as prompt injection, jailbreaking methods, data exfiltration, and adversarial machine learning. Additionally, it compiles various lab environments, CTF competitions, and real-world vulnerability databases, and structures the learning path into three experience levels: Novice, Intermediate, and Advanced, aiming to help developers systematically enhance the robustness of large AI model applications.
AI Score
82
Influence Score 25
Published At Today
Language
Chinese
Tags
AI Security
Penetration Testing
Prompt Injection
GitHub
Learning Roadmap HomeArticlesPodcastsVideosTweets
Hot on GitHub: AI/ML Penetration Testing and Security Lea... ===============