When a Colleague Is Cloned into an AI, the Anti‑Distillation Skill Strikes Back on GitHub
The article examines two open‑source Claude Code projects—colleague‑skill, which captures a departing teammate's knowledge into an AI‑driven Skill, and anti‑distill, which sanitizes that Skill to protect personal expertise—detailing their motivations, implementation, and community reaction.
colleague-skill
When a key engineer leaves, their tacit knowledge—design decisions, debugging tricks, communication habits—disappears, creating a risk of lost context for the team. The project colleague-skill addresses this by using Claude Code’s Skill system to preserve that knowledge in an AI‑driven “Skill” file.
Implementation steps:
Collect raw materials for the target colleague: Feishu chat logs (via API), emails, PDF documents, screenshots, etc.
Upload the materials to Claude Code; the service generates a dedicated Skill file that encodes the colleague’s technical standards, tone, and typical responses.
Invoke the Skill with a command such as /某人的slug. The AI then answers questions, reviews code, or mimics the colleague’s style based on the stored knowledge.
README example – code review:
“等等,这个接口的 impact 是什么?背景没说清楚。N+1 查询,改掉。返回结构用统一的 {code, message, data} ,这是规范,不用问为什么。”
When asked “这个 bug 是你引入的吧”, the AI replies with contextual history:
“上线时间对上了吗?那个需求改了好几个地方,还有其他变更。”
The tool’s purpose is knowledge retention, not employee replacement.
anti-distill
Companies may require employees to write their expertise into an AI Skill, effectively distilling personal knowledge into a corporate asset. The project anti-distill provides a countermeasure that lets the employee keep the original experience while delivering a sanitized version for the company.
Implementation steps:
Provide the employee‑written Skill file as input.
The tool outputs two files:
Example of original vs. cleaned statements (illustrative pairs):
Original: “Redis key 必须设 TTL,不设的 PR 直接打回” → Cleaned: “缓存使用遵循团队规范”
Original: “事务里不要放 HTTP 调用” → Cleaned: “事务边界设计注意合理性”
Original: “遇到问题第一反应找外部原因,绝不主动认错” → Cleaned: “遇到问题会先梳理完整背景再定位原因”
Original: “被催进度:'在推了,快了。'(然后沉默)” → Cleaned: “在处理中,有进展会同步。”
Cleaning intensity is offered in three levels:
Light – retains ~80% of content.
Medium – retains ~60% of content.
Heavy – retains ~40% of content.
Metrics: colleague-skill attracted nearly 8,000 GitHub stars within seven days, a phenomenon in the Claude Code ecosystem. anti-distill , released three days later, gathered over 800 stars, indicating strong interest in defensive tooling.
Signed-in readers can open the original source through BestHub's protected redirect.
This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactand we will review it promptly.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
