Reproducibility Index · 8fcbda9b

reddit:1tbhvlq

Created a free tool to check what PII your LLM prompts are leaking before they hit the provider

Independent reproduction of this paper. Validated on 2026-05-13 via a lab build. Passed on first attempt without repair.

Quality score
75%
Tests passed
36/36
Repair rounds
0
Status
reproduced

Checks

Syntax

syntax_parse blocking All 15 .py files parse cleanly

Imports

import_contextual_pii_guard blocking import contextual_pii_guard -- OK

Dependencies

dep_metadata major Found pyproject.toml

Tests

pytest_run major pytest: 36 passed, 0 failed, 0 errors (exit 0)

Packaging

pip_installable major pip install -e . --dry-run succeeded

Git State

worktrees_merged major No .worktrees directory
on_main_branch minor Current branch: master

Cleanup

no_venv_in_tree major .venv/ present (477 MB) -- should be in .gitignore

Whole-Repo Review

claude_review info Claude review: 7/10 — The codebase implements a functional, context-aware PII detection and anonymization pipeline using Presidio and spaCy, w

Citation & embed

Badge

Embeddable SVG, no auth required.

Research Radar reproduction badge
<img src="https://api.research-radar.com/v1/validations/8fcbda9b/badge.svg">

BibTeX

Drop into your .bib file.

Download .bib →