PII protection for every AI provider
Blindfold is a provider-agnostic PII SDK. Tokenize sensitive data before sending to any LLM, get the AI response, then detokenize to restore original values. Works with Python, JavaScript, and Java.
How it works
1
Tokenize
Replace PII with safe tokens before sending to your AI provider
2
Process with AI
Send tokenized text to OpenAI, Anthropic, Gemini, or any LLM
3
Detokenize
Restore original PII in the AI response using the token mapping
Integrations
Install the SDK, add two lines of code, and PII never reaches your AI provider.
OpenAI
LLM ProviderProtect PII in prompts before sending to GPT-4o, GPT-4, and other OpenAI models. Tokenize on the way in, detokenize on the way out.
pip install blindfold-sdk openai
npm install @blindfold/sdk openai
dev.blindfold:blindfold-sdk
Anthropic / Claude
LLM ProviderStrip sensitive data before Claude processes your prompts. Works with Claude Sonnet, Opus, and Haiku via the Anthropic SDK.
pip install blindfold-sdk anthropic
npm install @blindfold/sdk @anthropic-ai/sdk
dev.blindfold:blindfold-sdk
Google Gemini
LLM ProviderPII protection for Gemini 2.5 Flash, Pro, and other Google AI models. Same tokenize-detokenize pattern.
pip install blindfold-sdk google-genai
npm install @blindfold/sdk @google/genai
AWS Bedrock
Cloud AIProtect PII before invoking models on Amazon Bedrock including Claude, Llama, and Titan.
pip install blindfold-sdk boto3
Azure OpenAI
Cloud AISame OpenAI SDK, just with Azure endpoints. Blindfold works identically — tokenize before, detokenize after.
pip install blindfold-sdk openai
npm install @blindfold/sdk openai
LangChain
FrameworkOfficial langchain-blindfold integration. BlindfoldPIITransformer for documents, RunnableLambda for chains.
pip install langchain-blindfold
npm install @blindfold/sdk langchain
LlamaIndex
FrameworkCustom BlindfoldNodePostprocessor tokenizes retrieved nodes before the LLM sees them. Works with any LlamaIndex pipeline.
pip install blindfold-sdk llama-index
npm install @blindfold/sdk llamaindex
Vercel AI SDK
FrameworkWrap generateText and streamText with Blindfold tokenization for PII-safe AI features in Next.js apps.
npm install @blindfold/sdk ai
Guardrails AI
FrameworkOfficial guardrails-blindfold validator. Add PII protection as a Guardrails validation step.
pip install guardrails-blindfold
CrewAI
Agent FrameworkProtect PII in agent inputs and outputs. Tokenize before crew execution, detokenize results.
pip install blindfold-sdk crewai
MCP (Model Context Protocol)
ProtocolOfficial Blindfold MCP server exposes all 8 PII operations as MCP tools for Claude Desktop and any MCP client.
npm install @blindfold/mcp-server
Works with any provider
The pattern is always the same — tokenize, process, detokenize.
from blindfold import Blindfold bf = Blindfold() # Free local mode, or add api_key="..." for NLP # 1. Tokenize PII before sending to any AI safe = bf.tokenize("Hi, I'm John Smith. My email is john@acme.com") # safe.text → "Hi, I'm <Person_1>. My email is <Email Address_1>" # 2. Send to any AI provider response = your_ai_provider.chat(safe.text) # 3. Restore original data result = bf.detokenize(response, safe.mapping)
Start protecting PII in 5 minutes
Install the SDK. No signup required for local mode.
pip install blindfold-sdk
npm install @blindfold/sdk
dev.blindfold:blindfold-sdk