PII protection for every AI provider

Blindfold is a provider-agnostic PII SDK. Tokenize sensitive data before sending to any LLM, get the AI response, then detokenize to restore original values. Works with Python, JavaScript, and Java.

How it works

1

Tokenize

Replace PII with safe tokens before sending to your AI provider

2

Process with AI

Send tokenized text to OpenAI, Anthropic, Gemini, or any LLM

3

Detokenize

Restore original PII in the AI response using the token mapping

Integrations

Install the SDK, add two lines of code, and PII never reaches your AI provider.

OpenAI

LLM Provider

Protect PII in prompts before sending to GPT-4o, GPT-4, and other OpenAI models. Tokenize on the way in, detokenize on the way out.

pip install blindfold-sdk openai

npm install @blindfold/sdk openai

dev.blindfold:blindfold-sdk

View docs →

Anthropic / Claude

LLM Provider

Strip sensitive data before Claude processes your prompts. Works with Claude Sonnet, Opus, and Haiku via the Anthropic SDK.

pip install blindfold-sdk anthropic

npm install @blindfold/sdk @anthropic-ai/sdk

dev.blindfold:blindfold-sdk

View docs →

Google Gemini

LLM Provider

PII protection for Gemini 2.5 Flash, Pro, and other Google AI models. Same tokenize-detokenize pattern.

pip install blindfold-sdk google-genai

npm install @blindfold/sdk @google/genai

View docs →

AWS Bedrock

Cloud AI

Protect PII before invoking models on Amazon Bedrock including Claude, Llama, and Titan.

pip install blindfold-sdk boto3

View docs →

Azure OpenAI

Cloud AI

Same OpenAI SDK, just with Azure endpoints. Blindfold works identically — tokenize before, detokenize after.

pip install blindfold-sdk openai

npm install @blindfold/sdk openai

View docs →

LangChain

Framework

Official langchain-blindfold integration. BlindfoldPIITransformer for documents, RunnableLambda for chains.

pip install langchain-blindfold

npm install @blindfold/sdk langchain

View docs →

LlamaIndex

Framework

Custom BlindfoldNodePostprocessor tokenizes retrieved nodes before the LLM sees them. Works with any LlamaIndex pipeline.

pip install blindfold-sdk llama-index

npm install @blindfold/sdk llamaindex

View docs →

Vercel AI SDK

Framework

Wrap generateText and streamText with Blindfold tokenization for PII-safe AI features in Next.js apps.

npm install @blindfold/sdk ai

View docs →

Guardrails AI

Framework

Official guardrails-blindfold validator. Add PII protection as a Guardrails validation step.

pip install guardrails-blindfold

View docs →

CrewAI

Agent Framework

Protect PII in agent inputs and outputs. Tokenize before crew execution, detokenize results.

pip install blindfold-sdk crewai

View docs →

MCP (Model Context Protocol)

Protocol

Official Blindfold MCP server exposes all 8 PII operations as MCP tools for Claude Desktop and any MCP client.

npm install @blindfold/mcp-server

View docs →

Works with any provider

The pattern is always the same — tokenize, process, detokenize.

python
from blindfold import Blindfold

bf = Blindfold()  # Free local mode, or add api_key="..." for NLP

# 1. Tokenize PII before sending to any AI
safe = bf.tokenize("Hi, I'm John Smith. My email is john@acme.com")
# safe.text → "Hi, I'm <Person_1>. My email is <Email Address_1>"

# 2. Send to any AI provider
response = your_ai_provider.chat(safe.text)

# 3. Restore original data
result = bf.detokenize(response, safe.mapping)

Start protecting PII in 5 minutes

Install the SDK. No signup required for local mode.

pip install blindfold-sdk

npm install @blindfold/sdk

dev.blindfold:blindfold-sdk

START FOR FREE