Back to News
Advertisement
nnip about 3 hours ago 9 commentsRead Article on copilot.simplepdf.com

RU version is available. Content is displayed in original English for accuracy.

Hey HN!

I built SimplePDF Copilot: an AI assistant that can interact with the PDF editor. It fills fields, answers questions, focuses on a specific field, adds fields, deletes pages, and so on.

It's built on top of SimplePDF that I started 7 years ago, pioneering privacy-respecting client-side pdf editing, now used monthly by 200k+ people.

As for the privacy model: the PDF itself never leaves the browser. Parsing, rendering, and field detection all run client-side.

The text the model needs (and your messages) goes to whatever LLM you point at. By default that's our demo proxy (DeepSeek V4 Flash, rate-capped), but you can BYOK and point it at any cloud provider, or go fully local (I've been testing with LM Studio).

Unlike the existing "Chat with PDF" tools that only retrieve the text/OCR layer, Copilot can act on the PDF: filling fields, adding fields (detected client-side using CommonForms by Joe Barrow [1], jbarrow on HN with some post-processing heuristics I added on top), focusing on fields, deleting pages, and so on.

I built this because SimplePDF is mostly used by healthcare customers where document privacy is paramount, and I wanted an AI experience that didn't require shipping PII to a third party. Stack is pretty standard:

- Tanstack Start

- AI SDK from Vercel

- Tailwind (I personally prefer CSS modules, I'm old-school but the goal since I open source it, I figured that Tailwind would be a better fit)

The more interesting part is the client-side tool calling: events are passed back and forth via iframe postMessage.

If you're not familiar with "tool calling" and "client-side tool calling", a quick primer:

Tool calling is what LLMs use to take actions. When Claude runs grep or ls, or hits an MCP server, those are tool calls.

Client-side tool calling means the intent to call a tool comes from the LLM, but the execution happens in the browser.

That matters for: speed, you can't go faster than client-to-client operations and also gives you the ability to limit the data you expose to the LLM. For the demo I do feed the content of the document to the LLM, but that connection could be severed as simply as removing the tool that exposes the content data.

The demo is fully open source, available on Github [2] and the demo is the same as the link of this post [3]

What's not open source is SimplePDF itself (loaded as the iframe).

I could talk on and on about this, let me know if you have any questions, anything goes!

[1] https://github.com/jbarrow/commonforms

[2] https://github.com/SimplePDF/simplepdf-embed/tree/main/copil...

[3] https://copilot.simplepdf.com/?share=a7d00ad073c75a75d493228...

Advertisement

⚡ Community Insights

Discussion Sentiment

50% Positive

Analyzed from 285 words in the discussion.

Trending Topics

#data#models#forms#messages#local#popup#demo#support#thanks#tinfoil

Discussion (9 Comments)Read Original on HackerNews

iamflimflam1about 2 hours ago
Might be worth making it clearer that the chat messages are going to a remote server. So any PII data is leaving the local machine.
nipabout 2 hours ago
I tried to make it clear with the popup message that appears when you start chatting: "Public demo. Use sample data only. Messages are processed by the selected AI provider."

But you're right that it's not as evident as I wanted to, I'm making a small copy update to make it clearer: "Public demo. Your chat messages leave your device and are sent to the selected AI provider. Use sample data only."

(Since there's support for local models, the popup is only displayed when NOT using your own model)

Thanks!

EDIT: the copy update is live, thanks again!

kiney14 minutes ago
Does it support XFA forms?
nip8 minutes ago
Hey Kiney!

It supports AcroForms (like in the example) but not XFA.

Why are you asking? gov forms support?

grahammccainabout 2 hours ago
Keep going though. I’m definitely looking for something like this once we can get something secure we can use with proprietary and pii data.
FrasiertheLion34 minutes ago
This is the canonical use case for Tinfoil: https://tinfoil.sh/inference. It provides verifiably private AI inference with frontier open source models: https://docs.tinfoil.sh/models/overview

Disclaimer I'm the cofounder, only recommending it because it's legitimately the right shape for your problem.

nipabout 2 hours ago
Thanks!

Anything you see missing in Copilot to achieve that?

Not sure if you noticed, but there's an arch-diagram in the info popup [1].

[1] https://copilot.simplepdf.com/?share=a7d00ad073c75a75d493228...

simianwords11 minutes ago
It looks cool but, how is this different from me uploading to chatgpt and asking it to fill in?
nipabout 3 hours ago
Just to be clear, this is a technical demo showing what's possible with client-side tool calling + local models: LLM-assisted form filling where no document data has to leave the user's machine.

Use cases range from:

- Filling foreign-language forms

- Navigating a contract before signing: "can I trust ALL the clauses here?"

- Pre-filling repetitive forms from existing data sources (CRM, EHR, etc. via MCP/RAG)

Copilot is designed to be embedded; our customers ship it white-labeled inside their own products.