Reddit’s API is effectively dead for archival. Third-party apps are gone. Reddit has threatened to cut off access to the Pushshift dataset multiple times. But 3.28TB of Reddit history exists as a torrent right now, and I built a tool to turn it into something you can browse on your own hardware.
The key point: This doesn’t touch Reddit’s servers. Ever. Download the Pushshift dataset, run my tool locally, get a fully browsable archive. Works on an air-gapped machine. Works on a Raspberry Pi serving your LAN. Works on a USB drive you hand to someone.
What it does: Takes compressed data dumps from Reddit (.zst), Voat (SQL), and Ruqqus (.7z) and generates static HTML. No JavaScript, no external requests, no tracking. Open index.html and browse. Want search? Run the optional Docker stack with PostgreSQL – still entirely on your machine.
API & AI Integration: Full REST API with 30+ endpoints – posts, comments, users, subreddits, full-text search, aggregations. Also ships with an MCP server (29 tools) so you can query your archive directly from AI tools.
Self-hosting options:
- USB drive / local folder (just open the HTML files)
- Home server on your LAN
- Tor hidden service (2 commands, no port forwarding needed)
- VPS with HTTPS
- GitHub Pages for small archives
Why this matters: Once you have the data, you own it. No API keys, no rate limits, no ToS changes can take it away.
Scale: Tens of millions of posts per instance. PostgreSQL backend keeps memory constant regardless of dataset size. For the full 2.38B post dataset, run multiple instances by topic.
How I built it: Python, PostgreSQL, Jinja2 templates, Docker. Used Claude Code throughout as an experiment in AI-assisted development. Learned that the workflow is “trust but verify” – it accelerates the boring parts but you still own the architecture.
Live demo: https://online-archives.github.io/redd-archiver-example/ GitHub: https://github.com/19-84/redd-archiver (Public Domain)
Pushshift torrent: https://academictorrents.com/details/1614740ac8c94505e4ecb9d88be8bed7b6afddd4



Would love to see you learn an entire foreign language just so you are able to communicate with the world without being laughed at by people as hostile as yourself.
They said it wasn’t their “first” lanugage. Which leads me to believe that they do speak English. If that’s the case, then they indeed are kind of lazy. There have already been studies in the impact of AI when used for communication and the results are not positive.
This isn’t something I’d personally point out and criticize, just something I wouldn’t do personally. Take the time to express your own ideas in your own words. The long term cost is higher than the short term gains.
Hey I drove to the library, picked up all these things you needed, got dinner here ya go, free!
You drove? man that’s lazy…
He used AI to clean up translation and save time after he spent a fuck ton of time curating and delivering us a helpful product. Calling him out as lazy is an awful take.
there are the so called activists that complain alot then there are the activists that deliver projects and code… enough said
I have A1 and A2 level in a couple of non-first languages, technically I can speak those, realistically I don’t and will not be able to communicate something more complex than ‘here, take a look’
So I don’t agree with your absolutistic stance
I can’t even learn my own language!