How It Works

MyOnionSearch is powered by two main components: a Go backend that serves the API and this website, and a separate Go crawler that maintains the index. This separation makes the system robust and efficient.

1. The Backend (Go API)

The main application you interact with is a web server written in Go. It has two jobs:

2. The Crawler (Go Worker)

A separate, long-running Go application acts as our index maintainer. It runs in a continuous loop to perform three critical jobs:

3. The Database (MySQL)

We use a MySQL database to store the public information for our index. The sites table contains:

Crucially, this database does not contain any tables for user accounts, search histories, or IP logs.

4. The Frontend (HTML)

This website is built with plain HTML and a pre-compiled CSS file. There is no large framework, no WebAssembly, and no complex rendering. This makes it: