A .onion website finder and verifier. You must have tor configured correctly you can use the command below and you should get the same output.
-
Initial Seed URL Queue:
- The
queue_initial_urlsfunction queues the initial URLs that the program will begin crawling from. - These URLs should ideally be directories or known sources of
.onionURLs.
- The
-
URL Extraction and Queueing:
extract_and_queue_onion_urlsis responsible for extracting.onionURLs from the fetched page content and queuing them for further processing.
-
Thread Function:
thread_funcconstantly pulls URLs from the queue and processes them to find more.onionURLs. It keeps running until the queue is empty.
-
Multi-threading:
- The program creates multiple threads (
MAX_THREADSis set to 10) to process URLs concurrently. This significantly improves the speed of finding.onionURLs.
- The program creates multiple threads (
-
Recursive Crawling:
- Each thread fetches a URL, extracts any
.onionURLs from it, and queues them for further processing. This way, the program recursively discovers new.onionURLs.
- Each thread fetches a URL, extracts any
-
Install Necessary Tools:
Make sure to have the required tools installed:
sudo apt-get update sudo apt-get install gcc libcurl4-openssl-dev tor
-
Start Tor:
Ensure the Tor service is running:
sudo service tor start
-
Create the Program File:
Create a file named
OFarm.cand paste the provided code:nano OFarm.c
- Paste the code into this file.
- Save the file with
Ctrl + Oand exit withCtrl + X.
-
Compile the Program:
Compile the program using
gcc:gcc -o OFarm OFarm.c -lcurl -lpthread
-
Run the Program:
Execute the compiled program:
./OFarm
-
Monitor the Output:
The program will start by processing the seed URLs and will continue to find and queue new
.onionURLs. The output will display any discovered.onionURLs along with their source.
-
Adding Real Seed URLs: You should replace the placeholder URLs in the
seed_urls[]array with actual.oniondirectory URLs or other sources of.onionlinks. -
Handling Dead URLs: You could enhance the error handling to manage dead URLs, possibly removing them from the queue or logging them for later analysis.
This guide provides a detailed implementation and step-by-step instructions to create a self-sufficient .onion finder that starts with its own seed URLs. By continuously crawling and discovering new .onion sites, the program can autonomously gather and verify .onion URLs in the background.