A scalable job import system that fetches job listings from external XML APIs, queues them using Redis + BullMQ, imports them into MongoDB with worker processes, and provides an Admin UI (Next.js) to view import history and Express.js backend.
Demo: Link
✅ Fetch job feeds from multiple XML APIs (Jobicy, HigherEdJobs)
✅ Parse XML → JSON and normalize data
✅ Queue-based background import using Redis & BullMQ
✅ Worker processes handle inserts/updates asynchronously
✅ Hourly Cron Job auto-fetches latest data
✅ Import logs with counts (new, updated, failed, total)
✅ Next.js admin dashboard to view import history
✅ Fully Dockerized for consistent development setup
| Layer | Technology |
|---|---|
| Frontend | Next.js (React + TypeScript + Tailwind CSS) |
| Backend | Node.js (Express + TypeScript) |
| Database | MongoDB + Mongoose |
| Queue | BullMQ (Redis-based) |
| Scheduler | node-cron |
| Containerization | Docker + Docker Compose |
External XML APIs
↓
[ Cron Job (Hourly) ]
↓
Job Service (fetch + parse XML)
↓
Redis Queue (BullMQ)
↓
Worker (process + MongoDB insert/update)
↓
Import Logs Collection
↓
Next.js UI (View import history)
/client → Next.js Frontend (Admin UI)
/server → Node.js Backend API + Workers + Cron
/docs/architecture.md
/README.md
/docker-compose.dev.yml
git clone https://github.com/jaya6400/queue-job-app.git
cd queue-job-app
2️⃣ Environment setup
Create /server/.env:
PORT=4000
MONGO_URI=mongodb://mongo:27017/job_import_db
REDIS_URL=redis://redis:6379
Create /client/.env.local:
NEXT_PUBLIC_API_URL=http://server:4000
3️⃣ Start all services
docker compose -f docker-compose.dev.yml up --build
This starts:
🖥️ client → Next.js (port 3000)
⚙️ server → Node + Express (port 4000)
🗃️ mongo → MongoDB (port 27017)
💾 redis → Redis (port 6379)
🌐 Access Service URLs:
-
Frontend (Next.js) http://localhost:3000
-
Backend API http://localhost:4000/api/import-logs
-
Enqueue test route http://localhost:4000/api/enqueue-test-job
🛠 Development & Debugging Commands:
Enter a running container (example: server)
docker exec -it queue-job-app-server-1 bash
Stop and delete all containers + volumes
docker-compose -f docker-compose.dev.yml down -v
Rebuild everything cleanly (no cache)
docker-compose -f docker-compose.dev.yml build --no-cache
Recreate and start all containers in detached mode
docker-compose -f docker-compose.dev.yml up --force-recreate -d
🧩 Key Routes
GET /api/import-logs
- Returns all import history logs from MongoDB.
GET /api/enqueue-test-job
- Adds a dummy job import to queue (for testing).
⏰ Cron Jobs
-
Runs every hour automatically.
-
Fetches data from all feed URLs in jobService.ts.
-
Parses XML → JSON → normalized structure.
-
Queues each batch for the worker to process.
-
Logs results in importlogs collection.
🧱 MongoDB Collections:
Collection Description:
-
jobs- Stores all imported job data
-
importlogs- Tracks each import run (total, new, updated, failed)
🧾 Import Log Schema Example
{
"fileName": "https://jobicy.com/?feed=job_feed",
"totalFetched": 125,
"totalImported": 120,
"newJobs": 100,
"updatedJobs": 20,
"failedJobs": 5,
"failedReasons": [],
"timestamp": "2025-11-05T12:40:00.000Z"
}🧑💻 Development Notes
- Use
npm run devinside client or server for local debugging outside Docker. - Redis + BullMQ workers handle concurrency automatically (concurrency: 3).
- You can test enqueuing manually via
/api/enqueue-test-job. - Logs are visible in the Import History UI in Next.js.
🧑💼 Author
Developed by: Jaya Dubey
Stack: MERN + Redis + BullMQ + Docker
Date: November 2025