DMARC Analyzer is a tool for processing and analyzing DMARC (Domain-based Message Authentication, Reporting, and Conformance) reports. It helps organizations monitor email authentication results and protect their domains from email spoofing and phishing attacks.
- Overview
- Prerequisites
- Environment Variables
- Development Setup
- AWS Service Configuration
- SQS Message Consumer
- Frontend Setup
- API Documentation
- Deployment
DMARC Analyzer processes DMARC aggregate reports that are stored in an S3 bucket. It parses these reports, extracts relevant information, and stores the data in a PostgreSQL database for analysis and visualization. The system supports both manual processing and automated processing via SQS message queues.
- Go 1.24 or later
- PostgreSQL 14 or later
- AWS account with S3 and SQS access
- Docker and Docker Compose (for containerized deployment)
The application requires the following environment variables:
# Database Configuration
DATABASE_URL=postgresql://postgres:postgres@localhost:5432/dmarcdb
# AWS Configuration
S3_BUCKET_NAME=your-dmarc-reports-bucket-name
SQS_QUEUE_URL=https://sqs.your-aws-region.amazonaws.com/your-aws-account-id/your-dmarc-reports-queue-name
AWS_ACCESS_KEY_ID=your-aws-access-key
AWS_SECRET_ACCESS_KEY=your-aws-secret-key
AWS_REGION=your-aws-region
git clone https://github.com/dmarc-analyzer/dmarc-analyzer.git
cd dmarc-analyzer
# Create the PostgreSQL database
createdb dmarcdb
# Apply the existing schema to your database
psql -d dmarcdb -f backend/schema.sql
This step is only necessary when you've modified the model classes and need to regenerate the schema.sql file.
# Generate a new database schema
dropdb --if-exists gen_sql && createdb gen_sql
go run ./backend/cmd/generate_sql.go
echo '-- Code generated by dmarc-analyzer generate_sql. DO NOT EDIT.' > backend/schema.sql
pg_dump -d gen_sql --schema-only --no-owner | sed '/^--/d' | sed '/^SET /d' | sed '/^SELECT /d' | sed 's/public\.//g' | sed -e :a -e '/^\n*$/{$d;N;ba' -e '}' -e 's/\n\n*/\n/' >> backend/schema.sql
dropdb --if-exists gen_sql
# Apply the newly generated schema to your database
psql -d dmarcdb -f backend/schema.sql
Create a .env
file in the project root with the required environment variables as listed above.
# Start the server
go run ./backend/cmd/server/server.go
The server will start on port 6767 by default.
-
Create an S3 bucket to store DMARC reports:
- Sign in to the AWS Management Console
- Navigate to S3 service
- Click "Create bucket"
- Enter a unique bucket name (e.g.,
your-org-name-dmarc-reports
) - Choose your preferred region
- Configure bucket settings as needed
- Click "Create bucket"
-
Configure S3 Bucket Policy for SES: After creating the bucket, you need to configure a bucket policy to allow AWS SES service to write emails to the bucket:
- Go to your S3 bucket → Permissions tab
- Click "Edit" in the Bucket policy section
- Add the following policy (replace
your-aws-account-id
andyour-dmarc-reports-bucket-name
with your actual values):
{ "Version": "2012-10-17", "Statement": [ { "Sid": "AllowSESToWriteEmails", "Effect": "Allow", "Principal": { "Service": "ses.amazonaws.com" }, "Action": [ "s3:PutObject" ], "Resource": "arn:aws:s3:::your-dmarc-reports-bucket-name/*", "Condition": { "StringEquals": { "aws:SourceAccount": "your-aws-account-id" } } } ] }
Important: Replace the following placeholders with your actual values:
your-aws-account-id
: Your AWS account IDyour-dmarc-reports-bucket-name
: Your S3 bucket name for DMARC reports
-
Create an SQS queue:
- Navigate to SQS service in AWS Console
- Click "Create queue"
- Choose "Standard queue"
- Enter a queue name (e.g.,
dmarc-reports
) - Configure queue settings:
- Visibility timeout: 30 seconds (recommended)
- Message retention period: 4 days (default)
- Receive message wait time: 20 seconds (for long polling)
- Click "Create queue"
-
Configure SQS Queue Access Policy: After creating the queue, you need to configure an access policy to allow AWS S3 service to send messages to the queue:
- Go to your SQS queue → Permissions tab
- Click "Edit" in the Access policy section
- Replace the default policy with the following (replace
your-aws-account-id
,your-aws-region
, andyour-dmarc-reports-bucket-name
with your actual values):
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::your-aws-account-id:root" }, "Action": "SQS:*", "Resource": "arn:aws:sqs:your-aws-region:your-aws-account-id:your-dmarc-reports-queue-name" }, { "Sid": "AllowS3ToSendMessages", "Effect": "Allow", "Principal": { "Service": "s3.amazonaws.com" }, "Action": "sqs:SendMessage", "Resource": "arn:aws:sqs:your-aws-region:your-aws-account-id:your-dmarc-reports-queue-name", "Condition": { "StringEquals": { "aws:SourceAccount": "your-aws-account-id" } } } ] }
Important: Replace the following placeholders with your actual values:
your-aws-account-id
: Your AWS account IDyour-aws-region
: Your AWS region (e.g., us-east-1, eu-west-1)your-dmarc-reports-bucket-name
: Your S3 bucket name for DMARC reports (recommended format:your-org-name-dmarc-reports
)your-dmarc-reports-queue-name
: Your SQS queue name (recommended format:dmarc-reports
)
After setting up both S3 bucket and SQS queue, you need to create an IAM user or role with permissions to access both services:
-
Create IAM User or Role:
- Navigate to IAM service in AWS Console
- Create a new IAM user or role for the DMARC Analyzer application
-
Attach IAM Policy: Create and attach the following policy that allows access to both S3 and SQS:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:GetObject", "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::your-dmarc-reports-bucket-name", "arn:aws:s3:::your-dmarc-reports-bucket-name/*" ] }, { "Effect": "Allow", "Action": [ "sqs:ReceiveMessage", "sqs:DeleteMessage", "sqs:GetQueueAttributes" ], "Resource": "arn:aws:sqs:your-aws-region:your-aws-account-id:your-dmarc-reports-queue-name" } ] }
Important: Replace the following placeholders with your actual values:
your-aws-account-id
: Your AWS account IDyour-aws-region
: Your AWS region (e.g., us-east-1, eu-west-1)your-dmarc-reports-bucket-name
: Your S3 bucket name for DMARC reportsyour-dmarc-reports-queue-name
: Your SQS queue name
-
Obtain AWS credentials (Access Key ID and Secret Access Key) for the IAM user.
Important: Only S3 delivery method is supported for DMARC report processing. Lambda functions cannot access email attachments and body content, which are essential for parsing DMARC reports.
-
Configure SES to receive emails:
- Navigate to SES service in AWS Console
- Go to "Email receiving" → "Rule sets"
- Create a new rule set or use the default
- Create a new rule:
- Recipient:
dmarc-reports@yourdomain.com
- Action: Store in S3 bucket
- S3 bucket: Select your DMARC reports bucket
- S3 key prefix:
dmarc-reports/
(optional)
- Recipient:
-
Configure S3 Event Notifications:
- Go to your S3 bucket → Properties → Event notifications
- Click "Create event notification"
- Configure the event:
- Event name:
dmarc-email-uploaded
- Event types: Select "All object create events"
- Destination: SQS queue
- SQS queue: Select your created SQS queue
- Event name:
- Click "Save changes"
Note: Lambda functions are not suitable for this use case because they cannot access the email attachments and body content that contain the DMARC report data. The S3 delivery method preserves the complete email structure, allowing the DMARC Analyzer to extract and parse the XML reports from email attachments.
To receive DMARC reports, configure your domain's DMARC record:
_dmarc.example.com. IN TXT "v=DMARC1; p=none; rua=mailto:dmarc-reports@example.com;"
Make sure the email address in the rua
field matches the recipient configured in SES.
The DMARC Analyzer includes an SQS message consumer that automatically processes new DMARC reports as they arrive.
# Build the consumer
make build-consumer
# Run the consumer
make run-consumer
# Or build and run in one command
make run-consumer
# Build
cd backend
go build -o ../bin/consumer ./cmd/consumer
# Run
./bin/consumer
- Automatic message processing: Continuously polls SQS queue for new messages
- Duplicate detection: Prevents processing the same email multiple times
- Error handling: Graceful error handling with retry logic
- Graceful shutdown: Responds to SIGINT/SIGTERM signals
- Detailed logging: Comprehensive logging for monitoring and debugging
# S3é…Ťç˝®
export S3_BUCKET_NAME="your-dmarc-reports-bucket-name"
# SQSé…Ťç˝®
export SQS_QUEUE_URL="https://sqs.your-aws-region.amazonaws.com/your-aws-account-id/your-dmarc-reports-queue-name"
# 数据库配置
export DB_HOST="localhost"
export DB_PORT="5432"
export DB_USER="your_db_user"
export DB_PASSWORD="your_db_password"
export DB_NAME="your_db_name"
export DB_SSLMODE="disable"
- Email Reception: DMARC reports are sent to your configured email address
- S3 Storage: SES stores the email in your S3 bucket
- Event Trigger: S3 event notification sends a message to SQS
- Message Processing: The consumer picks up the message and processes the email
- Data Extraction: DMARC report data is extracted and parsed
- Database Storage: Results are stored in PostgreSQL database
- Message Cleanup: Successfully processed messages are deleted from the queue
To process existing DMARC reports in your S3 bucket:
go run ./backend/cmd/backfill/backfill.go
This command will scan your S3 bucket for DMARC reports, parse them, and store the data in the PostgreSQL database.
The DMARC Analyzer frontend is built with Angular. Follow these steps to set up and run the frontend application.
- Node.js 16 or later
- Yarn package manager
Navigate to the frontend directory and install the required dependencies:
cd frontend
yarn install
Start the development server:
yarn start
By default, this will start the Angular development server on port 4200. You can access the application at http://localhost:4200/.
To build the application for production:
yarn build
The build artifacts will be stored in the dist/
directory.
Execute unit tests:
yarn test
Run end-to-end tests:
yarn e2e
The frontend application is configured to connect to the backend API running on port 6767. If you need to change this configuration, update the environment files in src/environments/
.
The DMARC Analyzer provides the following API endpoints:
curl http://127.0.0.1:6767/api/domains
Returns a list of all domains with DMARC reports.
curl http://127.0.0.1:6767/api/domains/example.com/report?start=2024-10-10T00:00:00Z&end=2024-10-20T00:00:00Z
Returns a summary of DMARC reports for the specified domain and date range.
curl http://127.0.0.1:6767/api/domains/example.com/report/detail?start=2024-10-10T00:00:00Z&end=2024-10-20T00:00:00Z
Returns detailed DMARC report information for the specified domain and date range.
curl http://127.0.0.1:6767/api/domains/example.com/chart/dmarc?start=2024-10-10T00:00:00Z&end=2024-10-20T00:00:00Z
Returns data for generating DMARC compliance charts for the specified domain and date range.
- Pull the pre-built Docker image from GitHub Container Registry:
docker pull ghcr.io/dmarc-analyzer/dmarc-analyzer:latest
- Create a
docker-compose.yml
file with the following content:
version: '3.8'
services:
dmarc-analyzer:
image: ghcr.io/dmarc-analyzer/dmarc-analyzer:latest
ports:
- "6767:6767"
environment:
- DATABASE_URL=postgresql://postgres:postgres@postgres:5432/dmarcdb
- S3_BUCKET_NAME=${S3_BUCKET_NAME}
- SQS_QUEUE_URL=${SQS_QUEUE_URL}
- AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
- AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
- AWS_REGION=${AWS_REGION}
depends_on:
- postgres
dmarc-consumer:
image: ghcr.io/dmarc-analyzer/dmarc-analyzer:latest
command: ./consumer
environment:
- S3_BUCKET_NAME=${S3_BUCKET_NAME}
- SQS_QUEUE_URL=${SQS_QUEUE_URL}
- DB_HOST=postgres
- DB_PORT=5432
- DB_USER=postgres
- DB_PASSWORD=postgres
- DB_NAME=dmarcdb
- DB_SSLMODE=disable
- AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
- AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
- AWS_REGION=${AWS_REGION}
depends_on:
- postgres
restart: unless-stopped
postgres:
image: postgres:14
ports:
- "5432:5432"
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
- POSTGRES_DB=dmarcdb
volumes:
- postgres-data:/var/lib/postgresql/data
- ./backend/schema.sql:/docker-entrypoint-initdb.d/schema.sql
volumes:
postgres-data:
- Configure environment variables in a
.env
file in the project root:
# AWS Configuration
S3_BUCKET_NAME=your-dmarc-reports-bucket-name
SQS_QUEUE_URL=https://sqs.your-aws-region.amazonaws.com/your-aws-account-id/your-dmarc-reports-queue-name
AWS_ACCESS_KEY_ID=your-aws-access-key
AWS_SECRET_ACCESS_KEY=your-aws-secret-key
AWS_REGION=your-aws-region
- Start the containers:
docker-compose up -d
This will start the DMARC Analyzer server, SQS consumer, and PostgreSQL database in containers.
-
Make sure Docker and Docker Compose are installed on your system.
-
Create a
docker-compose.yml
file with the following content:
version: '3.8'
services:
dmarc-analyzer:
build:
context: .
dockerfile: Dockerfile
ports:
- "6767:6767"
environment:
- DATABASE_URL=postgresql://postgres:postgres@postgres:5432/dmarcdb
- S3_BUCKET_NAME=${S3_BUCKET_NAME}
- SQS_QUEUE_URL=${SQS_QUEUE_URL}
- AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
- AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
- AWS_REGION=${AWS_REGION}
depends_on:
- postgres
dmarc-consumer:
build:
context: .
dockerfile: Dockerfile
command: ./consumer
environment:
- S3_BUCKET_NAME=${S3_BUCKET_NAME}
- SQS_QUEUE_URL=${SQS_QUEUE_URL}
- DB_HOST=postgres
- DB_PORT=5432
- DB_USER=postgres
- DB_PASSWORD=postgres
- DB_NAME=dmarcdb
- DB_SSLMODE=disable
- AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
- AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
- AWS_REGION=${AWS_REGION}
depends_on:
- postgres
restart: unless-stopped
postgres:
image: postgres:14
ports:
- "5432:5432"
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
- POSTGRES_DB=dmarcdb
volumes:
- postgres-data:/var/lib/postgresql/data
- ./backend/schema.sql:/docker-entrypoint-initdb.d/schema.sql
volumes:
postgres-data:
-
Configure environment variables in a
.env
file in the project root. -
Build and start the containers:
docker-compose up -d --build
- Build the application and consumer:
# Build the main server
go build -o dmarc-server ./backend/cmd/server/server.go
# Build the SQS consumer
go build -o dmarc-consumer ./backend/cmd/consumer/consumer.go
-
Set up the PostgreSQL database and apply the schema as described in the Development Setup section.
-
Configure environment variables.
-
Run the server:
./dmarc-server
- Run the consumer in a separate terminal:
./dmarc-consumer
See the LICENSE file for details.