This guide will help you get started with the FGResQ inference code.
First, clone the repository and install the required dependencies.
git clone https://github.com/sxfly99/FGRestore.git
cd FGRestore
pip install -r requirements.txtYou can download the pre-trained model weights from the following link: Download Weights (Google Drive) or (Baidu Netdisk)
Place the downloaded files in the weights directory.
FGResQ.pth: The main model for quality scoring and ranking.Degradation.pth: The weights for the degradation-aware task branch.
Create the weights directory if it doesn't exist and place the files inside.
FGRestore/
|-- weights/
| |-- FGResQ.pth
| |-- Degradation.pth
|-- model/
| |-- FGResQ.py
|-- requirements.txt
|-- README.md
The FGResQ provides two main functionalities: scoring a single image and comparing a pair of images.
First, import and initialize the FGResQ.
from model.FGResQ import FGResQ
# Path to the main model weights
model_path = "weights/FGResQ.pth"
# Initialize the inference engine
model = FGResQ(model_path=model_path)You can get a quality score for a single image. The score typically ranges from 0 to 1, where a higher score indicates better quality.
image_path = "path/to/your/image.jpg"
quality_score = model.predict_single(image_path)
print(f"The quality score for the image is: {quality_score:.4f}")You can also compare two images to determine which one has better quality.
image_path1 = "path/to/image1.jpg"
image_path2 = "path/to/image2.jpg"
comparison_result = model.predict_pair(image_path1, image_path2)
# The result includes a human-readable comparison and raw probabilities
print(f"Comparison: {comparison_result['comparison']}")
# Example output: "Comparison: Image 1 is better"
print(f"Raw output probabilities: {comparison_result['comparison_raw']}")
# Example output: "[0.8, 0.1, 0.1]" (Probabilities for Image1 > Image2, Image2 > Image1, Image1 ≈ Image2)If you find this work is useful, pleaes cite our paper!
@article{sheng2025fg,
title={Fine-grained Image Quality Assessment for Perceptual Image Restoration},
author={Sheng, Xiangfei and Pan, Xiaofeng and Yang, Zhichao and Chen, Pengfei and Li, Leida},
journal={arXiv preprint arXiv:2508.14475},
year={2025}
}