Work in progress...
Important
Installation is mandatory.
git clone https://github.com/NJUNLP/EAX.git
cd EAX
pip install -e ".[infer]" --no-build-isolation
Extra dependencies available:
infer
: install vllm for sampling.eval
: comet, sacrebleu and bleurt for evaluation. Also, bleurt is required for Reward Modeling.
The pipeline includes the following steps:
- Supervised Fine-tuning: setup the translation model with supervised data.
- Reward Modeling: build translation evaluation capabilities for the SFT model through Reward Modeling.
- x2x Optimization: optimize x2x translation with English-Anchored Generation and Evaluation.
Setup the flores dataset:
wget https://tinyurl.com/flores200dataset -O flores200dataset.tar.gz
tar -xzvf flores200dataset.tar.gz
ls flores200_dataset
Evaluate the model on flores dataset and log the results to wandb:
python3 eval/run_eval_flores.py \
--model_path path/to/model \
--model_name model_name_for_logging \
--test_data_path flores200_dataset \
--split devtest \
--metrics ["bleurt","sacrebleu","comet"] \
--bleurt_path BLEURT-20 \
--comet_path wmt22-comet-da/checkpoints/model.ckpt \
--log_to_wandb True
Tip
set --log_to_wandb False
if wandb is not available and the results are logged to console.
Important
Do not evaluate our models on flores dev
split as it is included in the Towerblocks dataset for training.
You can evaluate the model on custom dataset by preparing the inference data in the following format:
[
{
"src_lang": "en",
"trg_lang": "zh",
"src_text": "\"We now have 4-month-old mice that are non-diabetic that used to be diabetic,\" he added.",
"trg_text": "他补充道:“我们现在有 4 个月大没有糖尿病的老鼠,但它们曾经得过该病。”",
},
...
]
Run the evaluation:
python3 eval/run_eval.py \
--model_path path/to/model \
--infer_data_path path/to/infer_data.json \
--metrics ["bleurt","sacrebleu","comet"] \
--bleurt_path BLEURT-20 \
--comet_path wmt22-comet-da/checkpoints/model.ckpt \
--log_to_wandb True \
--config '{"model_name": "qwen7b_eax"}' # any info that you want to log to wandb
@misc{yang2025enanchoredx2xenglishanchoredoptimizationmanytomany,
title={EnAnchored-X2X: English-Anchored Optimization for Many-to-Many Translation},
author={Sen Yang and Yu Bao and Yu Lu and Jiajun Chen and Shujian Huang and Shanbo Cheng},
year={2025},
eprint={2509.19770},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2509.19770},
}