Skip to content

Conversation

@samgelman
Copy link
Collaborator

Added a new inference script and notebook. The notebook shows how to run inference with our Lightning framework or load the model as a PyTorch module and use a custom inference loop. I removed dependencies and references to using the metl-pretrained package for running inference and updated READMEs.

@agitter
Copy link
Member

agitter commented May 20, 2025

@brycejoh16 you will be a potential user of this. Can you please give the initial review?

@brycejoh16
Copy link
Collaborator

@agitter , I can do an initial review and give feedback .

Copy link
Collaborator

@brycejoh16 brycejoh16 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me, we may just want to use a different parameter for determining source vs target model, such as model_name or a finetuning argument.

@samgelman samgelman merged commit e6bb116 into main May 28, 2025
4 checks passed
@samgelman samgelman deleted the inference branch May 28, 2025 02:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants