Skip to content
/ llmify Public

Tired of boring, deterministic function behavior? Enter llmify: an LLM-powered function executor! What could go wrong?

License

Notifications You must be signed in to change notification settings

dross20/llmify

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ChatGPT Image Aug 16, 2025, 07_57_39 PM

Static Badge llmify-decorator GitHub license OpenAI API


Replace a function call with LLM inference. Sends the source code and arguments to an LLM, which then predicts what the output should be.

Caution

Please, for the love of all things good and holy, do not use this in any sort of production setting. This library should only be used for experimentation or prototyping.

📦 Installation

pip install llmify-decorator

💻 Quickstart

To use llmify, simply apply it as a decorator to a function like so:

from llmify import llmify

@llmify()
def add(a, b):
  return a + b

result = add(1, 2)
print(result) # Output: 3 (probably)

To change the model used for inference, pass in a value for the model keyword argument:

@llmify(model="gpt-5")
def add(a, b):
  ...

You can also use llmify on function stubs, so long as they have docstrings or comments:

@llmify()
def greet_user(name):
  """Greet the user in a friendly manner."""
  ...

greeting = greet_user("Mortimer")
print(greeting) # Output: "Hello, Mortimer!"

About

Tired of boring, deterministic function behavior? Enter llmify: an LLM-powered function executor! What could go wrong?

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages