Replace a function call with LLM inference. Sends the source code and arguments to an LLM, which then predicts what the output should be.
Caution
Please, for the love of all things good and holy, do not use this in any sort of production setting. This library should only be used for experimentation or prototyping.
pip install llmify-decorator
To use llmify
, simply apply it as a decorator to a function like so:
from llmify import llmify
@llmify()
def add(a, b):
return a + b
result = add(1, 2)
print(result) # Output: 3 (probably)
To change the model used for inference, pass in a value for the model
keyword argument:
@llmify(model="gpt-5")
def add(a, b):
...
You can also use llmify
on function stubs, so long as they have docstrings or comments:
@llmify()
def greet_user(name):
"""Greet the user in a friendly manner."""
...
greeting = greet_user("Mortimer")
print(greeting) # Output: "Hello, Mortimer!"