Skip to content

amitsuthar69/ollama-cli

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

OLLAMA-CLI - Chat Completion with context

preview


Get the GROQ_API_KEY from GRQO CLOUD.

  • Set the key to environment variable
export GROQ_API_KEY=<your_key>
  • Verify your key
echo $GROQ_API_KEY
  • check your PATH
echo $PATH
  • move binary to your PATH
mv ./ollama ~/.local/bin
# ./local/bin is my PATH
  • use the CLI
ollama ask "your prompt...""
  • Context
ollama ask -c "prompt which needs context of previous chats..."
  • History
ollama history        # all history
ollama history -d 2   # past 2 days

>> ollama --help
CLI Wrapper for llama.

Usage:
  ollama [command]

Available Commands:
  ask         prompt the LLM
  completion  Generate the autocompletion script for the specified shell
  help        Help about any command
  history     Show the history of prompts

Flags:
  -h, --help     help for ollama
  -t, --toggle   Help message for toggle

Use "ollama [command] --help" for more information about a command.
>> ollama ask --help
prompt the LLM

Usage:
  ollama ask <message> [flags]

Flags:
  -c, --ctx    Use Context Mode. [Context window: 10 mins]
  -h, --help   help for ask

About

Yet another LLM Wrapper to use in CLI

Topics

Resources

Stars

Watchers

Forks