Enable AI inferencing on z/OS
Use the zopen package manager (QuickStart Guide) to install:
zopen install llamacpp- Clone the repository:
git clone https://github.com/zopencommunity/llamacppport.git
cd llamacppport- Build using zopen:
zopen build -vvSee the zopen porting guide for more details.
While building if an error is encountered in the ggml-cpu.cpp file (perhaps related to pthread), run zopen upgrade zoslib -y and try building again.
Contributions are welcome! Please follow the zopen contribution guidelines.