- Cerebral Valley
- artemtkachuk.me
- @artGPT
- in/artemtkachuk
Starred repositories
📚 Freely available programming books
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
Tensors and Dynamic neural networks in Python with strong GPU acceleration
Robust Speech Recognition via Large-Scale Weak Supervision
🌐 Make websites accessible for AI agents. Automate tasks online with ease.
Drop in a screenshot and convert it to clean code (HTML/Tailwind/React/Vue)
scikit-learn: machine learning in Python
💻 A fully functional local AWS cloud stack. Develop and test your cloud & Serverless apps offline
🌟 The Multi-Agent Framework: First AI Software Company, Towards Natural Language Programming
Interact with your documents using the power of GPT, 100% privately, no data leaks
The simplest, fastest repository for training/finetuning medium-sized GPTs.
LlamaIndex is the leading framework for building LLM-powered agents over your data.
Universal memory layer for AI Agents; Announcing OpenMemory MCP - local and secure memory management.
Developer-first error tracking and performance monitoring
Ray is an AI compute engine. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.
Opiniated RAG for integrating GenAI in your apps 🧠 Focus on your product rather than the RAG. Easy integration in existing products with customisation! Any LLM: GPT4, Groq, Llama. Any Vectorstore: …
Federated query engine for AI - The only MCP Server you'll ever need
⚡ A Fast, Extensible Progress Bar for Python and CLI
You like pytorch? You like micrograd? You love tinygrad! ❤️
We have made you a wrapper you can't refuse
Generative Models by Stability AI
Graph Neural Network Library for PyTorch
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training