-
Couchbase
- Bangalore, Karnataka, India
Stars
Zap file format compatible with a future version of Bleve
A modern text/numeric/geo-spatial/vector indexing library for go
Trully flash implementation of DeBERTa disentangled attention mechanism.
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
A simple and working implementation of Electra, the fastest way to pretrain language models from scratch, in Pytorch
Train and Infer Powerful Sentence Embeddings with AnglE | 🔥 SOTA on STS and MTEB Leaderboard
Cookbook containing recipes for using Couchbase Vector Search using different Embedding & Large Language Models
RAGEN leverages reinforcement learning to train LLM reasoning agents in interactive, stochastic environments.
Simple replication of [ColBERT-v1](https://arxiv.org/abs/2004.12832).
Convert pretrained RoBerta models to various long-document transformer models
Fast inference engine for Transformer models
A collection of resources to study Transformers in depth.
List of Computer Science courses with video lectures.