-
EURECOM
- Sophia Antipolis
- http://www.eurecom.fr/~troncy/
- https://orcid.org/0000-0003-0457-1436
Stars
π€ Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
π¦π The platform for reliable agents.
Python tool for converting files and office documents to Markdown.
A curated list of awesome Machine Learning frameworks, libraries and software.
TensorFlow code and pre-trained models for BERT
Deep Learning papers reading roadmap for anyone who are eager to learn this amazing tech!
Fully open reproduction of DeepSeek-R1
An open-source RAG-based tool for chatting with your documents.
Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.
π€ The largest hub of ready-to-use datasets for AI models with fast, easy-to-use and efficient data manipulation tools
State-of-the-Art Text Embeddings
Annotate better with CVAT, the industry-leading data engine for machine learning. Used and trusted by teams at any scale, for data of any scale.
A very simple framework for state-of-the-art Natural Language Processing (NLP)
π Scalable embedding, reasoning, ranking for images and sentences with CLIP
π‘ All-in-one open-source AI framework for semantic search, LLM orchestration and language model workflows
Supercharge Your LLM Application Evaluations π
Implementation of DALL-E 2, OpenAI's updated text-to-image synthesis neural network, in Pytorch
A framework for few-shot evaluation of language models.
Open source annotation tool for machine learning practitioners.
TensorFlow Neural Machine Translation Tutorial
The open source platform for AI-native application development.
A collection of important graph embedding, classification and representation learning papers with implementations.
π Simple and ready-to-use tutorials for TensorFlow
RAG that intelligently adapts to your use case, data, and queries
A library for Multilingual Unsupervised or Supervised word Embeddings
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Optimizing inference proxy for LLMs