-
Carnegie Mellon University, Neuroscience Institute
- www.juliendirani.com
Stars
A Python Toolbox for Multimode Neural Data Representation Analysis
Representational Similarity Analysis on MEG and EEG data
♞ lichess.org: the forever free, adless and open source chess server ♞
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
Code associated with the paper "Inducing brain-relevant bias in natural language processing models" in the proceedings of the 33rd Conference on Neural Information Processing Systems (NeurIPS 2019)…
hULMonA (حلمنا): tHe first Universal Language MOdel iN Arabic
AraVec is a pre-trained distributed word representation (word embedding) open source project which aims to provide the Arabic NLP research community with free to use and powerful word embedding mod…
Pre-trained Transformers for Arabic Language Understanding and Generation (Arabic BERT, Arabic GPT2, Arabic ELECTRA)