A small language model trained on famous literary works in Japanese.
-
Updated
May 14, 2024 - Python
A small language model trained on famous literary works in Japanese.
Lightweight BERT-like model with Rotary Position Embeddings (RoPE), Grouped-Query Attention (GQA) for faster inference, and configurable sliding-window + global-token attention for long-context modeling. Includes weight-tied embeddings and RMSNorm.
Aplicação web que utiliza inteligência artificial para classificar e-mails em categorias predefinidas e sugerir respostas automáticas com base na classificação realizada.
This project focuses on NLP giving a summary of the entire conversation using google/pegasus model
unofficial medicalnet huggingface wrapper for medical image 3d classifcation
Add a description, image, and links to the transformers-model topic page so that developers can more easily learn about it.
To associate your repository with the transformers-model topic, visit your repo's landing page and select "manage topics."