Stars
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
Secure and fast microVMs for serverless computing.
Curated list of Go design patterns, recipes and idioms
Nodejs extension host for vim & neovim, load extensions like VSCode and host language servers.
eBPF-based Networking, Security, and Observability
BCC - Tools for BPF-based Linux IO analysis, networking, monitoring, and more
concurrent, cache-efficient, and Dockerfile-agnostic builder toolkit
Open Source Neural Machine Translation and (Large) Language Models in PyTorch
Kata Containers is an open source project and community working to build a standard implementation of lightweight Virtual Machines (VMs) that feel and perform like containers, but provide the workl…
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Unsupervised Word Segmentation for Neural Machine Translation and Text Generation
Kata Containers version 1.x runtime (for version 2.x see https://github.com/kata-containers/kata-containers).
一个基于 VirtualBox 和 openwrt 构建的项目, 旨在实现 macOS / Windows 平台的透明代理.
Code snippets from the O'Reilly book
pytorch implementation of "Get To The Point: Summarization with Pointer-Generator Networks"
A Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
A Structured Self-attentive Sentence Embedding