-
@NL2G Universität Mannheim
- Germany
-
02:20
(UTC +01:00)
Highlights
- Pro
Lists (1)
Sort Name ascending (A-Z)
- All languages
- Assembly
- Bluespec
- C
- C#
- C++
- CSS
- Clojure
- CoffeeScript
- Common Lisp
- Cuda
- Cython
- Dart
- Dockerfile
- Erlang
- Go
- HCL
- HTML
- Haskell
- Java
- JavaScript
- Jupyter Notebook
- Kotlin
- Lua
- MDX
- MLIR
- Macaulay2
- Makefile
- PHP
- Perl
- Python
- R
- Roff
- Ruby
- Rust
- Scala
- Shell
- Svelte
- Swift
- SystemVerilog
- Tcl
- TeX
- TypeScript
- Verilog
- Vim Script
- Vue
- Zig
Starred repositories
thunlp / Seq1F1B
Forked from NVIDIA/Megatron-LMSequence-level 1F1B schedule for LLMs.
deep-spin / Megatron-LM-pretrain
Forked from NVIDIA/Megatron-LMOngoing research training transformer models at scale
RooCodeInc / Roo-Code
Forked from cline/clineRoo Code gives you a whole dev team of AI agents in your code editor.
NL2G / effeval
Forked from jbgruenwald/efficient-nlg-metricsThis is the code for "EffEval: A Comprehensive Evaluation of Efficiency for MT Evaluation Metrics"
facebookexperimental / fb-vscode
Forked from microsoft/vscodeVisual Studio Code
shawwn / llama
Forked from meta-llama/llamaInference code for LLaMA models
cimeister / typical-sampling
Forked from huggingface/transformers🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
cgnorthcutt / cleanlab
Forked from cleanlab/cleanlabOfficial cleanlab repo is at https://github.com/cleanlab/cleanlab
Ding-Liu / NLRN
Forked from ychfan/tf_estimator_bareboneCode for Non-Local Recurrent Network for Image Restoration (NeurIPS 2018)
GeorgeFedoseev / DeepSpeech
Forked from mozilla/DeepSpeechRussian Speech Recognition system based on Mozilla's DeepSpeech TensorFlow implementation.
tchewik / isanlp_qbic
Forked from DanAnastasyev/GramEval2020An isanlp wrapper for the winner solution from GramEval-2020 shared task (morphology, lemmatization, and UD syntax parsing for Russian).
Rapid research framework for PyTorch. The researcher's version of Keras
shawwn / gpt-2
Forked from nshepperd/gpt-2Code for the paper "Language Models are Unsupervised Multitask Learners"