-
University of Delaware
- Newark, DE
-
06:59
(UTC -05:00) - https://www.linkedin.com/in/logan-hallee/
- https://orcid.org/0000-0002-0426-3508
- @Logan_Hallee
- https://www.synthyra.com/
- https://www.gleghornlab.com/
Stars
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)
Jan is an open source alternative to ChatGPT that runs 100% offline on your computer.
CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image
You like pytorch? You like micrograd? You love tinygrad! ❤️
A generative world for general-purpose robotics & embodied AI learning.
An LLM agent that conducts deep research (local and web) on any given topic and generates a long report with citations.
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
Fast and memory-efficient exact attention
Luigi is a Python module that helps you build complex pipelines of batch jobs. It handles dependency resolution, workflow management, visualization etc. It also comes with Hadoop support built in.
An awesome README template to jumpstart your projects!
Open source code for AlphaFold 2.
Minimal reproduction of DeepSeek R1-Zero
This repository contains demos I made with the Transformers library by HuggingFace.
Implementation of Denoising Diffusion Probabilistic Model in Pytorch
Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)
A minimal GPU design in Verilog to learn how GPUs work from the ground up
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
Friends don't let friends make certain types of data visualization - What are they and why are they bad.
Google AI 2018 BERT pytorch implementation
Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"
Transformer related optimization, including BERT, GPT