Today
Production-ready platform for agentic workflow development. - langgenius/dify
Easy Data Preparation with latest LLMs-based Operators and Pipelines. - OpenDCAI/DataFlow
karpathy/hn-time-capsule: Analyzing Hacker News discussions from a decade ago in hindsight with LLMs
Analyzing Hacker News discussions from a decade ago in hindsight with LLMs - karpathy/hn-time-capsule
A Hacker News time capsule project that pulls the HN frontpage from exactly 10 years ago, analyzes articles and discussions using an LLM to evaluate prescience with the benefit of hindsight, and generates an HTML report.
A Model Context Protocol (MCP) server that facilitates structured, progressive thinking through defined stages. This tool helps break down complex problems into sequential thoughts, track the progression of your thinking process, and generate summaries.
Earnings Call Civilizational Score Prompt. GitHub Gist: instantly share code, notes, and snippets.
You are a financial historian and industry expert conducting a review of past earnings calls with the benefit of hindsight and wisdom.
29 Nov 25
Course to get into Large Language Models (LLMs) with roadmaps and Colab notebooks.
mlabonne.github.io/blog/
27 Nov 25
Verbalized Sampling, a training-free prompting strategy to mitigate mode collapse in LLMs by requesting responses with probabilities. Achieves 2-3x diversity improvement while maintaining quality. Model-agnostic framework with CLI/API for creative writing, synthetic data generation, and dialogue simulation. - CHATS-lab/verbalized-sampling
Open-Source Memory Engine for LLMs, AI Agents
What is Memori Memori enables any LLM to remember conversations, learn from interactions, and maintain context across sessions with a single line: memori.enable(). Memory is stored in standard SQL databases (SQLite, PostgreSQL, MySQL) that you fully own and control.
Why Memori?
One-line integration - Works with OpenAI, Anthropic, LiteLLM, LangChain, and any LLM framework SQL-native storage - Portable, queryable, and auditable memory in databases you control 80-90% cost savings - No expensive vector databases required Zero vendor lock-in - Export your memory as SQLite and move anywhere Intelligent memory - Automatic entity extraction, relationship mapping, and context prioritization
Fara-7B is Microsoft’s first agentic small language model (SLM) designed specifically for computer use.
With only 7 billion parameters, Fara-7B is an ultra-compact Computer Use Agent (CUA) that achieves state-of-the-art performance within its size class and is competitive with larger, more resource-intensive agentic systems.
AI-Powered Data Processing: Use LOTUS to process all of your datasets with LLMs and embeddings. Enjoy up to 1000x speedups with fast, accurate query processing, that’s as simple as writing Pandas code - lotus-data/lotus
LOTUS is an open-source query engine that makes programming as easy as writing Pandas and optimizes your programs for up to 400x speedups.
28 Jan 24
There’s no question that, as AI has surged in popularity, we have entered an era where code lines are being added faster than ever before. The better question for 2024: who’s on the hook to clean up the mess afterward?