30 Sep 23
28 Sep 23
Donald Knuth commentary on his experimentation with Chat-GPT.
19 Sep 23
Ted Chiang on how artificial intelligence may strengthen capitalism by promising to concentrate wealth and disempower workers, and on possible alternatives.
18 Sep 23
On a dataset of programming challenges, GPT-4 solved 10 out of 10 problems that had been published before 2021 (GPT-4’s pre-training cutoff date) and zero out of 10 problems that had been published after 2021.
12 Sep 23
11 Sep 23
AI models were perplexed by a baby giraffe without spots. They’re perplexed by me, too.
07 Sep 23
Learn more about the current moment for language technology and how the UW’s computational linguistics program prepares students to work in the field in this Q
29 Aug 23
26 Aug 23
Dataherald is a natural language-to-SQL engine built for enterprise-level question answering over structured data. It allows you to set up an API from your database that can answer questions in plain English.
25 Aug 23
Get up and running with large language models, locally.
In this post, we’ll talk about why fine-tuning is probably not necessary for your app, and why applying two of the most common techniques to the base GPT models — few-shot prompting and retrieval-augmented generation (RAG) — are sufficient for most use cases.
08 Jun 23
Auf YouTube findest du die angesagtesten Videos und Tracks. Außerdem kannst du eigene Inhalte hochladen und mit Freunden oder gleich der ganzen Welt teilen.
07 Jun 23
13 May 23
ChatGPT and other AI applications such as Midjourney have pushed “Artificial Intelligence” high on the hype cycle. In this article, I want to focus specifically on the energy cost of training and using applications like ChatGPT, what their widespread adoption could mean for global CO₂ emissions, and what we could do to limit these emissions.
Key points
- Training of large AI models is not the problem
- Large-scale use of large AI models would be unsustainable
- Renewables are not making AI more sustainable
The enormous energy requirement of these brute force statistical models is due to the following attributes:
- Requires millions or billions of training examples
- Requires many training cycles
- Requires retraining when presented with new information
- Requires many weights and lots of multiplication
ChatGPT and other AI applications such as Midjourney have pushed “Artificial Intelligence” high on the hype cycle. In this article, I want to focus specifically on the energy cost of training and using applications like ChatGPT, what their widespread adoption could mean for global CO₂ emissions, and what we could do to limit these emissions.
Key points
- Training of large AI models is not the problem
- Large-scale use of large AI models would be unsustainable
- Renewables are not making AI more sustainable
25 Apr 23
The author shares how he leveraged Chat-GPT to solve some of the CTF hacking challenges at BSides 2023.
19 Apr 23
No matter what, though, I argue that “information retrieval” is—despite how LLMs are marketed—is one of the least interesting and reliable things you can do with them. It’s mostly interesting, as described above, a “common wisdom” sampler to examine the common narratives that show up repeatedly in the corpus.