Skip to content
#

text-generation

Here are 27 public repositories matching this topic...

This repository provides Jupyter notebooks to interact with Mistral Large Language Models (LLMs) for tasks including chatbot development, retrieval-augmented generation, and text generation. These notebooks are designed to help users leverage Mistral models in a range of applications, from conversational AI to content generation.

  • Updated Nov 14, 2024
  • Jupyter Notebook

This repository contains Jupyter notebooks for working with Anthropic Large Language Models (LLMs), providing tools to explore chat-based interactions, retrieval-augmented generation, and text generation. These notebooks serve as a practical introduction to leveraging Anthropic models for various applications.

  • Updated Nov 14, 2024
  • Jupyter Notebook

This repository contains Jupyter notebooks to explore and utilize OpenAI's Large Language Models (LLMs) for various applications, including chatbots, retrieval-augmented generation, text generation, prompt engineering, and vector embedding. These notebooks provide a comprehensive toolkit for working with OpenAI models in diverse contexts.

  • Updated Nov 14, 2024
  • Jupyter Notebook

In this notebook, I'll construct a character-level LSTM with PyTorch. The network will train character by character on some text, then generate new text character by character. As an example, I will train on Anna Karenina. This model will be able to generate new text based on the text from the book!

  • Updated Apr 20, 2020
  • Jupyter Notebook

Explore advanced neural networks for crafting captivating headlines! Compare LSTM πŸ”„ and Transformer πŸ”€ models through interactive notebooks πŸ““ and easy-to-use wrapper classes πŸ› οΈ. Ideal for content creators and data enthusiasts aiming to automate and enhance headline generation ✨.

  • Updated Dec 28, 2025
  • Jupyter Notebook

Text generation using a character-based RNN with LSTM cells. We will work with a dataset of Shakespeare's writing from Andrej Karpathy's The Unreasonable Effectiveness of Recurrent Neural Networks. Given a sequence of characters from this data ("Shakespear"), train a model to predict the next character in the sequence ("e"). Longer sequences of …

  • Updated Sep 28, 2020
  • Python

Improve this page

Add a description, image, and links to the text-generation topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the text-generation topic, visit your repo's landing page and select "manage topics."

Learn more