A scientific publication monitoring and analysis application that tracks new publications from scientific journals, analyzes them using Ollama LLM, and generates relevance reports.
- Journal Monitoring using CrossRef: Track multiple scientific journal feeds
- LLM Integration: Use Ollama to analyze publication relevance and generate summaries
- SQLite Database: Store publications, analysis results, and daily reports
- Rich CLI Interface: User-friendly terminal interface with formatted output
- Static Web Interface: Browse reports and publications through a clean, modern web UI
- Customizable: Configure journals, LLM prompts, and more via YAML configuration
- Docker Support: Easy deployment using Docker containers
- Configuration Management: Default configuration with user overrides
- Python 3.8+
- Micromamba (for environment management)
- Ollama (for LLM integration)
- Create a new environment:
micromamba create -n litlum python=3.10- Install the package and dependencies:
# Activate the environment (or use micromamba run -n litlum)
micromamba run -n litlum pip install -e .Make sure Ollama is installed and running on your system. By default, the application will try to connect to Ollama at http://localhost:11434.
To install Ollama, follow the instructions at: https://ollama.com/download
Pull the LLM model specified in your config (default is llama3.2):
ollama pull llama3.2First, activate the micromamba environment:
# Activate the litlum environment
micromamba activate litlum
# Or, if you prefer to use the environment just for a single command:
micromamba run -n publication_reader <command>The application provides a command-line interface with several commands. You can run the application in multiple ways:
-
Recommended: Using the
litlumcommand (after installation withpip install -e .):# First activate the environment micromamba activate litlum # Then use the litlum command litlum --help # Show help and available commands
Note: The
litlumcommand is only available when the environment is activated. -
Using Python's
-mflag (works without activating the environment):python -m litlum --help
-
Using Micromamba (works without activating the environment):
micromamba run -n litlum litlum --help
# Fetch new publications from configured RSS feeds
litlum fetch
# Analyze unprocessed publications
litlum analyze
# Generate and display a report for today
litlum report --generate
# Run the full pipeline (fetch, analyze, report)
litlum run
# Run the full pipeline (fetch, analyze, report) and serve the web interface
litlum run --serve# List available reports
litlum list --reports
# List recent publications with minimum relevance score
litlum list --publications --days 7 --min-relevance 5
# Show details for a specific publication
litlum show <publication_id>
# Reanalyze publications from a specific date
litlum analyze --date 2025-05-30 --reanalyze
# Reset the database (useful for debugging)
litlum reset
# Reset without confirmation prompt
micromamba run -n litlum python -m litlum reset --forceThis section demonstrates more complex workflows and advanced options for working with the LitLum application.
To view detailed information about a specific publication, including its abstract, relevance score, and LLM-generated summary:
# Show publication with ID 5
litlum show 5Filter publications by date range and minimum relevance score:
# List publications from the last 14 days with relevance score of at least 6
python -m litlum list --publications --days 14 --min-relevance 6
Reanalyze publications if you've changed your interests or LLM configuration:
# Reanalyze all publications
python -m litlum analyze --reanalyze
# Reanalyze publications from a specific date
python -m publication_reader analyze --date 2025-05-30 --reanalyze
Generate and view reports:
# Generate a report for today
python -m publication_reader report --generate
# Generate a report for a specific date
python -m litlum report --generate --date 2025-05-30
# View a report for a specific date
python -m litlum report --date 2025-05-30Note: The minimum relevance for reports is configured in the
config.yamlfile under thereportssection. To customize this, update your configuration file.
The application includes debug output in the analyzer to show LLM prompts and responses. This output is always enabled when running the analyze command.
For troubleshooting database issues, you can reset the database:
# Reset the database (will prompt for confirmation)
python -m publication_reader reset
# Force reset without confirmation
python -m publication_reader reset --forceLitLum uses a layered configuration system that loads settings in the following order (each layer overrides the previous one):
- Default Configuration - Built-in default settings from
litlum/config/default-config.yaml - User Configuration - Custom settings from
~/.config/litlum/config.yaml(if exists) - Environment Variables - Specific path overrides (see below)
When you run any LitLum command, it will show which configuration files are being loaded:
[INFO] Loading default configuration from: /path/to/package/litlum/config/default-config.yaml
[INFO] Successfully loaded default configuration
[INFO] Loading user configuration from: /home/username/.config/litlum/config.yaml
[INFO] Successfully loaded user configuration
If no user configuration is found, you'll see:
[INFO] Loading default configuration from: /path/to/package/litlum/config/default-config.yaml
[INFO] Successfully loaded default configuration
[INFO] No user configuration found at /home/username/.config/litlum/config.yaml. Using default configuration.
You can also check the configuration files directly:
- Default configuration:
litlum/config/default-config.yamlin the package directory - User configuration:
~/.config/litlum/config.yaml(if it exists)
The default configuration includes sensible defaults for all settings. You can find the complete default configuration in the package's litlum/config/default-config.yaml.
To customize settings, create a config.yaml file in ~/.config/litlum/. This file will be merged with the default configuration, with your settings taking precedence.
Example user configuration (~/.config/litlum/config.yaml):
# Override database location
database:
path: "/custom/path/publications.db"
# Customize your interests
interests:
- "Arctic ocean"
- "climate modelling"
- "machine learning"
- "ocean eddies"
# Configure Ollama LLM settings
ollama:
model: "llama3.2" # or "gemma3:27b" for better results
host: "http://localhost:11434"
# Add or modify journal feeds
feeds:
- name: "Nature Climate Change"
type: "crossref"
issn: "1758-6798"
active: true
- name: "Science"
type: "crossref"
issn: "1095-9203"
active: true
# Reports configuration
reports:
min_relevance: 5.0 # Minimum relevance score (0-10) for including publications in reports
# Web interface settings
web:
title: "My Custom LitLum"The following environment variables can be used to override specific paths:
LITLUM_REPORTS_DIR: Override the directory for storing reportsLITLUM_WEB_DIR: Override the web output directory
Note: Database path can only be configured through the config file, not via environment variables.
To run the application automatically, you can set up a cron job:
- Create a script to run the application (e.g.,
~/scripts/run_litlum.sh):
#!/bin/bash
export PATH="/path/to/micromamba/bin:$PATH"
# Activate the environment and run the command
micromamba activate litlum
python -m litlum run- Make the script executable:
chmod +x ~/scripts/run_litlum.sh- Add a cron job (edit with
crontab -e):
# Run daily at 8 AM
0 8 * * * ~/scripts/run_litlum.sh
The LitLum application now includes a static web interface that makes it easy to browse reports and publications:
- Index Page: Lists all available reports by date
- Report Pages: Displays publications above the relevance threshold
- Interactive Tables: Each publication includes:
- Title and journal information
- Relevance score
- Link to the original paper (via DOI/URL)
- Expandable sections for AI assessment and abstract
- Responsive Design: Works on desktop and mobile devices
# Generate the static website
python -m publication_reader web
# Generate and serve the website locally
python -m litlum web --serveThe web interface will be generated in the configured web path (default: ~/.local/share/litlum/web).
This application is designed with future extensibility in mind:
- More sophisticated relevance analysis
- Additional publication sources
- Email notifications
- Custom user interests configuration
- Enhanced web interface with search and filtering
This project is licensed under the MIT License - see the LICENSE file for details.