CoSenseus is an AI-enabled dialogue tool for building consensus through generative conversation. It transforms natural language input from participants into structured insights, helping groups and communities move beyond debate and foster shared understanding.
CoSenseus operates as a local-first application, ensuring privacy and user control. It uses local language models (via Ollama) to perform real-time analysis of civic and organizational dialogue. The current version is focused on providing a powerful, single-event experience with a clear path toward a scalable, cloud-native architecture.
- π€ Local-First AI: All analysis is performed on your local machine using Ollama
- π Event Creation & Management: Create and manage single dialogue events
- π Multi-Round Conversations: Support for iterative rounds of input and AI-powered synthesis
- π Real-time Visualization: Interactive visualizations for sentiment, word clouds, and consensus
- π₯ Admin-in-the-Loop: Administrators review and approve AI-generated prompts before they are sent to participants
- π Dynamic Reporting: Flexible options to export raw data and generate formatted reports
- π Network Access: Share your local instance with others on your network for collaborative sessions
- Backend Services: Python FastAPI microservices
- Frontend Application: React.js with TypeScript
- AI/ML Pipeline: PyTorch, Transformers, scikit-learn
- Databases: PostgreSQL, Neo4j, Redis, Vector databases
- Infrastructure: Kubernetes, Docker, AWS/GCP
senseus/
βββ backend/ # Backend microservices
β βββ api-gateway/ # Main API gateway service
β βββ event-service/ # Event management service
β βββ nlp-service/ # Natural language processing
β βββ auth-service/ # Authentication and authorization
β βββ profile-service/ # User profile management
β βββ shared/ # Shared utilities and models
βββ frontend/ # React frontend application
βββ mobile/ # React Native mobile apps
βββ infrastructure/ # Kubernetes, Docker, Terraform
βββ docs/ # Technical documentation
βββ scripts/ # Development and deployment scripts
βββ tests/ # Integration and E2E tests
Before running CoSenseus, ensure you have the following installed:
- Node.js 18+ - Download here
- Python 3.11+ - Download here
- Ollama - Download here
- Git - Download here
git clone https://github.com/omniharmonic/cosenseus.git
cd cosenseus# Install Ollama (if not already installed)
# macOS: brew install ollama
# Linux: curl -fsSL https://ollama.ai/install.sh | sh
# Start Ollama service
ollama serve
# In a new terminal, pull the required AI model
ollama pull llama3.2:3b# Run the development setup script
./scripts/dev-setup.sh# Start all services (frontend + backend)
./start.sh
# Or start backend only (for development)
./start.sh --backend-onlyOnce started, you can access:
- Frontend Application: http://localhost:3000
- Backend API: http://localhost:8000/api/v1/
- API Documentation: http://localhost:8000/docs
- Ollama Health Check: http://localhost:11434
Press Ctrl+C in the terminal where you ran ./start.sh to gracefully stop all services.
After completing the installation steps above:
- Open your browser and navigate to http://localhost:3000
- Create your first session:
- Enter your name (e.g., "Event Organizer")
- Click "Create Session"
- Save your session code for future access
- You'll be redirected to the dashboard where you can create events
- Click "Create Event" in the Admin tab
- Fill in event details:
- Title: "Community Planning Session"
- Description: "Discuss local development priorities"
- Event Type: "Discussion"
- Add Questions: Include 2-3 open-ended questions
- Click "Create Event"
- Publish the event when ready to invite participants
- Share the event link with participants
- Participants can access via:
- Local network: http://[YOUR_IP]:3000 (e.g., http://192.168.1.154:3000)
- Same device: http://localhost:3000
- Participants create sessions and join your event
- Monitor responses in the event dashboard
- When ready to advance:
- Click "End Round & Start Analysis"
- Wait for AI analysis to complete
- Review and approve AI-generated prompts for the next round
- Continue for multiple rounds as needed
- Generate final reports when the dialogue is complete
CoSenseus supports local network access, allowing multiple participants to join from different devices:
- Find your local IP address:
# macOS/Linux ifconfig | grep "inet " | grep -v 127.0.0.1 # Windows ipconfig | findstr "IPv4"
- Share the network URL with participants:
Example:
http://[YOUR_IP]:3000http://192.168.1.154:3000
- Open the shared URL in their browser
- Create a session with their name
- Join the event and start participating
Port Already in Use:
# Check what's using the port
lsof -i :8000 # Backend port
lsof -i :3000 # Frontend port
# Kill the process if needed
kill -9 [PID]Ollama Not Running:
# Start Ollama service
ollama serve
# Check if it's running
curl http://localhost:11434/api/tagsDatabase Issues:
# Reset the database (WARNING: This will delete all data)
rm -rf ~/.cosenseus/cosenseus_local.db
./start.shNetwork Access Not Working:
- Check firewall settings - ensure ports 3000 and 8000 are open
- Verify IP address - use
ifconfigoripconfigto get correct IP - Test connectivity - try
curl http://[YOUR_IP]:8000/api/v1/from another device
View service logs for debugging:
# Backend logs
tail -f logs/backend.log
# NLP service logs
tail -f logs/nlp_service.log
# Frontend logs (if running in terminal)
# Check the terminal where you ran npm start# Start all services
./start.sh
# Start backend only (for development)
./start.sh --backend-only
# Start with custom configuration
./scripts/start_local_dev.sh# Install dependencies
cd backend && pip install -r requirements.txt
cd frontend && npm install
# Run tests
cd backend && python -m pytest
cd frontend && npm test
# Build for production
cd frontend && npm run build# Start Ollama service
ollama serve
# Pull AI model
ollama pull llama3.2:3b
# List available models
ollama list
# Test model
ollama run llama3.2:3b "Hello, how are you?"- Network Access Fix: Resolved API URL generation for local network access
- Database Optimization: Moved from iCloud Drive to local system directory for better reliability
- Service Stability: Enhanced error handling and timeout management
- Multi-Device Support: Participants can now access from different devices on local network
- Service Stability: Resolved backend process crashes and port conflicts
- Ollama Integration: Enhanced with improved error handling and timeout management
- Process Management: Robust service orchestration with auto-kill functionality
- Development Environment: Comprehensive startup scripts with backend-only mode
- Testing & Validation: All services (Backend API Gateway, NLP Service, Frontend, Ollama) operational
- Root Cause: Fixed missing analysis fields in
SynthesisResponsePydantic model - Solution: Enhanced response model and endpoint to return complete analysis data
- Result: Complete data flow from AI analysis through dialogue manager to frontend approval system
- Phase 1 (Months 1-6): β Core Platform - COMPLETED
- Phase 2 (Months 7-12): β Advanced AI & Visualization - COMPLETED
- Phase 3 (Months 13-18): π Integration & Scale - IN PROGRESS
- Phase 4 (Months 19-24): β³ Civic Companion Evolution - PLANNED
CoSenseus is designed for various civic and organizational dialogue scenarios:
- Town Hall Meetings: Facilitate community discussions on local issues
- Policy Feedback: Collect structured input on proposed policies
- Candidate Forums: Enable meaningful dialogue between candidates and constituents
- Neighborhood Planning: Gather community input on development projects
- Strategic Planning: Align teams around organizational goals
- Change Management: Build consensus around organizational changes
- Stakeholder Engagement: Gather diverse perspectives on key decisions
- Team Building: Foster understanding and collaboration within teams
- Classroom Discussions: Facilitate structured dialogue on complex topics
- Research Collaboration: Synthesize diverse academic perspectives
- Student Government: Enable democratic decision-making processes
- Faculty Meetings: Build consensus on institutional decisions
We welcome contributions to CoSenseus! Please see CONTRIBUTING.md for development guidelines and our code of conduct.
# Fork and clone the repository
git clone https://github.com/[your-username]/cosenseus.git
cd cosenseus
# Create a feature branch
git checkout -b feature/your-feature-name
# Make your changes and test
./start.sh --backend-only # For backend development
cd frontend && npm start # For frontend development
# Commit and push
git add .
git commit -m "feat: Add your feature description"
git push origin feature/your-feature-name- Use the GitHub Issues page
- Include detailed steps to reproduce the issue
- Attach relevant logs and screenshots
[License details to be added]
- Ollama for providing local AI capabilities
- FastAPI for the robust backend framework
- React for the responsive frontend
- The civic technology community for inspiration and guidance
Made with β€οΈ for better civic dialogue and consensus building