This repository contains the code and resources for a counselor chatbot developed using Python. The chatbot is designed to interact with users and provide responses based on predefined conversational intents. This project is implemented in a Jupyter Notebook and leverages various machine learning and natural language processing techniques.
The Counselor Chatbot project includes the following key components:
- Data Processing: Loading and processing conversational data stored in JSON format.
- Intent Classification: Using patterns in the data to classify user inputs into predefined intents.
- Response Generation: Generating appropriate responses based on the identified intent.
- Model Training: Training a machine learning model to improve the chatbot's ability to understand and respond to user queries.
To run this project locally, you will need to have Python 3 installed along with the following libraries:
pandasnumpytensorfloworkerasnltk
You can install these dependencies using pip:
pip install pandas numpy tensorflow nltk-
Clone the Repository:
git clone https://github.com/precioux/counselor-chatbot.git cd counselor-chatbot -
Open the Jupyter Notebook:
Launch Jupyter Notebook and open the
counseler-chatbot.ipynbfile.jupyter notebook counseler-chatbot.ipynb
-
Run the Cells:
Execute the cells in the notebook sequentially to load the data, process it, and train the model.
-
Interacting with the Chatbot:
Once the model is trained, you can interact with the chatbot directly in the notebook.
The conversational data used for this project is stored in a JSON file named intents.json. This file includes various intents, each with its own set of patterns (user inputs) and corresponding responses.
Contributions are welcome! If you have suggestions for improvements or new features, feel free to open an issue or submit a pull request.
This project is licensed under the MIT License. See the LICENSE file for more details.
- This project was inspired by the need to provide accessible mental health support through AI-driven chatbots.
- Special thanks to the creators of the datasets and the libraries used in this project.