This is a simple, browser-based chat interface for interacting with your LM Studio server. It allows you to connect to your locally hosted LM Studio model and chat with it from any device with a web browser, including mobile phones. This is a personal project - no pull requests are accepted at this time.
- Dark mode interface
- Connect to any LM Studio server
- Chat with your LM Studio model
- LaTeX Math Rendering and Markdown Rendering
- Mobile-friendly design
- Super Cool new Purple theme
- Chats
- Vision Model Support
- Choose your model
- Delete Chats
- Download the
index.htmlfile from this repository. - Save it to a location on your computer that you can easily access.
This works out of the box on Android devices. For iOS you need to open the file in Microsoft Edge or another browser. Safari/Chrome do not work.
There are several ways to get the index.html file on your mobile device:
-
Direct Download:
- Open this repository on your mobile device's web browser.
- Find the
index.htmlfile and download it directly to your device.
-
Email to Yourself:
- Download the
index.htmlfile on your computer. - Email it to yourself as an attachment.
- Open the email on your mobile device and download the attachment.
- Download the
-
Cloud Storage:
- Upload the
index.htmlfile to a cloud storage service like Google Drive, Dropbox, or iCloud. - Access the file from your mobile device using the respective cloud storage app.
- Upload the
-
File Transfer Apps:
- Use apps like AirDrop (for iOS devices) or nearby sharing (for Android devices) to transfer the file from your computer to your mobile device.
-
Start LM Studio Server:
- Open LM Studio on your computer.
- Go to the "Server" tab (In 0.3.x -> Developer -> Local Server).
- Ensure that CORS is enabled and Serve on Local Network is enabled.
- Click "Start Server" and note down the server address.
-
Open the Chat Interface:
- On desktop: Double-click the
index.htmlfile to open it in your default web browser. - On mobile: Use a file manager app to locate the downloaded
index.htmlfile and open it with your web browser.
- On desktop: Double-click the
-
Connect to LM Studio Server:
- In the chat interface, enter the LM Studio server address in the input field at the top.
- Click the "Connect" button.
-
Start Chatting:
- Once connected, you can start typing messages in the input field at the bottom of the screen.
- Press Enter or tap Send to send your message.
- The model's responses will appear in the chat window.
-
Can't connect to server:
- Ensure LM Studio Server is running on your computer.
- Check that you're using the correct server address.
- If accessing from another device, make sure both devices are on the same network.
-
Slow responses:
- LM Studio processing speed depends on your computer's capabilities. Larger models may take longer to respond.
-
Interface not loading:
- Try opening the
index.htmlfile with a different web browser.
- Try opening the
This interface is designed for local use only. Do not expose your LM Studio server to the public internet without proper security measures in place.
This is a personal project. While the code is public for anyone to use and learn from, I am not accepting pull requests for new features or bug fixes. If you find an issue or have a suggestion, please open an issue to discuss it. Pull Requests are automatically closed and not welcome.