Skip to content

Latest commit

 

History

History
74 lines (54 loc) · 2.42 KB

README.md

File metadata and controls

74 lines (54 loc) · 2.42 KB

Localllm

Localllm is a web-based chat application built entirely with Rust, demonstrating the potential of frontend web development using Yew. This project showcases a locally running LLM (Language Model) with a seamless chat interface.

Table of Contents

Features

  • 🌟 100% Rust: Entirely built with Rust, showcasing the power and versatility of the language.
  • 🖥️ Frontend with Yew: Utilizes Yew for building the frontend, demonstrating Rust's capability in web development.
  • 🤖 Local LLM Integration: Chat with a locally running language model.

Getting Started

Prerequisites

Ensure you have the following installed:

Installation

Clone the repository:

git clone https://github.com/yourusername/localllm.git cd localllm cargo run

Usage Open your browser and navigate to http://localhost:8080. Start chatting with the locally running LLM model.

Project Structure

app.rs

Handles the complete frontend and API calling logic.

api.rs

Contains the logic behind the API interactions.

types.rs

Defines the necessary structs used across the application.

main.rs

The entry point of the application, where the app function is called.

Contributing

Contributions are welcome! Please fork this repository and submit pull requests.

Fork the repository. Create your feature branch (git checkout -b feature/AmazingFeature). Commit your changes (git commit -m 'Add some AmazingFeature'). Push to the branch (git push origin feature/AmazingFeature). Open a pull request.

Made with ❤️ using Rust and Yew