Want to quickly prototype your Large Language Model (LLM) application without a ton of code? Unleash your creativity with PromptFlow, a low-code tool that empowers you to craft executable flowcharts. Seamlessly integrate LLMs, prompts, Python functions, and conditional logic to create intricate workflows. With PromptFlow, you can visualize your ideas and bring them to life, all without getting tangled in code or complex logic.
🎮 Join the conversation on our Discord Server
The core of PromptFlow is a visual flowchart editor that lets you design nodes and establish Connections between them. Each node can represent a Prompt, a Python function, or an LLM. The connections between nodes embody conditional logic, dictating the flow of your program.
When run your flowchart, PromptFlow executes each node in the sequence defined by the connections, transferring text data between nodes as required. If a node returns a value, that value is forwarded to the next node in the flow as a string. More information on the inner workings of PromptFlow can be found in our documentation.
Before starting, make sure to populate the .env
file with the appropriate values. The .env
file should be located in the root directory of the project.
The easiest way to run PromptFlow is with Docker Compose. To do so, run the following command:
docker compose up --build
This will run the DB, Redis, API (Backend), Celery Worker, and Frontend containers. The API will run on port 8069
by default, with the frontend
on port 4200
.
Check out our official docs:
To compile the Sphinx documentation, execute:
cd docs
make html
Then, navigate to docs/build/html/index.html
in your browser.
Want to contribute to PromptFlow? Get started by building a node.
Stumbled upon a bug? Don't hesitate to create an issue, open a PR, or let us know on Discord.
We're interested in your feedback! If you've used PromptFlow, please fill out this questionnaire.