LLMFlows: Framework for Building LLM Applications
Project Overview
GitHub Stats | Value |
---|---|
Stars | 661 |
Forks | 34 |
Language | Python |
Created | 2023-06-26 |
License | MIT License |
Introduction
LLMFlows is a framework designed to simplify the development of applications powered by Large Language Models (LLMs), such as chatbots, question-answering systems, and agents. It provides a minimalistic set of abstractions that enable the use of LLMs and vector stores in a transparent and well-structured manner. Unlike other frameworks, LLMFlows ensures that all components are explicit and free from hidden prompts or LLM calls, making it easier to monitor, maintain, and debug your applications. With its focus on simplicity and transparency, LLMFlows is an invaluable tool for anyone looking to build robust and manageable LLM-powered apps.
Key Features
LLMFlows is a framework designed to build simple, explicit, and transparent applications using Large Language Models (LLMs), such as chatbots, question-answering systems, and agents.
Key Features
LLMs
- Utilize LLMs like OpenAI’s ChatGPT for natural language text generation.
- Configure LLM classes with specific models, parameters, and settings.
- Automatic retries for failed model calls ensure reliable interactions.
Prompt Templates
- Create dynamic prompts with variables for flexible text generation.
- Define prompt strings tailored to specific inputs.
Flows and FlowSteps
- Structure applications using Flows and FlowSteps for clear and organized LLM interactions.
- Connect flow steps to pass outputs as inputs, maintaining a transparent pipeline.
- Use Async Flows to run LLMs in parallel when all inputs are available.
VectorStore Integrations
- Integrate with vector databases like Pinecone for efficient storage and retrieval of vector embeddings.
Explicit API and Full Transparency
- Create explicit applications without hidden prompts or predefined behaviors.
- Monitor and debug easily with full transparency into each component.
Callbacks
- Execute callback functions at different stages within flow steps for customization, logging, and monitoring.
Main Capabilities
- Building Chatbots and Question-Answering Systems: Manage conversation history and generate responses based on it.
- Complex Flow Management: Use
Flow
andFlowStep
classes to handle dependencies between prompts and LLM calls. - Async Execution: Run flow steps in parallel to optimize performance.
- Vector Database Integration: Store and query vector embeddings efficiently.
- Customizable and Transparent: Full control over the application with explicit APIs and detailed monitoring capabilities.
Installation
pip install llmflows
For more detailed examples and user guides, refer to the documentation.
Real-World Applications
Building Chatbots
You can use LLMFlows to create simple and transparent chatbots. Here’s an example using the OpenAIChat
and MessageHistory
classes:
from llmflows.llms import OpenAIChat, MessageHistory
llm = OpenAIChat(api_key="")
message_history = MessageHistory()
while True:
user_message = input("You:")
message_history.add_user_message(user_message)
llm_response, call_data, model_config = llm.generate(message_history)
message_history.add_ai_message(llm_response)
print(f"LLM: {llm_response}")
This setup allows for a clear conversation history and easy management of user and AI messages.
Creating Complex Flows
For applications with multiple dependencies, you can use the Flow
and FlowStep
classes to structure your LLM interactions. Here’s an example of generating a movie title, song title, characters, and song lyrics in a structured flow:
from llmflows.flows import Flow, FlowStep
from llmflows.llms import OpenAI
from llmflows.prompts import PromptTemplate
## Create prompt templates and flow steps
title_template = PromptTemplate("What is a good title of a movie about {topic}?")
song_template = PromptTemplate("What is a good song title of a soundtrack for a movie called {movie_title}?")
characters_template = PromptTemplate("What are two main characters for a movie called {movie_title}?")
lyrics_template = PromptTemplate("Write lyrics of a movie song called {song_title}. The main characters are {main_characters}")
## Connect flow steps and create the flow
movie_title_flowstep = FlowStep(name="Movie Title Flowstep", llm=openai_llm, prompt_template=title_template, output_key="movie_title")
song_title_flowstep = FlowStep(name="Song Title Flowstep", llm=openai_llm, prompt_template=song_template, output_key="song_title")
characters_flowstep = FlowStep(name="Characters Flowstep", llm=openai_llm, prompt_template=characters_template, output_key="main_characters")
song_lyrics_flowstep = FlowStep(name="Song Lyrics Flowstep", llm=openai_llm, prompt_template=lyrics_template, output_key="song_lyrics")
## Run the flow
soundtrack_flow = Flow(movie_title_flowstep
Conclusion
Key Points:
- Simplicity and Transparency: LLMFlows provides a simple, well-documented framework for building LLM applications with explicit and transparent components, making monitoring, maintenance, and debugging easier.
- Flexible Applications: Enables the creation of complex LLM-powered apps such as chatbots, question-answering systems, and agents with clear data flows and dependencies.
- Performance Optimization: Supports async flows to run LLMs in parallel, improving performance and efficiency.
- Integration Capabilities: Integrates with vector databases like Pinecone and supports custom callbacks for enhanced customization and control.
- User Control: Offers full control over LLM interactions with no hidden prompts or predefined behaviors, ensuring transparency in every component.
Future Potential:
- Scalability: Expected to facilitate the development of more sophisticated and scalable LLM applications.
- Community Engagement: Encourages contributions and feedback, potentially leading to a robust community-driven project.
- Expanded Use Cases: Likely to be used in various domains such as customer service, content generation, and educational tools due to its flexibility and transparency.
For further insights and to explore the project further, check out the original stoyan-stoyanov/llmflows repository.
Attributions
Content derived from the stoyan-stoyanov/llmflows repository on GitHub. Original materials are licensed under their respective terms.
Stay Updated with the Latest AI & ML Insights
Subscribe to receive curated project highlights and trends delivered straight to your inbox.