Getting Started with Python Generative AI: Six Practical Projects Using Llama 2, LangChain, Streamlit, Gradio, FastAPI and SQL
This article presents six hands‑on Python generative‑AI projects—ranging from a Llama 2 chatbot built with Streamlit and Replicate to natural‑language‑to‑SQL conversion using LlamaIndex and SQLAlchemy—complete with environment setup, required code snippets, deployment tips and resource links for further exploration.
Using Llama 2, Streamlit and Replicate to Build a Chatbot
Clone the repository, create a virtual environment, install the requirements, and store your Replicate API token in secrets.toml . Run the app with streamlit run streamlit_app_v2.py and adjust model parameters via the UI.
<code>git clone https://github.com/dataprofessor/llama2.git</code> <code>python -m venv env
source env/bin/activate
pip install -r requirements.txt</code> <code>REPLICATE_API_TOKEN = "your_replicate_token"</code> <code>streamlit run streamlit_app_v2.py</code>Using Matplotlib, Streamlit and OpenAI to Visualize Data
Clone the chat-with-your-data repo, set up a virtual environment, export your OpenAI key, install dependencies, and launch the Streamlit app.
<code>git clone https://github.com/michaelweiss/chat-with-your-data.git</code> <code>export OPENAI_API_KEY="your_open_ai_key"
pip install openai pandas streamlit matplotlib
streamlit run chat_data.py</code>Using OpenAI, LangChain and Chainlit to Query Text Documents
Install the required packages, set your OpenAI key, and run the Chainlit app. The framework provides decorators such as @cl.on_message and @cl.on_chat_start for handling user input.
<code>pip install python-dotenv langchain chromadb tiktoken chainlit openai
chainlit run -w qa.py</code>Using LangChain, OpenAI and Gradio to Query Saved Document Collections
Clone the repository, create a virtual environment, install the dependencies, and start the Gradio interface with python app.py . You can also run a CLI version via python cli_app.py . Adjust the data loader to handle PDFs, DOCX or TXT files as needed.
<code>pip install openai pandas streamlit matplotlib
python app.py</code>Deploying Gradio Applications
Deploy to Hugging Face Spaces or Streamlit Community Cloud. For password‑protected deployments, wrap the launch call as gr.ChatInterface(predict).queue().launch(auth=("user","pass")) .
Using LangChain, OpenAI and FastAPI for LLM‑Powered Web Research
Install the project dependencies, create a .env file with your OpenAI key, and start the server with uvicorn main:app --reload . The app uses the Tavily search engine and provides a web UI for query‑driven research.
<code>pip install -r requirements.txt
uvicorn main:app --reload</code>Using LlamaIndex, SQLAlchemy and OpenAI to Convert Natural Language to SQL
Set up an in‑memory SQLite database, populate it with sample data, and create an NLSQLTableQueryEngine to translate NL queries into SQL. Run the script with python app.py to see the generated SQL and answer.
<code>pip install openai sqlalchemy llama-index
python app.py</code>Additional Generative‑AI Project Resources
The article lists further resources such as Shiny for Python chat streams, more Streamlit examples, WASM‑based chatbots, and various LangChain and Chainlit cookbooks, providing links to GitHub repositories and documentation for deeper exploration.
Python Programming Learning Circle
A global community of Chinese Python developers offering technical articles, columns, original video tutorials, and problem sets. Topics include web full‑stack development, web scraping, data analysis, natural language processing, image processing, machine learning, automated testing, DevOps automation, and big data.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.