Build a Flask ChatGPT API: Step-by-Step Guide with Free Hosting
If you’re looking to create a simple yet powerful backend service to interact with ChatGPT using Flask, you’re in the right place. In this guide, I’ll show you how to build a lightweight Flask ChatGPT API, and we’ll go through how to host it online—for free.
Step 1: Project Setup
Start by creating a new folder for your project and a virtual environment:
mkdir flask-chat-api cd flask-chat-api python -m venv venv source venv /bin/activate # On Windows use venv\Scripts\activate |
Install the required packages:
pip install flask openai python-dotenv |
Create the following file structure:
flask-chat-api/ │ ├── app.py ├── .env └── requirements.txt |
Add your dependencies to requirements.txt
:
flask openai python-dotenv |
Step 2: Configure Environment Variables
In your .env
file, store your OpenAI API key securely:
OPENAI_API_KEY=your_openai_key_here |
We’ll use this key inside the app to authenticate requests to OpenAI.
Step 3: Create the Flask App
Now open app.py
and add the following code:
from flask import Flask, request, jsonify from dotenv import load_dotenv import openai import os load_dotenv() app = Flask(__name__) openai.api_key = os.getenv( "OPENAI_API_KEY" ) @app .route( '/chat' , methods = [ 'POST' ]) def chat(): data = request.get_json() user_message = data.get( "message" ) if not user_message: return jsonify({ "error" : "No message provided" }), 400 try : response = client.chat.completions.create( model = "gpt-3.5-turbo" , messages = [ { "role" : "system" , "content" : "You are a helpful assistant." }, { "role" : "user" , "content" : user_message} ] ) reply = response.choices[ 0 ].message.content return jsonify({ "response" : reply}) except Exception as e: return jsonify({ "error" : str (e)}), 500 if __name__ = = '__main__' : app.run(debug = True ) |
Step 4: Run Your API Locally
To test your API, run:
flask run |
Your API will be available at http://127.0.0.1:5000/chat
. You can test it with a tool like Postman or curl
:
curl -X POST http://127.0.0.1:5000/chat \ -H "Content-Type: application/json" \ -d '{"message": "Hello, who are you?"}' |
Hosting Your Flask API
Once your Flask API is ready, you might want to deploy it online so others (or other services) can access it. Here are some great hosting platforms that support Flask and offer generous free tiers:
1. PythonAnywhere
- A simple platform to host Python apps.
- Easy setup with a browser-based environment.
- Great for small-scale APIs and testing.
- Free plan available.
2. Koyeb
- Modern serverless platform.
- Supports Docker and Git-based deployments.
- Fast global infrastructure.
- Free tier with generous limits.
3. Render
- Very easy to use, supports Git deployments.
- Great documentation and developer-friendly setup.
- Free plan with limited resources.
4. Google App Engine
- Part of Google Cloud Platform.
- Offers a free tier that works well for low-traffic apps.
- Supports Python natively.
5. Fly.io
- Deploy applications close to your users.
- Good performance and flexibility.
- Free tier includes small instances, perfect for testing APIs.
These platforms are great for deploying small Flask APIs like the one we just built. Choose the one that best fits your needs based on simplicity, performance, and scalability.
What’s Next?
You now have a fully working Flask ChatGPT API that you can run locally or host online. If you want to expand this, consider:
- Adding logging or analytics
- Rate-limiting or API keys
- Using GPT-4 instead of GPT-3.5
- Securing your endpoints with authentication