An end-to-end AI pipeline for managing garment data and generating outfit images using generative AI.
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ Airflow │────▶│ PostgreSQL │◀────│ FastAPI │
│ (ETL Pipeline) │ │ (Database) │ │ (REST API) │
└─────────────────┘ └─────────────────┘ └────────┬────────┘
│
▼
┌─────────────────┐
│ Gemini / GenAI │
│(Image Generation)│
└─────────────────┘
- Python – core language
- FastAPI – REST API service
- Airflow – data pipeline orchestration
- PostgreSQL – relational database
- Docker / Docker Compose – containerization
- Google Gemini + Pollinations.ai – generative AI for outfit image generation
- Linux: Docker Engine 24+ and Docker Compose v2+
- macOS / Windows: Docker Desktop (includes Docker Compose)
- A Google Gemini API key (free at https://aistudio.google.com/apikey)
git clone https://github.com/CosminDanielSolomon/makeitreal.git
cd makeitrealCreate your .env file from the example:
cp .env.example .envEdit .env and replace your_gemini_api_key_here with your actual Gemini API key:
GEMINI_API_KEY=your_actual_key_heredocker compose upThis will start all services:
- PostgreSQL on port
5432 - FastAPI on port
8000 - Airflow Webserver on port
8080 - Airflow Scheduler (background)
Wait about 1-2 minutes on first run for all services to stabilize.
The pipeline must run first to populate the database before using the API.
- Open the Airflow UI at http://localhost:8080
- Login with
admin/admin - Find the
garment_pipelineDAG - Toggle it ON using the switch on the left
- Click the ▶ Play button to trigger it manually
- Wait for all 4 tasks to turn green:
validate_csv → load_raw_data → build_outfit_profiles → verify_profiles
This populates the user_outfit_profile table in PostgreSQL.
Once the pipeline has run, the API is available at http://localhost:8000
curl http://localhost:8000/healthResponse:
{"status": "ok", "service": "MakeItReal API"}curl http://localhost:8000/user/1Response:
{
"user_id": 1,
"preferred_upper_garment": "jacket",
"preferred_lower_garment": "jeans",
"garment_style": "casual",
"garment_color": "black"
}Available user IDs: 1 (Alice), 2 (Bob), 3 (Carol), 4 (David), 5 (Emma)
curl -X POST http://localhost:8000/generate-image \
-H "Content-Type: application/json" \
-d '{"user_id": 1}'Response:
{
"user_id": 1,
"prompt": "Fashion model wearing a black jacket and jeans, casual style, studio lighting, editorial fashion photography, high quality, professional fashion shoot.",
"image_url": "https://image.pollinations.ai/prompt/...",
"image_base64": null,
"message": "Image generated successfully via pollinations-url."
}The image_url can be opened directly in any browser to view the generated outfit image.
FastAPI provides automatic interactive documentation:
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
The system uses a two-provider strategy for image generation:
| Provider | Model | Plan Required |
|---|---|---|
| Google Imagen 4 | imagen-4.0-generate-001 |
Paid Gemini API plan |
| Pollinations.ai | Stable Diffusion | Free, no API key |
The service attempts providers in order and falls back automatically. The fashion prompt is always returned in the response regardless of which provider succeeds, so the image can also be generated manually by opening image_url in a browser.
The integration uses the official google-genai SDK (the new replacement for the deprecated google-generativeai package).
users -- registered users
garments -- garment catalog (upper/lower types)
user_garment_preferences -- user ↔ garment many-to-many relationships
user_outfit_profile -- aggregated profile, populated by Airflow pipelineThe garment_pipeline DAG runs @daily and consists of 4 tasks:
validate_csv → load_raw_data → build_outfit_profiles → verify_profiles
- validate_csv — reads and validates
garments.csv, shares data via XCom - load_raw_data — idempotently loads users, garments and preferences into PostgreSQL
- build_outfit_profiles — aggregates data into
user_outfit_profiletable - verify_profiles — sanity check that the output table has been populated
All tasks use ON CONFLICT DO NOTHING or ON CONFLICT DO UPDATE to ensure full idempotency.
makeitreal/
├── docker-compose.yml
├── .env # secrets (never committed)
├── .env.example # template for environment variables
├── README.md
├── airflow/
│ ├── Dockerfile
│ ├── dags/
│ │ └── garment_pipeline.py # Airflow DAG (4 tasks)
│ └── data/
│ └── garments.csv # source data (5 users, 10 rows)
├── api/
│ ├── Dockerfile
│ ├── requirements.txt
│ ├── main.py # FastAPI entrypoint
│ ├── models.py # SQLAlchemy ORM + Pydantic schemas
│ ├── database.py # DB connection and session management
│ ├── routers/
│ │ ├── users.py # GET /user/{user_id}
│ │ └── images.py # POST /generate-image
│ └── services/
│ └── genai.py # Gemini + Pollinations.ai integration
└── db/
└── init.sql # PostgreSQL schema (4 tables)
docker compose downTo also delete all data volumes (full reset):
docker compose down -v