Skip to content

CosminDanielSolomon/makeitreal

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MakeItReal – AI Engineering Challenge

An end-to-end AI pipeline for managing garment data and generating outfit images using generative AI.

Architecture

┌─────────────────┐     ┌─────────────────┐     ┌─────────────────┐
│    Airflow      │────▶│   PostgreSQL    │◀────│    FastAPI      │
│  (ETL Pipeline) │     │   (Database)    │     │  (REST API)     │
└─────────────────┘     └─────────────────┘     └────────┬────────┘
                                                          │
                                                          ▼
                                                 ┌─────────────────┐
                                                 │  Gemini / GenAI │
                                                 │(Image Generation)│
                                                 └─────────────────┘

Stack

  • Python – core language
  • FastAPI – REST API service
  • Airflow – data pipeline orchestration
  • PostgreSQL – relational database
  • Docker / Docker Compose – containerization
  • Google Gemini + Pollinations.ai – generative AI for outfit image generation

Prerequisites

  • Linux: Docker Engine 24+ and Docker Compose v2+
  • macOS / Windows: Docker Desktop (includes Docker Compose)
  • A Google Gemini API key (free at https://aistudio.google.com/apikey)

Setup

1. Clone the repository

git clone https://github.com/CosminDanielSolomon/makeitreal.git
cd makeitreal

2. Configure environment variables

Create your .env file from the example:

cp .env.example .env

Edit .env and replace your_gemini_api_key_here with your actual Gemini API key:

GEMINI_API_KEY=your_actual_key_here

3. Start the system

docker compose up

This will start all services:

  • PostgreSQL on port 5432
  • FastAPI on port 8000
  • Airflow Webserver on port 8080
  • Airflow Scheduler (background)

Wait about 1-2 minutes on first run for all services to stabilize.

Usage

Step 1 — Run the Airflow Pipeline

The pipeline must run first to populate the database before using the API.

  1. Open the Airflow UI at http://localhost:8080
  2. Login with admin / admin
  3. Find the garment_pipeline DAG
  4. Toggle it ON using the switch on the left
  5. Click the ▶ Play button to trigger it manually
  6. Wait for all 4 tasks to turn green:
   validate_csv → load_raw_data → build_outfit_profiles → verify_profiles

This populates the user_outfit_profile table in PostgreSQL.

Step 2 — Use the API

Once the pipeline has run, the API is available at http://localhost:8000

Health Check

curl http://localhost:8000/health

Response:

{"status": "ok", "service": "MakeItReal API"}

Get User Outfit Profile

curl http://localhost:8000/user/1

Response:

{
  "user_id": 1,
  "preferred_upper_garment": "jacket",
  "preferred_lower_garment": "jeans",
  "garment_style": "casual",
  "garment_color": "black"
}

Available user IDs: 1 (Alice), 2 (Bob), 3 (Carol), 4 (David), 5 (Emma)

Generate Outfit Image

curl -X POST http://localhost:8000/generate-image \
  -H "Content-Type: application/json" \
  -d '{"user_id": 1}'

Response:

{
  "user_id": 1,
  "prompt": "Fashion model wearing a black jacket and jeans, casual style, studio lighting, editorial fashion photography, high quality, professional fashion shoot.",
  "image_url": "https://image.pollinations.ai/prompt/...",
  "image_base64": null,
  "message": "Image generated successfully via pollinations-url."
}

The image_url can be opened directly in any browser to view the generated outfit image.

Interactive API Docs

FastAPI provides automatic interactive documentation:

Image Generation

The system uses a two-provider strategy for image generation:

Provider Model Plan Required
Google Imagen 4 imagen-4.0-generate-001 Paid Gemini API plan
Pollinations.ai Stable Diffusion Free, no API key

The service attempts providers in order and falls back automatically. The fashion prompt is always returned in the response regardless of which provider succeeds, so the image can also be generated manually by opening image_url in a browser.

The integration uses the official google-genai SDK (the new replacement for the deprecated google-generativeai package).

Database Schema

users                    -- registered users
garments                 -- garment catalog (upper/lower types)
user_garment_preferences -- user ↔ garment many-to-many relationships
user_outfit_profile      -- aggregated profile, populated by Airflow pipeline

Airflow Pipeline

The garment_pipeline DAG runs @daily and consists of 4 tasks:

validate_csv → load_raw_data → build_outfit_profiles → verify_profiles
  • validate_csv — reads and validates garments.csv, shares data via XCom
  • load_raw_data — idempotently loads users, garments and preferences into PostgreSQL
  • build_outfit_profiles — aggregates data into user_outfit_profile table
  • verify_profiles — sanity check that the output table has been populated

All tasks use ON CONFLICT DO NOTHING or ON CONFLICT DO UPDATE to ensure full idempotency.

Project Structure

makeitreal/
├── docker-compose.yml
├── .env                          # secrets (never committed)
├── .env.example                  # template for environment variables
├── README.md
├── airflow/
│   ├── Dockerfile
│   ├── dags/
│   │   └── garment_pipeline.py  # Airflow DAG (4 tasks)
│   └── data/
│       └── garments.csv         # source data (5 users, 10 rows)
├── api/
│   ├── Dockerfile
│   ├── requirements.txt
│   ├── main.py                  # FastAPI entrypoint
│   ├── models.py                # SQLAlchemy ORM + Pydantic schemas
│   ├── database.py              # DB connection and session management
│   ├── routers/
│   │   ├── users.py             # GET /user/{user_id}
│   │   └── images.py            # POST /generate-image
│   └── services/
│       └── genai.py             # Gemini + Pollinations.ai integration
└── db/
    └── init.sql                 # PostgreSQL schema (4 tables)

Stopping the System

docker compose down

To also delete all data volumes (full reset):

docker compose down -v

About

An end-to-end AI pipeline for managing garment data and generating outfit images using generative AI.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors