pantho0/visionary-ai
A tool for creating creative images.
🤖 Visionary AI: Text-to-Image Generation Platform
A state-of-the-art text-to-image generation platform built to transform text prompts into stunning visual art using two specialized third-party AI model. The system is designed for high performance and reliability, utilizing Node.js, Express.js, TypeScript, MongoDB, Mongoose and Next.js.
🏗️ System Architecture & Workflow
The platform employs a robust microservice-like workflow to handle image generation requests efficiently.
- Client Request (Next.js): The user submits a text prompt.
- API Gateway (Express/TS): The backend receives the prompt.
- API Call: The backend sends the prompt to one of the DeAI third-party API and Cloudflare Worker.
- AI Processing: The AI model generates the image.
- Image Return & Storage: The image is sent back to the backend, which then uploads and saves it to ImageBB for permanent hosting.
- Response: The backend returns the hosted image URL and relevant metadata to the Next.js frontend.
💻 Tech Stack ✨
This project leverages a full-stack JavaScript ecosystem, focusing on type safety, asynchronous processing, and performance.
Core Technologies
| Component | Technologies Used |
|---|---|
| Frontend | Next.js (Server-Side Rendering/Full-stack) |
| Backend | Node.js with Express.js |
| Language | TypeScript (For both frontend and backend) |
| Database | MongoDB with Mongoose (ODM) |
| AI MODEL API | Cloudflare Workers (API calls) |
| AI MODEL API | DEAPI (API calls) |
| Image Hosting | ImageBB (External image storage) |
Tools & Libraries
🚀 Key Features
✅ High-Quality Generation
- Intuitive Prompt Input: Simple interface for users to enter their creative text prompts.
- AI Model Integration: Seamless connection to a specialized DeAI third-party API for state-of-the-art image synthesis.
💾 Data & Asset Management
- Persistent Storage: Generated images are securely uploaded and stored on ImageBB for reliable long-term access.
- Image History: Saves user prompts and the resulting image URLs in MongoDB, allowing users to revisit their creations.
🌐 Performance & Scalability
- Cloudflare Workers: We utilize this serverless compute environment as a dedicated, low-latency API endpoint.
- DeAPI (Third-Party AI Model): This is the high-performance engine responsible for the actual text-to-image synthesis.
- Type Safety: TypeScript throughout the stack minimizes runtime errors and improves code maintainability.
<p align="center">
<img src="https://img.shields.io/badge/ACTION%20REQUIRED-Create%20a%20.env%20file%20and%20follow%20.env.example-red?style=for-the-badge&labelColor=black"/>
</p>
🔑 Required Environment Variables
To run the project, you must set up your environment file (.env) with the necessary API keys and database connection string.
| Variable Name | Description |
|---|---|
MONGO_URI |
Connection string for your MongoDB database. |
DEAPI_KEY |
API key for the third-party text-to-image service. |
IMAGEBB_API_KEY |
API key for uploading images to ImageBB. |
CLOUDFLARE_WORKER_ENDPOINT |
Endpoint URL for the deployed Cloudflare Worker instance. |
JWT_SECRET (Optional) |
Secret key for generating JSON Web Tokens. |
📥 Clone and Run
To get the project running locally, you'll need to set up both the backend (Node/Express/Mongo) and the frontend (Next.js).
1. Clone the Front-End Repository
git clone [https://github.com/pantho0/visionary-ai-client.git](https://github.com/pantho0/visionary-ai-client.git)
cd visionary-ai-client2. Backend Setup (Assuming a separate directory)
-
git clone [https://github.com/pantho0/visionary-ai.git](https://github.com/pantho0/visionary-ai.git) cd visionary-ai-client -
Navigate to your backend directory:
cd visionary-ai -
Install dependencies:
npm install
-
Start the server (usually a development or compiled script):
npm run dev # Or 'npm run start' depending on your setup
3. Frontend Setup (Visionary AI)
- Navigate back to the frontend directory:
cd ../visionary-ai-client - Install dependencies:
npm install
- Run the development server:
npm run dev
The frontend application will be accessible at http://localhost:3000 (or the port specified by Next.js).