AS
asitdave/Aircraft-Defect-Detection-and-Automated-Image-Captioning
Classifies aircraft surface damage (dent vs crack) using VGG16 and generates descriptive damage captions with BLIP for intelligent maintenance analysis.
Aircraft Damage Classification & Captioning
Overview
This project automates aircraft damage assessment from images using deep learning. It first classifies damage type (dent vs crack) using a VGG16 pre-trained model, then generates descriptive captions and summaries using the BLIP transformer model. This workflow simulates intelligent maintenance support systems used in aviation safety and MRO (Maintenance, Repair, Overhaul) operations.
Key Features
- Classifies aircraft surface damage as dent or crack
- Uses VGG16 for feature extraction and transfer learning
- Generates human-readable damage captions and summaries via BLIP
- Includes training curves, evaluation metrics, and visual predictions
- Reproducible pipeline with dataset loading, preprocessing, and labeling
Tech Stack
- TensorFlow / Keras
- VGG16 Pre-trained CNN
- BLIP Transformer (Hugging Face)
- ImageDataGenerator
- Matplotlib / NumPy
Model Performance
- Achieved ~81% accuracy on test set (5 epochs, frozen VGG16 base)
- Accurate visual label predictions on test images
- BLIP generates contextual captions and damage summaries
Applications
- Automated inspection support
- Faster maintenance documentation
- Visual defect triaging systems
- Aviation safety research and prototypes
Next Steps
- Fine-tune VGG16 layers for higher accuracy
- Expand dataset (different aircraft types, lighting, angles)
- Deploy as a web dashboard with file upload
- Integrate OCR for serial number tagging
Acknowledgements
- VGG16: ImageNet pre-trained model
- BLIP: Salesforce / Hugging Face
- Aircraft dataset: CC BY 4.0
On this page
Languages
HTML69.8%Jupyter Notebook30.2%
Contributors
Created November 2, 2025
Updated November 2, 2025