Hackathon Submission: Automated detection of deforestation and land-use changes in the Amazon rainforest using OpenAI GPT-4.1 vision models and satellite imagery.
Modis dataset is very coarse, it has a 500m accuracy. I need to get a much sharper dataset.
This system automatically detects environmental changes in the Amazon rainforest by:
- Fetching current satellite imagery from Google Earth Engine (Sentinel-2)
- Comparing with historical data (MODIS 2020 baseline)
- Using AI vision models to identify deforestation, construction, and land-use changes
- Generating detailed reports with coordinates and analysis
- AI-Powered Analysis: GPT-4.1 vision model analyzes satellite imagery
- Real Satellite Data: Live Sentinel-2 imagery via Google Earth Engine
- Automated Detection: Identifies deforestation, roads, and urban expansion
- Geographic Validation: Analysis constrained to Amazon rainforest boundaries
- Batch Processing: Analyzes multiple locations automatically
- Python 3.12+
- GitHub Token (free)
- Google Earth Engine account (free)
git clone https://github.com/ShaafPlayz/OpenAI-to-Z-challenge.git
cd OpenAI-to-Z-challenge
pip install -r requirements-minimal.txtCreate .env file:
GITHUB_TOKEN=your_github_token_hereearthengine authenticate
# Follow the browser authenticationjupyter notebook "GEE and OpenAI Research.ipynb"
# Open the notebook and run all cellsThat's it! The system will automatically:
- Generate random points in the Amazon rainforest
- Fetch current satellite imagery
- Compare with historical land cover data
- Use AI to detect and report anomalies
OpenAI-to-Z-challenge/
βββ π .env # Environment variables (local)
βββ οΏ½ .gitignore # Git ignore rules
βββ οΏ½ README.md # Project documentation
βββ οΏ½ environment.yml # Conda environment configuration
βββ οΏ½ package-lock.json # Node.js dependencies lock file
βββ οΏ½π requirements.txt # Full Python dependencies
βββ π requirements-minimal.txt # Essential Python dependencies
βββ π GEE and OpenAI Research.ipynb # Main research notebook
βββ π .git/ # Git repository data
βββ π .ipynb_checkpoints/ # Jupyter notebook checkpoints
βββ π amazonia_boundary_proposal/ # Amazon boundary shapefiles
β βββ πΊοΈ amazonia_polygons.dbf # Shapefile attribute data
β βββ πΊοΈ amazonia_polygons.prj # Projection information
β βββ πΊοΈ amazonia_polygons.sbn # Spatial index binary
β βββ πΊοΈ amazonia_polygons.sbx # Spatial index
β βββ πΊοΈ amazonia_polygons.shp # Main shapefile geometry
β βββ πΊοΈ amazonia_polygons.shp.xml # Metadata
β βββ πΊοΈ amazonia_polygons.shx # Shapefile index
βββ π AppEEARS/ # NASA AppEEARS MODIS datasets
β βββ πΊοΈ MCD12Q1.061_LC_Prop2_doy2020001_aid0001.tif # MODIS 2020 land cover
β βββ πΊοΈ MCD12Q1.061_LC_Prop2_doy2021001_aid0001.tif # MODIS 2021 land cover
β βββ πΊοΈ MCD12Q1.061_LC_Prop2_doy2022001_aid0001.tif # MODIS 2022 land cover
β βββ πΊοΈ MCD12Q1.061_LC_Prop2_doy2023001_aid0001.tif # MODIS 2023 land cover
β βββ πΊοΈ MCD12Q1.061_QC_doy2020001_aid0001.tif # MODIS 2020 quality control
β βββ πΊοΈ MCD12Q1.061_QC_doy2021001_aid0001.tif # MODIS 2021 quality control
β βββ πΊοΈ MCD12Q1.061_QC_doy2022001_aid0001.tif # MODIS 2022 quality control
β βββ πΊοΈ MCD12Q1.061_QC_doy2023001_aid0001.tif # MODIS 2023 quality control
βββ π AppEEARSjpg/ # Processed MODIS imagery (empty)
βββ π Resources/ # Additional resources
β βββ ποΈ amazon_biome_border.zip # Original biome shapefile data
β βββ ποΈ amazonia_boundary_proposal_Eva_2005 (1).zip # Boundary proposal data
βββ π satimagery/ # Satellite imagery data
β βββ πΌοΈ sentinel_rgb.jpg # Processed RGB satellite image
β βββ πΊοΈ sentinel_rgb.tif # Raw satellite data (GeoTIFF)
βββ π Technology Testing (Scripts)/ # Python scripts for testing
β βββ π .env # Local environment variables
β βββ π GoogleEarthEngine.py # Google Earth Engine integration
β βββ π Offical_OpenAI_Key.py # API key management
β βββ π€ OpenAI-o3.py # OpenAI o3 model experiments
β βββ π€ OpenAI.py # Main OpenAI API implementation
βββ π WorkFlow Testing (Jupyter Notebooks)/ # Jupyter notebooks
βββ π openai-research.ipynb # Additional research notebook
- Boundary Validation: Ensures coordinates are within Amazon rainforest
- Satellite Data Fetch: Downloads current Sentinel-2 imagery (10m resolution)
- Historical Lookup: Retrieves MODIS 2020 land cover classification
- AI Analysis: GPT-4.1 compares current vs. historical imagery
- Anomaly Detection: Reports deforestation, construction, or land-use changes
find_and_log_anomalies(): Main detection pipelinegetAmazoniaAnalysisPoints(): Generates valid Amazon coordinatesget_modis_class_for_point(): Gets historical land cover datapromptGPT(): Sends imagery to AI for analysis
Footprint ID: 1 at (-2.8234, -60.1234)
Historical (MODIS) Class: Evergreen Needleleaf Trees
Status: Anomaly Found - Construction activity detected
- Python 3.12+
- Conda (Anaconda/Miniconda)
- GitHub Token (for GitHub Models access)
- Google Earth Engine Account
- OpenAI API Key (optional)
-
Clone the Repository
git clone https://github.com/yourusername/OpenAI-to-Z-challenge.git cd OpenAI-to-Z-challenge -
Create and Activate Conda Environment
# Create the environment from YAML conda env create -f environment.yml conda activate OpenAI-GoogleEngine # Or create manually conda create -n OpenAI-GoogleEngine python=3.12 conda activate OpenAI-GoogleEngine
-
Install Dependencies
# Option 1: Install minimal requirements (recommended) pip install -r requirements-minimal.txt # Option 2: Install full environment pip install -r requirements.txt
-
Environment Variables Create a
.envfile in the project root:GITHUB_TOKEN=your_github_token_here OPENAI_API_KEY=your_openai_api_key_here # Optional
-
Google Earth Engine Authentication
# First time setup earthengine authenticate # Initialize with your project # Replace 'your-project-id' with your actual GEE project ID
# Launch Jupyter and open the main research notebook
jupyter notebook "GEE and OpenAI Research.ipynb"# Run OpenAI experiments
python "Technology Testing (Scripts)/OpenAI.py"# Test GEE functionality
python "Technology Testing (Scripts)/GoogleEarthEngine.py"Ready to try it? Just run the Quick Start above! π
Hackathon Submission - June, 2025
