Drone integration

BlindSPOTS (Satellite Pairing Orbital Telemetric Systems) addresses one of the most persistent limitations in satellite wildfire detection: incomplete or uncertain data caused by cloud interference. While modern orbital sensors such as VIIRS and MODIS provide near-global coverage, their ability to detect fires beneath cloud layers remains restricted by atmospheric absorption and limited spatial resolution. To overcome this, our design introduces a collaborative drone–satellite network. In this system, autonomous drones operate as low-altitude extensions of satellites, flying beneath cloud cover to gather high-resolution thermal, optical, and LiDAR data in regions where satellite visibility is obstructed. The data collected by these drones are then integrated with satellite observations into a unified, cloud-penetrating dataset that offers complete, accurate, and temporally consistent fire intelligence.

figure_1_infrared-camera.jpg

Function and Capabilities of Drones

Unmanned Aerial Vehicles (UAVs), commonly known as drones, are autonomous or semi-autonomous aircraft capable of collecting environmental data at extremely high resolution. Operating between 100 and 3,000 meters above ground level, drones can carry thermal infrared, multispectral, and LiDAR sensors to map temperature gradients, vegetation structure, and canopy fuel density. Their relatively low flight altitude allows them to operate beneath cloud layers and atmospheric interference, gathering data that orbital sensors cannot. Most drones are powered by high-density lithium-polymer batteries or hybrid solar-electric systems and use radio or satellite-based telemetry to transmit real-time data to control stations or processing networks.

Drone–Satellite Coordination and Communication

Within the integrated system, satellites perform wide-field scanning to identify regions of uncertainty in their thermal imagery—particularly areas masked by clouds or haze. Using cloud-masking algorithms similar to those applied by MODIS and GOES, the satellites flag zones where infrared detection is unreliable. These flagged regions are transmitted to an AI-driven coordination hub that manages a fleet of drones positioned across high-risk wildfire regions.

The hub’s machine learning model analyzes real-time satellite inputs, historical weather data, and cloud formation patterns to predict where coverage gaps are most significant. Based on these predictions, it assigns flight paths to available drones. Each drone receives a geospatial task packet containing coordinates, estimated cloud altitude, and data acquisition parameters. Communication occurs through a two-layer relay system: a cloud-based ground station that interprets satellite signals and a drone uplink using high-frequency radio or Ka-band satellite transmission. The satellites effectively function as command-and-control centers in orbit, while drones act as mobile, intelligent data collectors that respond to these directives.

Scope of Drone Operations and Data Amalgamation

Deployed drones operate within a defined operational radius of 100–300 km, depending on endurance and environmental conditions. Their mission is to perform detailed scanning beneath or around cloud-obscured regions, capturing high-resolution thermal and spectral imagery that compensates for missing or uncertain satellite data. Drones execute adaptive flight patterns—grid sweeps or contour-following passes—to optimize coverage of complex terrain.

As data are collected, both satellite and drone observations are sent to the coordination hub for data fusion and synchronization. Sensor fusion algorithms such as Kalman filters and Bayesian data assimilation models reconcile differences in temporal frequency, spectral range, and spatial scale. This integration process aligns drone data (centimeter to meter-scale resolution) with satellite imagery (hundreds of meters to kilometer-scale), creating a composite radiometric dataset with improved precision and completeness. The result is a multi-layered thermal and structural model that combines the global reach of satellites with the localized accuracy of UAVs.

Optimal Drone Platforms

The optimal platforms for this network are long-endurance fixed-wing VTOL (Vertical Takeoff and Landing) UAVs equipped with thermal infrared cameras, shortwave infrared imagers, and LiDAR payloads. Fixed-wing designs such as the Quantum Systems Trinity F90+ or AeroVironment Puma LE provide the range, endurance, and stability needed for extended missions, while VTOL capability allows flexible deployment in remote and uneven terrain. Each drone would carry an uncooled microbolometer for thermal imaging, a shortwave infrared sensor capable of penetrating thin clouds, and a LiDAR unit to quantify canopy height, canopy base height, and canopy fuel weight. Data processing can occur locally on embedded systems or be streamed to the coordination hub depending on bandwidth availability.

Resulting Data Integration and Impact

BlindSPOTS produces a continuously updated, cloud-penetrating wildfire monitoring system. Satellites identify thermal anomalies and large-scale atmospheric patterns; drones fill in the spatial and spectral gaps with fine-resolution data. Together, they generate a complete picture of ground and canopy conditions—including temperature, humidity, fuel load, and ignition probability—delivered to firefighting agencies and early-warning networks in near real time.

By closing the most critical data gap in wildfire monitoring—the inability to detect and confirm ignitions beneath clouds—this system directly advances the goals of the XPRIZE Wildfire Challenge. It enhances early detection accuracy, reduces false alarms, and enables actionable predictions of fire spread within minutes of ignition. Through a fully integrated, adaptive sensing network, our approach transforms fragmented orbital observations into a unified system capable of anticipating, validating, and mitigating extreme wildfires before they escalate into uncontrollable events.

References

  1. NASA LP DAAC. “VIIRS Active Fire Product User’s Guide.” LP DAAC, 2024. **https://lpdaac.usgs.gov/documents/427/VNP14_User_Guide_V1.pdf**. Accessed 24 Oct. 2025.