Taranis utilizes satellites, drones and aircrafts (currently only in Brazil) to acquire field imagery data. During the onboarding phase, each customer can assign fields to matching subscription plans to ensure they are serviced accordingly.
Taranis acquires satellite imagery from Planet Labs Imagery and Sentinel. Images are acquired for a minimum period of 2 years and customers can choose their timeframe: January or September timeframe to meet the different needs across the serviced regions.
The imagery from satellites is gathered based on the frequency and image availability by the satellite imagery provider, ranging from daily to weekly, providing images at a quality that ranges from 10m per pixel to 1.5m per pixel. The images are then processed in the cloud servers by Taranis systems on a pixel-by-pixel process to clear problematic areas of the field, due to satellite obstructers such as clouds.
Satellite imagery layers screens
To initiate the submillimeter, hi-resolution data acquisition process, the Taranis solution first aims to conduct its flight mission according to the crop phenological growth stages. Conducting the missions at the right time will ensure the provided insights are meaningful and actionable throughout the growing season.
Taranis developed a comprehensive technology, named Intelligent Flight Scheduler (or IFS), geared to improve our serviceability by arriving at the field at the right time from an agronomic perspective. The IFS implementation considers many factors and use cases, and provides a unified model which reliably schedules Taranis flights.
The technology is providing accurate flight scheduling for several crop types, including Soybean, Corn and Cotton, as well as all other crops that are being serviced.
To perform the most accurate scheduling decisions, IFS is utilizing a proven stage growth model, provided by ClearAG by DTN: https://www.dtn.com/agriculture/agribusiness/clearag/
IFS uses the DTN model in combination with user provided information such as crop and planting date to best schedule missions.
More information can be found here: https://knowledge.taranis.ag/portal/en/kb/articles/flight-scheduling-during-the-season
Based on IFS, missions will be created in the Taranis mission operations system, called Atlas. Atlas uses Atlassian Jira (https://www.atlassian.com/software/jira) as its core platform to create, process and complete its mission operations. In Atlas, the Taranis operations team can configure mission and flight parameters (preferred laser-measured height from the ground, camera zoom level, etc.) for each crop type and each mission type (early stand count missions and later threats missions).
Additionally, Atlas is used to allocate missions to outsourced Drone Service Providers (DSP), giving them a flight window and field information.
Configuring flight parameters in Atlas screen
Taranis operations can also use a map based interface called Mapatlas to view upcoming mission allocations and allocate unallocated missions to Drone Services Providers.
Marketing unallocated missions to be assigned in Mapatlas
Taranis developed a specific system called Capitan to be used by pilots (drones and aircraft) for mission execution, in-field configuration adjustments (such as new obstacles in the field), and image uploading.
Capitan is a desktop software (running on a desktop/laptop) and updated by the pilots on a regular basis. When logging in, pilots can see their allocated missions, and for each mission Capitan will include:
Field sampling rate: Taranis samples 1-2 images per acre. This means that a 90 acre field will have between 90 to 180 leaf-level hi-resolution images (each image represents a different location in the field). The sampling rate is determined in Atlas and is configured for all missions per crop type and region. Pilots can increase the provided sampling rate in case of outstanding field issues (such as field shape and obstacles)
Predefined flight route: the route of the drone/aircraft is calculated prior to the flight on the Taranis SaaS platform and will optimize the route to include the requested field sampling rate, maximizing their coverage of the field (in terms of spread) and minimizing the overall flight time
Flight parameters: including flight height, zoom level, planted row spacing etc. These parameters will determine the image footprints, which ranges between 100 sqft (10 sqm) for densely seeded fields and 400 sqft (37 sqm) for sparsely seeded fields
Capitan screen showing the predefined flight route with its waypoints that indicate the sampled locations in the field
Once the flight has been completed (see below), the pilots will once again use Capitan software in the field, to validate the sampled images only for quality issues, such as blurriness, brightness, and flight height requirements. The pilots then use Capitan to upload the acquired images to the Taranis cloud servers and storage for their processing, as no analysis or data extraction is performed during the drone/aircraft flight or on the pilot operational systems.
Taranis uses off-the-shelf DJI Matrice 300 (https://www.dji.com/matrice-300) drones for the drone flights. To further optimize the flights, and using the DJI software development kit, Taranis built a custom flight app, called Cockpit. Cockpit uses the information from the pilot operations system, Capitan, and executes the flight accordingly.
The flights are then executed based on the predefined route, allowing the drone to capture separate images along the flight route based on the provided parameters (zoom, focal length, etc.). During the flight there are no analysis, validations or processes executed by Cockpit or the drone itself as those processes are performed post-flight for image quality and on the Taranis cloud storage for further annotation and analysis.
The M300 drones flies across the field and capture sampled images in their predetermined locations in two formats:
Leaf-level, submillimeter hi-resolution images to be used for field insights
MacroLevel, wide images for providing context only (no processing is performed on these wide images)
Leal-level images of early stage corn taken with the M300 drone (with annotations)
Same images as above with its MacroView for user context (no analysis or annotations)