The Taranis solution offers a set of insights:
Stand count - the insight helps advisors and growers detect uneven plant emergence and overall beginning crop health risk to drive replant decisions and long term fertility plans
Weeds - the insight can helps advisors and growers detect weeds (by species) and overall threshold to drive herbicide decisions and long term plans
Diseases - the insight helps advisors and growers detect disease symptoms and overall crop health risk to drive in-season and long term fungicide plans
Insects - the insight helps advisors and growers detect insect thresholds and overall crop health risk to drive insecticide decisions and long term plans
Nutrient Deficiencies - the insight helps advisors and growers detect nutrient deficiency symptoms and overall crop health risk to drive in-season and long term fertility plans
Defoliation in Soybean - the insight helps advisors and growers detect defoliation and overall insect threshold for crop health risk to drive insecticide decisions and long term plans
Field Health Index (FHI) - the improved analysis model monitors the field health throughout the season with PlanetLabs NDVI layers and helps detect anomalies within the field
To provide the insights, the analysis process begins once the images and their metadata have been validated in Capitan and uploaded to the Taranis cloud storage. The first stage in the process is annotating the leaf-level, hi-resolution images using machine learning and artificial intelligence (AI) models.
Specific artificial intelligence models are available for each of the insights (Field Health Index excluded) and for supported crops. Each model is trained on previously captured hi-resolution images to support a specific needs. For example: a dedicated model exists for stand count for corn, annotating and marketing each of the emerging plants during the early stages of the growing cycle.
The input for each of the individual AI models is a set of tiles cropped from the original image. Importantly, these cropped image tiles are not rescaled before running inference, in order to preserve the highly granular resolution of the source imagery. While processing the source images into sets of tiles does increase the complexity of training and deploying the models, it is necessary in order to support detection of features that appear over small regions of pixels, where each pixel is captured at sub-millimeter precision.
Once the AI models have completed and if necessary, manual/human annotation is performed. For example: when the AI model identifies weeds and requests for manual annotation to identify the weeds species.
Once an insight has been fully annotated and calculated (see below), and for a sample of the insights, the Taranis Quality Control team will review, make adjustments when needed and approve for customer delivery.
Fully annotated image of emerging corn plants
As Taranis uses a field sampling method for data acquisition, accordingly for each insight, the analysis is done on a image-by-image basis (there is no image geo overlap) and the additional calculations and analysis on the field level is done using extrapolation methods.
Assessing the plants’ population of fields, specifically right after the emergence stage, is used for early estimation of yield.
Taranis uses two common methodologies for plant population assessment in early stages:
“1 per 1000” method
The “1 per 1000” method is commonly used when sowing is done in rows, while the “hoop” method is commonly used when there is no applicable uniform row structure in the field. Both methods are used during early scouting of the field, where the field is manually scouted in a couple of locations randomly selected in the field. The plant population estimates calculated in each of the spots are then averaged to give the general estimate of the population in the field.
For the field level average plant population, we exclude extreme values based on common statistical outlier detection methods.
Detailed explanation of the planting population calculation can be found here https://knowledge.taranis.ag/portal/en/kb/articles/plant-population-counting-with-smartscout-20-10-2021-1
The calculation allows Taranis to provide an estimated population of plants per acre for each individual image. The calculations are then shown to the user in the web application and the Connect mobile app:
“Bubbles view” of the calculated stand count for each acquired image as displayed in the Taranis web application
Additionally, in order to provide a global view of the geospatial distribution of the stand count throughout the field, Taranis uses constrained KNN (k-nearest neighbor; https://en.wikipedia.org/wiki/K-nearest_neighbors_algorithm) algorithms to extrapolate from image samples to a field level heatmap. Using these algorithms - the value of each pixel in the field’s map is calculated according to the closest most relevant image samples taken in the field.
The heatmap uses constrained KNN to extrapolate from individual stand count samples to a full field visual display
A simple average calculation of the individual stand count populations is used to provide a field-level “Avg. Number Of Plants” score:
Stand count score for the field based on the individual stand count calculations
The insight histogram shows the distribution of acres in each planting population range, based on the heatmap coloring. Histogram bins are defined according to the overall population range - divided into 5 equally spaced bins. For example: if the field lowest count in a specific image was calculated to be 86K stands and the highest 147K stands, the ranges of the histogram bins will be of 12K = (147K-86K) / 5.
Histogram with 5 equally spaced bins of the stand count population
The Taranis weeds insight analysis is done by calculating the count of the broad and grass weeds in the leaf-level, hi-resolution image per the image footprint and extrapolating the count to an area of 100 square feet.
Weeds detected in the leaf-level hi-resolution image
Using the footprint of the images and the number of weeds, Taranis calculates the estimated weeds per 100 ft2 as shown in the screenshot
Similarly to the stand count, additional visual elements are generated to allow better understanding of the data, including:
Heatmap: Using constrained KNN models to extrapolate from weed pressure in the sampled points images to a field level representation
Histogram: creating 5 equally spaced bins to show the weeds density across the field
Additional insights are also processed by the artificial intelligence models to identify the threats in the field. Specific identified threats are marked within the image and counted to provide an overall assessment of their impact on the field.
Chemical damage to the leaf as part of the diseases insight shown in the leaf-level image
Disease insight showing the number of detected images with diseases and their location in the field. Clicking on the red dots will open the annotated leaf-level image