Our Technology

Technologies Used

Our technology is covered by two pending patent applications:

 

1) System for Exchanging Image Information Between Remote Locations and Computers over LoRa and LoRaWAN Wireless Connections. Filed in January 2021.

 

2) System and Method for Detecting and Monitoring Species of Wildlife with a Light Detection and Ranging (LiDAR) Multi-Zone Rangefinder and Smart Camera.  Filed in January 2022.

Maintaining healthy populations of insect pollinators, salmon, birds and other wildlife and has significant biological, cultural, and economic benefits. Significant data gaps exist for species that are critical for a sustainable economy and an ecologically healthy and stable planet. These gaps create conditions for controversies and poor land management decisions. FDT and FDS are developing novel camera trap technology that leverages the latest AI-ready microprocessors with the latest low-powered sensors and wireless technologies to maximize data acquisition from the field while using the least amount of battery power. 

Traditional camera traps have proved invaluable for field research. However, commercial camera traps were not designed for land managers or scientists. Commercial camera traps are narrowly focused on providing visual images. Large memory cards filled with visual images might be interesting to look at by curious naturalists or sport hunters but they do not provide useful data for land managers or scientists. Scientists are left trying to extract useful data from voluminous disks of images. FDT and FDS have turned the process of interpreting images upside down by instead focusing on creating spreadsheets of tabular environmental and animal behavioral data. Our goal is a device that compiles the raw data into formats that are ready for quantitative analysis. Toward this goal we have integrated the following three technologies that take our devices far beyond the capabilities of commercially available trail cameras.

 

  1. Lidar as the key motion detection technology
  2. Lorawan to create Networked Data
  3. Cloud Dashboard for Data Access

The functionality of traditional trail cameras depends entirely on a passive infrared (PIR) sensor. PIR sensors have been around for decades, and are thus a time-tested way to detect animals moving against a background. PIRs detect motion by detecting a temperature differential between a moving object and the background. As such, they are ideal for detecting a warm-blooded animal moving against a cooler background. However, this approach has some drawbacks. PIRs are unreliable when an animal’s body temperature is the same as the background. As such, PIRs are a poor way to detect fish or amphibians in water. PIRs are equally unreliable for detecting insects or larger animals in tropical climates where there is little difference in temperature between the animals and the background. PIRs are also notoriously susceptible to false triggers from infrared sunlight that reflects off of windblown vegetation. Our technology incorporates a substantially different and newly available LiDAR sensor to detect motion. Instead of being susceptible to any and all temperature anomalies in the field of view, our LiDAR sensor defines a precise air space with millimeter accuracy. Any object that enters this airspace will trigger the sensor and activate our device regardless of temperature or ambient light anomalies. When our device is activated it then brings additional sensors, software and AI to bear on the task of interpreting the event. The onboard 5 megapixel camera takes visual images of the events. But whereas a traditional trail camera’s functionality ends when the images are stored, our devices use the images as verification of events in a detailed time series spreadsheet that is rich in behavioral and environmental data. 

 

This LiDAR chip, introduced in October 2020, uses an infrared laser to measure distances to objects with millimeter accuracy. This breakthrough technology could enable low-cost detection of adult salmonids as they move over shallow riffles, or juveniles underwater. LiDAR chips have been incorporated into an existing field-proven digital platform that includes an AI-enabled smart camera and long-distance wireless link. Our devices are small (2 lbs), portable, and able to be installed without stream channel alterations. This enables deployments across numerous remote and unmonitored streams, or alongside existing survey methods. Following the extensive Phase I laboratory testing, Phase II will add field deployment and data collection. On-board AI will interpret detections and produce tabular data ready for analysis of population health, or to validate restoration of barriers and culverts.

In addition to compiling an onboard spreadsheet of data, the device can also send live updates to a cloud dashboard to provide assurance of field operation and additional graphical analyses of incoming data. In the Internet of Things (IoT) technology world, LoRaWAN will send the data the furthest with the least power. LoRAWAN is a radio protocol that creates a Long Range Wide Area Network. The smart cameras can report back to their base station over this network. Currently, once a minute the cameras can send 51 or 11 byte packages, depending on network quality. Our compression allows for tabular data, movement data, machine learning model outcomes, and thumbnails of images to be sent over a network.

Our users can currently choose to load data tables directly from the SD cards of the smart cameras, or have the data load to a dashboard from their devices.   For multiple devices during long deployment, the dashboard allows the camera own to see which devices are powered and running or which have gone silent (such as a bear tearing down a nest box!) This real time monitoring saves time going out to change batteries or check on units, as only needed actions are carried out by the field researchers or technicians.   When choosing to log data to the dashboard, the camera owner can view data outputs real time, and see what is happening at their site form a cloud-based or local laptop. For sensitive data, everything can be kept local, so there’s no risk of accidental data sharing for endangered species or private land management practices.  

We have multiple patents submitted regarding using LiDAR to identify animal behavior and transmitting visual data across LoRAWAN. Phase I SBIR work successfully integrated the LiDAR chip (LoC) onto the existing open-source board. Insect specimens at the end of a thin wire were attached to a programmable robotic arm, and robotic fish specimens were attached to a special underwater track. The robotics moved the model animals in front of the LoC following natural movement paths.. This test system was used to develop software that captured a brief time series of LoC position measurements and compressed the detected flight paths into an 8×8 matrix of values representing those flight paths. We refer to these captured flight paths as “LiDAR signatures.” We demonstrated the ability to train compact-on-disk, low-power, low-memory classifiers that distinguish between LiDAR signatures, thereby  proving the feasibility of software classifiers that run in near-real time on low-power devices. What this means is that the cameras, without needing to turn on a camera, can distinguish what is gong on using machine learning and only turn on the camera selectively. Biologists only get the data they want to see, and the battery lasts a much longer time in the field.  Here we see a dried mason bee and artificial flowers, as see through a visual camera. Then, we “fly” the bee on a robot arm and have it follow a pollination pattern. The result is we learn how the LiDAR detects small animal interactions so that we can intelligently turn on the camera systems. We have similar data for artificial juvenile fish.



Fine detail LiDAR testing rig (first image), a small animal simulator with robot arm and customizable landscape(second image), artificial salmon stream with changeable substrate and up to 2,000 gallons per hour flow(third image).

 

We have done over 300,000 individual LiDAR detection trials on our experimental systems, making us the leading experts in how low resolution LiDAR can be utilized for wildlife identification and activating smart systems.

 









Technology Labs

Lidar - Pollinator Testing

Talk to us

Have any questions? We are always open to talk about your new projects and how we can help you.