Our Technology
Technologies Used
Our technology is covered by two pending patent applications:
1) System for Exchanging Image Information Between Remote Locations and Computers over LoRa and LoRaWAN Wireless Connections. Filed in January 2021.
2) System and Method for Detecting and Monitoring Species of Wildlife with a Light Detection and Ranging (LiDAR) Multi-Zone Rangefinder and Smart Camera. Filed in January 2022.
Maintaining healthy populations of insect pollinators, salmon, birds and other wildlife and has significant biological, cultural, and economic benefits. Significant data gaps exist for species that are critical for a sustainable economy and an ecologically healthy and stable planet. These gaps create conditions for controversies and poor land management decisions. FDT and FDS are developing novel camera trap technology that leverages the latest AI-ready microprocessors with the latest low-powered sensors and wireless technologies to maximize data acquisition from the field while using the least amount of battery power.
Traditional camera traps have proved invaluable for field research. However, commercial camera traps were not designed for land managers or scientists. Commercial camera traps are narrowly focused on providing visual images. Large memory cards filled with visual images might be interesting to look at by curious naturalists or sport hunters but they do not provide useful data for land managers or scientists. Scientists are left trying to extract useful data from voluminous disks of images. FDT and FDS have turned the process of interpreting images upside down by instead focusing on creating spreadsheets of tabular environmental and animal behavioral data. Our goal is a device that compiles the raw data into formats that are ready for quantitative analysis. Toward this goal we have integrated the following three technologies that take our devices far beyond the capabilities of commercially available trail cameras.
- Lidar as the key motion detection technology
- Lorawan to create Networked Data
- Cloud Dashboard for Data Access
The functionality of traditional trail cameras depends entirely on a passive infrared (PIR) sensor. PIR sensors have been around for decades, and are thus a time-tested way to detect animals moving against a background. PIRs detect motion by detecting a temperature differential between a moving object and the background. As such, they are ideal for detecting a warm-blooded animal moving against a cooler background. However, this approach has some drawbacks. PIRs are unreliable when an animal’s body temperature is the same as the background. As such, PIRs are a poor way to detect fish or amphibians in water. PIRs are equally unreliable for detecting insects or larger animals in tropical climates where there is little difference in temperature between the animals and the background. PIRs are also notoriously susceptible to false triggers from infrared sunlight that reflects off of windblown vegetation. Our technology incorporates a substantially different and newly available LiDAR sensor to detect motion. Instead of being susceptible to any and all temperature anomalies in the field of view, our LiDAR sensor defines a precise air space with millimeter accuracy. Any object that enters this airspace will trigger the sensor and activate our device regardless of temperature or ambient light anomalies. When our device is activated it then brings additional sensors, software and AI to bear on the task of interpreting the event. The onboard 5 megapixel camera takes visual images of the events. But whereas a traditional trail camera’s functionality ends when the images are stored, our devices use the images as verification of events in a detailed time series spreadsheet that is rich in behavioral and environmental data.
This LiDAR chip, introduced in October 2020, uses an infrared laser to measure distances to objects with millimeter accuracy. This breakthrough technology could enable low-cost detection of adult salmonids as they move over shallow riffles, or juveniles underwater. LiDAR chips have been incorporated into an existing field-proven digital platform that includes an AI-enabled smart camera and long-distance wireless link. Our devices are small (2 lbs), portable, and able to be installed without stream channel alterations. This enables deployments across numerous remote and unmonitored streams, or alongside existing survey methods. Following the extensive Phase I laboratory testing, Phase II will add field deployment and data collection. On-board AI will interpret detections and produce tabular data ready for analysis of population health, or to validate restoration of barriers and culverts.
In addition to compiling an onboard spreadsheet of data, the device can also send live updates to a cloud dashboard to provide assurance of field operation and additional graphical analyses of incoming data. In the Internet of Things (IoT) technology world, LoRaWAN will send the data the furthest with the least power. LoRAWAN is a radio protocol that creates a Long Range Wide Area Network. The smart cameras can report back to their base station over this network. Currently, once a minute the cameras can send 51 or 11 byte packages, depending on network quality. Our compression allows for tabular data, movement data, machine learning model outcomes, and thumbnails of images to be sent over a network.




















Fine detail LiDAR testing rig (first image), a small animal simulator with robot arm and customizable landscape(second image), artificial salmon stream with changeable substrate and up to 2,000 gallons per hour flow(third image).
We have done over 300,000 individual LiDAR detection trials on our experimental systems, making us the leading experts in how low resolution LiDAR can be utilized for wildlife identification and activating smart systems.
Technology Labs
Lidar - Pollinator Testing
Talk to us
Have any questions? We are always open to talk about your new projects and how we can help you.