To identify Salmon species in the Nisqually River, the Nisqually Indian Tribe has installed one video camera and infrared sensors in a ﬁsh ladder at a dam on the river. The camera is triggered to capture 30 seconds of video when any ﬁsh swims past the infrared sensors. Then the recorded videos are scrutinized manually to identify and name the ﬁsh species in it.
The manual process to identify and name any living species through captured videos is resource intensive from a time, human, and cost perspective. So, when the Nisqually River Foundation, a Washington based nature conservation organization, encountered a similar challenge to measure and monitor the Salmon species ﬁsh identiﬁcation, they approached us for an automated technology-driven solution.
First, the collected video feeds were processed to extract the relevant frames. Deep learning AI models were then trained to draw bounding boxes around each ﬁsh passing by the camera. The entire workﬂow encapsulated in a Web App automated the process of video feed input, detection and classiﬁcation. The automated AI solution leveraged the latest implemented deep learning algorithms using the Microsoft Azure and Cognitive Services platform stack.
Given the nature of the problem and the format of the video ﬁles, processing power was a key requirement for the training and validation phases. A GPU machine was the natural choice to run the object detection models. Hence, we selected the NC6 GPU VM in the Azure portal.
The Microsoft AI for Earth team was a key enabler and inﬂuencer for project’s success, through timely access for technical support and resolution of AI platform queries.
This web-based AI solution would save the client, valuable hours of expert biologist time and infrastructure costs spent in manually reviewing the videos. As part of a planned upgrade, an enhanced version of the solution has been provided to the customer, which predicts to deliver substantial cost savings to the tune of 80%.