Triton: Igiugig Fish Video Analysis


Title: Triton: Igiugig Fish Video Analysis
Publication Date:
August 01, 2017
Document Number: PNNL-26576
Pages: 60
Sponsoring Organization:
Technology Type:

Document Access

Attachment: Access File
(13 MB)


Matzner, S.; Trostle, C.; Staines, G.; Hull, R.; Avila, A.; Harker-Klimes, G. (2017). Triton: Igiugig Fish Video Analysis. Report by Pacific Northwest National Laboratory (PNNL). pp 60.

Tidal and instream turbine technologies are currently being investigated for power generation in a variety of locations in the US. An environmental permitting and consenting requirement parallels this exploration generating the need to ensure little or no harm, in the form of strike or collision, befalls marine animals from device deployments. Monitoring methods (e.g., underwater cameras, active acoustics, passive acoustics) around turbine deployments provide empirical data allowing regulators and other stakeholders to assess risk. At present, there is a high level of concern and limited data precluding robust conclusions, which creates a challenge to regulators who must make decisions based on perceived risk versus actual risk. However, the data that are currently available to the scientific community for analysis indicate the issue to be of low risk to date, and strike or collision to be rare events. One such dataset that provides insight to the rarity of strike and collision risk to fish came from an instream turbine deployment in Alaska that used underwater video as the monitoring method.


This document describes the analysis of video data collected around the Ocean Renewable Power Company’s RivGen® device deployed in the Kvichak River during July and August 2015 to gain an understanding of the implications of using underwater video cameras as a fish monitoring technique. The data were analyzed manually and used to develop automated algorithms for detecting fish in the video frames and describing their interaction behavior relative to the device. In addition, Pacific Northwest National Laboratory (PNNL) researchers developed a web application, EyeSea, to combine manual and automated processing, so that ultimately the automated algorithms could be used to identify where human analysis was needed (i.e., when fish are present in video frames).


The goal of the project was to develop software algorithms that could identify video frames with fish present to inform and accelerate manual analysis. To achieve this, independent manual analysis was completed for specific video clips (i.e., visual analysis and annotation by a human observer was the standard for assessing the algorithms). The analysis process indicated that some confounding aspects of the algorithm development could potentially be solved with recommended improvements in the initial camera data collection methods.


The manual analysis began to look at all data from the start of deployment of the RivGen® device, primarily using video from Camera 2 that looked directly at the upstream side of the turbine so any interaction could be identified; this was to ensure rare events were seen, and initially focused on Nighttime Data when more fish were present. This process highlighted the amount of time it takes to identify fish, and ultimately only 42.33 hours of video were reviewed because of the time-consuming analysis. The data were classified as “Fish” when the reviewer was confident it was a fish, and “Maybe” fish when it was difficult to distinguish. The two classes were distinguished based on the movement, shape, and color characteristics. Fish Events were further classified by “adult”, “juvenile”, or “unidentifiable” age. Behavioral attributes were noted and were broadly divided into Passive and Avoidance activities. In over 42 hours of the data reviewed, there were only 20 potential contact interactions, of which 3 were Maybe classifications, 12 were juveniles, and 5 were adults. While only 11.5% of the video data were analyzed from Camera 2, these results are from the time when most fish were present over the turbine deployment period (from Alaska Department of Fish and Game data) and provide preliminary evidence that fish strike or collision of fish in the Kvichak River with an instream turbine is rare.


On only one occasion was an actual contact confirmed, and this was an adult fish that contacted the camera, not the turbine itself. This experience highlights the difficulties associated with confirming a strike or collision event as either having occurred or having been a near-miss. More interactions were detected at night; this was probably biased by nighttime use of artificial light, which may have attracted fish, but also could have increased detection probability because the light is reflected from the fish itself.


For the algorithm development, background subtraction, optical flow, and Deep Learning techniques were considered. The Deep Learning approach was determined to need too much training data for this application, so its use was not continued. The optical flow analysis was considered promising, but did not give immediate results, so it needs further investigation. Therefore, background subtraction was the main focus in algorithm development. Three methods of background subtraction were tried: Robust Principal Components Analysis (RPCA), Gaussian Mixture Model (GMM), and Video Background Extraction (ViBE). A classification technique was then applied to the foreground images to determine fish presence. Using this combination, fish could be accurately identified when occupying a higher number of pixels (>200 pixels, 98.2% correct; 100–200 pixels, 99.6% correct; 5–100 pixels, 85.4% correct; 2–5 pixels, 66.3% correct).


In parallel, EyeSea was developed to convert the video data to a usable form and to enable manual and automated analysis of the data that would have a standardized output.


Recommendations for further research, and optimizing methods for enhancing data collection and analysis include the following:

  • Research
    • Conduct more studies of the effect of lights on fish behavior.
    • Investigate the use of low light video applications as an alternative to using lights.
    • Further investigate optical flow techniques and their applicability for automated analysis.
    • Further refine the parameters for background subtraction in automated analysis.
  • Standardized techniques
    • Include markings on the turbines to determine relative range and size of fish within the field of view.
    • Use a standardized (non-proprietary) video format that has a consistent frame rate of at least 25 frames per second.
    • Use a scientific camera designed for underwater measurement in low light environments that has a field of view appropriate for the observations and a pixel resolution high enough to determine fish within the given range.
    • Carefully consider the use of lights and how they illuminate the areas of interest.
    • Standardized and detailed record keeping and metadata collection
    • Use other monitoring technologies (e.g., strain sensors on turbine blades) to determine actual collision or strike events.
Find Tethys on InstagramFind Tethys on FacebookFind Tethys on Twitter
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.