9. References
Aguzzi, J., Chatzievangelou, D., Marini, S., Fanelli, E., Danovaro, R., Flogel, S., . . . Company, J.B. (2019). New high-tech flexible networks for the monitoring of deep-sea ecosystems. Environmental Science & Technology, 53, 6616-6631. http://dx.doi.org/10.1021/acs.est.9b00409
Arvind, C.S., Prajwal, R., Bhat, P.N., Sreedevi, A., Prabhudeva, K.N. & Ieee (2019). Fish detection and tracking in pisciculture environment using deep instance segmentation. In Proceedings of the 2019 ieee region 10 conference , (pp. 778-783). New York: IEEE.
Beyan, C. & Browman, H.I. (2020). Setting the stage for the machine intelligence era in marine science. ICES Journal of Marine Science, http://dx.doi.org/10.1093/icesjms/fsaa084
Bicknell, A.W.J., Godley, B.J., Sheehan, E.V., Votier, S.C. & Witt, M.J. (2016). Camera technology for monitoring marine biodiversity and human impact. Frontiers in Ecology and the Environment, 14, 424-432. http://dx.doi.org/10.1002/fee.1322
Bingshan, N., Guangyao, L., Fang, P., Jing, W., Long, Z. & Li, Z. (2018). Survey of fish behavior analysis by computer vision. Journal of Aquaculture Research and Development, 9, 15. http://dx.doi.org/10.4172/2155-9546.1000534
Bolme, D.S., Beveridge, J.R., Draper, B.A. & Lui, Y.M. (2010) Visual object tracking using adaptive correlation filters. IEEE Conference on Computer Vision and Pattern Recognition , pp. 2544-2550.
Botella, C., Joly, A., Bonnet, P., Monestiez, P. & Munoz, F. (2018). Species distribution modeling based on the automated identification of citizen observations. Applications in plant sciences, 6, e1029-e1029. http://dx.doi.org/10.1002/aps3.1029
Bruneel, S., Gobeyn, S., Verhelst, P., Reubens, J., Moens, T. & Goethals, P. (2018). Implications of movement for species distribution models - rethinking environmental data tools. Science of the Total Environment, 628-629, 893-905. http://dx.doi.org/10.1016/j.scitotenv.2018.02.026
Cheng, J.C., Tsai, Y.H., Hung, W.C., Wang, S.J. & Yang, M.H. (2018) Fast and accurate online video object segmentation via tracking parts.IEEE Conference on Computer Vision and Pattern Recognition , pp. 7415-7424.
Christin, S., Hervet, E. & Lecomte, N. (2019). Applications for deep learning in ecology. Methods in Ecology and Evolution, 10, http://dx.doi.org/10.1111/2041-210x.13256
Chuang, M., Hwang, J., Ye, J., Huang, S. & Williams, K. (2017). Underwater fish tracking for moving cameras based on deformable multiple kernels. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 47, 2467-2477. http://dx.doi.org/10.1109/TSMC.2016.2523943
Cui, S., Zhou, Y., Wang, Y. & Zhai, L. (2020). Fish detection using deep learning. Applied Computational Intelligence and Soft Computing, 2020, 3738108. http://dx.doi.org/10.1155/2020/3738108
Ditria, E., Sievers, M., Lopez-Marcano, S., Jinks, E.L. & Connolly, R.M. (2020a). Deep learning for automated analysis of fish abundance: The benefits of training across multiple habitats. bioRxiv, http://dx.doi.org/10.1101/2020.05.19.105056
Ditria, E.M., Lopez-Marcano, S., Sievers, M., Jinks, E.L., Brown, C.J. & Connolly, R.M. (2020b). Automating the analysis of fish abundance using object detection: Optimizing animal ecology with deep learning. Frontiers in Marine Science, 7, http://dx.doi.org/10.3389/fmars.2020.00429
Everingham, M., Van Gool, L., Williams, C.K.I., Winn, J. & Zisserman, A. (2010). The pascal visual object classes (voc) challenge. International Journal of Computer Vision, 88, 303-338. http://dx.doi.org/10.1007/s11263-009-0275-4
Francisco, F.A., Nührenberg, P. & Jordan, A. (2020). High-resolution, non-invasive animal tracking and reconstruction of local environment in aquatic ecosystems. Movement Ecology, 8, 27. http://dx.doi.org/10.1186/s40462-020-00214-w
Glover-Kapfer, P., Soto-Navarro, C.A. & Wearn, O.R. (2019). Camera-trapping version 3.0: Current constraints and future priorities for development. Remote Sensing in Ecology and Conservation, 5, 209-223. http://dx.doi.org/10.1002/rse2.106
González-Rivero, M., Beijbom, O., Rodriguez-Ramirez, A., Bryant, E.P.D., Ganase, A., Gonzalez-Marrero, Y., . . . Hoegh-Guldberg, O. (2020). Monitoring of coral reefs using artificial intelligence: A feasible and cost-effective approach. Remote Sensing, 12, http://dx.doi.org/10.3390/rs12030489
Grothendieck, G. (2017) Sqldf: Manipulate r data frames using sql. R package version 0.4-11.
Guirado, E., Tabik, S., Rivas, M.L., Alcaraz-Segura, D. & Herrera, F. (2019). Whale counting in satellite and aerial images with deep learning. Scientific Reports, 9, 14259. http://dx.doi.org/10.1038/s41598-019-50795-9
Guo, S., Xu, P., Miao, Q., Shao, G., Chapman, C.A., Chen, X., . . . Li, B. (2020). Automatic identification of individual primates with deep learning techniques. iScience, 23, 101412. http://dx.doi.org/https://doi.org/10.1016/j.isci.2020.101412
Han, W., Khorrami, P., Le Paine, T., Ramachandran, P., Babaeizadeh, M., Shi, H., . . . Huang, T. (2016). Seq-nms for video object detection. ArXiv, 3, http://dx.doi.org/arXiv:1602.08465
He, K.M., Gkioxari, G., Dollar, P. & Girshick, R. (2017). Mask r-cnn. IEEE International Conference on Computer Vision, 2980-2988. http://dx.doi.org/10.1109/Iccv.2017.322
Huo, G., Wu, Z., Li, J. & Li, S. (2018). Underwater target detection and 3d reconstruction system based on binocular vision. Sensors, 18, 3570. http://dx.doi.org/10.3390/s18103570
Jalal, A., Salman, A., Mian, A., Shortis, M. & Shafait, F. (2020). Fish detection and species classification in underwater environments using deep learning with temporal information. Ecological Informatics, 57, 101088. http://dx.doi.org/10.1016/j.ecoinf.2020.101088
Kälin, U., Lang, N., Hug, C., Gessler, A. & Wegner, J.D. (2019). Defoliation estimation of forest trees from ground-level images. Remote Sensing of Environment, 223, 143-153. http://dx.doi.org/10.1016/j.rse.2018.12.021
Kennedy, E.V., Vercelloni, J., Neal, B.P., Ambariyanto, Bryant, D.E.P., Ganase, A., . . . Hoegh-Guldberg, O. (2020). Coral reef community changes in karimunjawa national park, indonesia: Assessing the efficacy of management in the face of local and global stressors. Journal of Marine Science and Engineering, 8, 760. http://dx.doi.org/10.3390/jmse8100760
Kezebou, L., Oludare, V., Panetta, K. & Agaian, S.S. (2019) Underwater object tracking benchmark and dataset. 2019 IEEE International Symposium on Technologies for Homeland Security (HST) , pp. 1-6.
Langlois, T., Goetze, J., Bond, T., Monk, J., Abesamis, R.A., Asher, J., . . . Harvey, E.S. (2020). A field and video annotation guide for baited remote underwater stereo-video surveys of demersal fish assemblages. Methods in Ecology and Evolution, http://dx.doi.org/10.1111/2041-210X.13470
Lantsova, E., Voitiuk, T., Zudilova, T. & Kaarna, A. (2016) Using low-quality video sequences for fish detection and tracking. 2016 SAI Computing Conference (SAI) , pp. 426-433.
Lecun, Y., Bengio, Y. & Hinton, G. (2015). Deep learning. Nature, 521, 436-444. http://dx.doi.org/10.1038/nature14539
Lopez-Marcano, S., Brown, C.J., Sievers, M. & Connolly, R.M. (2020). The slow rise of technology: Computer vision techniques in fish population connectivity. Aquatic Conservation: Marine and Freshwater Ecosystems, http://dx.doi.org/10.1002/aqc.3432
Marini, S., Fanelli, E., Sbragaglia, V., Azzurro, E., Fernandez, J.D. & Aguzzi, J. (2018). Tracking fish abundance by underwater image recognition. Scientific Reports, 8, 13748. http://dx.doi.org/10.1038/s41598-018-32089-8
Massa, F. & Girshick, R. (2018) Maskrcnn-benchmark: Fast, modular reference implementation of instance segmentation and object detection algorithms in pytorch. Retrieved 29 October 2020, from https://github.com/facebookresearch/maskrcnn-benchmark
Mohamed, H.E.-D., Fadl, A., Anas, O., Wageeh, Y., ElMasry, N., Nabil, A. & Atia, A. (2020). Msr-yolo: Method to enhance fish detection and tracking in fish farms. Procedia Computer Science, 170, 539-546. http://dx.doi.org/10.1016/j.procs.2020.03.123
Olds, A.D., Nagelkerken, I., M Huijbers, C., Gilby, B., Pittman, S. & Schlacher, T. (2018). Connectivity in coastal seascapes. In S.J. Pittman, Seascape ecology , (pp. 261-291). Hoboken, NJ:John Wiley & Sons Ltd.
Pagès, J.F., Gera, A., Romero, J. & Alcoverro, T. (2014). Matrix composition and patch edges influence plant–herbivore interactions in marine landscapes. Functional Ecology, 28, 1440-1448. http://dx.doi.org/10.1111/1365-2435.12286
Papadakis, V.M., Glaropoulos, A. & Kentouri, M. (2014). Sub-second analysis of fish behavior using a novel computer-vision system. Aquacultural Engineering, 62, 36-41. http://dx.doi.org/10.1016/j.aquaeng.2014.06.003
Prechelt, L. (2012). Early stopping — but when? In G. Montavon, G.B. Orr & K.-R. Müller, Neural networks: Tricks of the trade: Second edition , (pp. 53-67). Berlin, Heidelberg: Springer Berlin Heidelberg.
Qian, Z.-M., Wang, S.H., Cheng, X.E. & Chen, Y.Q. (2016). An effective and robust method for tracking multiple fish in video image based on fish head detection. BMC Bioinformatics, 17, 251. http://dx.doi.org/10.1186/s12859-016-1138-y
Rovero, F., Zimmermann, F., Berzi, D. & Meek, P. (2013). ”Which camera trap type and how many do i need?” A review of camera features and study designs for a range of wildlife research applications. Hystrix, the Italian Journal of Mammalogy, 24, 148-156. http://dx.doi.org/10.4404/hystrix-24.2-8789
Rowcliffe, J.M., Jansen, P.A., Kays, R., Kranstauber, B. & Carbone, C. (2016). Wildlife speed cameras: Measuring animal travel speed and day range using camera traps. Remote Sensing in Ecology and Conservation, 2, 84-94. http://dx.doi.org/10.1002/rse2.17
Salberg, A. (2015) Detection of seals in remote sensing images using features extracted from deep convolutional neural networks. 2015 IEEE International Geoscience and Remote Sensing Symposium (IGARSS) , pp. 1893-1896.
Salman, A., Jalal, A., Shafait, F., Mian, A., Shortis, M., Seager, J. & Harvey, E. (2016). Fish species classification in unconstrained underwater environments based on deep learning. Limnology and Oceanography: Methods, 14, 570-585. http://dx.doi.org/10.1002/lom3.10113
Schneider, S., Taylor, G.W. & Kremer, S.C. (2018). Deep learning object detection methods for ecological camera trap data. 2018 15th Conference on Computer and Robot Vision (Crv), 321-328. http://dx.doi.org/10.1109/Crv.2018.00052
Schneider, S., Taylor, G.W., Linquist, S. & Kremer, S.C. (2019). Past, present and future approaches using computer vision for animal re-identification from camera trap data. Methods in Ecology and Evolution, 10, 461-470. http://dx.doi.org/10.1111/2041-210X.13133
Sidhu, R. (2016). Tutorial on minimum output sum of squared error filter. Degree of Master of Science, Colorado State University.
Spampinato, C., Chen-Burger, Y.H., Nadarajan, G. & Fisher, R.B. (2008) Detecting, tracking and counting fish in low quality unconstrained underwater videos. VISAPP Proceedings of the Third International Conference on Computer Vision Theory and Applications , pp. 514-+.
Sridhar, V.H., Roche, D.G. & Gingins, S. (2019). Tracktor: Image-based automated tracking of animal movement and behaviour. Methods in Ecology and Evolution, 10, 815-820. http://dx.doi.org/10.1111/2041-210X.13166
van Gemert, J.C., Verschoor, C.R., Mettes, P., Epema, K., Koh, L.P. & Wich, S. (2015) Nature conservation drones for automatic localization and counting of animals. Computer Vision - ECCV 2014 Workshops(eds L. Agapito, M.M. Bronstein & C. Rother), pp. 255-270. Springer International Publishing, Cham.
Villon, S., Chaumont, M., Subsol, G., Villéger, S., Claverie, T. & Mouillot, D. (2016). Coral reef fish detection and recognition in underwater videos by supervised machine learning: Comparison between deep learning and hog+svm methods. In, (pp. 160-171). Springer International Publishing.
Villon, S., Mouillot, D., Chaumont, M., Darling, E.S., Subsol, G., Claverie, T. & Villeger, S. (2018). A deep learning method for accurate and fast identification of coral reef fishes in underwater images. Ecological Informatics, 48, 238-244. http://dx.doi.org/10.1016/j.ecoinf.2018.09.007
Villon, S., Mouillot, D., Chaumont, M., Subsol, G., Claverie, T. & Villéger, S. (2020). A new method to control error rates in automated species identification with deep learning algorithms. Scientific Reports, 10, 10972. http://dx.doi.org/10.1038/s41598-020-67573-7
Waldchen, J. & Mader, P. (2018). Machine learning for image based species identification. Methods in Ecology and Evolution, 9, 2216-2225. http://dx.doi.org/10.1111/2041-210x.13075
Wang, Q., Zhang, L., Bertinetto, L., Hu, W. & Torr, P. (2019). Fast online object tracking and segmentation: A unifying approach. ArXiv, http://dx.doi.org/arxiv:1812.05050
Watanabe, J.-I., Shao, Y. & Miura, N. (2019). Underwater and airborne monitoring of marine ecosystems and debris. Journal of Applied Remote Sensing, 13, 044509. http://dx.doi.org/10.1117/1.JRS.13.044509
Wearn, O.R. & Glover-Kapfer, P. (2019). Snap happy: Camera traps are an effective sampling tool when compared with alternative methods. Royal Society Open Science, 6, 181748. http://dx.doi.org/10.1098/rsos.181748
Weinstein, B.G. (2018). A computer vision for animal ecology. Journal of Animal Ecology, 87, 533-545. http://dx.doi.org/10.1111/1365-2656.12780
Wickham, H. (2009) Ggplot2: Elegant graphics for data analysis.Ggplot2: Elegant Graphics for Data Analysis , pp. 1-212.
Wickham, H. & Henry, L. (2019) Tidyr: Tidy messy data
Xiu, L., Min, S., Qin, H. & Liansheng, C. (2015) Fast accurate fish detection and recognition of underwater images with fast r-cnn. IEEE.
Xu, Z. & Cheng, X.E. (2017). Zebrafish tracking using convolutional neural networks. Scientific Reports, 7, 42815. http://dx.doi.org/10.1038/srep42815
Zhao, Z.-Q., Zheng, P., Xu, S.-t. & Wu, X. (2019). Object detection with deep learning: A review. ArXiv, http://dx.doi.org/arxiv:1807.05511
Zurell, D., Pollock, L.J. & Thuiller, W. (2018). Do joint species distribution models reliably detect interspecific interactions from co-occurrence data in homogenous environments? Ecography, 41, 1812-1819. http://dx.doi.org/10.1111/ecog.03315