Underwater Plastic Detection
Unit(s) of assessment: Computer Science and Informatics
Research theme: Computational Intelligence and Applications Research Group (CIA)
School: School of Science and Technology
Plastics thrown away by humans are normally transported to the oceans by rivers. It is not clear how much plastic is transported every year from land to seas because of lack of metrics and standard monitorisation techniques. One of the approaches is on the monitorisation of plastics floating on the water surface. Only a small part of the plastic waste in rivers and oceans floats on the surface, the rest sinks to deeper waters or to rivers/oceans floor, threatening the local flora and fauna. So far, there is no way of detecting plastic at the bottom of rivers/oceans on a large scale. Traditional monitoring methods, in which divers manually collect image data along lines or taut cords (so-called transects) only allow for assertions about very limited areas. In addition, these methods are highly time-consuming, expensive and with very limited results. Generally, such methods neither provide georeferenced data that can be used to find locations again, for example to recover plastic or check its condition.
In this research project, we propose a non-evasive approach where an underwater drone, funded by the department back in July 2021, is used to collect visual data that will be georeferenced to assess how macro-plastics are transported underwater, how much of that plastic is deposited in the bottom of the Trent River and how those plastics affect the ecosystem. The multi-disciplinary team involved in this project will analyse data from different perspectives and use the findings to prepare and submit a research proposal to BBSRC/UKRI to further explore and create understanding how much plastic is carried underwater, how much of the plastic deposits on the Trent’s River soil and how it affects the ecosystem. This research project aims to develop a proof-of-concept validator. The project, if funded, will allow to prepare the existing underwater drone to systematically collect and label data to be post-processed by the Academic staff. A research assistant will support the academic staff to collect the initial datasets, to be analysed and generate the initial results that will be used to prepare a stronger proposal to be submitted to BBSRC/UKRI thematic calls on underwater plastic detection.
To develop a programme to:
- Design and implement a systematic approach for collect and label visual datasets using an underwater drone.
- Process the dataset using ML/AI/statistical analysis algorithms to understand how plastic is transported underwater, deposit in the soil at the bottom of the Trent River and how it affects the ecosystem.
- Design and implement a procedure to systematically collect and label datasets.
- Collect the initial datasets to be processed using ML/AI/statistical analysis algorithms
- Demonstrate the dataset collection using an underwater drone.
Download the UPD dataset from https://zenodo.org/record/6907230#.Yt_INOzMJH4
Underwater Plastic Detection
This video shows plastic and crayfish being detected. YOLOv5 nano is being used to perform classification in real-time.
It can be seen that the model is robust and capable of detecting both plastic and crayfish in low-visibility conditions
The dataset was annotated, augmented and generated using the Roboflow platform.
Mr Flemming Christensen – Sundance Multiprocessor Technology
Mr. Matt Easter - Trent River Trust
Dr Pedro Machado (PI) https://www.ntu.ac.uk/staff-profiles/science-technology/pedro-miguel-baptista-machado
Dr Doratha Vinkemeier https://www.ntu.ac.uk/staff-profiles/science-technology/doratha-vinkemeier
Dr Farhad Fassihi Tash https://www.ntu.ac.uk/staff-profiles/science-technology/farhad-fassihi-tash
Dr Isibor Kennedy Ihianle https://www.ntu.ac.uk/staff-profiles/science-technology/Isibor-Kennedy-Ihianle
Dr Omprakash Kaiwartya https://www.ntu.ac.uk/staff-profiles/science-technology/omprakash-kaiwartya
Francisco de Lemos
P. Machado, A. Oikonomou, J. F. Ferreira and T. M. Mcginnity, "HSMD: An Object Motion Detection Algorithm Using a Hybrid Spiking Neural Network Architecture," in IEEE Access, vol. 9, pp. 125258-125268, 2021, doi: 10.1109/ACCESS.2021.3111005.
Bottcher, W.; Machado, P.; Lama, N.; and McGinnity, T.M.; 2021. Object recognition for robotics from tactile time series data utilising different neural network architectures. In: Proceedings of 2021 International Joint Conference on Neural Networks (IJCNN 2021), Virtual Event, 18-22 July 2021.
BRANDENBURG, S.,MACHADO, P., SHINDE, P., FERREIRA, J.F. and MCGINNITY, T.M., 2019. Object classification for robotic platforms. In: ROBOT 2019: Fourth Iberian Robotics Conference, Porto, Portugal, 20-22 November 2019.