When Trackers Date Fish

A Benchmark and Framework for Underwater Multiple Fish Tracking

Weiran Li, Yeqiang Liu, Qiannan Guo, Yijie Wei, Hwa Liang Leo, Zhenbo Li*

Abstract

Multiple object tracking (MOT) technology has made significant progress in terrestrial applications, but underwater tracking scenarios remain underexplored despite their importance to marine ecology and aquaculture. We present Multiple Fish Tracking Dataset 2025 (MFT25), the first comprehensive dataset specifically designed for underwater multiple fish tracking, featuring 15 diverse video sequences with 408,578 meticulously annotated bounding boxes across 48,066 frames.

Key Results

15
Video Sequences
408K
Annotated Boxes
34.1
HOTA
44.6
IDF1

Key Features

MFT25 Dataset

  • Multiple Object Tracking (MOT) Benchmark, multiple fish species
  • 15 diverse video sequences
  • 408,578 annotated bounding boxes
  • 48,066 frames
  • Actual farm collection & laboratory scene collection

SU-T Framework

  • Unscented Kalman Filter (UKF)
  • Fish-Intersection-over-Union (FishIoU)
  • Optimized for non-linear fish swimming
  • State-of-the-art performance
  • Specialized for aquatic species

Downloads

Dataset

Download the MFT25 dataset from BaiduYun:

Dataset includes:

  • 15 video sequences
  • Annotation files
  • README and documentation

Pretrained Models

Download our pretrained models:

Models include:

  • SU-T base model
  • SU-T with ReID module
  • Model checkpoints

Sample Results

Framework Overview

Performance Comparison

Method Class Year HOTA↑ IDF1↑
FairMOT JDE 2021 22.226 26.867
CMFTNet JDE 2022 22.432 27.659
TransTrack TF 2021 30.426 35.215
TransCenter TF 2023 27.896 30.278
TrackFormer TF 2022 30.361 35.285
TFMFT TF 2024 25.440 33.950
SORT SDE 2016 29.063 34.119
ByteTrack SDE 2022 31.758 40.355
BoT-SORT SDE 2022 26.848 36.847
OC-SORT SDE 2023 25.017 34.620
Deep-OC-SORT SDE 2023 24.848 34.176
HybridSORT SDE 2024 32.258 38.421
HybridSORT† SDE 2024 32.705 41.727
SU-T (Ours) SDE 2025 33.351 41.717
SU-T† (Ours) SDE 2025 34.067 44.643

Contact

For questions and discussions, please contact:

vranlee@cau.edu.cn or weiranli@u.nus.edu