Development of a live object-tracking system
The Quick Take
Zebrafish behaviour experiments are generally recorded using a stationary overhead camera running at 200Hz. This works well for measuring most behaviour, but it lacks the resolution to observe finer details - such as the pectoral fins. A new setup was devised in which the camera would be placed closer to the fish, and move with it. My part in this project was to develop the pipeline from image acquisition to motor correction. I streamlined the pipeline and redesigned the detection algorithm to work at sub-millisecond speed, leading to a 16x speed improvement.
Introduction
Behaviour experiments are conducted with a stationary camera. The amount of detail that this camera can capture is constrained by two factors: (1) the distance to the fish; and (2) the computer's processing speed. Placing the camera closer would require a smaller arena to prevent the fish from going out of view. If the camera were to get a higher resolution, it would not be possible to process frames at 200Hz, which is required to capture all details of the behaviour.
As a new project in the lab needed to capture more details of the fish (the pectoral fins in particular), the following solution was proposed: the camera would be mounted to a linear-motor system, and move with the fish such that it can be closer to the fish. In addition, this allowed for a bigger arena size (only constrained by the reach of the motor system) and by using a motor on the z-axis, it would also be possible to keep track of the fish's depth.
Generally, optogenetic experiments are carried out by flashing light on the whole brain. However, this does not allow for precise spatial control, especially because restricting or changing the expression of the optogenetic gene to a specific brain region is tedious. It also makes it difficult to dynamically change which neurons are targeted over time.
This also introduced new challenges. Zebrafish can accelerate and swim quite fast, especially during 'startle responses' - sudden bursts of movement triggered by external stimuli. This requires a fast response time from the system and is especially challenging because linear-motor systems are typically optimised for precision rather than fast response times. In addition, accelerating too fast will lead to shakiness - both bad for the fish and for the camera.
Execution
As I joined the project, an initial pipeline was already there together with an old motor system that was available at the time. However, the motor system responded too slowly, and was in the process of being replaced. In the meantime, I analysed the existing pipeline and found the response time to be around 16 milliseconds.
A major bottleneck in the detection algorithm was that it kept track of the whole body and the relative position of body parts - useful for behaviour analysis, but not so much for fast detection. Instead, I made use of the fact that the eyes showed up as two big black spots next to each other. By detecting one eye and checking if the other eye was within the expected range, the fish could be reliably detected.
The software used for experiments was written in Python, allowing me to achieve considerable speed gains by sticking to OpenCV functions and using Numba for just-in-time compilation.
After I integrated the new motor system into the pipeline, it was possible to keep the fish in the field of vision, even during startle responses. What remained was to devise an algorithm for tracking the relative depth of the fish. The change of depth had to be estimated using the sharpness of the fish. This was particularly tricky, as it was impossible to know whether the fish moved up or down without moving the camera and measuring the sharpness again. In addition, the sharpness varied between fish, had a reasonable amount of noise, and changed in a nonlinear fashion.
I was able to keep track of the fish using iterative optimisation, where the motor randomly moves up and down in search of a better sharpness. The step size increases when the sharpness is low, and decreases when the sharpness is high. This approach allowed the system to operate in a noisy environment and quickly adapt to changes in the fish depth.
Results
The final pipeline achieved real-time tracking and kept the fish in view, even when it made startle responses. This was made possible by optimising the detection algorithm to a sub-millisecond speed and by upgrading the motor system. I was also able to develop an optimisation algorithm that tracked the relative depth of the fish and kept it in focus by utilising its sharpness in the image.
Nathan