Fix latency feature bugs, add more latency tests#58
Closed
jacobbeierle wants to merge 1 commit intomasterfrom
Closed
Fix latency feature bugs, add more latency tests#58jacobbeierle wants to merge 1 commit intomasterfrom
jacobbeierle wants to merge 1 commit intomasterfrom
Conversation
michberger
reviewed
Mar 31, 2026
michberger
left a comment
There was a problem hiding this comment.
Unfortunately, this fix did not solve the problem with the latency calculations. I ran this branch on the Rett data and got the same incorrect latency values that we had before. We need to do a bit more digging, I'm afraid.
Author
|
I'm closing this pull request because the fundamental problem seems to be in the mouse-tracking-runtime repo. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
The latency_to_first_prediction and latency_to_last_prediction columns in the binned behavior data were reporting incorrect frame numbers. These columns are supposed to report frame numbers relative to the experiment start, but they were instead reporting frame numbers relative to the individual video file. This meant:
The first time bin (0–5 min) appeared correct by coincidence, since the first video starts at the experiment start (offset = 0).
All later time bins produced nonsensical values — often small frame numbers that could exceed the video's total frame count when interpreted in the experiment timeline, or that didn't match the merged bout tables.
What was changed
Source fix — project_utils.py:1083-1084
The two latency calculations in BoutTable.bouts_to_bins() were changed from using the video-local start column to using the experiment-relative adjusted_start and adjusted_end columns:
Before (buggy):
results["latency_to_first_prediction"] = behavior_bins["start"].min()
results["latency_to_last_prediction"] = (
behavior_bins["start"] + behavior_bins["duration"]
).max()
After (fixed):
results["latency_to_first_prediction"] = behavior_bins["adjusted_start"].min()
results["latency_to_last_prediction"] = behavior_bins["adjusted_end"].max()
start — the frame number where a bout begins within its video file (resets to 0 for each video)
adjusted_start — the frame number where a bout begins relative to the experiment start, computed as time_to_frame(video_timestamp, experiment_start, fps) + start
adjusted_end — adjusted_start + duration, the experiment-relative frame where the bout ends
Regression tests — test_project_utils.py
Two new test methods were added:
test_bouts_to_bins_latency_to_first_prediction_nonzero_offset — constructs a DataFrame with two videos at different timestamps (00:00:00 and 00:05:00), uses 5-minute bins, and asserts that the second bin's latency_to_first_prediction equals 9000 + 200 = 9200 (experiment-relative), not 200 (video-local).
test_bouts_to_bins_latency_to_last_prediction_nonzero_offset — same multi-video setup, asserts that the second bin's latency_to_last_prediction equals 9000 + 800 + 40 = 9840, not 840.
The existing single-video tests (which use time="1970-01-01 00:00:00" where adjusted_start == start) were preserved and still pass. The new tests use multiple distinct timestamps specifically because bouts_to_bins() falls back to epoch when all timestamps are identical — multi-video data is required to exercise the bug.