placecell.behavior#
Behavior data loading and processing.
Functions
|
Match neural events to behavior positions for place-cell analysis. |
|
Clip positions to arena boundaries. |
|
Compute speed from behavior positions and timestamps using a window. |
|
Correct perspective distortion from overhead camera parallax. |
|
Filter trajectory to frames above a speed threshold. |
|
Recompute speed on a trajectory that already has |
|
Replace implausible position jumps with linear interpolation. |
- placecell.behavior.remove_position_jumps(positions: DataFrame, threshold_px: float) tuple[DataFrame, int]#
Replace implausible position jumps with linear interpolation.
A jump is detected when the frame-to-frame displacement exceeds threshold_px. Jumped frames have their x/y replaced by linearly interpolated values from the surrounding good frames.
- Parameters:
positions (DataFrame) – DataFrame with columns
x,y(and any others, preserved).threshold_px (float) – Maximum plausible frame-to-frame displacement in pixels.
- Return type:
tuple of (DataFrame with jumps interpolated, number of frames fixed).
- placecell.behavior.correct_perspective(positions: DataFrame, arena_bounds: tuple[float, float, float, float], camera_height_mm: float, tracking_height_mm: float) DataFrame#
Correct perspective distortion from overhead camera parallax.
An LED at height h above the floor appears shifted radially outward from the optical axis. The corrected position is:
x_corrected = cx + (x - cx) * (H - h) / H
where cx, cy is the arena center (midpoint of arena_bounds), H is the camera height, and h is the tracking height.
- Parameters:
positions (DataFrame) – DataFrame with columns
x,y.arena_bounds (tuple[float, float, float, float]) – (x_min, x_max, y_min, y_max) in pixels.
camera_height_mm (float) – Camera height above floor in mm.
tracking_height_mm (float) – Tracked point height above floor in mm.
- Return type:
DataFrame with corrected
x,y.
- placecell.behavior.clip_to_arena(positions: DataFrame, arena_bounds: tuple[float, float, float, float]) DataFrame#
Clip positions to arena boundaries.
Points outside the arena (from detection errors) are clamped to the nearest boundary edge.
- Parameters:
positions (DataFrame) – DataFrame with columns
x,y.arena_bounds (tuple[float, float, float, float]) – (x_min, x_max, y_min, y_max) in pixels.
- Return type:
DataFrame with
x,yclipped to arena bounds.
- placecell.behavior.recompute_speed(trajectory: DataFrame, window_frames: int) DataFrame#
Recompute speed on a trajectory that already has
x,y,unix_time.Use this after spatial corrections (jump removal, perspective, clipping) to update the
speedcolumn from the corrected positions.- Parameters:
trajectory (DataFrame) – DataFrame with columns
x,y,unix_time.window_frames (int) – Number of frames to look ahead for speed calculation.
- Return type:
DataFrame with
speedcolumn overwritten (in position-units / s).
- placecell.behavior.filter_by_speed(trajectory: DataFrame, speed_threshold: float) DataFrame#
Filter trajectory to frames above a speed threshold.
- Parameters:
trajectory (DataFrame) – DataFrame with columns
frame_indexandspeed.speed_threshold (float) – Minimum speed to keep.
- Returns:
Filtered copy, sorted by frame index, with
frame_indexrenamedto
beh_frame_index.
- Return type:
DataFrame
- placecell.behavior.compute_behavior_speed(positions: DataFrame, timestamps: DataFrame, window_frames: int = 10) DataFrame#
Compute speed from behavior positions and timestamps using a window.
Speed is calculated over a window of frames for stability, especially useful at high frame rates where consecutive frame differences are noisy.
- Parameters:
positions (DataFrame) – DataFrame with columns frame_index, x, y in pixels.
timestamps (DataFrame) – DataFrame with columns frame_index, unix_time in seconds.
window_frames (int) – Number of frames to use for speed calculation. Speed is computed as distance traveled over this window divided by time elapsed.
- Return type:
DataFrame with speed in pixels/s.
- placecell.behavior.build_event_place_dataframe(event_index: DataFrame, neural_timestamp_path: Path, behavior_with_speed: DataFrame, behavior_fps: float, speed_threshold: float = 50.0) DataFrame#
Match neural events to behavior positions for place-cell analysis.
For each event, finds the closest behavior frame in time, filters out matches where the timestamp difference exceeds 0.5 / behavior_fps, and drops events below the speed threshold.
- Parameters:
event_index (DataFrame) – DataFrame with columns: unit_id, frame, s.
neural_timestamp_path (Path) – Path to neural timestamp CSV (columns: frame, timestamp_first, timestamp_last).
behavior_with_speed (DataFrame) – Trajectory DataFrame with columns: frame_index, x, y, unix_time, speed.
behavior_fps (float) – Behavior sampling rate (Hz).
speed_threshold (float) – Minimum running speed to keep events (pixels/s).
- Returns:
DataFrame with columns (unit_id, frame, s, neural_time,)
beh_frame_index, beh_time, x, y, speed.
- Return type:
DataFrame