Quickstart#
placecell supports two workflows:
arenafor 2D open-field analysismazefor 1D arm/graph-based analysis
Use the notebook or Python API after preparing the right data files for your workflow.
Required Files#
Neural data directory (e.g., minian/ output):
{trace_name}.zarr: calcium traces (e.g.,C.zarrorC_lp.zarr)A.zarr: spatial footprints for cell overlay (optional)max_proj.zarr: max projection image for visualization (optional)
Always required:
neural_timestamp.csv: neural frame timestampsbehavior_timestamp.csv: behavior frame timestamps
Arena (2D) behavior input:
behavior_position.csv: animal position with bodypart columns (DeepLabCut format)
Maze (1D) behavior input:
behavior_position.csv: raw DLC tracking CSV (x,y,likelihood)behavior_graph.yaml: zone polygons + adjacency grapharm_orderindata_paths.yaml
Configuration files:
config.yaml: analysis parametersdata_paths.yaml: paths to your data files
Setup#
1. Create data paths config#
Create data_paths.yaml with paths relative to this file:
arena data_paths.yaml
type: arena
neural_path: path/to/minian_output
neural_timestamp: path/to/neural_timestamp.csv
behavior_position: path/to/behavior_position.csv
behavior_timestamp: path/to/behavior_timestamp.csv
behavior_fps: 20.0
bodypart: LED
maze data_paths.yaml
type: maze
neural_path: path/to/minian_output
neural_timestamp: path/to/neural_timestamp.csv
behavior_timestamp: path/to/behavior_timestamp.csv
behavior_position: path/to/behavior_position.csv # raw DLC output
behavior_graph: path/to/behavior_graph.yaml # zone polygons + adjacency
# zone_tracking: path/to/zone_tracking.csv # optional; defaults to zone_tracking_{stem}.csv
behavior_fps: 20.0
bodypart: LED
arm_order:
- Arm_1
- Arm_2
- Arm_3
- Arm_4
placecell is scorer-agnostic for DLC-style CSVs; configure the correct bodypart, and the scorer name is read from the file header.
1b. Maze: zone detection#
placecell analysis runs zone detection automatically on first use. To drive it explicitly (e.g. to inspect the validation video, or to iterate on zone_detection parameters):
Create
behavior_graph.yamlwithplacecell define-zones -d data_paths.yaml --rooms <n> --arms <n>.Run
placecell detect-zones -d data_paths.yaml.Use
placecell analysis --force-redetectto refresh the cached CSV after parameter changes.
2. Create analysis config#
Create config.yaml with analysis parameters:
arena config.yaml
neural:
fps: 20.0
trace_name: C
oasis:
g: [1.60, -0.63]
baseline: p10
penalty: 0
behavior:
type: arena
speed_threshold: 10.0
speed_window_seconds: 0.25
spatial_map_2d:
bins: 50
min_occupancy: 0.05
spatial_sigma: 3
n_shuffles: 500
p_value_threshold: 0.05
maze config.yaml
neural:
fps: 20.0
trace_name: C_lp
oasis:
g: [1.60, -0.63]
baseline: p10
penalty: 0.8
behavior:
type: maze
speed_threshold: 25
speed_window_seconds: 0.25
spatial_map_1d:
bin_width_mm: 10
min_occupancy: 0.025
spatial_sigma: 2
n_shuffles: 500
p_value_threshold: 0.05
split_by_direction: true
require_complete_traversal: true
3. Run the analysis#
placecell analysis -c config.yaml -d data_paths.yaml
Or via Python:
from placecell.dataset import BasePlaceCellDataset
ds = BasePlaceCellDataset.from_yaml("config.yaml", "data_paths.yaml")
ds.load()
ds.preprocess_behavior()
ds.deconvolve()
ds.match_events()
ds.compute_occupancy()
ds.analyze_units()
ds.save_bundle("output/session_name")
For batch processing, see examples/batch_analysis.py.
Output#
The pipeline saves a .pcellbundle directory containing all results and summary figures. Key outputs:
canonical.parquet— per-neural-frame table with position, speed, and deconvolved activity per unitfigures/occupancy.pdf— trajectory density and occupancy with split-half comparisonfigures/behavior_preview.pdf— trajectory and speed distributionfigures/diagnostics.pdf— SI and stability distributionsfigures/summary_scatter.pdf— SI vs stability with place cell classificationfigures/speed_traces.pdf— speed and place cell traces over time
To browse results interactively, open notebook/view_results_arena.ipynb or notebook/view_results_maze.ipynb in Jupyter Lab.
See Pipeline Details for the full list of summary figures and how the analysis works.