Getting started

Using YASS pre-built pipelines

YASS configuration file

YASS is configured using a YAML file, below is an example of such configuration:

#####################################################################
# YASS configuration example (only required values)                 #
# for a complete reference see examples/config_sample_complete.yaml #
#####################################################################

data:
  root_folder: data/
  recordings: neuropixel.bin
  geometry: neuropixel_channels.npy

resources:
  max_memory: 200MB

recordings:
  dtype: int16
  sampling_rate: 30000
  n_channels: 10
  spatial_radius: 70
  spike_size_ms: 1
  order: samples

preprocess:
  apply_filter: True
  dtype: float32

detect:
  method: threshold
  temporal_features: 3

If you want to use a Neural Network as detector, you need to provide your own Neural Network. YASS provides tools for easily training the model, see this tutorial for details.

If you do now want to use a Neural Network, you can use the threshold detector instead.

For details regarding the configuration file see YASS configuration file.

Running YASS from the command line

After installing yass, you can sort spikes from the command line:

yass sort path/to/config.yaml

Run the following command for more information:

yass sort --help

Running YASS in a Python script

import logging

import yass
from yass import preprocess
from yass import detect
from yass import cluster
from yass import templates
from yass import deconvolute

# configure logging module to get useful information
logging.basicConfig(level=logging.INFO)

# set yass configuration parameters
yass.set_config('config_sample.yaml')

(standarized_path, standarized_params, channel_index,
 whiten_filter) = preprocess.run()

(score, spike_index_clear,
 spike_index_all) = detect.run(standarized_path,
                               standarized_params,
                               channel_index,
                               whiten_filter)


spike_train_clear, tmp_loc, vbParam = cluster.run(
    score, spike_index_clear)

(templates_, spike_train,
 groups, idx_good_templates) = templates.run(
    spike_train_clear, tmp_loc)

spike_train = deconvolute.run(spike_index_all, templates_)