dandi / example-notebooks

https://www.dandiarchive.org/example-notebooks
Apache License 2.0
6 stars 17 forks source link

Index dandisets with notebooks #93

Closed bendichter closed 3 months ago

yarikoptic commented 3 months ago

I have submitted #94 to fix codespell... do you have your personal instance to checkout or how do you know it all works as expected? ;)

bendichter commented 3 months ago
<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <title>DANDI Datasets</title>
    <style>
        body {
            font-family: 'Arial', sans-serif;
            line-height: 1.6;
            margin: 0;
            padding: 20px;
        }
        .container {
            max-width: 1200px;
            margin: 0 auto;
        }
        h1 {
            text-align: center;
            margin-bottom: 30px;
        }
        .dandiset {
            border-radius: 8px;
            box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
            margin-bottom: 30px;
            padding: 20px;
            transition: box-shadow 0.3s ease;
        }
        .dandiset:hover {
            box-shadow: 0 4px 8px rgba(0, 0, 0, 0.15);
        }
        .dandiset h2 {
            margin-top: 0;
            margin-bottom: 15px;
        }
        .dandiset p {
            margin-bottom: 10px;
        }
        .dandiset a {
            text-decoration: none;
            font-weight: bold;
        }
        .dandiset a:hover {
            text-decoration: underline;
        }
        .notebooks {
            margin-top: 15px;
        }
        .notebooks h3 {
            font-size: 1.1em;
            margin-bottom: 10px;
        }
        .notebooks ul {
            list-style-type: disc;
            padding-left: 20px;
        }
        .notebooks li {
            margin-bottom: 5px;
        }
        @media (max-width: 768px) {
            body {
                padding: 10px;
            }
            .dandiset {
                padding: 15px;
            }
        }
    </style>
</head>
<body>
<div class="container">
    <h1>DANDI Datasets</h1>

    <div class="dandiset">
        <h2>A NWB-based dataset and processing pipeline of human single-neuron activity during a declarative memory task</h2>
        <p><strong>ID:</strong> 000004</p>
        <p><strong>Description:</strong> A challenge for data sharing in systems neuroscience is the multitude of different data formats used. Neurodata Without Borders: Neurophysiology 2.0 (NWB:N) has emerged as a standardized data format for the storage of cellular-level data together with meta-data, stimulus information, and behavior. A key next step to facilitate NWB:N adoption is to provide easy to use processing pipelines to import/export data from/to NWB:N. Here, we present a NWB-formatted dataset of 1863 single neurons recorded from the medial temporal lobes of 59 human subjects undergoing intracranial monitoring while they performed a recognition memory task. We provide code to analyze and export/import stimuli, behavior, and electrophysiological recordings to/from NWB in both MATLAB and Python. The data files are NWB:N compliant, which affords interoperability between programming languages and operating systems. This combined data and code release is a case study for how to utilize NWB:N for human single-neuron recordings and enables easy re-use of this hard-to-obtain data for both teaching and research on the mechanisms of human memory.</p>
        <p><a href="https://dandiarchive.org/dandiset/000004" target="_blank">View on DANDI Archive</a></p>

        <div class="notebooks">
            <h3>Notebooks:</h3>
            <ul>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000004/RutishauserLab/000004_demo_analysis.ipynb" target="_blank">RutishauserLab/000004_demo_analysis.ipynb</a></li>

            </ul>
        </div>

    </div>

    <div class="dandiset">
        <h2>Electrophysiology data from thalamic and cortical neurons during somatosensation</h2>
        <p><strong>ID:</strong> 000005</p>
        <p><strong>Description:</strong> intracellular and extracellular electrophysiology recordings performed on mouse barrel cortex and ventral posterolateral nucleus (vpm) in whisker-based object locating task.</p>
        <p><a href="https://dandiarchive.org/dandiset/000005" target="_blank">View on DANDI Archive</a></p>

        <div class="notebooks">
            <h3>Notebooks:</h3>
            <ul>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000005/DataJoint/DJ-NWB-Yu-Gutnisky-2016/notebooks/Yu-Gutnisky-2016-examples.ipynb" target="_blank">DataJoint/DJ-NWB-Yu-Gutnisky-2016/notebooks/Yu-Gutnisky-2016-examples.ipynb</a></li>

            </ul>
        </div>

    </div>

    <div class="dandiset">
        <h2>Mouse anterior lateral motor cortex (ALM) in delay response task</h2>
        <p><strong>ID:</strong> 000006</p>
        <p><strong>Description:</strong> Extracellular electrophysiology recordings performed on mouse anterior lateral motor cortex (ALM) in delay response task. Neural activity from two neuron populations, pyramidal track upper and lower, were characterized, in relation to movement execution. Some files, as originally (re)distributed from e.g. http://datasets.datalad.org/?dir=/labs/svoboda/Economo_2018 were found to be broken and would not be able available among reorganized files under sub-* directories.</p>
        <p><a href="https://dandiarchive.org/dandiset/000006" target="_blank">View on DANDI Archive</a></p>

        <div class="notebooks">
            <h3>Notebooks:</h3>
            <ul>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000006/DataJoint/DJ-NWB-Economo-2018/notebooks/Economo-2018-examples.ipynb" target="_blank">DataJoint/DJ-NWB-Economo-2018/notebooks/Economo-2018-examples.ipynb</a></li>

            </ul>
        </div>

    </div>

    <div class="dandiset">
        <h2>A cortico-cerebellar loop for motor planning</h2>
        <p><strong>ID:</strong> 000007</p>
        <p><strong>Description:</strong> Extracellular recording in ALM</p>
        <p><a href="https://dandiarchive.org/dandiset/000007" target="_blank">View on DANDI Archive</a></p>

        <div class="notebooks">
            <h3>Notebooks:</h3>
            <ul>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000007/DataJoint/DJ-NWB-Gao-2018/notebooks/erd.ipynb" target="_blank">DataJoint/DJ-NWB-Gao-2018/notebooks/erd.ipynb</a></li>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000007/DataJoint/DJ-NWB-Gao-2018/notebooks/Gao-2018-examples.ipynb" target="_blank">DataJoint/DJ-NWB-Gao-2018/notebooks/Gao-2018-examples.ipynb</a></li>

            </ul>
        </div>

    </div>

    <div class="dandiset">
        <h2>Maintenance of persistent activity in a frontal thalamocortical loop</h2>
        <p><strong>ID:</strong> 000009</p>
        <p><strong>Description:</strong> We recorded spikes from the ALM and thalamus during tactile discrimination  with a delayed directional response. Here we show that, similar to ALM  neurons, thalamic neurons exhibited selective persistent delay activity  that predicted movement direction. Unilateral photoinhibition of delay  activity in the ALM or thalamus produced contralesional neglect.  Photoinhibition of the thalamus caused a short-latency and near-complete  collapse of ALM activity. Similarly, photoinhibition of the ALM diminished  thalamic activity. Our results show that the thalamus is a circuit hub in  motor preparation and suggest that persistent activity requires reciprocal  excitation across multiple brain areas.</p>
        <p><a href="https://dandiarchive.org/dandiset/000009" target="_blank">View on DANDI Archive</a></p>

        <div class="notebooks">
            <h3>Notebooks:</h3>
            <ul>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000009/DataJoint/DJ-NWB-Guo-Inagaki-2017/notebooks/Guo-Inagaki-2017-examples.ipynb" target="_blank">DataJoint/DJ-NWB-Guo-Inagaki-2017/notebooks/Guo-Inagaki-2017-examples.ipynb</a></li>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000009/DataJoint/DJ-NWB-Guo-Inagaki-2017/notebooks/Guo-Inagaki-2017-NWB-examples.ipynb" target="_blank">DataJoint/DJ-NWB-Guo-Inagaki-2017/notebooks/Guo-Inagaki-2017-NWB-examples.ipynb</a></li>

            </ul>
        </div>

    </div>

    <div class="dandiset">
        <h2>A motor cortex circuit for motor planning and movement</h2>
        <p><strong>ID:</strong> 000010</p>
        <p><strong>Description:</strong> Data from "A motor cortex circuit for motor planning and movement" Li et al. Nature 2015</p>
        <p><a href="https://dandiarchive.org/dandiset/000010" target="_blank">View on DANDI Archive</a></p>

        <div class="notebooks">
            <h3>Notebooks:</h3>
            <ul>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000010/DataJoint/DJ-NWB-Li-2015b/notebooks/Li-2015b-examples.ipynb" target="_blank">DataJoint/DJ-NWB-Li-2015b/notebooks/Li-2015b-examples.ipynb</a></li>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000010/DataJoint/DJ-NWB-Li-2015b/notebooks/Schemas.ipynb" target="_blank">DataJoint/DJ-NWB-Li-2015b/notebooks/Schemas.ipynb</a></li>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000010/DataJoint/DJ-NWB-Li-Daie-2015-2016/notebooks/Li-Daie-2016-examples.ipynb" target="_blank">DataJoint/DJ-NWB-Li-Daie-2015-2016/notebooks/Li-Daie-2016-examples.ipynb</a></li>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000010/DataJoint/DJ-NWB-Li-Daie-2015-2016/notebooks/Li-2015-examples.ipynb" target="_blank">DataJoint/DJ-NWB-Li-Daie-2015-2016/notebooks/Li-2015-examples.ipynb</a></li>

            </ul>
        </div>

    </div>

    <div class="dandiset">
        <h2>Low-noise encoding of active touch by layer 4 in the somatosensory cortex</h2>
        <p><strong>ID:</strong> 000013</p>
        <p><strong>Description:</strong> Data from "Low-noise encoding of active touch by layer 4 in the somatosensory cortex" Hires, Gutnisky et al. Elife 2015</p>
        <p><a href="https://dandiarchive.org/dandiset/000013" target="_blank">View on DANDI Archive</a></p>

        <div class="notebooks">
            <h3>Notebooks:</h3>
            <ul>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000013/DataJoint/DJ-NWB-Hires-Gutnisky-2015/notebooks/Hires-Gutnisky-2015-examples.ipynb" target="_blank">DataJoint/DJ-NWB-Hires-Gutnisky-2015/notebooks/Hires-Gutnisky-2015-examples.ipynb</a></li>

            </ul>
        </div>

    </div>

    <div class="dandiset">
        <h2>A Map of Anticipatory Activity in Mouse Motor Cortex</h2>
        <p><strong>ID:</strong> 000015</p>
        <p><strong>Description:</strong> Data from "A Map of Anticipatory Activity in Mouse Motor Cortex" Chen et al. Neuron 2017</p>
        <p><a href="https://dandiarchive.org/dandiset/000015" target="_blank">View on DANDI Archive</a></p>

        <div class="notebooks">
            <h3>Notebooks:</h3>
            <ul>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000015/DataJoint/DJ-NWB-Chen-2017/notebooks/Schemas.ipynb" target="_blank">DataJoint/DJ-NWB-Chen-2017/notebooks/Schemas.ipynb</a></li>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000015/DataJoint/DJ-NWB-Chen-2017/notebooks/Chen-2017-examples.ipynb" target="_blank">DataJoint/DJ-NWB-Chen-2017/notebooks/Chen-2017-examples.ipynb</a></li>

            </ul>
        </div>

    </div>

    <div class="dandiset">
        <h2>Allen Institute – Contrast tuning in mouse visual cortex with calcium imaging</h2>
        <p><strong>ID:</strong> 000039</p>
        <p><strong>Description:</strong> A two photon calcium imaging dataset from Allen Institute measuring responses to full-field drifting gratings (approx. 120x90 degrees of visual space) of 8 directions and 6 contrasts (5%, 10%, 20%, 40%, 60%, 80%). Mouse Cre lines expressing GCaMP6f were imaged to record responses of   pyramidal neurons across cortical layers (Cux2: layer 2/3; Rorb: layer 4; Rbp4: layer 5; Ntsr1: layer 6) as well as inhibitory interneurons (Vip and Sst). All experimental sessions took place on the same data collection pipeline as the Allen Brain Observatory (see http://observatory.brain-map.org/visualcoding) and have the same visual stimulus monitor calibration and positioning, two photon imaging systems and image processing pipeline, and running wheel to track locomotion.

 Data are subject to Allen Institute Terms of Use policy, available at: http://www.alleninstitute.org/legal/terms-use/</p>
        <p><a href="https://dandiarchive.org/dandiset/000039" target="_blank">View on DANDI Archive</a></p>

        <div class="notebooks">
            <h3>Notebooks:</h3>
            <ul>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000039/AllenInstitute/Create_manifest.ipynb" target="_blank">AllenInstitute/Create_manifest.ipynb</a></li>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000039/AllenInstitute/Contrast_analysis.ipynb" target="_blank">AllenInstitute/Contrast_analysis.ipynb</a></li>

            </ul>
        </div>

    </div>

    <div class="dandiset">
        <h2>AJILE12: Long-term naturalistic human intracranial neural recordings and pose</h2>
        <p><strong>ID:</strong> 000055</p>
        <p><strong>Description:</strong> Understanding the neural basis of human movement in naturalistic scenarios is critical for expanding neuroscience research beyond constrained laboratory paradigms. The neural correlates of unstructured, spontaneous movements in completely naturalistic settings have rarely been studied, due in large part to a lack of available data. Here, we present our Annotated Joints in Long-term Electrocorticography for 12 human participants (AJILE12) dataset, the largest human neurobehavioral dataset that is publicly available; the dataset was recorded opportunistically during passive clinical epilepsy monitoring. AJILE12 includes synchronized intracranial neural recordings and upper body pose trajectories across 55 semi-continuous days of naturalistic movements, along with relevant metadata, including thousands of wrist movement events and annotated behavioral states. Neural recordings are available at 500 Hz from at least 64 electrodes per participant, for a total of 1280 hours. Pose trajectories at 9 upper-body keypoints, including wrist, elbow, and shoulder joints, were sampled at 30 frames per second and estimated from 118 million video frames. In adherence with the FAIR data principles, we have shared AJILE12 on The Dandi Archive in the Neurodata Without Borders (NWB) data standard and developed a browser-based dashboard to facilitate data exploration and reuse.</p>
        <p><a href="https://dandiarchive.org/dandiset/000055" target="_blank">View on DANDI Archive</a></p>

        <div class="notebooks">
            <h3>Notebooks:</h3>
            <ul>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000055/BruntonLab/peterson21/Table_part_characteristics.ipynb" target="_blank">BruntonLab/peterson21/Table_part_characteristics.ipynb</a></li>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000055/BruntonLab/peterson21/Table_coarse_labels.ipynb" target="_blank">BruntonLab/peterson21/Table_coarse_labels.ipynb</a></li>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000055/BruntonLab/peterson21/Fig_coarse_labels.ipynb" target="_blank">BruntonLab/peterson21/Fig_coarse_labels.ipynb</a></li>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000055/BruntonLab/peterson21/dashboard.ipynb" target="_blank">BruntonLab/peterson21/dashboard.ipynb</a></li>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000055/BruntonLab/peterson21/Fig_pow_spectra.ipynb" target="_blank">BruntonLab/peterson21/Fig_pow_spectra.ipynb</a></li>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000055/.ipynb_checkpoints/fig2-plot-coarse-labels-checkpoint.ipynb" target="_blank">.ipynb_checkpoints/fig2-plot-coarse-labels-checkpoint.ipynb</a></li>

            </ul>
        </div>

    </div>

    <div class="dandiset">
        <h2>Light sheet imaging of the human brain</h2>
        <p><strong>ID:</strong> 000108</p>
        <p><strong>Description:</strong> This dataset contains images of 1 mm sections of human brain tissue showing nuclei, NeuN+ cells and blood vessels. Each tissue section was first SHIELD-processed for protein protection and delipidated to clear the tissue. The tissues were stained with YOYO1 (nuclei), anti-NeuN antibody (with Rhodamine Red-X secondary antibody) and Lectin (blood vessel) for 8 days in total and then optically cleared using ExPROTOS (a refractive matching solution). The sample was imaged using light sheet microscopy at a resolution of ~2.5 um x 3.6 um x 2.5 um. Each slab was imaged using multiple stacks. Offset transforms are included with the dataset to enable reconstruction of each slab.</p>
        <p><a href="https://dandiarchive.org/dandiset/000108" target="_blank">View on DANDI Archive</a></p>

        <div class="notebooks">
            <h3>Notebooks:</h3>
            <ul>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000108/chunglab/demo/2021-09-27_dandi-demo.ipynb" target="_blank">chunglab/demo/2021-09-27_dandi-demo.ipynb</a></li>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000108/chunglab/demo/dashboard.ipynb" target="_blank">chunglab/demo/dashboard.ipynb</a></li>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000108/chunglab/demo/validate_lev6.ipynb" target="_blank">chunglab/demo/validate_lev6.ipynb</a></li>

            </ul>
        </div>

    </div>

    <div class="dandiset">
        <h2>MC_RTT: macaque motor cortex spiking activity during self-paced reaching</h2>
        <p><strong>ID:</strong> 000129</p>
        <p><strong>Description:</strong> This dataset contains sorted unit spiking times and behavioral data from a macaque performing a self-paced reaching task. In the experimental task, the subject reached between targets randomly selected from an 8x8 grid without gaps or pre-movement delay intervals. Neural activity was recorded from an electrode array implanted in the primary motor cortex. Finger position, cursor position, and target position were also recorded during the experiment. Provided as part of the Neural Latents Benchmark: https://neurallatents.github.io.</p>
        <p><a href="https://dandiarchive.org/dandiset/000129" target="_blank">View on DANDI Archive</a></p>

    </div>

    <div class="dandiset">
        <h2>Mesoscale Activity Map Dataset</h2>
        <p><strong>ID:</strong> 000363</p>
        <p><strong>Description:</strong> Mesoscale Activity Map Project. Map behavior-related activity in a multi-regional network supporting memory-guided movement in mice. Anatomy-guided recordings from multiple connected brain regions, from anterior lateral motor cortex to the medulla.

Supported by Simons Collaboration on the Global Brain, Janelia Visitor Project, NIH U19NS123714-01, R01NS112312, R01EB028171, McKnight foundation</p>
        <p><a href="https://dandiarchive.org/dandiset/000363" target="_blank">View on DANDI Archive</a></p>

        <div class="notebooks">
            <h3>Notebooks:</h3>
            <ul>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000363/MAP/demo/browse_map_ephys_data.ipynb" target="_blank">MAP/demo/browse_map_ephys_data.ipynb</a></li>

            </ul>
        </div>

    </div>

    <div class="dandiset">
        <h2>MICrONS Two Photon Functional Imaging</h2>
        <p><strong>ID:</strong> 000402</p>
        <p><strong>Description:</strong> The light microscopic images were acquired from a cubic millimeter volume that spanned portions of primary visual cortex and three higher visual cortical areas. The volume was imaged in vivo by two-photon random access mesoscope (2P-RAM) from postnatal days P75 to P81 in a male mouse expressing a genetically encoded calcium indicator in excitatory cells, while the mouse viewed natural movies and parametric stimuli. The calcium imaging data includes the single-cell responses of an estimated 75,000 pyramidal cells imaged over a volume of approximately 1200 x 1100 x 500 μm3 (anteroposterior x mediolateral x radial depth). The center of the volume was placed at the junction of primary visual cortex (VISp) and three higher visual areas, lateromedial area (VISlm), rostrolateral area (VISrl) and anterolateral area (VISal). During imaging, the animal was head-restrained, and the stimulus was presented to the left visual field. Treadmill rotation (single axis) and video of the animal's left eye were captured throughout the scan, yielding the locomotion velocity, eye movements, and pupil diameter data included here.

The functional data were co-registered with electron microscopy (EM) data. The structural identifiers of the matched cells are added as plane segmentation columns extracted from the CAVE database. To access the latest revision see the notebook that is linked to this dandiset. The structural ids might not be present for all plane segmentations.</p>
        <p><a href="https://dandiarchive.org/dandiset/000402" target="_blank">View on DANDI Archive</a></p>

        <div class="notebooks">
            <h3>Notebooks:</h3>
            <ul>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000402/MICrONS/demo/000402_microns_demo.ipynb" target="_blank">MICrONS/demo/000402_microns_demo.ipynb</a></li>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000402/MICrONS/demo/.ipynb_checkpoints/000402_microns_demo-checkpoint.ipynb" target="_blank">MICrONS/demo/.ipynb_checkpoints/000402_microns_demo-checkpoint.ipynb</a></li>

            </ul>
        </div>

    </div>

    <div class="dandiset">
        <h2>IBL - Brain Wide Map</h2>
        <p><strong>ID:</strong> 000409</p>
        <p><strong>Description:</strong> The International Brain lab (IBL) aims to understand the neural basis of decision-making in the mouse by gathering a whole-brain activity map composed of electrophysiological recordings pooled from multiple laboratories. We have systematically recorded from nearly all major brain areas with Neuropixels probes, using a grid system for unbiased sampling and replicating each recording site in at least two laboratories. These data have been used to construct a brain-wide map of activity at single-spike cellular resolution during a decision-making task. In addition to the map, this data set contains other information gathered during the task: sensory stimuli presented to the mouse; mouse decisions and response times; and mouse pose information from video recordings and DeepLabCut analysis.</p>
        <p><a href="https://dandiarchive.org/dandiset/000409" target="_blank">View on DANDI Archive</a></p>

        <div class="notebooks">
            <h3>Notebooks:</h3>
            <ul>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000409/IBL/02_behaviour_psychometric_curve.ipynb" target="_blank">IBL/02_behaviour_psychometric_curve.ipynb</a></li>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000409/IBL/03_analysis_Imbizo_2023.ipynb" target="_blank">IBL/03_analysis_Imbizo_2023.ipynb</a></li>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000409/IBL/01_list_datasets.ipynb" target="_blank">IBL/01_list_datasets.ipynb</a></li>

            </ul>
        </div>

    </div>

    <div class="dandiset">
        <h2>Spontaneous behaviour is structured by reinforcement without explicit reward</h2>
        <p><strong>ID:</strong> 000559</p>
        <p><strong>Description:</strong> Spontaneous animal behaviour is built from action modules that are concatenated by the brain into sequences. However, the neural mechanisms that guide the composition of naturalistic, self-motivated behaviour remain unknown. Here we show that dopamine systematically fluctuates in the dorsolateral striatum (DLS) as mice spontaneously express sub-second behavioural modules, despite the absence of task structure, sensory cues or exogenous reward. Photometric recordings and calibrated closed-loop optogenetic manipulations during open field behaviour demonstrate that DLS dopamine fluctuations increase sequence variation over seconds, reinforce the use of associated behavioural modules over minutes, and modulate the vigour with which modules are expressed, without directly influencing movement initiation or moment-to-moment kinematics. Although the reinforcing effects of optogenetic DLS dopamine manipulations vary across behavioural modules and individual mice, these differences are well predicted by observed variation in the relationships between endogenous dopamine and module use. Consistent with the possibility that DLS dopamine fluctuations act as a teaching signal, mice build sequences during exploration as if to maximize dopamine. Together, these findings suggest a model in which the same circuits and computations that govern action choices in structured tasks have a key role in sculpting the content of unconstrained, high-dimensional, spontaneous behavior.</p>
        <p><a href="https://dandiarchive.org/dandiset/000559" target="_blank">View on DANDI Archive</a></p>

        <div class="notebooks">
            <h3>Notebooks:</h3>
            <ul>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000559/dattalab/markowitz_gillis_nature_2023/reproduce_figure_S3.ipynb" target="_blank">dattalab/markowitz_gillis_nature_2023/reproduce_figure_S3.ipynb</a></li>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000559/dattalab/markowitz_gillis_nature_2023/reproduce_figure_S1.ipynb" target="_blank">dattalab/markowitz_gillis_nature_2023/reproduce_figure_S1.ipynb</a></li>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000559/dattalab/markowitz_gillis_nature_2023/reproduce_figure1d.ipynb" target="_blank">dattalab/markowitz_gillis_nature_2023/reproduce_figure1d.ipynb</a></li>

            </ul>
        </div>

    </div>

    <div class="dandiset">
        <h2>Conjunctive Representation of Position, Direction, and Velocity in Entorhinal Cortex</h2>
        <p><strong>ID:</strong> 000582</p>
        <p><strong>Description:</strong> The dataset includes spike times for recorded grid cells from the medial entorhinal cortex (MEC) in rats that explored two-dimensional environments. The behavioral data includes position from the tracking LEDs. 

This sample was published in Sargolini et al. (Science, 2006).

Grid cells in the medial entorhinal cortex (MEC) are part of an environment-independent spatial coordinate system. To determine how information about location, direction, and distance is integrated in the grid-cell network, we recorded from each principal cell layer of MEC in rats that explored two-dimensional environments. Whereas layer II was predominated by grid cells, grid cells colocalized with head-direction cells and conjunctive grid x head-direction cells in the deeper layers. All cell types were modulated by running speed. The conjunction of positional, directional, and translational information in a single MEC cell type may enable grid coordinates to be updated during self-motion-based navigation.
</p>
        <p><a href="https://dandiarchive.org/dandiset/000582" target="_blank">View on DANDI Archive</a></p>

        <div class="notebooks">
            <h3>Notebooks:</h3>
            <ul>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000582/Sargolini2006/000582_Sargolini2006_demo.ipynb" target="_blank">Sargolini2006/000582_Sargolini2006_demo.ipynb</a></li>

            </ul>
        </div>

    </div>

    <div class="dandiset">
        <h2>Mapping the Neural Dynamics of Locomotion across the Drosophila Brain</h2>
        <p><strong>ID:</strong> 000727</p>
        <p><strong>Description:</strong> Walking is a fundamental mode of locomotion, yet its neural correlates are unknown at brain-wide scale in any animal. We use volumetric two-photon imaging to map neural activity associated with walking across the entire brain of Drosophila. We detect locomotor signals in approximately 40% of the brain, identify a global signal associated with the transition from rest to walking, and define clustered neural signals selectively associated with changes in forward or angular velocity. These networks span functionally diverse brain regions, and include regions that have not been previously linked to locomotion. We also identify time-varying trajectories of neural activity that anticipate future movements, and that represent sequential engagement of clusters of neurons with different behavioral selectivity. These motor maps suggest a dynamical systems framework for constructing walking maneuvers reminiscent of models of forelimb reaching in primates and set a foundation for understanding how local circuits interact across large-scale networks.</p>
        <p><a href="https://dandiarchive.org/dandiset/000727" target="_blank">View on DANDI Archive</a></p>

        <div class="notebooks">
            <h3>Notebooks:</h3>
            <ul>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000727/clandinin/simple_data_access/reading_data.ipynb" target="_blank">clandinin/simple_data_access/reading_data.ipynb</a></li>

            </ul>
        </div>

    </div>

    <div class="dandiset">
        <h2>Allen Institute - Visual Coding - Optical Physiology</h2>
        <p><strong>ID:</strong> 000728</p>
        <p><strong>Description:</strong> From the Allen Institute Brain Observatory, the Visual Coding (optical physiology) dataset is a large-scale, standardized survey of physiological activity across the mouse visual cortex, hippocampus, and thalamus. This two-photon imaging dataset features visually evoked calcium responses from GCaMP6-expressing neurons in a range of cortical layers, visual areas, and Cre lines. We hope that experimentalists and modelers will use these comprehensive, open datasets as a testbed for theories of visual information processing.

Full documentation of the Visual Coding Ophys dataset and tutorials can be found here: https://observatory.brain-map.org/visualcoding/

The Visual Coding Ecephys (NeuroPixels) dataset can be found here: https://dandiarchive.org/dandiset/000021

</p>
        <p><a href="https://dandiarchive.org/dandiset/000728" target="_blank">View on DANDI Archive</a></p>

        <div class="notebooks">
            <h3>Notebooks:</h3>
            <ul>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000728/AllenInstitute/visual_coding_ophys_tutorial.ipynb" target="_blank">AllenInstitute/visual_coding_ophys_tutorial.ipynb</a></li>

            </ul>
        </div>

    </div>

    <div class="dandiset">
        <h2>Reach-related Single Unit Activity in the Parkinsonian Macaque</h2>
        <p><strong>ID:</strong> 000947</p>
        <p><strong>Description:</strong> This dataset contains recordings of single-unit activity from from multiple brain areas,  including globus pallidus-internus (GPi), ventrolateral nucleus of the thalamus (VLa and VLp) and the arm-related regions of primary motor cortex, including sulcus (M1-S) and gyrus (M1-G) subregions, in monkeys performing a choice reaction time reaching task. Small numbers of recordings were also obtained from supplementary motor area (SMA), external globus pallidus (GPe), the thalamic reticular nucleus (RTN), striatum (STR) and the region between RTN and VL thalamus (R-V). It contains data from two monkeys before and after the administration of MPTP (1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine), which induces Parkinsonism. The neuronal activity was recorded using 16-contact linear probes (0.5–1.0 MΩ, V-probe, Plexon) or glass-insulated tungsten microelectrodes (0.5–1.5 MΩ, Alpha Omega). The neuronal data were amplified (4×, 2 Hz–7.5 kHz) and digitized at 24.414 kHz (approx., 16-bit resolution; Tucker Davis Technologies). The neuronal data were high-pass filtered (Fpass: 200 Hz, Matlab FIRPM) and thresholded, and candidate action potentials were sorted into clusters in principal components space (Off-line Sorter, Plexon or custom algorithm, TomSort, https://zenodo.org/doi/10.5281/zenodo.11176978).

The recordings for subject Isis before the administration of MPTP (pre-MPTP) were collected by Daisuke Kase.
The recordings for subject Isis after the administration of MPTP (post-MPTP) were collected by Yan Han (韩妍).
The recordings for subject Gaia (pre-MPTP,  post-MPTP) from 2015 were collected by Andrew J. Zimnik.
The recordings for subject Gaia (post-MPTP) from 2016-2017 were also collected by Daisuke Kase.</p>
        <p><a href="https://dandiarchive.org/dandiset/000947" target="_blank">View on DANDI Archive</a></p>

        <div class="notebooks">
            <h3>Notebooks:</h3>
            <ul>

                <li><a href="https://github.com/dandi/example-notebooks/blob/master/000947/TurnerLab/public_demo/000947_demo.ipynb" target="_blank">TurnerLab/public_demo/000947_demo.ipynb</a></li>

            </ul>
        </div>

    </div>

</div>
</body>
</html>
bendichter commented 3 months ago

Or you can just run python .github/scripts/metadata-collector-script.py from the project dir locally

yarikoptic commented 3 months ago

I saw an HTML file , now I even practiced my scrolling skills here ;) my point is on either we know what all actions etc would work out as expected? ... anyways -- easy to check and there should be no effect, let's merge and see

bendichter commented 3 months ago

@yarikoptic https://www.dandiarchive.org/example-notebooks/