Skip to content

Generalisation of electrical models of neurons with MCMC

License

Notifications You must be signed in to change notification settings

ReallyCoolBean/emodel-generalisation

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DOI

emodel-generalisation

Generalisation of neuronal electrical models on a morphological population with Markov Chain Monte-Carlo.

This code accompanies the pre-print:

Arnaudon, A., Reva, M., Zbili, M., Markram, H., Van Geit, W., & Kanari, L. (2023). Controlling morpho-electrophysiological variability of neurons with detailed biophysical models. bioRxiv, 2023-04.

Installation

This code can be installed via pip with

git clone [email protected]:BlueBrain/emodel-generalisation.git
pip install .

Examples

We provide several examples of the main functionalities of the emodel-generalisation code:

Citation

When you use the emodel-generalisation code or method for your research, we ask you to cite:

Arnaudon, A., Reva, M., Zbili, M., Markram, H., Van Geit, W., & Kanari, L. (2023). Controlling morpho-electrophysiological variability of neurons with detailed biophysical models. bioRxiv, 2023-04.

To get this citation in another format, please use the Cite this repository button in the sidebar of the code's github page.

Funding & Acknowledgment

The development of this code was supported by funding to the Blue Brain Project, a research center of the École polytechnique fédérale de Lausanne (EPFL), from the Swiss government’s ETH Board of the Swiss Federal Institutes of Technology.

For license and authors, see LICENSE.txt and AUTHORS.md respectively.

Copyright 2022-2023 Blue Brain Project/EPFL

About

Generalisation of electrical models of neurons with MCMC

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 92.8%
  • AMPL 6.0%
  • Jinja 1.2%