brainhack-boston / brainhack-boston.github.io

Brainhack Boston
https://brainhack-boston.github.io
Apache License 2.0
1 stars 22 forks source link

[PROJECT] Julia implementation of neural network estimators #52

Open Tinggong opened 4 months ago

Tinggong commented 4 months ago

Introduction Microstructure.jl is a Julia toolbox (development version) aiming at fast and probabilistic microstructure imaging. It features flexible biophysical modelling with MRI data. For estimating microstructure parameters from these models, it includes generic estimators such as Markov Chain Monte Carlo (MCMC) sampling methods and Monte Carlo dropout with neural networks.

Goal Using Flux.jl and Microstructure.jl to implement different types of neural networks. The current neural network estimator in Microstructure.jl uses multi-layer perceptron for supervised training with training samples generated from forward models in Microstructure.jl, e.g. MRI measurements as inputs and microstructure parameters as outputs. For other types of methods, an example we can try is to implement self-supervised method that uses the forward models in Microstructure.jl as a decoder.

Resources

  1. Tutorials/domes about how to use Microstructure.jl will be available soon on the documentation website
  2. For neural network examples using Flux, there are various models that you can reference at the Flux model zoo
  3. Python implementation example for the network models

Julia is a programming language designed for high performance. If you are interested in Julia or have experiences in related areas using other languages, join me in hacking towards the goal!