--- title: Title keywords: fastai sidebar: home_sidebar nb_path: "nbs/index.ipynb" ---
The best of two worlds: Combining state-of-the-art deep learning with a barrier free environment for life science researchers.
Read the paper, watch the tutorials, or read the docs.
- No coding skills required (graphical user interface)
- Ground truth estimation from the annotations of multiple experts for model training and validation
- Quality assurance and out-of-distribution detection for reliable prediction on new data
- Best-in-class performance for semantic and instance segmentation
Kaggle Gold Medal and Innovation Price Winner: The deepflash2 Python API built the foundation for winning the Innovation Award a Kaggle Gold Medal in the HuBMAP - Hacking the Kidney challenge. Have a look at our solution
Get started in less than a minute. Watch the tutorials for help. For a quick start, run deepflash2 in Google Colaboratory with free access to graphics processing units (GPUs).
To try the functionalities of deepflash2, open the deepflash2 GUI in Colab or follow the installation instructions below. The GUI provides a build-in use for sample data. After starting the GUI, select the task (GT Estimation, Training, or Prediction) and click Load Sample Data
. For futher instructions watch the tutorials.
We provide an overview of the tasks below:
Ground Truth (GT) Estimation | Training | Prediction | |
---|---|---|---|
Main Task | STAPLE or Majority Voting | Ensemble training and validation | Semantic and instance segmentation |
Sample Data | 5 masks from 5 experts each | 5 image/mask pairs | 5 images and 2 trained models |
Expected Output | 5 GT Segmentation Masks | 5 models | 5 predicted segmentation masks (semantic and instance) and uncertainty maps |
Estimated Time | ~ 1 min | ~ 150 min | ~ 4 min |
Times are estimated for Google Colab (with free NVIDIA Tesla K80 GPU). You can download the sample data here.
We provide a complete guide to reproduce our experiments using the deepflash2 Python API here. The data is currently available on Google Drive.
The preprint of our paper is available on arXiv. Please cite
@misc{griebel2021deepflash2,
title={Deep-learning in the bioimaging wild: Handling ambiguous data with deepflash2},
author={Matthias Griebel and Dennis Segebarth and Nikolai Stein and Nina Schukraft and Philip Tovote and Robert Blum and Christoph M. Flath},
year={2021},
eprint={2111.06693},
archivePrefix={arXiv}
}
Works in the browser an on your local pc/server
deepflash2 is designed to run on Windows, Linux, or Mac (x86-64) if pytorch is installable. We generally recommend using Google Colab as it only requires a Google Account and a device with a web browser. To run deepflash2 locally, we recommend using a system with a GPU (e.g., 2 CPUs, NVIDIA Tesla K80 GPU or better).
Software dependencies are defined in the settings.ini file. Additionally, the ground truth estimation functionalities are based on the simpleITK>=2.0 and the instance segmentation capabilities are complemented using cellpose with commit hash 316927eff7ad2201391957909a2114c68baee309
.
deepflash2 is tested on Google Colab (Ubuntu 18.04.5 LTS) and locally (Ubuntu 20.04 LTS, Windows 10 (tbd), MacOS 12.0.1 (tbd)).
Typical install time is about 1-5 minutes, depending on your internet connection
The GUI of deepflash2 runs as a web application inside a Jupyter Notebook, the de-facto standard of computational notebooks in the scientific community. The GUI is built on top of the deepflash2 Python API, which can be used independently (read the docs).
Excute the Set up environment
cell or follow the pip
instructions.
We recommend installation into a new, clean environment.
conda install -c conda-forge -c fastchan -c matjesg deepflash2
You should install PyTorch first by following the installation instructions of pytorch.
pip install deepflash2
If you want to use the GUI, make sure to download the GUI notebook and start a Jupyter server.
curl -o deepflash2_GUI.ipynb https://raw.githubusercontent.com/matjesg/deepflash2/master/deepflash2_GUI.ipynb
jupyter notebook
Then, open deepflash2_GUI.ipynb
within Notebook environment.
Docker images for deepflash2 are built on top of the latest pytorch image.
docker run -p 8888:8888 matjes/deepflash2 ./run_jupyter.sh
docker run --gpus all --shm-size=256m -p 8888:8888 matjes/deepflash2 ./run_jupyter.sh
All docker containers are configured to start a jupyter server. To add data, we recomment using bind mounts with /workspace
as target. To start the GUI, open deepflash2_GUI.ipynb
within Notebook environment.
For more information on how to run docker see docker orientation and setup.
If you don't have labelled training data available, you can use this instruction manual for creating segmentation maps. The ImagJ-Macro is available here.