deepMTJ
is a machine learning approach for automatically tracking of muscle-tendon junctions (MTJ) in ultrasound images. Our method is based on a convolutional neural network trained to infer MTJ positions across various ultrasound systems from different vendors, collected in independent laboratories from diverse observers, on distinct muscles and movements. We built deepMTJ
to support clinical biomechanists and locomotion researchers with an open-source tool for gait analyses.
This repository contains the full python source code of deepMTJ
including:
- a google colab notebook to make inferences online
- input/output utilities to load data and save predictions
- the model backbone and trained model weights to PREDICT, TRANSFER and LEARN
- a diverse test dataset annotated by 4 specialists (2-10y experience) and containing 1344 ultrasound images of muscle tendon junctions to benchmark future models
- a labeling tool to annotate ultrasound images (discontinued after v1.2)
-
Cloud based predictions are accessible via deepmtj.org. These services run in beta and have a datasize limitation of 200 MB.
-
Or use the python code to predict MTJs. Have a look at the nitty-gritty guide to the repository.
[1] @article{deepmtj2021a,
title={A Human-Centered Machine-Learning Approach for Muscle-Tendon Junction Tracking in Ultrasound Images},
year={2021}
author={Leitner, Christoph and Jarolim, Robert and Englmair, Bernhard and Kruse, Annika and Hernandez, Karen Andrea Lara and Konrad, Andreas and Su, Eric and Schröttner,
Jörg and Kelly, Luke A. and Lichtwark, Glen A. and Tilp, Markus and Baumgartner, Christian},
journal = {IEEE Transactions on Biomedical Engineering},
publisher={IEEE},
doi={10.1109/TBME.2021.3130548}
}
[2] @misc{deepmtj2021b,
title = {{deepMTJ test-set data}},
year={2021},
author={Leitner, Christoph and Jarolim, Robert and Englmair, Bernhard and Kruse, Annika and Hernandez, Karen Andrea Lara and Konrad, Andreas and Su, Eric and Schröttner,
Jörg and Kelly, Luke A. and Lichtwark, Glen A. and Tilp, Markus and Baumgartner, Christian},
doi = {10.6084/m9.figshare.16822978.v2}
}
[3] @inproceedings{deepmtj2020,
title={Automatic Tracking of the Muscle Tendon Junction in Healthy and Impaired Subjects using Deep Learning*},
year={2020},
author={Leitner, Christoph and Jarolim, Robert and Konrad, Andreas and Kruse, Annika and Tilp, Markus and Schröttner, Jörg and Baumgartner, Christian},
booktitle={2020 42nd Annual International Conference of the IEEE Engineering in Medicine Biology Society (EMBC)},
publisher={IEEE},
pages={4770-4774},
doi={10.1109/EMBC44109.2020.9176145}
}
mtj_tracking/label
folder contains the video annotation tool (discontinued after v1.2). We have used the online labeling tool Labelbox in our recent publication Leitner et al. 2021a.
mtj_tracking/train
folder contains the model backbone, the network training and evaluation
This repository comprises the deepMTJ tensorflow model. This U-Net model with attention mechanism was trained on a total of 66864 annotated ultrasound images of the muscle-tendon junction. The training dataset, covers 3 functional movements (isometric maximum voluntary contractions, passive torque movements, running), 2 muscles (Lateral Gastrocnemius, Medial Gastrocnemius), collected on 123 healthy and 38 impaired subjects with 3 different ultrasound systems (Aixplorer V6, Esaote MyLab60, Telemed ArtUs).
You can download the test dataset [2] (464 MB) used in Leitner et al. 2021a (e.g., to benchmark your own model,...). This dataset comprises 1344 images of muscle-tendon junctions recorded with 3 ultrasound imaging systems (Aixplorer V6, Esaote MyLab60, Telemed ArtUs), on 2 muscles (Lateral Gastrocnemius, Medial Gastrocnemius), and 2 movements (isometric maximum voluntary contractions, passive torque movements). We have included the ground truth labels for each image. These reference labels are the computed mean from 4 specialist labels. Specialist annotators had 2-10 years of experience in biomechanical and clinical research investigating muscles and tendons in 2-9 ultrasound studies in the past 2 years.
The provided dataset and models are licensed under a Creative Commons Attribution 4.0 International License.
-
Cloud based predictions are accessible via deepmtj.org. These services run in beta and have a datasize limitation of 200 MB.
data
folder with additional result plots and figures in high resolution.
This is one possible way to run #deepMTJ
source code on your computer.
- Install Anaconda for Python v3.7 (on prompt choose to include python in your path)
- Download the trained model from google cloud storage.
- Clone this GitHub repository to your local machine using https://github.com/luuleitner/deepMTJ.
- Open the terminal and navigate to the downloaded repository (
cd <<repository path>>>
). - Create the
deepMTJ
virtual environment in your Anaconda terminal and install all necessary libraries (listed in therequirements.txt
file) using the following code in the terminal window:
conda create --name deepMTJ python=3.7 --file requirements.txt
conda activate deepMTJ
- Run the model:
python -m mtj_tracking.predict.main <<downloaded model path>> <<input path>> <<output path>>
The experimental works and cloud deployments of the present study were supported by Google Cloud infrastructure.
This program is free software and licensed under the GNU General Public License v3.0 - see the LICENSE file for details.