You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+45-15
Original file line number
Diff line number
Diff line change
@@ -46,7 +46,9 @@ Part of the WFCI data used in this paper is available on PhysioNet. To note, If
46
46
├── mouse_split.json
47
47
├── train_tfrecords.sh
48
48
├── utils.py
49
-
└── visualize_lstm_attention_weights.py
49
+
├── visualize_lstm_attention_weights.py
50
+
└── example_notebook.ipynb
51
+
50
52
├── Results
51
53
├── fragmented_sleep.m
52
54
├── overlay_heatmap_gradcam.m
@@ -57,27 +59,55 @@ Part of the WFCI data used in this paper is available on PhysioNet. To note, If
57
59
└── visualize_attention_weights.m
58
60
```
59
61
#### In the `Scripts` folder, these are mainly the scripts for training and testing of the bidirectional LSTM model.
60
-
-`main.py`: the main python script to launch the network training, validation, testing, computing the Grad-CAM and extracting the temporal attention weights by defining the parameter `mode` in the config file `train_tfrecords.sh`.
61
-
-`create_tfrecords*.py`: data preprocessing code to create tfrecords from continuous WFCI recordings. When the requested epoch lenghth is large than 10 second, the code takes additional frames in the adjacent epochs to compose the final epoch, eg., take adjacent 5-second in the epoch N-1 and epoch N+1 to compose a 20-second length for epoch N.
62
-
-`dataloader_sleep.py`: the dataloader to create tf.dataset object for network input, need to be modified accordingly.
63
-
-`model_attention_bilstm.py`: code for building the hybrid attention-based bi-lstm model.
64
-
-`AttentionLayer.py`: TensorFlow wrapper functions for building various type of attention module, including the LSTM attention, the spatial [SimAM](http://proceedings.mlr.press/v139/yang21o.html) and [CBAM](https://doi.org/10.48550/arXiv.1807.06521) module.
65
-
-`gradcam.py`: code to compute the [Grad-CAM](https://doi.org/10.48550/arXiv.1610.02391) heatmaps.
66
-
-`visualize_lstm_attention_weights.py`: code to visualize learned attention scores of each time steps in a given 10-s input.
67
-
-`mouse_split.json`: config file to define the train/validation/test split.
62
+
-`main.py`: the main python script to launch the network training, validation, testing, computing the Grad-CAM and extracting the temporal attention weights by defining the parameter `mode` in the config file `train_tfrecords.sh`
63
+
-`create_tfrecords*.py`: data preprocessing code to create tfrecords from continuous WFCI recordings. When the requested epoch lenghth is large than 10 second, the code takes additional frames in the adjacent epochs to compose the final epoch, eg., take adjacent 5-second in the epoch N-1 and epoch N+1 to compose a 20-second length for epoch N
64
+
-`dataloader_sleep.py`: the dataloader to create tf.dataset object for network input, need to be modified accordingly
65
+
-`model_attention_bilstm.py`: code for building the hybrid attention-based bi-lstm model
66
+
-`AttentionLayer.py`: TensorFlow wrapper functions for building various type of attention module, including the LSTM attention, the spatial [SimAM](http://proceedings.mlr.press/v139/yang21o.html) and [CBAM](https://doi.org/10.48550/arXiv.1807.06521) module
67
+
-`gradcam.py`: code to compute the [Grad-CAM](https://doi.org/10.48550/arXiv.1610.02391) heatmaps
68
+
-`visualize_lstm_attention_weights.py`: code to visualize learned attention scores of each time steps in a given 10-s input
69
+
-`mouse_split.json`: config file to define the train/validation/test split
70
+
-`example_notebook.ipynb`: a jupyter notebook example for training model interatively
68
71
#### In the `Results` folder, these are mainly the MATLAB codes to analyze the sleep scoring results.
69
-
-`plot_hypnogram.m`: cpde to ploy color-coded hypnogram.
70
-
-`plot_gradcam.m`: overlay Grad-CAM heatmap on a selected frame with the help of function `overlay_heatmap_gradcam.m`.
71
-
-`fragmented_sleep.m`: code to calculate the number of sleep transitions and the average length in each of the sleep states.
72
-
-`visualize_attention_weights.m`: visualize the color-coded attention weights.
72
+
-`plot_hypnogram.m`: cpde to ploy color-coded hypnogram
73
+
-`plot_gradcam.m`: overlay Grad-CAM heatmap on a selected frame with the help of function `overlay_heatmap_gradcam.m`
74
+
-`fragmented_sleep.m`: code to calculate the number of sleep transitions and the average length in each of the sleep states
75
+
-`visualize_attention_weights.m`: visualize the color-coded attention weights
73
76
## Running the Code
74
77
- The main setup for running the code is running:
75
78
```
76
79
./train_tfrecords.sh
77
80
```
78
81
by setting the `mode` as `train` for training the network, `gradcam` for generating the Grad-CAM heatmap for given inputs and `attention_weights` for extacting the temporal attention scores.
79
82
## Citations
80
-
If you use our codes to investigating the functional brain network in WFCI, and/or the example data in your research, the authors of this software would like you to cite our paper and/or conference proceedings in your related publications.
83
+
If you use our codes to classify brain states of WFCI data, and/or the example data in your research, the authors of this software would like you to cite our paper and/or conference proceedings in your related publications.
81
84
```
82
-
TBD
85
+
@inproceedings{zhang2023attention,
86
+
title={Attention-based CNN-BiLSTM for sleep state classification of spatiotemporal wide-field calcium imaging data},
87
+
author={Zhang, Xiaohui and Landsness, Eric C and Culver, Joseph P and Lee, Jin-Moo and Anastasio, Mark A},
88
+
booktitle={Neural Imaging and Sensing 2023},
89
+
volume={12365},
90
+
pages={39--42},
91
+
year={2023},
92
+
organization={SPIE}
93
+
}
94
+
95
+
@article{zhang2022automated,
96
+
title={Automated sleep state classification of wide-field calcium imaging data via multiplex visibility graphs and deep learning},
97
+
author={Zhang, Xiaohui and Landsness, Eric C and Chen, Wei and Miao, Hanyang and Tang, Michelle and Brier, Lindsey M and Culver, Joseph P and Lee, Jin-Moo and Anastasio, Mark A},
98
+
journal={Journal of neuroscience methods},
99
+
volume={366},
100
+
pages={109421},
101
+
year={2022},
102
+
publisher={Elsevier}
103
+
}
104
+
105
+
@article{chen2022validation,
106
+
title={Validation of Deep Learning-based Sleep State Classification},
107
+
author={Chen, Wei and Zhang, Xiaohui and Miao, Hanyang and Tang, Michelle J and Anastasio, Mark and Culver, Joseph and Lee, Jin-Moo and Landsness, Eric C},
0 commit comments