Capsule theory is a potential research proposed by Geoffrey E. Hinton et al, where he describes the shortcomings of the Convolutional Neural Networks and how Capsules could potentially circumvent these problems such as "pixel attack" and create more robust Neural Network Architecture based on Capsules Layer.
We expect that this theory will definitely contribute to Deep Learning Industry and we are excited about it. For the same reason we are proud to introduce CapsLayer, an advanced library for the Capsule Theory, integrating capsule-relevant technologies, providing relevant analysis tools, developing related application examples, and probably most important thing: promoting the development of capsule theory.
This library is based on Tensorflow and has a similar API with it but designed for capsule layers/models.
TensorFlow-like API for building Neural Nets block, see API docs for more details:
Datasets support:
[ ] small NORB
Capsule Nets Model examples:
Algorithm support:
If you want us to support more features, let us know by opening Issues or sending E-mail to [email protected]
Feel free to send your pull request or open issues
If you find it is useful, please cite our project by the following BibTex entry:
@misc{HuadongLiao2017,
title = {CapsLayer: An advanced library for capsule theory},
author = {Huadong Liao, Jiawei He},
year = {2017}
publisher = {GitHub},
journal = {GitHub Project},
howpublished = {\url{http://naturomics.com/CapsLayer}},
}
Note: We are considering to write a paper for this project, but before that, please cite the above Bibtex entry if you find it helps.
Apache 2.0 license.
Hello! I've found a performance issue in /capslayer/data/datasets: batch()
should be called before map()
, which could make your program more efficient. Here is the tensorflow document to support it.
Detailed description is listed below:
dataset.batch(batch_size)
(here) should be called before dataset.map(parse_fun)
(here).dataset.batch(batch_size)
(here) should be called before dataset.map(parse_fun)
(here).dataset.batch(batch_size)
(here) should be called before dataset.map(parse_fun)
(here).dataset.batch(batch_size)
(here) should be called before dataset.map(parse_fun)
(here).Besides, you need to check the function called in map()
(e.g., parse_fun
called in dataset.map(parse_fun)
) whether to be affected or not to make the changed code work properly. For example, if parse_fun
needs data with shape (x, y, z) as its input before fix, it would require data with shape (batch_size, x, y, z).
Looking forward to your reply. Btw, I am very glad to create a PR to fix it if you are too busy.
Hello,I found a performance issue in the definition of __call__(self, batch_size, mode)
,
capslayer/data/datasets/cifar10/reader.py,
dataset = dataset.map(parse_fun) was called without num_parallel_calls.
I think it will increase the efficiency of your program if you add this.
The same issues also exist in dataset = dataset.map(parse_fun) , dataset = dataset.map(parse_fun), dataset = dataset.map(parse_fun)
Here is the documemtation of tensorflow to support this thing.
Looking forward to your reply. Btw, I am very glad to create a PR to fix it if you are too busy.
Thank you for this project. Is it possible to do a multi-label classification with CapsNet such that the softmax output can predict multiple classes for each input image?
Thanks so much, Abby
Hi,
first of all, thanks for this great project!
I noticed that in models/main.py
(line 206) the fashion mnist dataset is called 'fashion-mnist'.
# Deciding which dataset to use
if cfg.dataset == 'mnist' or cfg.dataset == 'fashion-mnist':
But in capslayer/data/datasets/fashion_mnist
it is called 'fashion_mnist', therefore main.py doesnt run with fashion mnist dataset.
Another thing:
in capslayer/data/datasets/fashion_mnist/writer.py
MNIST_FILES should be FASHION_MNIST_FILES:
def load_fashion_mnist(path, split):
split = split.lower()
image_file, label_file = [os.path.join(path, file_name) for file_name in MNIST_FILES[split]]
Hello naturomics! Thank you for this great code that is helping me understand capsule networks. In the documentation i saw that all of the capsules are seeing activation, but when i ran the code 3 capsules were not activated. Ran the code multiple times and saw that around 2-3-4 are not having any activation probability. I've attached picture after 500 and 49500 steps and also your provided example activation chart. Could you please help me solve this issue.
capsnet tensorflow capslayer capsule-network matrix-capsule em-routing routing-algorithm