Previously, we published a simillar list named List of Python Libraries For Data Science & Machine Learning. Just to repeat the same sayings – knowing basics around Python is a need for development around Data Science. We guess, most of the readers already used with installing Jupyter and working with Jupyter. This article is not exclusively related to Jupyter Notebook but is very important for the developers. Our multi-part article Approaches of Deep Learning is the required basic theory. We are not talking about machine learning.
Most of the below can be installed with the command
pip install and can be found on GitHub. Typically we need only one or two packages to do the common works. Theano, TensorFlow, Keras are commonly used. MXNet supports many languages.
- TensorFlow : For numerical computation using data flow graphs with flexible architecture.
- Theano : define, optimize, and evaluate mathematical expressions
- Blocks : it is a Theano framework for building and training neural networks
- PyTorch : Tensor computation with GPU acceleration
- DeepLearning4J : framework for support with deep learning algorithms
- Apache MXNet : framework designed for efficiency & flexibility.
- Caffe : developed by Berkeley AI Research and community
- Keras : Is not it of machine learning?
- nolearn : can be used for applications such as Deep Belief Networks, compatible with scikit-learn
- fast.ai : includes out of the box support for vision, text, tabular, and collaborative filtering models
- Microsoft Cognitive Toolkit (cntk.ai) : cntk allows combine popular model types
- TFLearn : built on top of Tensorflow to provide a higher-level API to speed-up experimentations
- Elephas : it is an extension of Keras, allows to run deep learning models with Spark
- spark-deep-learning : high-level APIs for deep learning in Python with Apache Spark
- Distributed Keras : framework built on top of Apache Spark and Keras
This ends this article. There was not much to go in in-depth as the official documentations, examples are superior than our writing.