Here I talk about Layers, the basic building blocks of Keras.
Layers are essentially little functions that are stateful - they generally have weights associated with them and these weights are trainable or non-trainable (when we fit a model, we're changing these weights).
I explain what general layers are and then walk you through two special layers: Input layers and Lambda layers.
In general, dense layers spit out functions that you can call on inputs. Each dense layer in the function from the input to the output contains weights and biases. The weight will be multiplied to your input and the biases will be added to your input - generally speaking. We can also change the weights by using layer.set_weights.
I then talk about Saving and Loading individual layer configs by getting the config (layer.get_config( ) ) and reconstructing the layer from that config.
I then move onto the two special layers:
Input Layers are special because they allow you to specify an input shape.
Lambda Layers are special because they cannot have any internal state.
Finally, I show you how to write your own layer.
Links:
1) Link to my Scikit Learn tutorial - A Bit of DataScience and Scikit Learn: • Intro to Scikit Learn
2) The Hitchhiker's Guide to Python - one of the best handbooks to the installation, configuration, and usage of Python that I have come across: http://docs.python-guide.org/en/latest/
3) Link to Keras: https://keras.io
4) Link to TensorFlow: https://www.tensorflow.org
5) GitHub link to a-bit-of-deep-learning-and-keras notebooks: https://github.com/knathanieltucker/a...
6) Link to the History of Deep Learning video will be up soon!
Watch video Layers - Keras online without registration, duration hours minute second in high quality. This video was added by user Data Talks 15 October 2017, don't forget to share it with your friends and acquaintances, it has been viewed on our site 32,121 once and liked it 291 people.