DescriptionExample of a neural network's neural unit.png
English: The image shows an example of a neuronal unit in a neural network, illustrating its main components. In the example, these are the inputs, the weights, the bias, the summation of inputs and bias, the activation function, and the output. The activation function chosen for this example is a ReLU (Rectified Linear Unit).
to share – to copy, distribute and transmit the work
to remix – to adapt the work
Under the following conditions:
attribution – You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
share alike – If you remix, transform, or build upon the material, you must distribute your contributions under the same or compatible license as the original.
Please help improve this media file by adding it to one or more categories, so it may be associated with related media files (how?), and so that it can be more easily found.
Please notify the uploader with
{{subst:Please link images|File:Example of a neural network's neural unit.png}} ~~~~
Captions
An example of a unit of a neural network illustrating its main components.