W3cubDocs

/TensorFlow C++

tensorflow::ops::Selu

#include <nn_ops.h>

Computes scaled exponential linear: scale * alpha * (exp(features) - 1)

Summary

if < 0, scale * features otherwise.

See Self-Normalizing Neural Networks

Arguments:

Returns:

  • Output: The activations tensor.
Constructors and Destructors
Selu(const ::tensorflow::Scope & scope, ::tensorflow::Input features)
Public attributes
activations
Public functions
node() const
::tensorflow::Node *
operator::tensorflow::Input() const
operator::tensorflow::Output() const

Public attributes

activations

::tensorflow::Output activations

Public functions

Selu

 Selu(
  const ::tensorflow::Scope & scope,
  ::tensorflow::Input features
)

node

::tensorflow::Node * node() const 

operator::tensorflow::Input

operator::tensorflow::Input() const 

operator::tensorflow::Output

operator::tensorflow::Output() const 

© 2017 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/cc/class/tensorflow/ops/selu.html