# Function¶

class chainer.Function[ソース]

Function on variables with backpropagation ability.

chainer.functions で定義されたすべての関数は、このクラスを継承します。

The main feature of this class is keeping track of function applications as a backward graph. When a function is applied to Variable objects, its forward() method is called on data fields of input variables, and at the same time it chains references from output variables to the function and from the function to its inputs.

As of v1.5, a function instance cannot be used twice in any computational graphs. In order to reuse a function object multiple times, use copy.copy() before the function applications to make a copy of the instance.

This restriction also means that we cannot make a stateful function anymore. For example, it is now not allowed to let a function hold parameters. Define a function as a pure (stateless) procedure, and use Link to combine it with parameter variables.

xVariable のインスタンスとし、f を1つの引数だけを取る Function のインスタンスとします。そして、

>>> import numpy, chainer, chainer.functions as F
>>> x = chainer.Variable(numpy.zeros(10))
>>> f = F.Identity()
>>> y = f(x)


x <--- f <--- y


If an application of another function g occurs as

>>> g = F.Identity()
>>> z = g(x)


then the graph grows with a branch:

    |--- f <--- y
x <-+
|--- g <--- z


Note that the branching is correctly managed on backward computation, i.e. the gradients from f and g are accumulated to the gradient of x.

Every function implementation should provide forward_cpu(), forward_gpu(), backward_cpu() and backward_gpu(). Alternatively, one can provide forward() and backward() instead of separate methods. Backward methods have default implementations that just return None, which indicates that the function is non- differentiable.

変数: inputs – A tuple or list of input variables. outputs – A tuple or list of output variables. type_check_enable – When it is True, the function checks types of input arguments. Set CHAINER_TYPE_CHECK environment variable 0 to disable type check, or set the variable directly in your own program.
add_hook(hook, name=None)[ソース]

パラメータ: hook (FunctionHook) – Function hook to be registered. name (str) – Name of the function hook. name must be unique among function hooks registered to the function. If None, default name of the function hook is used.
backward(inputs, grad_outputs)[ソース]

It delegates the procedure to backward_cpu() or backward_gpu() by default. Which it selects is determined by the type of input arrays and output gradient arrays. Implementations of Function must implement either CPU/GPU methods or this method, if the function is intended to be backprop-ed.

パラメータ: inputs – Tuple of input arrays. grad_outputs – Tuple of output gradient arrays. Tuple of input gradient arrays. Some or all of them can be None, if the function is not differentiable on inputs. tuple

Implementations of Function must take care that the return value must be a tuple even if it returns only one array.

backward_cpu(inputs, grad_outputs)[ソース]

CPU上の出力勾配配列にバックプロパゲーションを適用します。

パラメータ: inputs – Tuple of input numpy.ndarray object(s). grad_outputs – Tuple of output gradient numpy.ndarray object(s). Tuple of input gradient numpy.ndarray object(s). Some or all of them can be None, if the function is not differentiable on corresponding inputs. tuple

Implementations of Function must take care that the return value must be a tuple even if it returns only one array.

backward_gpu(inputs, grad_outputs)[ソース]

Applies backprop to output gradient arrays on GPU.

パラメータ: inputs – Tuple of input cupy.ndarray object(s). grad_outputs – Tuple of output gradient cupy.ndarray object(s). Tuple of input gradient cupy.ndarray object(s). Some or all of them can be None, if the function is not differentiable on corresponding inputs. tuple

Implementations of Function must take care that the return value must be a tuple even if it returns only one array.

check_type_forward(in_types)[ソース]

Checks types of input data before forward propagation.

Before forward() is called, this function is called. You need to validate types of input data in this function using the type checking utilities.

パラメータ: in_types (TypeInfoTuple) – The type information of input data for forward().
delete_hook(name)[ソース]

パラメータ: name (str) – the name of the function hook to be unregistered.
forward(inputs)[ソース]

It delegates the procedure to forward_cpu() or forward_gpu() by default. Which it selects is determined by the type of input arrays. Implementations of Function must implement either CPU/GPU methods or this method.

パラメータ: inputs – Tuple of input array(s). Tuple of output array(s).

Implementations of Function must take care that the return value must be a tuple even if it returns only one array.

forward_cpu(inputs)[ソース]

Applies forward propagation to input arrays on CPU.

パラメータ: inputs – Tuple of numpy.ndarray object(s). Tuple of numpy.ndarray object(s). tuple

Implementations of Function must take care that the return value must be a tuple even if it returns only one array.

forward_gpu(inputs)[ソース]

Applies forward propagation to input arrays on GPU.

パラメータ: inputs – Tuple of cupy.ndarray object(s). Tuple of cupy.ndarray object(s). tuple

Implementations of Function must take care that the return value must be a tuple even if it returns only one array.

label

Short text that represents the function.

The default implementation returns its type name. Each function should override it to give more information.

local_function_hooks

Ordered Dictionary of registered function hooks.

Contrary to chainer.thread_local.function_hooks, which registers its elements to all functions, Function hooks in this property is specific to this function.

unchain()[ソース]

Purges in/out variables and this function itself from the graph.

This method is called from Variable.unchain_backward() method.

chainer.force_backprop_mode()[ソース]

Enable back-propagation for Variable whose volatile is auto.

When you want to enable back-propagation in no_backprop_mode(), call this method. In this context, Variable object whose volatile attribute is 'auto' behaves like a volatile variable. That means you can disable no_backprop_mode() in this context.

If you call this method outside of no_backprop_mode() context, it changes nothing. Variable object with volatile='auto' behaves like a volatile variable by default.

In this example, the volatility of x and y is 'auto'. In no_backprop_mode() context, y does not have a computational graph but in force_backprop_mode() it has a graph.

>>> with chainer.no_backprop_mode():
...   # Variable with volatile='auto' behaves like volatile='on'
...   with chainer.force_backprop_mode():
...     # Variable with volatile='auto' behaves like volatile='off'
...     y = x + 1


See no_backprop_mode() for details of back-prop mode.

chainer.no_backprop_mode()[ソース]

Disable back-propagation for Variable whose volatile is auto.

In the default setting a Variable object whose volatile attribute is 'auto' behaves like a non-volatile variable. That means such a Variable object builds a computational graph, consumes memory to store the graph, and you can execute back-propagation for it. With this context such a Variable object behaves like a volatile variable. So, you can easily switch training and evaluation.

In this example, the volatility of x and y is 'auto'. So, y does not have a computational graph.

>>> x = chainer.Variable(numpy.array([1,], 'f'), volatile='auto')
>>> with chainer.no_backprop_mode():
...    y = x + 1