Helper functions¶
-
lasagne.layers.
get_output
(layer_or_layers, inputs=None, **kwargs)[source]¶ Computes the output of the network at one or more given layers. Optionally, you can define the input(s) to propagate through the network instead of using the input variable(s) associated with the network’s input layer(s).
- Parameters
layer_or_layers : Layer or list
inputs : None, Theano expression, numpy array, or dict
If None, uses the input variables associated with the
InputLayer
instances. If a Theano expression, this defines the input for a singleInputLayer
instance. Will throw a ValueError if there are multipleInputLayer
instances. If a numpy array, this will be wrapped as a Theano constant and used just like a Theano expression. If a dictionary, anyLayer
instance (including the input layers) can be mapped to a Theano expression or numpy array to use instead of its regular output.- Returns
output : Theano expression or list
the output of the given layer(s) for the given network input
Notes
Depending on your network architecture, get_output([l1, l2]) may be crucially different from [get_output(l1), get_output(l2)]. Only the former ensures that the output expressions depend on the same intermediate expressions. For example, when l1 and l2 depend on a common dropout layer, the former will use the same dropout mask for both, while the latter will use two different dropout masks.
-
lasagne.layers.
get_output_shape
(layer_or_layers, input_shapes=None)[source]¶ Computes the output shape of the network at one or more given layers.
- Parameters
layer_or_layers : Layer or list
input_shapes : None, tuple, or dict
If None, uses the input shapes associated with the
InputLayer
instances. If a tuple, this defines the input shape for a singleInputLayer
instance. Will throw a ValueError if there are multipleInputLayer
instances. If a dictionary, anyLayer
instance (including the input layers) can be mapped to a shape tuple to use instead of its regular output shape.- Returns
tuple or list
the output shape of the given layer(s) for the given network input
-
lasagne.layers.
get_all_layers
(layer, treat_as_input=None)[source]¶ This function gathers all layers below one or more given
Layer
instances, including the given layer(s). Its main use is to collect all layers of a network just given the output layer(s). The layers are guaranteed to be returned in a topological order: a layer in the result list is always preceded by all layers its input depends on.- Parameters
layer : Layer or list
treat_as_input : None or iterable
an iterable of
Layer
instances to treat as input layers with no layers feeding into them. They will show up in the result list, but their incoming layers will not be collected (unless they are required for other layers as well).- Returns
list
a list of
Layer
instances feeding into the given instance(s) either directly or indirectly, and the given instance(s) themselves, in topological order.
Examples
>>> from lasagne.layers import InputLayer, DenseLayer >>> l_in = InputLayer((100, 20)) >>> l1 = DenseLayer(l_in, num_units=50) >>> get_all_layers(l1) == [l_in, l1] True >>> l2 = DenseLayer(l_in, num_units=10) >>> get_all_layers([l2, l1]) == [l_in, l2, l1] True >>> get_all_layers([l1, l2]) == [l_in, l1, l2] True >>> l3 = DenseLayer(l2, num_units=20) >>> get_all_layers(l3) == [l_in, l2, l3] True >>> get_all_layers(l3, treat_as_input=[l2]) == [l2, l3] True
-
lasagne.layers.
get_all_params
(layer, unwrap_shared=True, **tags)[source]¶ Returns a list of Theano shared variables or expressions that parameterize the layer.
This function gathers all parameters of all layers below one or more given
Layer
instances, including the layer(s) itself. Its main use is to collect all parameters of a network just given the output layer(s).By default, all shared variables that participate in the forward pass will be returned. The list can optionally be filtered by specifying tags as keyword arguments. For example,
trainable=True
will only return trainable parameters, andregularizable=True
will only return parameters that can be regularized (e.g., by L2 decay).- Parameters
layer : Layer or list
unwrap_shared : bool (default: True)
Affects only parameters that were set to a Theano expression. If
True
the function returns the shared variables contained in the expression, otherwise the Theano expression itself.**tags (optional)
tags can be specified to filter the list. Specifying
tag1=True
will limit the list to parameters that are tagged withtag1
. Specifyingtag1=False
will limit the list to parameters that are not tagged withtag1
. Commonly used tags areregularizable
andtrainable
.- Returns
params : list
A list of Theano shared variables or expressions representing the parameters.
Notes
If any of the layers’ parameters was set to a Theano expression instead of a shared variable, unwrap_shared controls whether to return the shared variables involved in that expression (
unwrap_shared=True
, the default), or the expression itself (unwrap_shared=False
). In either case, tag filtering applies to the expressions, considering all variables within an expression to be tagged the same.Examples
Collecting all parameters from a two-layer network:
>>> from lasagne.layers import InputLayer, DenseLayer >>> l_in = InputLayer((100, 20)) >>> l1 = DenseLayer(l_in, num_units=50) >>> l2 = DenseLayer(l1, num_units=30) >>> all_params = get_all_params(l2) >>> all_params == [l1.W, l1.b, l2.W, l2.b] True
Parameters can be filtered by tags, and parameter expressions are unwrapped to return involved shared variables by default:
>>> from lasagne.utils import floatX >>> w1 = theano.shared(floatX(.01 * np.random.randn(50, 30))) >>> w2 = theano.shared(floatX(1)) >>> l2 = DenseLayer(l1, num_units=30, W=theano.tensor.exp(w1) - w2, b=None) >>> all_params = get_all_params(l2, regularizable=True) >>> all_params == [l1.W, w1, w2] True
When disabling unwrapping, the expression for
l2.W
is returned instead:>>> all_params = get_all_params(l2, regularizable=True, ... unwrap_shared=False) >>> all_params == [l1.W, l2.W] True
-
lasagne.layers.
count_params
(layer, **tags)[source]¶ This function counts all parameters (i.e., the number of scalar values) of all layers below one or more given
Layer
instances, including the layer(s) itself.This is useful to compare the capacity of various network architectures. All parameters returned by the
Layer`s' `get_params
methods are counted.- Parameters
layer : Layer or list
**tags (optional)
tags can be specified to filter the list of parameter variables that will be included in the count. Specifying
tag1=True
will limit the list to parameters that are tagged withtag1
. Specifyingtag1=False
will limit the list to parameters that are not tagged withtag1
. Commonly used tags areregularizable
andtrainable
.- Returns
int
The total number of learnable parameters.
Examples
>>> from lasagne.layers import InputLayer, DenseLayer >>> l_in = InputLayer((100, 20)) >>> l1 = DenseLayer(l_in, num_units=50) >>> param_count = count_params(l1) >>> param_count 1050 >>> param_count == 20 * 50 + 50 # 20 input * 50 units + 50 biases True
-
lasagne.layers.
get_all_param_values
(layer, **tags)[source]¶ This function returns the values of the parameters of all layers below one or more given
Layer
instances, including the layer(s) itself.This function can be used in conjunction with set_all_param_values to save and restore model parameters.
- Parameters
layer : Layer or list
**tags (optional)
tags can be specified to filter the list. Specifying
tag1=True
will limit the list to parameters that are tagged withtag1
. Specifyingtag1=False
will limit the list to parameters that are not tagged withtag1
. Commonly used tags areregularizable
andtrainable
.- Returns
list of numpy.array
A list of numpy arrays representing the parameter values.
Examples
>>> from lasagne.layers import InputLayer, DenseLayer >>> l_in = InputLayer((100, 20)) >>> l1 = DenseLayer(l_in, num_units=50) >>> all_param_values = get_all_param_values(l1) >>> (all_param_values[0] == l1.W.get_value()).all() True >>> (all_param_values[1] == l1.b.get_value()).all() True
-
lasagne.layers.
set_all_param_values
(layer, values, **tags)[source]¶ Given a list of numpy arrays, this function sets the parameters of all layers below one or more given
Layer
instances (including the layer(s) itself) to the given values.This function can be used in conjunction with get_all_param_values to save and restore model parameters.
- Parameters
layer : Layer or list
values : list of numpy.array
A list of numpy arrays representing the parameter values, must match the number of parameters. Every parameter’s shape must match the shape of its new value.
**tags (optional)
tags can be specified to filter the list of parameters to be set. Specifying
tag1=True
will limit the list to parameters that are tagged withtag1
. Specifyingtag1=False
will limit the list to parameters that are not tagged withtag1
. Commonly used tags areregularizable
andtrainable
.- Raises
ValueError
If the number of values is not equal to the number of params, or if a parameter’s shape does not match the shape of its new value.
Examples
>>> from lasagne.layers import InputLayer, DenseLayer >>> l_in = InputLayer((100, 20)) >>> l1 = DenseLayer(l_in, num_units=50) >>> all_param_values = get_all_param_values(l1) >>> # all_param_values is now [l1.W.get_value(), l1.b.get_value()] >>> # ... >>> set_all_param_values(l1, all_param_values) >>> # the parameter values are restored.