Shortcuts

lumin.nn.models.layers package

Submodules

lumin.nn.models.layers.activations module

lumin.nn.models.layers.activations.lookup_act(act)[source]

Map activation name to class

Parameters

act (str) – string representation of activation function

Return type

Any

Returns

Class implementing requested activation function

class lumin.nn.models.layers.activations.Swish(inplace=False)[source]

Bases: torch.nn.modules.module.Module

Non-trainable Swish activation function https://arxiv.org/abs/1710.05941

Parameters

inplace – whether to apply activation inplace

Examples::
>>> swish = Swish()
forward(x)[source]

Pass tensor through Swish function

Parameters

x (Tensor) – incoming tensor

Return type

Tensor

Returns

Resulting tensor

lumin.nn.models.layers.mish module

This file contains code modfied from https://github.com/digantamisra98/Mish which is made available under the following MIT Licence: MIT License

Copyright (c) 2019 Diganta Misra

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

The Apache Licence 2.0 underwhich the majority of the rest of LUMIN is distributed does not apply to the code within this file.

class lumin.nn.models.layers.mish.Mish[source]

Bases: torch.nn.modules.module.Module

Applies the mish function element-wise: mish(x) = x * tanh(softplus(x)) = x * tanh(ln(1 + exp(x))) Shape:

  • Input: (N, *) where * means, any number of additional dimensions

  • Output: (N, *), same shape as the input

Examples

>>> m = Mish()
>>> input = torch.randn(2)
>>> output = m(input)
forward(input)[source]

Forward pass of the function.

Module contents

Read the Docs v: v0.7.2
Versions
latest
stable
v0.7.2
v0.7.1
v0.7.0
v0.6.0
v0.5.1
v0.5.0
v0.4.0.1
v0.3.1
Downloads
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.

Docs

Access comprehensive developer and user documentation for LUMIN

View Docs

Tutorials

Get tutorials for beginner and advanced researchers demonstrating many of the features of LUMIN

View Tutorials