Skip to content

Commit d946745

Browse files
Doc/fix activation functions for NBEATS and NHITS (#2759)
* doc add missing activation function * doc shorten activation function text (now similar to nbeats, nhits) * fix typo * lint --------- Co-authored-by: Dennis Bader <[email protected]>
1 parent 5065f60 commit d946745

File tree

3 files changed

+7
-4
lines changed

3 files changed

+7
-4
lines changed

darts/models/forecasting/nbeats.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -609,7 +609,8 @@ def __init__(
609609
prediction time).
610610
activation
611611
The activation function of encoder/decoder intermediate layer (default='ReLU').
612-
Supported activations: ['ReLU','RReLU', 'PReLU', 'Softplus', 'Tanh', 'SELU', 'LeakyReLU', 'Sigmoid']
612+
Supported activations: ['ReLU', 'RReLU', 'PReLU', 'ELU', 'Softplus', 'Tanh', 'SELU', 'LeakyReLU', 'Sigmoid',
613+
'GELU']
613614
**kwargs
614615
Optional arguments to initialize the pytorch_lightning.Module, pytorch_lightning.Trainer, and
615616
Darts' :class:`TorchForecastingModel`.

darts/models/forecasting/nhits.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -543,7 +543,8 @@ def __init__(
543543
prediction time).
544544
activation
545545
The activation function of encoder/decoder intermediate layer (default='ReLU').
546-
Supported activations: ['ReLU','RReLU', 'PReLU', 'Softplus', 'Tanh', 'SELU', 'LeakyReLU', 'Sigmoid']
546+
Supported activations: ['ReLU', 'RReLU', 'PReLU', 'ELU', 'Softplus', 'Tanh', 'SELU', 'LeakyReLU', 'Sigmoid',
547+
'GELU']
547548
MaxPool1d
548549
Use MaxPool1d pooling. False uses AvgPool1d
549550
**kwargs

darts/models/forecasting/tsmixer_model.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -581,8 +581,9 @@ def __init__(
581581
num_blocks
582582
The number of mixer blocks in the model. The number includes the first block and all subsequent blocks.
583583
activation
584-
The name of the activation function to use in the mixer layers. Default: `"ReLU"`. Must be one of
585-
`"ReLU", "RReLU", "PReLU", "ELU", "Softplus", "Tanh", "SELU", "LeakyReLU", "Sigmoid", "GELU"`.
584+
The activation function to use in the mixer layers (default='ReLU').
585+
Supported activations: ['ReLU', 'RReLU', 'PReLU', 'ELU', 'Softplus', 'Tanh', 'SELU', 'LeakyReLU', 'Sigmoid',
586+
'GELU']
586587
dropout
587588
Fraction of neurons affected by dropout. This is compatible with Monte Carlo dropout at inference time
588589
for model uncertainty estimation (enabled with ``mc_dropout=True`` at prediction time).

0 commit comments

Comments
 (0)