How to beginner · 3 min read

How to use ReLU in PyTorch

Quick answer
Use the torch.nn.ReLU module or the functional API torch.nn.functional.relu to apply the ReLU activation in PyTorch. Instantiate torch.nn.ReLU() as a layer or call torch.nn.functional.relu(tensor) directly on tensors.

PREREQUISITES

  • Python 3.8+
  • pip install torch>=2.0

Setup

Install PyTorch if you haven't already. Use the official command from PyTorch installation guide. For CPU-only:

pip install torch torchvision torchaudio
bash
pip install torch torchvision torchaudio

Step by step

Here is a complete example showing how to use torch.nn.ReLU as a layer and torch.nn.functional.relu as a function on a tensor.

python
import torch
import torch.nn as nn
import torch.nn.functional as F

# Create a sample tensor with negative and positive values
x = torch.tensor([-1.0, 0.0, 1.0, 2.0, -0.5])

# Using torch.nn.ReLU as a layer
relu_layer = nn.ReLU()
y_layer = relu_layer(x)
print('Output using nn.ReLU layer:', y_layer)

# Using torch.nn.functional.relu as a function

y_function = F.relu(x)
print('Output using functional.relu:', y_function)
output
Output using nn.ReLU layer: tensor([0., 0., 1., 2., 0.])
Output using functional.relu: tensor([0., 0., 1., 2., 0.])

Common variations

You can use inplace=True with nn.ReLU(inplace=True) to save memory by modifying the input tensor directly. Also, ReLU is commonly used inside custom nn.Module models as an activation layer.

python
class SimpleNN(nn.Module):
    def __init__(self):
        super().__init__()
        self.linear = nn.Linear(3, 3)
        self.relu = nn.ReLU(inplace=True)

    def forward(self, x):
        x = self.linear(x)
        x = self.relu(x)
        return x

model = SimpleNN()
input_tensor = torch.tensor([[1.0, -1.0, 0.0]])
output = model(input_tensor)
print('Model output:', output)
output
Model output: tensor([[0.0000, 0.0000, 0.0000]], grad_fn=<ReluBackward0>)

Troubleshooting

  • If your output tensor has unexpected negative values, ensure you applied ReLU correctly either as a layer or function.
  • Using inplace=True can cause errors if the input is needed elsewhere; remove it if you see runtime errors.
  • Check tensor shapes before applying ReLU to avoid shape mismatch errors.

Key Takeaways

  • Use torch.nn.ReLU() as a layer or torch.nn.functional.relu() as a function to apply ReLU activation.
  • The inplace=True option modifies tensors in place to save memory but can cause errors if misused.
  • ReLU zeroes out negative values and passes positive values unchanged, making it simple and efficient.
  • Integrate ReLU inside custom nn.Module models for neural network activations.
  • Always verify tensor shapes and data types before applying activation functions to avoid runtime errors.
Verified 2026-04
Verify ↗