What the Underscore Suffix Means in PyTorch Function Names

It indicates in-place operations.

Introduction

When working with PyTorch, you'll often see functions with a trailing underscore _ in their names.

  • kaiming_normal_
  • add_
  • etc...

I didn’t know what it meant at first, so here’s a quick note for future reference.

Note: This article was translated from my original post.

Underscore "_" Suffix in PyTorch Function Names

Meaning

A trailing underscore _ in a PyTorch function name means it's an in-place operation.

In-place operations directly modify the original data.

In-place operations Operations that have a _ suffix are in-place.
For example: x.copy_(y), x.t_(), will change x.

Ref. Tensors — PyTorch Tutorials 2.7.0+cu126 documentation

In-place operations can save memory, but since they overwrite the history, they may cause problems with autograd. Generally, they are discouraged.

In-place operations save some memory, but can be problematic when computing derivatives because of an immediate loss of history. Hence, their use is discouraged.

Ref. Tensors — PyTorch Tutorials 2.7.0+cu126 documentation

Example

Here’s a simple example showing the difference between add and add_.

First, the add example:

import torch

x = torch.tensor([1, 2, 3])
print("x: ", x)
y = x.add(1)
print("y: ", y)
print("x: ", x)

### Output ###
# x:  tensor([1, 2, 3])
# y:  tensor([2, 3, 4])
# x:  tensor([1, 2, 3])

The original tensor x remains unchanged after using add.

Now with add_:

x_ = torch.tensor([1, 2, 3])
print("x_: ", x_)
y_ = x_.add_(1)
print("y_: ", y_)
print("x_: ", x_)

### Output ###
# x_:  tensor([1, 2, 3])
# y_:  tensor([2, 3, 4])
# x_:  tensor([2, 3, 4])

You can see that x_ itself has been incremented.

This is an in-place operation.

Conclusion

That’s a quick memo on what the trailing underscore _ in PyTorch function names means.

Hope it’s helpful to someone!

[Related Articles]

en.bioerrorlog.work

References