All Tensors that have requires_grad
which is False
will be leaf Tensors by convention.
For Tensors that have requires_grad
which is True
, they will be leaf Tensors if they were created by the user. This means that they are not the result of an operation and so grad_fn
is None.
Only leaf Tensors will have their grad
populated during a call to backward()
. To get grad
populated for non-leaf Tensors, you can use retain_grad()
.
Example:
>>> a = torch.rand(10, requires_grad=True) >>> a.is_leaf True >>> b = torch.rand(10, requires_grad=True).cuda() >>> b.is_leaf False # b was created by the operation that cast a cpu Tensor into a cuda Tensor >>> c = torch.rand(10, requires_grad=True) + 2 >>> c.is_leaf False # c was created by the addition operation >>> d = torch.rand(10).cuda() >>> d.is_leaf True # d does not require gradients and so has no operation creating it (that is tracked by the autograd engine) >>> e = torch.rand(10).cuda().requires_grad_() >>> e.is_leaf True # e requires gradients and has no operations creating it >>> f = torch.rand(10, requires_grad=True, device="cuda") >>> f.is_leaf True # f requires grad, has no operation creating it
Access comprehensive developer documentation for PyTorch
View Docs TutorialsGet in-depth tutorials for beginners and advanced developers
View Tutorials ResourcesFind development resources and get your questions answered
View ResourcesRetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4