Note_Tech

All technological notes.


Project maintained by simonangel-fong Hosted on GitHub Pages — Theme by mattgraham

PyTorch - Gradient

Back


Gradient


Example 1: Calculate Derivatives

derivatives

derivatives

import torch
# requires_grad: require a gradient to find the derivative of the function.
x = torch.tensor(2.0, requires_grad=True)
print(x)        # tensor(2., requires_grad=True)
y = 8*x**4+3*x**3+7*x**2+6*x+3      # define function
# compute the derivative of the function
y.backward()
# .grad: The attribute of tensor. The value is the gradients computed and future calls to backward().
print(x.grad)  # tensor(326.)

Example 2: Calculate Derivatives

import torch
x = torch.tensor(2.0, requires_grad=True)
z = torch.tensor(4.0, requires_grad=True)

# define function
y = x**2+z**3

y.backward()    # get Derivatives
print(x.grad)   # tensor(4.)
print(z.grad)   # tensor(48.)

TOP