GitHunt

Pytorch-PCGrad

PyTorch Implementation of "Gradient Surgery for Multi-Task Learning" using multiprocessing

Usage

import torch
import torch.nn as nn
import torch.optim as optim
from ppcgrad import PPCGrad

# wrap your favorite optimizer
optimizer = PPCGrad(optim.Adam(net.parameters())) 
losses = [...] # a list of per-task losses
assert len(losses) == num_tasks
optimizer.pc_backward(losses) # calculate the gradient can apply gradient modification
optimizer.step()  # apply gradient step

Languages

Python100.0%

Contributors

Apache License 2.0
Created May 17, 2025
Updated May 17, 2025