ddpw 5.2.1

Creator: bigcodingguy24

Last updated:

Add to Cart

Description:

ddpw 5.2.1

DDPW









Distributed Data Parallel Wrapper (DDPW) is a lightweight wrapper that
scaffolds PyTorch's (Distributed Data) Parallel.
This code is written in Python 3.10. The DDPW
documentation contains details on how to use
this package.
Overview
Installation
conda install -c tvsujal ddpw # with conda
pip install ddpw # with pip from PyPI

Usage
from ddpw import Platform, Wrapper

# some task
def task(global_rank, local_rank, group, args):
print(f'This is GPU {global_rank}(G)/{local_rank}(L); args = {args}')

# platform (e.g., 4 GPUs)
platform = Platform(device='gpu', n_gpus=4)

# wrapper
wrapper = Wrapper(platform=platform)

# start
wrapper.start(task, ('example',))

License

For personal and professional use. You cannot resell or redistribute these repositories in their original state.

Customer Reviews

There are no reviews.