What is the batch version of torch.randperm?

Random permutation can be used in a variety of applications. For example, it can shuffle datasets during training or generate random sequences for algorithms. PyTorch’s torch.randperm can generate a random integer permutation. However, when dealing with batch data or sequences, we need to be able to generate a batch of random permutations at once.

Let’s understand the working of torch.randperm and why there is a need for batched permutations.

The torch.randperm

In PyTorch, torch.randperm is a function that produces a random permutation of integers from 00 to n1n-1. Here, nn is the input variable provided to the function. The function shuffles the elements and returns a 1-D tensorIn PyTorch, a tensor is a multi-dimensional matrix containing elements of a single data type. Tensors are the fundamental building blocks of PyTorch and are used for representing and manipulating data. depicting the random permutation.

Here is an example:

import torch
# Produces a random permutation of 20 integers
perm = torch.randperm(20)
print(perm)

In the example above, perm contains a random order of integers from 00 to 1919.

Point to Ponder

Question

What is the difference between permutation and random integers generator?

Show Answer

The necessity of batched permutations

In many scenarios, especially in machine learning, a single permutation might not be sufficient. Consider situations where we have batched data, and we need a distinct random order for each batch element. To address this need, PyTorch provides an elegant solution using tensor operations.

Generating batched permutations

For an effective generation of batched permutations, we can employ a combination of list comprehensions and tensor stacking. Following is an illustrative example:

import torch
batch_size = 4 # Size of the batch
length = 20 # Length of the permutation sequence
# Produce a batch of random permutations
batch_perm = torch.stack([torch.randperm(length) for _ in range(batch_size)])
print(batch_perm)

In this example, a list comprehension is employed to create a list of random permutations, each representing one batch element. The torch.stack function is then used to concatenate these permutations along a new dimension, resulting in a tensor with shape (batch_size,length), i.e., (4,20)(4, 20) in our case.

This highly flexible approach can be easily adapted to different batch sizes and sequence lengths. Simply adjust the batch_size and length variables according to the specific requirements.

Free Resources

Copyright ©2025 Educative, Inc. All rights reserved