Transforms

Affine

Affine transforms implemented on torch tensors, and requiring only one interpolation

class pywick.transforms.affine_transforms.Affine(tform_matrix, interp='bilinear')[source]
class pywick.transforms.affine_transforms.AffineCompose(transforms, interp='bilinear')[source]
class pywick.transforms.affine_transforms.RandomAffine(rotation_range=None, translation_range=None, shear_range=None, zoom_range=None, interp='bilinear', lazy=False)[source]
class pywick.transforms.affine_transforms.RandomChoiceRotate(values, p=None, interp='bilinear', lazy=False)[source]
class pywick.transforms.affine_transforms.RandomChoiceShear(values, p=None, interp='bilinear', lazy=False)[source]
class pywick.transforms.affine_transforms.RandomChoiceTranslate(values, p=None, interp='bilinear', lazy=False)[source]
class pywick.transforms.affine_transforms.RandomChoiceZoom(values, p=None, interp='bilinear', lazy=False)[source]
class pywick.transforms.affine_transforms.RandomRotate(rotation_range, interp='bilinear', lazy=False)[source]
class pywick.transforms.affine_transforms.RandomShear(shear_range, interp='bilinear', lazy=False)[source]
class pywick.transforms.affine_transforms.RandomSquareZoom(zoom_range, interp='bilinear', lazy=False)[source]
class pywick.transforms.affine_transforms.RandomTranslate(translation_range, interp='bilinear', lazy=False)[source]
class pywick.transforms.affine_transforms.RandomZoom(zoom_range, interp='bilinear', lazy=False)[source]
class pywick.transforms.affine_transforms.Rotate(value, interp='bilinear', lazy=False)[source]
class pywick.transforms.affine_transforms.Shear(value, interp='bilinear', lazy=False)[source]
class pywick.transforms.affine_transforms.Translate(value, interp='bilinear', lazy=False)[source]
class pywick.transforms.affine_transforms.Zoom(value, interp='bilinear', lazy=False)[source]

Distortion

Transforms to distort local or global information of an image

class pywick.transforms.distortion_transforms.Blur(threshold, order=5)[source]

Blur an image with a Butterworth filter with a frequency cutoff matching local block size

class pywick.transforms.distortion_transforms.RandomChoiceBlur(thresholds, order=5)[source]
class pywick.transforms.distortion_transforms.RandomChoiceScramble(blocksizes)[source]
class pywick.transforms.distortion_transforms.Scramble(blocksize)[source]

Create blocks of an image and scramble them

Image

Transforms very specific to images such as color, lighting, contrast, brightness, etc transforms

NOTE: Most of these transforms assume your image intensity is between 0 and 1, and are torch tensors (NOT numpy or PIL)

class pywick.transforms.image_transforms.Brightness(value)[source]
class pywick.transforms.image_transforms.Contrast(value)[source]
class pywick.transforms.image_transforms.DeNormalize(mean, std)[source]

Denormalizes a tensor using provided mean, std

class pywick.transforms.image_transforms.Gamma(value)[source]
class pywick.transforms.image_transforms.Grayscale(keep_channels=False)[source]
class pywick.transforms.image_transforms.MaskPixelsToMap(value_map: dict = None)[source]

Replaces the pixel values in range [0-255] with class values from supplied value_map.

:return : numpy.ndarray with dtype=np.uint8

class pywick.transforms.image_transforms.MaskToFloatTensor(divisor: float = None)[source]

Converts a PIL, numpy or CV image to a torch.float32 representation

class pywick.transforms.image_transforms.MaskToSqueezedTensor[source]

Removes empty dimensions from the mask and converts to a torch.float32 tensor. Typically used with B/W masks to remove the “channel” dimension

:return tensor

class pywick.transforms.image_transforms.MaskToTensor[source]

Converts a PIL, numpy or CV image to a torch.long representation

class pywick.transforms.image_transforms.RandomBrightness(min_val, max_val)[source]
class pywick.transforms.image_transforms.RandomChoiceBrightness(values, p=None)[source]
class pywick.transforms.image_transforms.RandomChoiceContrast(values, p=None)[source]
class pywick.transforms.image_transforms.RandomChoiceGamma(values, p=None)[source]
class pywick.transforms.image_transforms.RandomChoiceSaturation(values, p=None)[source]
class pywick.transforms.image_transforms.RandomContrast(min_val, max_val)[source]
class pywick.transforms.image_transforms.RandomGamma(min_val, max_val)[source]
class pywick.transforms.image_transforms.RandomGrayscale(p=0.5)[source]
class pywick.transforms.image_transforms.RandomSaturation(min_val, max_val)[source]
class pywick.transforms.image_transforms.Saturation(value)[source]
pywick.transforms.image_transforms.rgb_to_hsv(x)[source]

Convert from RGB to HSV

Tensor

class pywick.transforms.tensor_transforms.AddChannel(axis=0)[source]

Adds a dummy channel to an image, also known as expanding an axis or unsqueezing a dim This will make an image of size (28, 28) to now be of size (1, 28, 28), for example.

param axis: (int): dimension to be expanded to singleton

pywick.transforms.tensor_transforms.CDHW

alias of pywick.transforms.tensor_transforms.ChannelsFirst

pywick.transforms.tensor_transforms.CHW

alias of pywick.transforms.tensor_transforms.ChannelsFirst

class pywick.transforms.tensor_transforms.ChannelsFirst(safe_check=False)[source]

Transposes a tensor so that the channel dim is first. CHW and CDHW are aliases for this transform.

Parameters:safe_check – (bool): if true, will check if channels are already first and, if so, will just return the inputs
class pywick.transforms.tensor_transforms.ChannelsLast(safe_check=False)[source]

Transposes a tensor so that the channel dim is last HWC and DHWC are aliases for this transform.

Parameters:safe_check – (bool): if true, will check if channels are already last and, if so, will just return the inputs
class pywick.transforms.tensor_transforms.Compose(transforms)[source]

Composes (chains) several transforms together.

Parameters:transforms – (list of transforms) to apply sequentially
pywick.transforms.tensor_transforms.DHWC

alias of pywick.transforms.tensor_transforms.ChannelsLast

pywick.transforms.tensor_transforms.ExpandAxis

alias of pywick.transforms.tensor_transforms.AddChannel

pywick.transforms.tensor_transforms.HWC

alias of pywick.transforms.tensor_transforms.ChannelsLast

class pywick.transforms.tensor_transforms.Pad(size)[source]

Pads an image to the given size

Parameters:size – (tuple or list): size of crop
class pywick.transforms.tensor_transforms.PadNumpy(size)[source]

Pads a Numpy image to the given size Return a Numpy image / image pair Arguments ——— :param size: (tuple or list):

size of crop
class pywick.transforms.tensor_transforms.RandomChoiceCompose(transforms)[source]

Randomly choose to apply one transform from a collection of transforms

e.g. to randomly apply EITHER 0-1 or -1-1 normalization to an input:
>>> transform = RandomChoiceCompose([RangeNormalize(0,1),
                                     RangeNormalize(-1,1)])
>>> x_norm = transform(x) # only one of the two normalizations is applied
Parameters:transforms – (list of transforms) to choose from at random
class pywick.transforms.tensor_transforms.RandomCrop(size)[source]

Randomly crop a torch tensor

Parameters:size – (tuple or list): dimensions of the crop
class pywick.transforms.tensor_transforms.RandomFlip(h=True, v=False, p=0.5)[source]

Randomly flip an image horizontally and/or vertically with some probability.

Parameters:
  • h – (bool): whether to horizontally flip w/ probability p
  • v – (bool): whether to vertically flip w/ probability p
  • p – (float between [0,1]): probability with which to apply allowed flipping operations
class pywick.transforms.tensor_transforms.RandomOrder[source]

Randomly permute the channels of an image

class pywick.transforms.tensor_transforms.RangeNormalize(min_val, max_val)[source]

Given min_val: (R, G, B) and max_val: (R,G,B), will normalize each channel of the th.*Tensor to the provided min and max values.

Works by calculating :

a = (max’-min’)/(max-min)

b = max’ - a * max

new_value = a * value + b

where min’ & max’ are given values, and min & max are observed min/max for each channel

Parameters:
  • min_val – (float or integer): Lower bound of normalized tensor
  • max_val – (float or integer): Upper bound of normalized tensor
Example:
>>> x = th.rand(3,5,5)
>>> rn = RangeNormalize((0,0,10),(1,1,11))
>>> x_norm = rn(x)
Also works with just one value for min/max:
>>> x = th.rand(3,5,5)
>>> rn = RangeNormalize(0,1)
>>> x_norm = rn(x)
class pywick.transforms.tensor_transforms.Slice2D(axis=0, reject_zeros=False)[source]

Take a random 2D slice from a 3D image along a given axis. This image should not have a 4th channel dim.

Parameters:
  • axis – (int in {0, 1, 2}): the axis on which to take slices
  • reject_zeros – (bool): whether to reject slices that are all zeros
class pywick.transforms.tensor_transforms.SpecialCrop(size, crop_type=0)[source]

Perform a special crop - one of the four corners or center crop

Parameters:
  • size – (tuple or list): dimensions of the crop
  • crop_type – (int in {0,1,2,3,4}): 0 = center crop 1 = top left crop 2 = top right crop 3 = bottom right crop 4 = bottom left crop
class pywick.transforms.tensor_transforms.StdNormalize[source]

Normalize torch tensor to have zero mean and unit std deviation

class pywick.transforms.tensor_transforms.ToFile(root)[source]

Saves an image to file. Useful as a pass-through transform when wanting to observe how augmentation affects the data

NOTE: Only supports saving to Numpy currently

Parameters:root – (string): path to main directory in which images will be saved
class pywick.transforms.tensor_transforms.ToNumpyType(type)[source]

Converts an object to a specific numpy type (with the idea to be passed to ToTensor() next)

Parameters:type – (one of `{numpy.double, numpy.float, numpy.int64, numpy.int32, and numpy.uint8})
class pywick.transforms.tensor_transforms.ToTensor[source]

Converts a numpy array to torch.Tensor

class pywick.transforms.tensor_transforms.Transpose(dim1, dim2)[source]

Swaps two dimensions of a tensor

Parameters:
  • dim1 – (int): first dim to switch
  • dim2 – (int): second dim to switch
class pywick.transforms.tensor_transforms.TypeCast(dtype='float')[source]

Cast a torch.Tensor to a different type param dtype: (string or torch.*Tensor literal or list) of such

data type to which input(s) will be cast. If list, it should be the same length as inputs.
pywick.transforms.tensor_transforms.Unsqueeze

alias of pywick.transforms.tensor_transforms.AddChannel