Deep Learning with PyTorch
上QQ阅读APP看书,第一时间看更新

Data preparation

PyTorch provides two kinds of data abstractions called tensors and variables. Tensors are similar to numpy arrays and they can also be used on GPUs, which provide increased performance. They provide easy methods of switching between GPUs and CPUs. For certain operations, we can notice a boost in performance and machine learning algorithms can understand different forms of data, only when represented as tensors of numbers. Tensors are like Python arrays and can change in size. For example, images can be represented as three-dimensional arrays (height, weight, channel (RGB)). It is common in deep learning to use tensors of sizes up to five dimensions. Some of the commonly used tensors are as follows:

  • Scalar (0-D tensors)
  • Vector (1-D tensors)
  • Matrix (2-D tensors)
  • 3-D tensors
  • Slicing tensors
  • 4-D tensors
  • 5-D tensors
  • Tensors on GPU