3

I know about CPU, GPU and TPU. But, it is the first time for me to read about XPU from PyTorch documentation about MODULE.

xpu(device=None)


Moves all model parameters and buffers to the XPU.

This also makes associated parameters and buffers different objects. So it should be called before constructing optimizer if the module will live on XPU while being optimized.

Note: This method modifies the module in-place.

Parameters

    device (int, optional) – if specified, all parameters will be copied to that device

Returns

    self

Return type

    Module

CPU stands for Central Processing Unit. GPU stands for Graphical Processing Unit. TPU stands for Tensor Processing Unit.

Most of us know that these processing units are highly useful in the research of some computational intensive domains of AI including deep learning. So, I am wondering whether XPU is also useful in AI research since it is used in PyTorch.

From the context, I can say that PU stands for processing unit. But I have no idea of what X is.

What is the full form for XPU? Where can I read about XPU in detail?

hanugm
  • 3,571
  • 3
  • 18
  • 50
  • 2
    This blog post by Intel seems to be explaining it: https://medium.com/intel-tech/the-evolution-of-xpu-and-the-critical-role-of-software-c46970dfcbe9 – SpiderRico Jul 21 '21 at 16:36

1 Answers1

6

XPU is a device abstraction for Intel heterogeneous computation architectures, which can be mapped to CPU, GPU, FPGA and other accelerators. The "X" from XPU is just like a variable, like in maths, so you can do X=C and you get CPU accceleration, or X=G and you get GPU acceleration... That's the intuition behind that abstract name.

In order to integrate a new accelerator you need 2 things:

JVGD
  • 1,088
  • 1
  • 6
  • 14