2

I've read a number of articles on how GPUs can speed up matrix algebra calculations, but I'm wondering how calculations are performed when one uses various kernel functions in a neural network.

If I use Sigmoid functions in my neural network, does the computer use the CPU for the Sigmoid calculation, and then the GPU for the subsequent matrix calculations?

Alternatively, is the GPU capable of doing nonlinear calculations in addition to the linear algebra calculations? If not, how about a simple Kernel function like ReLU? Can a GPU do the Relu calculation, or does it defer to the CPU?

Specifically, I'm using Keras with a Tensorflow backend, and would like to know what TensorFlow can and cannot use the GPU for, but I'm also interested in the general case.

0 Answers0