Consider the following paragraph from Numerical Computation of the deep learning book.
When $f'(x) = 0$, the derivative provides no information about which direction to move. Points where $f'(x)$ = 0 are known as critical points, or stationary points. A local minimum is a point where $f(x)$ is lower than at all neighboring points, so it is no longer possible to decrease $f(x)$ by making infinitesimal steps. A local maximum is a point where $f(x)$ is higher than at all neighboring points so it is not possible to increase $f(x)$ by making infinitesimal steps. Some critical points are neither maxima nor minima. These are known as saddle points.
In short, points where $f'(x) =0 $ are called critical points, or stationary points.
But, according to mathematical terminology, the definitions are as follows:
#1: Critical point
A function $y=f(x)$ has critical points at all points $x_0$ where $f'(x_0)=0$ or $f(x)$ is not differentiable.
#2: Stationary point
A point $x_0$ at which the derivative of a function $f(x)$ vanishes, $f'(x_0)=0$. A stationary point may be a minimum, maximum, or inflection point.
It can be noticed that the definitions that are given in the deep learning book do match exactly with stationary points since the only premise is $f'(x)=0$. The definition for critical point is not apt since a critical point can also be a point where $f'(x)$ is nonexistent.
Is there any reason for using the terms critical points and stationary points interchangeably? Is there no need to address the points where $f'(x)$ does not exist?