Originally, only sigmoid activation functions (sigmoid and hyperbolic tangent) were commonly used; however, recently, the class of ramp functions has also been employed, which, in contrast to sigmoid functions, are unbounded. Artificial Neurons with the ramp activation function are defined as Rectified Linear Units (ReLUs) (KSH12a).