激活函數(shù)在設(shè)計(jì)神經(jīng)網(wǎng)絡(luò)上很關(guān)鍵。隱藏層的激活函數(shù)影響的是學(xué)習(xí),輸出層影響的是輸出。
概述:
1.激活函數(shù)
2.隱藏層的激活函數(shù)
3.輸出層的激活函數(shù)
激活函數(shù)
激活函數(shù)定義了輸入的加權(quán)和是如何被轉(zhuǎn)化成輸出的。
一個(gè)神經(jīng)網(wǎng)絡(luò)通常有三個(gè)部分:輸入層,隱藏層,輸出層。
所有隱藏層的激活函數(shù)一般相同,輸出層一般不同。激活函數(shù)一般可微。
隱藏層的激活函數(shù)
Rectified Linear Activation (ReLU)
Logistic (Sigmoid)
Hyperbolic Tangent (Tanh)
ReLU:max(0.0, x)
from matplotlib import pyplot
# rectified linear function
def rectified(x):
return max(0.0, x)
# define input data
inputs = [x for x in range(-10, 10)]
# calculate outputs
outputs = [rectified(x) for x in inputs]
# plot inputs vs outputs
pyplot.plot(inputs, outputs)
pyplot.show()
Sigmoid:1.0 / (1.0 + e^-x)
# example plot for the sigmoid activation function
from math import exp
from matplotlib import pyplot
# sigmoid activation function
def sigmoid(x):
return 1.0 / (1.0 + exp(-x))
# define input data
inputs = [x for x in range(-10, 10)]
# calculate outputs
outputs = [sigmoid(x) for x in inputs]
# plot inputs vs outputs
pyplot.plot(inputs, outputs)
pyplot.show()
tanh:
# example plot for the tanh activation function
from math import exp
from matplotlib import pyplot
# tanh activation function
def tanh(x):
return (exp(x) - exp(-x)) / (exp(x) + exp(-x))
# define input data
inputs = [x for x in range(-10, 10)]
# calculate outputs
outputs = [tanh(x) for x in inputs]
# plot inputs vs outputs
pyplot.plot(inputs, outputs)
pyplot.show()
如何選擇一個(gè)隱藏層激活函數(shù):
image.png
輸出層
Linear
Logistic (Sigmoid)
Softmax
線(xiàn)形的并不改變什么,而是直接返回值。
# example plot for the linear activation function
from matplotlib import pyplot
# linear activation function
def linear(x):
return x
# define input data
inputs = [x for x in range(-10, 10)]
# calculate outputs
outputs = [linear(x) for x in inputs]
# plot inputs vs outputs
pyplot.plot(inputs, outputs)
pyplot.show()
softmax和概率相關(guān)
sigmoid
如何選擇:
回歸問(wèn)題用線(xiàn)性,
Binary Classification: One node, sigmoid activation.
Multiclass Classification: One node per class, softmax activation.
image.png