中文标题#
ReCA:一種參數化 ReLU 複合激活函數
英文标题#
ReCA: A Parametric ReLU Composite Activation Function
中文摘要#
激活函數已被證明對深度神經網絡的性能有顯著影響。儘管修正線性單元(ReLU)在實踐中仍然是主要選擇,但深度神經網絡的最佳激活函數仍然是一個開放的研究問題。在本文中,我們提出了一種基於 ReLU 的新穎參數化激活函數,ReCA,該函數已在使用不同複雜神經網絡架構的最新數據集上表現出優於所有基線的效果。
英文摘要#
Activation functions have been shown to affect the performance of deep neural networks significantly. While the Rectified Linear Unit (ReLU) remains the dominant choice in practice, the optimal activation function for deep neural networks remains an open research question. In this paper, we propose a novel parametric activation function, ReCA, based on ReLU, which has been shown to outperform all baselines on state-of-the-art datasets using different complex neural network architectures.
PDF 獲取#
抖音掃碼查看更多精彩內容