Abstract
In this paper, we report a study on learning ability of a Deterministic Boltzmann Machine (DBM) with neurons which have a non-monotonic activation function. We use an end-cut-off-type function with a threshold parameter `θ' as the non-monotonic function. Numerical simulations of learning nonlinear problems, such as the XOR problem and the ADD problem, show that the DBM network with non-monotonic neurons has higher learning ability compared to the network with monotonic neurons, and that the non-monotonic neural network has a novel effects which adjust the number of neurons. We have designed an integrated circuit of the 2-3-1 DBM network. The use of the non-monotonic neurons make it possible to integrate a large scale neural network because of the simple circuit design.
Original language | English |
---|---|
Pages | 2347-2350 |
Number of pages | 4 |
Publication status | Published - 1999 |
Event | International Joint Conference on Neural Networks (IJCNN'99) - Washington, DC, USA Duration: 1999 Jul 10 → 1999 Jul 16 |
Other
Other | International Joint Conference on Neural Networks (IJCNN'99) |
---|---|
City | Washington, DC, USA |
Period | 99/7/10 → 99/7/16 |
ASJC Scopus subject areas
- Software
- Artificial Intelligence