Previous | [ 1] | [ 2] | [ 3] | [ 4] | [ 5] | [ 6] | [ 7] | [ 8] | [ 9] | [ 10] | [ 11] | [ 12] | [ 13] | [ 14] | [ 15] | [ 16] | [ 17] | [ 18] | [ 19] | [ 20] | [ 21] | [ 22] | [ 23] | [ 24] | [ 25] |

¡@

CHAO-HUI KO, CHING-TSORNG TSAI^{*,+} AND CHISHYAN LIAW^{*}

*
Department of Information Management
Hsiuping Institute of Technology
Taichung, 412 Taiwan
*

The Quadratic Hebbian-type associative memories have superior performance, but
they are more difficult to implement because of their large interconnection values in chips
than are the first order Hebbian-type associative memories. In order to reduce the interconnection
value for a neural network with M patterns stored, the interconnection value
[- *M*, *M*] is mapped to [- *H*, *H*] linearly, where *H* is the quantization level. The probability
of direct convergence equation of quantized Quadratic Hebbian-type associative memories
is derived and the performances are explored. The experiments demonstrate that the quantized
network approaches the original recall capacity at a small quantization level. Quadratic
Hebbian-type associative memories usually store more patterns; therefore, the strategy
of linear quantization reduces interconnection value more efficiently.

*
Keywords:
*
Hebbian-type associative memories, quadratic Hebbian-type associative memories,
linear quantization, interconnection quantization, probability of direct convergence

Retrieve PDF document (**201103_06.pdf**)

Received May 27, 2009; revised August 18, 2009; accepted December 4, 2009.

Communicated by Chin-Teng Lin.
^{+} Corresponding author.