Previous | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 |

¡@

**Pao-Ta Yu and Chih-Chia Yao ^{+}**

National Chung Cheng University

Chiayi, 621 Taiwan

E-mail: csipty@csie.ccu.edu.tw

Chaoyang University of Technology

Taichung, 413 Taiwan

E-mail: ccyao@cyut.edu.tw

Weighted order statistics (WOS) filters are highly effective, in processing digital signals, due to their simple window structure. This paper proposes a fast and efficient learning algorithm that both improves learning speed and reduces the complexity of designing WOS filters. The algorithm uses a dichotomous approach to reduce the Boolean functions from 255 levels to two levels which are separated by an optimal hyperplane. The design concept of this algorithm is similar to that of support vector machines (SVMs), which use two separate sets of data to determine the optimal hyperplane. A conjugate gradient algorithm is adopted, to solve the orthant-constrained optimum problem, in order to improve memory storage for the large amounts of data required in the design process. Prior literature includes three different schemes for learning: one approach updates the parameters via pattern-mode learning, while the others involve batch-mode learning and semi-batch mode learning. Our proposed method approximates the optimal weighted order statistics filters far more rapidly than either Yoo¡¦s algorithm or adaptive neural filters.

*
Keywords:
*
weighted order statistics filters, dichotomy, support vector machines, patternmode
learning, batch-mode learning

Retrieve PDF document (**200805_05.pdf**)

Received April 26, 2006; revised August 15, 2006; accepted November 8, 2006.

Communicated by Ja-Ling Wu.
^{*} This work was supported by the National Science Council of Taiwan, R.O.C. under grant No. NSC 95-2221-
E-194-066.