Pełnotekstowe zasoby PLDML oraz innych baz dziedzinowych są już dostępne w nowej Bibliotece Nauki.
Zapraszamy na https://bibliotekanauki.pl

PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
2008 | 18 | 4 | 455-464

Tytuł artykułu

Random projection RBF nets for multidimensional density estimation

Treść / Zawartość

Warianty tytułu

Języki publikacji

EN

Abstrakty

EN
The dimensionality and the amount of data that need to be processed when intensive data streams are observed grow rapidly together with the development of sensors arrays, CCD and CMOS cameras and other devices. The aim of this paper is to propose an approach to dimensionality reduction as a first stage of training RBF nets. As a vehicle for presenting the ideas, the problem of estimating multivariate probability densities is chosen. The linear projection method is briefly surveyed. Using random projections as the first (additional) layer, we are able to reduce the dimensionality of input data. Bounds on the accuracy of RBF nets equipped with a random projection layer in comparison to RBF nets without dimensionality reduction are established. Finally, the results of simulations concerning multidimensional density estimation are briefly reported.

Rocznik

Tom

18

Numer

4

Strony

455-464

Opis fizyczny

Daty

wydano
2008
otrzymano
2007-12-04
poprawiono
2008-05-15

Twórcy

  • Institute of Computer Engineering, Automation and Robotics, Wrocław University of Technology, Wybrzeże Wyspiańskiego 27, 50-370 Wrocław, Poland

Bibliografia

  • Achlioptas D. (2001). Database friendly random projections, Proceedings of the 20th ACM SIGMOD-SIGACT-SIGART Symposium on Principles of Database Systems, Santa Barbara, CA, USA, pp. 274-281.
  • Ailon N. and Chazelle B. (2006). Approximate nearest neighbors and the fast Johnson-Lindenstrauss transform, Proceedings of the 38th Annual ACM Symposium on Theory of Computing, Seattle, WA, USA, pp. 557-563.
  • Arriaga R. and Vempala S.(1999). An algorithmic theory of learning: Robust concepts and random projection, Proceedings of the 40th Annual Symposium on Foundations of Computer Science, New York, NY, USA, pp. 616-623.
  • Bishop C. M. (1994). Novelty detection and neural-network validation, IEE Proceedings - Vision Image and Signal Processing, 141:217-222.
  • Bishop C.M. (1995). Neural Networks for Pattern Recognition, Oxford University Press, Oxford.
  • Bowman A.W. (1995). An alternative method of cross-validation for the smoothing of density estimates, Biometrika 71(2): 353-360.
  • Broomhead D. and Lowe D. (1988). Multivariable functional interpolation and adaptive networks, Complex Systems 2(11): 321-323.
  • Buhmann M. D. (1988). Radial Basis Functions: Theory and Implementations, Cambridge University Press, Cambridge.
  • Chen S., Cowan C.F.N. and Grant P.M. (1991). Orthogonal least squares learning algorithm for radial basisfunction networks, IEEE Transactions on Neural Networks 2(2): 302-307.
  • Chen S., Hong X. and Harris C.J. (2004). Sparse kernel density construction using orthogonal forward regression with leave-one-out test score and local regularization, IEEE Transactions on Systems, Man, and Cybernetics, Part B, 34(4): 1708-1717.
  • Dasgupta S. and Gupta A. (2003). An elementary proof of a theorem of Johnson and Lindenstrauss, Random Structures and Algorithms 22(1): 60-65.
  • Devroye L. and Györfi L. (1985). Nonparametric Density Estimation. The L1 View. Wiley, New York, NY.
  • Devroye L., Györfi L. and Lugosi G. (1996). Probabilistic Theory of Pattern Recognition, Springer-Verlag, New York, NY.
  • Frankl P. and Maehara H. (1987). The Johnson-Lindenstrauss lemma and the sphericity of some graphs, Journal of Combinatorial Theory A 44(3): 355-362.
  • Gertler J.J. (1998). Fault Detection and Diagnosis in Engineering Systems, Marcel Dekker, New York, NY.
  • Guh R.(2005). A hybrid learning based model for on-line detection and analysis of control chart patterns, Computers and Industrial Engineering 49(1): 35-62.
  • Holmström L. and Hämäläinen A. (1993). The self-organizing reduced kernel density estimator, Proceedings of the 1993 IEEE International Conference on Neural Networks, San Francisco, CA, USA, 1: 417-421.
  • Haykin S. (1999). Neural Networks. A Comprehensive Foundation, 2nd Ed., Prentice-Hall, Upper Saddle River, NJ.
  • Indyk P. and Motwani R. (1998). Approximate nearest neighbors: Towards removing the curse of dimensionality, Proceedings of the 30th Annual ACM Symposium on Theory of Computing, Dallas, TX, USA, pp. 604-613.
  • Indyk P. and Naor A.(2006). Nearest neighbor preserving embeddings, ACM Transactions on Algorithms (to appear).
  • Johnson W. B. and Lindenstrauss J. (1984). Extensions of Lipshitz mapping into Hilbert space, Contemporary Mathematics 26: 189-206.
  • Jones M.C., Marron J.S. and Sheather S.J. (1996). A brief survey of bandwidth selection for density estimation, Journal of the American Statistical Association 91(433): 401-407.
  • Karayiannis N.B. (1999). Reformulated radial basis neural networks trained by gradient descent, IEEE Transactions on Neural Networks 10(3): 657-671.
  • Krzyżak A. (2001). Nonlinear function learning using optimal radial basis function networks, Journal on Nonlinear Analysis 47(1): 293-302.
  • Krzyżak A., Linder T.and Lugosi G. (2001). Nonparametric estimation and classification using radial basis function nets and empirical risk minimization, IEEE Transactions on Neural Networks 7(2): 475-487.
  • Krzyżak A. and Niemann H. (2001). Convergence and rates of convergence of radial basis functions networks in function learning, Journal on Nonlinear Analysis 47(1): 281-292.
  • Krzyżak A.and Skubalska-Rafajłowicz E. (2004). Combining space-filling curves and radial basis function networks, Artificial Intelligence and Soft Computing ICAISC 2004. 7th International conference. Zakopane, Poland, Lecture Notes in Artificial Intelligence, 3070: 229-234, Springer-Verlag, Berlin.
  • Korbicz J., Kościelny J. M., Kowalczuk Z. and Cholewa W. (Eds) (2004). Fault diagnosis. Models, Artificial Intelligence, Applications, Springer-Verlag, Berlin.
  • Leonard J. A., and Kramer M. A. (1990). Classifying process behaviour with neural networks: Strategies for improved training and generalization, Proceedings of the American Control Conference, San Diego, CA, USA, pp. 2478-2483.
  • Leonard J.A., and Kramer M.A. (1991). Radial basis networks for classifying process faults, IEEE Control Systems Magazine 11(3): 31-38
  • Li Y., Pont M. J. and Jones N.B. (2002). Improving the performance of radial basis function classifiers in condition monitoring and fault diagnosis applications where ‘unknown' faults may occur, Pattern Recognition Letters 23(5): 569-577.
  • Li P., Hastie T.J. and Church K.W. (2007). Nonlinear Estimators and tail bounds for dimension reduction in l1 using Cauchy random projections, The Journal of Machine Learning Research 8(10): 2497-2532.
  • Li P., Hastie T.J. and Church K.W. (2006). Sub-Gaussian random projections, Technical report, Stanford University.
  • Magdon-Ismail M. and Atiya A. (2002). Density estimation and random variate generation using multilayer networks, IEEE Transactions on Neural Networks 13(3): 497-520.
  • Moody J. and Darken C.J. (1989). Fast learning in networks of locally tuned processing units, Neural Computation 1(2): 281-294.
  • Patton R.J.(1994). Robust model-based fault diagnosis: The state of the art, Proceedings of the IFAC Symposium on Fault Detection Supervision and Safety of Technical Processes, Espoo, Finland, pp. 1-24.
  • Patton R. J., Chen J. and Benkhedda H.(2000). A study on neurofuzzy systems for fault diagnosis, International Journal of Systems Science 31(11): 1441-1448.
  • Parzen E. (1962 ). On estimation of a probability density function and mode, Annals of Mathematical Statistics 33(3): 1065-1076.
  • Poggio T. and Girosi F. (1990). Networks for approximation and learning, Proceedings of the IEEE 78(9): 484-1487.
  • Powell M.J.D. (1987). Radial basis functions for multivariable interpolation: A review, in (J.C. Mason, M.G. Cox, Eds.) Algorithms for Approximation, Clarendon Press, Oxford, pp. 143-167.
  • Rafajłowicz E. (2006). RBF nets in fault localization, 8th International Conference on Artificial Intelligence and Soft Computing - ICAISC 2006. Zakopane, Poland, LNCS, Springer-Verlag, Berlin/Heidelberg, 4029/2006: 113-122.
  • Rafajłowicz E., Skubalska-Rafajłowicz E. (2003). RBF nets based on equidistributed points, Proceedings of 9th IEEE International Conference: Methods and Models in Automation and Robotics MMAR 2003, Szczecin, Poland, 2: 921-926.
  • Roberts S. (2000). Extreme value statistics for novelty detection in biomedical data processing, IEE Proceedings: Science, Measurement and Technology 147 (6): 363-367.
  • Schlorer H. and Hartman U. (1992). Mapping neural networks derived from the Parzen window estimator, Neural Networks 5(6): 903-909.
  • Skubalska-Rafajłowicz E. (2000). On using space-filling curves and vector quantization for constructing multidimensional control charts, Proceedings of the 5th on Conference Neural Network and Soft Computing, Zakopane, Poland, pp. 162-167.
  • Skubalska-Rafajłowicz E. (2006a). RBF neural network for probability density function estimation and detecting changes in multivariate processes, 8th International Conference: Artificial Intelligence and Soft Computing - ICAISC 2006. Zakopane, Poland, LNCS, Springer-Verlag, Berlin/Heidelberg 4029/2006: 133-141.
  • Skubalska-Rafajłowicz E. (2006b). Self-organizing RBF neural network for probability density function estimation, Proceedings of the 12th IEEE International Conference on Methods and Models in Automation and Robotics, Międzyzdroje, Poland, pp. 985-988.
  • Specht D.F. (1990). Probabilistic neural networks, Neural Networks 3(1): 109-118.
  • Vempala S. (2004). The Random Projection Method, American Mathematical Society, Providence, RI.
  • Wettschereck D. and Dietterich T. (1992). Improving the performance of radial basis function networks by learning center locations, in (B. Spatz, Ed.) Advances in Neural Information Processing Systems, Morgan Kaufmann, San Mateo, CA, Vol. 4, pp. 1133-1140.
  • Willsky A. S. (1976). A survey of design methods for failure detection in dynamic systems, Automatica 12(6): 601-611.
  • Xu L., Krzyżak A. and Yuille A. (1994). On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size, Neural Networks 7(4): 609-628.
  • Yee P. V. and Haykin S. (2001). Regularized Radial Basis Function Networks: Theory and Applications, John Wiley, New York, NY.
  • Yin H. and Allinson N.M.(2001). Self-organising mixture networks for probability density estimation, IEEE Transactions on Neural Networks 12(2): 405-411.
  • Zorriassatine F., Tannock J.D.T.and O‘Brien C.(2003). Using novelty detection to identify abnormalities caused by mean shifts in bivariate processes, Computers and Industrial Engineering 44(3): 385-408.

Typ dokumentu

Bibliografia

Identyfikatory

Identyfikator YADDA

bwmeta1.element.bwnjournal-article-amcv18i4p455bwm
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.