default search action
ESANN 1996: Bruges, Belgium
- 4th European Symposium on Artificial Neural Networks, ESANN 1996, Bruges, Belgium, April 24-26, 1996, Proceedings. 1996
Learning and generalization I
- Nicolas Pican:
Synaptic efficiency modulations for context integration: The meta ODWE architecture. - Colin McCormack:
Using a Meta Neural Network for RPROP parameter adaptation. - Christian Goerick, Werner von Seelen:
On unlearnable problems -or- A model for premature saturation in backpropagation learning. - Hubert A. B. te Braake, H. J. L. van Can, Gerrit van Straten, Henk B. Verbruggen:
Regulated Activation Weights Neural Network (RAWN). - Jörg Bruske, Ingo Ahrns, Gerald Sommer:
Praticing Q-learning.
Recurrent models
- Jean-Philippe Draye, Davor Pavisic, Guy Cheron, Gaetan Libert:
Adaptative time constants improve the dynamic features of recurrent neural networks. - Myriam Mokhtari, Herman Akdag:
An adaptive technique for pattern recognition by the random neural network. - Davor Pavisic, Jean-Philippe Draye, Roberto Teran, Gustavo Calderon, Guy Cheron, Gaetan Libert:
Negative initial weights improve learning in recurrent neural networks.
Fuzzy neural networks
- Adelmo Luis Cechin, Ulrich Epperlein, Wolfgang Rosenstiel, Bernhard Koppenhoefer:
The extraction of Sugeno fuzzy rules from neural networks. - Selwyn Piramuthu:
Neural versus neurofuzzy systems for credit approval.
Invited paper I
- Bernd Fritzke:
Growing self-organizing networks - Why ?
Self-organizing maps
- Monika Köhle, Dieter Merkl:
Identification of gait patterns with self-organizing maps based on ground reaction force. - Stefan Schünemann, Bernd Michaelis:
A self-organizing map for analysis of high-dimensional feature spaces with clusters of highly differing feature density. - Jean-Claude Fort, Gilles Pagès:
Quantization vs Organization in the Kohonen S.O.M. - Wlodzislaw Duch, Antoine Naud:
On global self-organizing maps. - Damien Lamberton, Gilles Pagès:
On the critical points of the 1-dimensional competitive learning vector quantization algorithm. - Marie Cottrell, Eric de Bodt:
A Kohonen map representation to avoid misleading interpretations.
Incremental learning
- Karim Mohraz, Peter Protzel:
FlexNet - A flexible neural network construction algorithm. - Christian Scheier:
Incremental category learning in a real world artifact using growing dynamic cell structures. - Stefan Wermter, Manuela Meurer:
Towards constructive and destructive dynamic network configuration. - Rachida Chentouf, Christian Jutten:
Combining sigmoids and radial basis functions in evolutive neural architectures.
Invited paper II
- Fabrizio Mura, Nicolas Martin, Nicolas H. Franceschini:
Biologically inspired eye movements for visually guided navigation of mobile robots.
Classification
- G. Qiu:
A novel two-layer neural network classifier. - Nigel R. Ball:
Representation of obstacles in a neural network based classifier system. - Malti Patel:
Investigating lexical access using neural nets.
Mathematical aspects of neural networks
- Katerina Hlavácková, Vera Kurková:
Rates of approximation of real-valued boolean functions by neural networks. - Eddy Mayoraz:
Bounds on the degree of high order binary perceptrons. - Andreu Català Mallofré, Joseph Aguilar-Martin, Bernardo Morcego Seix, Núria Piera Carreté:
A fast Bayesian algorithm for Boolean functions synthesis by means of perceptron networks. - Hui Wang, David A. Bell:
Accomodating relevance in neural networks.
Natural and artificial vision
- Enno Littmann, Heiko Neumann, Luiz Pessoa:
Neural model for visual contrast detection. - A. de la Hera, Manuel Graña, Alicia D'Anjou, F. Xabier Albizuri:
Application of high-order Boltzmann machines in OCR. - Greg Maguire, Simon X. Yang:
Simulation of an inner plexiform layer neural circuit in vertebrate retina leads to sustained and transient excitation. - Christof Born:
Analysis of visual information by receptive field dynamics. - H. A. K. Mastebroek:
Neurotransmitterdynamics in a model of a movement detecting visual system.
Neural networks and statistics
- Wlodzimierz Kasprzak, Andrzej Cichocki:
Recurrent least square learning for quasi-parallel principal component analysis. - Anne Guérin-Dugué, Carlos Avilés-Cruz, Patricia Palagi:
Interpreting data through neural and statistical tools. - A. Varfis, L. Corleto:
Error rate estimation via cross-validation and learning curve theory. - Mikko Lehtokangas, Petri Korpisaari, Kimmo Kaski:
Maximum covariance method for weight initialization of multilayer perceptron network.
Invited paper III
- Juha Karhunen:
Neural approaches to independent component analysis and source separation.
Learning and generalization II
- Joost N. Kok, Elena Marchiori, Massimo Marchiori, Claudio Rossi:
Constraining of weights using regularities. - E. Schaeffer, P. Bourret, S. Montrozier:
Regularization and neural computation: application to aerial images analysis. - Wojtek Kowalczyk:
An algorithm for training multilayer networks on non-numerical data. - Jey E. E. Ngole:
A correlation-based network for real-time processing. - Simon M. Lucas:
Evolving neural network learning behaviours with set-based chromosomes.
Prediction
- Tommy W. S. Chow, Siu-Yeung Cho:
Neural network application: rainfall forecasting system in Hong Kong. - Yves Moreau, Joos Vandewalle:
Prediction of dynamical systems with composition networks. - Radu Dogaru, A. T. Murgan, Cristina Comaniciu:
Fast signal recognition and detection using ART1 neural networks and nonlinear preprocessing units based on time delay embeddings. - Arnfried Ossen, Stefan M. Rüger:
An analysis of the metric structure of the weight space of feedforward networks and its application to time series modeling and prediction. - Rienk S. Venema, Alexander Ypma, J. A. G. Nijhuis, Lambert Spaanenburg:
Time series prediction using neural networks and its application to artificial human walking.
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.