The notion of equivalent number of degrees of freedom (e.d.f.) has been recently proposed in the context of neural network modeling for small data sets. This quantity is much smaller than the number of the parameters in the network and it does not depend on the number of input variables. In this paper, we present numerical studies on both real and simulated data sets assuring the validity of e.d.f. in a general framework. Results confirm that e.d.f. performs more reliably than the total number W of adaptive parameters - which are usually assumed equal to the degrees of freedom of the model in common statistical softwares - for analyzing and comparing neural models. Numerical studies also point out that e.d.f. works well in estimating the error variance and constructing approximate confidence intervals. We then propose a comparison among some model selection criteria and results show that for neural networks GCV performs slightly better. We finally present a simple forward procedure which can be easily implemented for automatically selecting a neural model with good trade-off between learning error and generalization properties.

Computational studies with equivalent degrees of freedoms in neural networks / S., Ingrassia; Morlini, Isabella. - In: ADVANCES AND APPLICATIONS IN STATISTICS. - ISSN 0972-3617. - STAMPA. - 13:(2009), pp. 49-81.

Computational studies with equivalent degrees of freedoms in neural networks

MORLINI, Isabella
2009

Abstract

The notion of equivalent number of degrees of freedom (e.d.f.) has been recently proposed in the context of neural network modeling for small data sets. This quantity is much smaller than the number of the parameters in the network and it does not depend on the number of input variables. In this paper, we present numerical studies on both real and simulated data sets assuring the validity of e.d.f. in a general framework. Results confirm that e.d.f. performs more reliably than the total number W of adaptive parameters - which are usually assumed equal to the degrees of freedom of the model in common statistical softwares - for analyzing and comparing neural models. Numerical studies also point out that e.d.f. works well in estimating the error variance and constructing approximate confidence intervals. We then propose a comparison among some model selection criteria and results show that for neural networks GCV performs slightly better. We finally present a simple forward procedure which can be easily implemented for automatically selecting a neural model with good trade-off between learning error and generalization properties.
2009
13
49
81
Computational studies with equivalent degrees of freedoms in neural networks / S., Ingrassia; Morlini, Isabella. - In: ADVANCES AND APPLICATIONS IN STATISTICS. - ISSN 0972-3617. - STAMPA. - 13:(2009), pp. 49-81.
S., Ingrassia; Morlini, Isabella
File in questo prodotto:
File Dimensione Formato  
2009 Adas Ingrassia & Morlini-1.pdf

Accesso riservato

Tipologia: Versione originale dell'autore proposta per la pubblicazione
Dimensione 1.36 MB
Formato Adobe PDF
1.36 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/615418
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact