×

Estimating regional noise on neural network predictions. (English) Zbl 1039.68107

Summary: A new method for estimating the variance of noise for nonlinear regression is presented. The noise is modelled to be regional, i.e. its variance depends on the input, and it consists of two sources: measurement errors and inherent noise of the underlying function. Our approach consists of two neural networks using Bayesian methods, which are trained in sequence. It is orientated by the assumption of unbiased predictions of the mean and the confidence of network prognoses, which are used to predict the variance of noise. We demonstrate our approach on two toy and one real data sets.

MSC:

68T05 Learning and adaptive systems in artificial intelligence
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] J.G. Carney, P. Cunningham, Confidence and prediction intervals for neural network ensembles, in: Proceedings of IJCNN’99, The International Joint Conference on Neural Networks, Washington, USA, July, 1999.; J.G. Carney, P. Cunningham, Confidence and prediction intervals for neural network ensembles, in: Proceedings of IJCNN’99, The International Joint Conference on Neural Networks, Washington, USA, July, 1999.
[2] R.J. Foxall, G.C. Cawley, N.L.C. Talbot, S.R. Dorling, D.P. Mandic, Heteroscedastic regularised kernel regression for prediction of episodes of poor air quality, Proceedings of European Symposium on Artificial Neural Networks (ESANN’2002), Bruges, Belgium, pp. 19-24.; R.J. Foxall, G.C. Cawley, N.L.C. Talbot, S.R. Dorling, D.P. Mandic, Heteroscedastic regularised kernel regression for prediction of episodes of poor air quality, Proceedings of European Symposium on Artificial Neural Networks (ESANN’2002), Bruges, Belgium, pp. 19-24.
[3] T. Heskes, Practical confidence and prediction intervals, in: M. Mozer, M. Jordan, T. Petsche (Eds.), Advances in Neural Information Processing Systems, Vol. 9, MIT Press, Cambridge, 1997, pp. 176-182.; T. Heskes, Practical confidence and prediction intervals, in: M. Mozer, M. Jordan, T. Petsche (Eds.), Advances in Neural Information Processing Systems, Vol. 9, MIT Press, Cambridge, 1997, pp. 176-182.
[4] Nix, D. A.; Weigend, A. S., Learning local error bars for nonlinear regression, (Tesauro, G.; Touretzky, D. S.; Leen, T. K., Advances in Neural Information Processing Systems 7 (NIPS*94) (1995), MIT Press: MIT Press Cambridge, MA), 489-496
[5] A.S. Weigend, M. Mangeas, Avoiding overfitting by locally matching the noise level of the data, Proceedings of the Third International Conference on Artificial Intelligence Applications on Wall Street, New York, NY, 1995 (Addendum).; A.S. Weigend, M. Mangeas, Avoiding overfitting by locally matching the noise level of the data, Proceedings of the Third International Conference on Artificial Intelligence Applications on Wall Street, New York, NY, 1995 (Addendum). · Zbl 0945.91058
[6] R. Dybowski, S.J. Roberts, Confidence intervals and prediction intervals for feed-forward neural networks, url: citeseer.nj.nec.com/391106.html; R. Dybowski, S.J. Roberts, Confidence intervals and prediction intervals for feed-forward neural networks, url: citeseer.nj.nec.com/391106.html
[7] C.M. Bishop, Neural Networks for Pattern Recognition, Oxford University Press, Oxford, 1995 (Chapter 10).; C.M. Bishop, Neural Networks for Pattern Recognition, Oxford University Press, Oxford, 1995 (Chapter 10). · Zbl 0868.68096
[8] MacKay, D. J.C., Bayesian interpolation, Neural Comput., 4, 3, 415-447 (1992)
[9] Cottis, R. A.; Qing, L.; Owen, G.; Gartland, S. J.; Helliwell, I. A.; Turega, M., Neural network methods for corrosion data reduction, Mater. Design, 20, 169-178 (1999)
[10] C.K.I. Williams, C. Qazaz, C.M. Bishop, H. Zhu, On the relationship between Bayesian error bars and the input data density, in: Fourth International Conference on Artificial Neural Networks, IEE Conference Publications, Paris, France, 1995, Vol. 409, pp. 160-165.; C.K.I. Williams, C. Qazaz, C.M. Bishop, H. Zhu, On the relationship between Bayesian error bars and the input data density, in: Fourth International Conference on Artificial Neural Networks, IEE Conference Publications, Paris, France, 1995, Vol. 409, pp. 160-165.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.