Skip to main content
Log in

Dynamic Schedule for Effective On-Line Connection Pruning

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

In this letter we investigate a neural network simplification schedule that takes place at the same time as regular weight adjustment and with variable pruning strength. The underlying connection model incorporates an explicit trainable factor modulating the classical synaptic weight. Learning in this context results in a reduced size structure with enhanced generalization ability. The effectiveness of the method is empirically explored in an artificial application and a classical real-world benchmark problem.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Bengio, Y., Simand, P. and Frasconi, P.: Learning long-term dependencies with gradient descent is difficult, IEEE Transactions on Neural Networks, 51(2) (1994) pp. 157–166.

    Google Scholar 

  2. Box, G. E. P. and Jenkins, G. M.: Time Series Analysis, Forecasting and Control, Holden Day: San Francisco, CA, 1976.

    Google Scholar 

  3. Cotrell, M., Girard, B., Girard, Y. and Mangeas, M.: Time series and neural network: A statistical method for weight elimination, Proceedings of the International Symposium on Neural Networks, Brussels, (1993) pp. 157–164.

  4. Fahlman, S. E. and Lebiere, C.: The cascade-correlation learning algorithm. In: D. S. Touretzky (ed.), Advances in Neural Information Processing System 2, Morgan Kaufmann: San Mateo, CA, (1990) pp. 525–532.

    Google Scholar 

  5. Izenman, A. J.: J. R. Wolf and the Zurich sunspot relative numbers, The Mathematical Intelligencer, 7(1) (1985) pp. 27–33.

    Google Scholar 

  6. Karmin, E. D.: A simple procedure for pruning back-propagation trained neural networks, IEEE Transactions on Neural Networks, 1(2) (1990) pp. 239–242.

    Google Scholar 

  7. KrishnaKumar, K.: Optimization of the neural net connectivity pattern using a back-propagation algorithm, Neurocomputing, 5(6) (1993) pp. 272–286.

    Google Scholar 

  8. Mozer, M. C. and Smokensky, P.: Skeletonization: A technique for trimming the fat from a network via relevance assessment. In: D. S. Touretzky (ed.) Advances in Neural Information Processing Systems 1, Morgan Kaufmann: San Mateo, CA, (1989) pp. 107–115.

    Google Scholar 

  9. Pi, H. and Peterson, C.: Finding the embedding dimensions and variable dependencies in time series, Neural Computation, 6(3) (1994) pp. 509–520.

    Google Scholar 

  10. Prechelt, L.: Connection pruning with static and adaptive pruning schedules, Neurocomputing, 16(1) (1997) pp. 49–61.

    Google Scholar 

  11. Reed, R.: Pruning algorithms — A survey, IEEE Transactions on Neural Networks, 4(5) (1993) pp. 740–747.

    Google Scholar 

  12. Rementeria, S. and Basogain, X.: Predicting Sunspots with a Self-Configuring Neural System, Proceedings of the Eighth Information Processing and Management of Uncertainty in Knowledge Based Systems Conference (IPMU 2000), Madrid, (2000) pp. 1184–1190.

  13. Rementeria, S. and Olabe, X.: On Simultaneous Weight and Architecture Learning. In: J. Mira, R. Moreno-Diaz and J. Cabestany (eds.), Biological and Artificial Computation: From Neuroscience to Technology, Lecture Notes in Computer Science 1240, Springer Verlag, Berlin (1997) pp. 501–509.

    Google Scholar 

  14. Riedmiller, M. and Braun, H.: A direct adaptive method for faster backpropagation learning: The RPROP algorithm, Proceedings of the IEEE International Conference on Neural Networks, San Francisco, (1993) pp. 586–591.

  15. Subba Rao, T. and Gabr, M. M.: An introduction to bispectral analysis and bilinear time series models, Lecture Notes in Statistics, Vol. 24, Springer Verlag, Berlin, 1984.

    Google Scholar 

  16. Wan, E. A.: Combining fossil and sunspot data: Committee predictions, Proceedings of the 1997 International Conference on Neural Networks (ICNN 97) (also accessible as http://www.ee.ogi.edu/~ericwan/PUBS/wan_icnn97b.ps), 1997.

  17. Weigend, A. S., Huberman, B. A. and Rumelhart, D. E.: Predicting sunspots and exchange rates with connectionist networks. In: M. Casdagli and S. G. Eubank, (eds.), Nonlinear Modeling and Forecasting, Addison-Wesley, Reading, MA, (1992) pp. 395–432.

    Google Scholar 

  18. Williams, R. J. and Zipser, D.: A learning algorithm for continually running fully recurrent neural networks, Neural Computation. 1(2) (1989) pp. 270–280.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Rementeria, S. Dynamic Schedule for Effective On-Line Connection Pruning. Neural Processing Letters 14, 1–14 (2001). https://doi.org/10.1023/A:1011321906641

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1011321906641

Navigation