Coverart for item
The Resource Artificial neural networks : an introduction, Kevin L. Priddy and Paul E. Keller, (electronic resource)

Artificial neural networks : an introduction, Kevin L. Priddy and Paul E. Keller, (electronic resource)

Label
Artificial neural networks : an introduction
Title
Artificial neural networks
Title remainder
an introduction
Statement of responsibility
Kevin L. Priddy and Paul E. Keller
Creator
Contributor
Subject
Language
eng
Summary
This tutorial text provides the reader with an understanding of artificial neural networks (ANNs) and their application, beginning with the biological systems which inspired them, through the learning methods that have been developed and the data collection processes, to the many ways ANNs are being used today. The material is presented with a minimum of math (although the mathematical details are included in the appendices for interested readers), and with a maximum of hands-on experience. All specialized terms are included in a glossary. The result is a highly readable text that will teach the engineer the guiding principles necessary to use and apply artificial neural networks
Member of
Additional physical form
Also available in print.
Cataloging source
CaBNVSL
Illustrations
illustrations
Index
index present
Literary form
non fiction
Nature of contents
  • dictionaries
  • bibliography
Series statement
Tutorial texts in optical engineering
Series volume
v. TT68
Target audience
  • adult
  • specialized
Artificial neural networks : an introduction, Kevin L. Priddy and Paul E. Keller, (electronic resource)
Label
Artificial neural networks : an introduction, Kevin L. Priddy and Paul E. Keller, (electronic resource)
Link
Publication
Note
"SPIE digital library."
Related Contributor
Related Location
Related Agents
Related Authorities
Related Subjects
Related Items
Bibliography note
Includes bibliographical references (p. [151]-162) and index
Color
black and white
Contents
  • Chapter 1. Introduction. 1.1. The neuron -- 1.2. Modeling neurons -- 1.3. The feedforward neural network -- 1.4. Historical perspective on computing with artificial neurons
  • Chapter 2. Learning methods. 2.1. Supervised training methods -- 2.2. Unsupervised training methods
  • Chapter 3. Data normalization. 3.1. Statistical or Z-score normalization -- 3.2. Min-max normalization -- 3.3. Sigmoidal or SoftMax normalization -- 3.4. Energy normalization -- 3.5. Principal components normalization
  • Chapter 4. Data collection, preparation, labeling, and input coding. 4.1. Data collection -- 4.2. Feature selection and extraction
  • Chapter 5. Output coding. 5.1. Classifier coding -- 5.2. Estimator coding
  • Chapter 6. Post-processing
  • Chapter 7. Supervised training methods. 7.1. The effects of training data on neural network performance -- 7.2. Rules of thumb for training neural networks -- 7.3. Training and testing
  • Chapter 8. Unsupervised training methods. 8.1. Self-organizing maps (SOMs) -- 8.2. Adaptive resonance theory network
  • Chapter 9. Recurrent neural networks. 9.1. Hopfield neural networks -- 9.2. The bidirectional associative memory (BAM) -- 9.3. The generalized linear neural network -- 9.4. Real-time recurrent network -- 9.5. Elman recurrent network
  • Chapter 10. A plethora of applications. 10.1. Function approximation -- 10.2. Function approximation-Boston housing example -- 10.3. Function approximation-cardiopulmonary modeling -- 10.4. Pattern recognition-tree classifier example -- 10.5. Pattern recognition-handwritten number recognition example -- 10.6. Pattern recognition-electronic nose example -- 10.7. Pattern recognition-airport scanner texture recognition example -- 10.8. Self organization-serial killer data-mining example -- 10.9. Pulse-coupled neural networks-image segmentation example
  • Chapter 11. Dealing with limited amounts of data. 11.1. K-fold cross-validation -- 11.2. Leave-one-out cross-validation -- 11.3. Jackknife resampling -- 11.4. Bootstrap resampling
  • Appendix A. The feedforward neural network. A.1. Mathematics of the feedforward process -- A.2. The backpropagation algorithm -- A.3. Alternatives to backpropagation
  • Appendix B. Feature saliency
  • Appendix C. Matlab code for various neural networks. C.1. Matlab code for principal components normalization -- C.2. Hopfield network -- C.3. Generalized neural network -- C.4. Generalized neural network example -- C.5. ART-like network -- C.6. Simple perceptron algorithm -- C.7. Kohonen self-organizing feature map
  • Appendix D. Glossary of terms -- References -- Index
http://library.link/vocab/cover_art
https://contentcafe2.btol.com/ContentCafe/Jacket.aspx?Return=1&Type=S&Value=9780819478726&userID=ebsco-test&password=ebsco-test
Dimensions
unknown
http://library.link/vocab/discovery_link
{'f': 'http://opac.lib.rpi.edu/record=b3828657'}
Extent
1 online resource (ix, 165 p. : ill.)
File format
multiple file formats
Form of item
online
Governing access note
Restricted to subscribers or individual electronic text purchasers
Isbn
9780819478726
Isbn Type
(electronic)
Other physical details
digital file.
Reformatting quality
access
Specific material designation
remote
System details
  • Mode of access: World Wide Web
  • System requirements: Adobe Acrobat Reader

Library Locations

    • Folsom LibraryBorrow it
      110 8th St, Troy, NY, 12180, US
      42.729766 -73.682577
Processing Feedback ...