The Resource Neural networks and pattern recognition, edited by Omid Omidvar, Judith Dayhoff
Neural networks and pattern recognition, edited by Omid Omidvar, Judith Dayhoff
- Language
- eng
- Extent
- 1 online resource (xvi, 351 pages)
- Contents
-
- (Chapter Headings) Preface. Contributors. J.L. Johnson, H. Ranganath, G. Kuntimad, and H.J. Caulfield, Pulse-Coupled Neural Networks. H. Li and J. Wang, A Neural Network Model for Optical Flow Computation. F. Unal and N. Tepedelenlioglu, Temporal Pattern Matching Using an Artificial Neural Network. J. Dayhoff, P. Palmadesso, F. Richards, and D.-T. Lin, Patterns of Dynamic Activity and Timing in Neural Network Processing. J. Ghosh, H.-J. Chang, and K. Liano, A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons. P. Tito, B. Horne, C.L. Giles, and P. Collingwood, Finite State Machines and Recurrent Neural Networks--Automata and Dynamical Systems Approaches. R. Anderson, Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error. A. Nigrin, Using SONNET 1 to Segment Continuous Sequences of Items. K. Venkatesh, A. Pandya, and S. Hsu, On the Use of High Level Petri Nets in the Modeling of Biological Neural Networks. J. Principe, S. Celebi, B. de Vries, and J. Harris, Locally Recurrent Networks: The Gamma Operator, Properties, and Extensions. Preface. Contributors. J.L. Johnson, H. Ranganath, G. Kuntimad, and H.J. Caulfield, Pulse-Coupled Neural Networks: Introduction. Basic Model. Multiple Pulses. Multiple Receptive Field Inputs. Time Evolution of Two Cells. Space to Time. LinkingWaves and Time Scales. Groups. Invariances. Segmentation. Adaptation. Time to Space. Implementations. Integration into Systems. Concluding Remarks. References. H. Li and J. Wang, A Neural Network Model for Optical Flow Computation: Introduction. Theoretical Background. Discussion on the Reformulation. Choosing Regularization Parameters. A Recurrent Neural Network Model. Experiments. Comparison to Other Work. Summary and Discussion. References. F. Unal and N. Tepedelenlioglu, TemporalPattern Matching Using an Artificial Neural Network: Introduction. Solving Optimization Problems Using the Hopfield Network. Dynamic Time Warping Using Hopfield Network. Computer Simulation Results. Conclusions. References. J. Dayhoff, P. Palmadesso, F. Richards, and D.-T. Lin, Patterns of Dynamic Activity and Timing in Neural Network Processing: Introduction. Dynamic Networks. Chaotic Attractors and Attractor Locking. Developing Multiple Attractors. Attractor Basins and Dynamic Binary Networks. Time Delay Mechanisms and Attractor Training. Timing of Action Potentials in Impulse Trains. Discussion. Acknowledgments. References. J. Ghosh, H.-J. Chang, and K. Liano, A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons: Introduction. A Macroscopic Model for Cell Assemblies. Interactions Between Two Neural Groups. Stability of Equilibrium States. Oscillation Frequency Estimation. Experimental Validation. Conclusion. Appendix. References. P. Tito, B. Horne, C.L. Giles, and P. Collingwood, Finite State Machines and Recurrent Neural Networks--Automata and Dynamical Systems Approaches: Introduction. State Machines. Dynamical Systems. Recurrent Neural Network. RNN as a State Machine. RNN as a Collection of Dynamical Systems. RNN with Two State Neurons. Experiments--Learning Loops of FSM. Discussion. References. R. Anderson, Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error: Introduction. Hebb's Rule. Theoretical Learning Rules. Biological Evidence. Conclusions. Acknowledgments. References and Bibliography. A. Nigrin, Using SONNET 1 to Segment Continuous Sequences of Items: Introduction. Learning Isolated and Embedded Spatial Patterns. Storing Items with Decreasing Activity. The LTM Invariance Principle. Using Rehearsal to Process Arbitrarily Long Lists. Implementing the LTM Invariance Principle with an On-Center Off-Surround Circuit. Resetting Items Once They can be Classified. Properties of a Classifying System. Simulations. Discussion. K. Venkatesh, A. Pandya, and S. Hsu, On the Use of High Level Petri Nets in the Modeling of Biological Neural Networks: Introduction. Fundamentals of PNs. Modeling of Biological Neural Systems with High Level PNs. New/Modified Elements Added to HPNs to Model BNNs. Example of a BNN: The Olfactory Bulb. Conclusions. References. J. Principe, S. Celebi, B. de Vries, and J. Harris, Locally Recurrent Networks: The Gamma Operator, Properties, and Extensions: Introduction. Linear Finite Dimensional Memory Structures. The Gamma Neural Network. Applications of the Gamma Memory. Interpretations of the Gamma Memory. Laguerre and Gamma II Memories. Analog VLSI Implementations of the Gamma Filter. Conclusions. References
- Pulse-coupled neural networks / J.L. Johnson [and others] -- A neural network model for optical flow computation / Hua Li ; Jun Wang -- Temporal pattern matching using an artificial neural network / Fatih A. Unal ; Nazif Tepedelenlioglu -- Patterns of dynamic activity and timing in neural network processing / Judith E. Dayhoff [and others] -- A macroscopic model of oscillation in ensembles of inhibitory and excitatory neurons / Joydeep Ghosh ; Hung-Jen Chang ; Kadir Liano -- Finite state machines and recurrent neural networks--automata and dynamical systems approaches / Peter Tiňo [and others] -- Biased random-walk learning: a neurobiological correlate to trial-and-error / Russell W. Anderson -- Using SONNET 1 to segment continuous sequences of items / Albert Nigrin -- On the use of high-level petri nets in the modeling of biological neural networks / Kurapati Venkatesh ; Abhijit Pandya ; Sam Hsu -- Locally recurrent networks: the gamma operator, properties, and extensions / Jose C. Principe [and others]
- Isbn
- 9780080512617
- Label
- Neural networks and pattern recognition
- Title
- Neural networks and pattern recognition
- Statement of responsibility
- edited by Omid Omidvar, Judith Dayhoff
- Language
- eng
- Cataloging source
- DLC
- Illustrations
- illustrations
- Index
- index present
- LC call number
- QA76.87
- LC item number
- .O45 1998
- Literary form
- non fiction
- Nature of contents
- bibliography
- Label
- Neural networks and pattern recognition, edited by Omid Omidvar, Judith Dayhoff
- Bibliography note
- Includes bibliographical references and index
- Carrier category
- online resource
- Carrier category code
- cr
- Carrier MARC source
- rdacarrier
- Color
- multicolored
- Content category
- text
- Content type code
- txt
- Content type MARC source
- rdacontent
- Contents
-
- (Chapter Headings) Preface. Contributors. J.L. Johnson, H. Ranganath, G. Kuntimad, and H.J. Caulfield, Pulse-Coupled Neural Networks. H. Li and J. Wang, A Neural Network Model for Optical Flow Computation. F. Unal and N. Tepedelenlioglu, Temporal Pattern Matching Using an Artificial Neural Network. J. Dayhoff, P. Palmadesso, F. Richards, and D.-T. Lin, Patterns of Dynamic Activity and Timing in Neural Network Processing. J. Ghosh, H.-J. Chang, and K. Liano, A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons. P. Tito, B. Horne, C.L. Giles, and P. Collingwood, Finite State Machines and Recurrent Neural Networks--Automata and Dynamical Systems Approaches. R. Anderson, Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error. A. Nigrin, Using SONNET 1 to Segment Continuous Sequences of Items. K. Venkatesh, A. Pandya, and S. Hsu, On the Use of High Level Petri Nets in the Modeling of Biological Neural Networks. J. Principe, S. Celebi, B. de Vries, and J. Harris, Locally Recurrent Networks: The Gamma Operator, Properties, and Extensions. Preface. Contributors. J.L. Johnson, H. Ranganath, G. Kuntimad, and H.J. Caulfield, Pulse-Coupled Neural Networks: Introduction. Basic Model. Multiple Pulses. Multiple Receptive Field Inputs. Time Evolution of Two Cells. Space to Time. LinkingWaves and Time Scales. Groups. Invariances. Segmentation. Adaptation. Time to Space. Implementations. Integration into Systems. Concluding Remarks. References. H. Li and J. Wang, A Neural Network Model for Optical Flow Computation: Introduction. Theoretical Background. Discussion on the Reformulation. Choosing Regularization Parameters. A Recurrent Neural Network Model. Experiments. Comparison to Other Work. Summary and Discussion. References. F. Unal and N. Tepedelenlioglu, TemporalPattern Matching Using an Artificial Neural Network: Introduction. Solving Optimization Problems Using the Hopfield Network. Dynamic Time Warping Using Hopfield Network. Computer Simulation Results. Conclusions. References. J. Dayhoff, P. Palmadesso, F. Richards, and D.-T. Lin, Patterns of Dynamic Activity and Timing in Neural Network Processing: Introduction. Dynamic Networks. Chaotic Attractors and Attractor Locking. Developing Multiple Attractors. Attractor Basins and Dynamic Binary Networks. Time Delay Mechanisms and Attractor Training. Timing of Action Potentials in Impulse Trains. Discussion. Acknowledgments. References. J. Ghosh, H.-J. Chang, and K. Liano, A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons: Introduction. A Macroscopic Model for Cell Assemblies. Interactions Between Two Neural Groups. Stability of Equilibrium States. Oscillation Frequency Estimation. Experimental Validation. Conclusion. Appendix. References. P. Tito, B. Horne, C.L. Giles, and P. Collingwood, Finite State Machines and Recurrent Neural Networks--Automata and Dynamical Systems Approaches: Introduction. State Machines. Dynamical Systems. Recurrent Neural Network. RNN as a State Machine. RNN as a Collection of Dynamical Systems. RNN with Two State Neurons. Experiments--Learning Loops of FSM. Discussion. References. R. Anderson, Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error: Introduction. Hebb's Rule. Theoretical Learning Rules. Biological Evidence. Conclusions. Acknowledgments. References and Bibliography. A. Nigrin, Using SONNET 1 to Segment Continuous Sequences of Items: Introduction. Learning Isolated and Embedded Spatial Patterns. Storing Items with Decreasing Activity. The LTM Invariance Principle. Using Rehearsal to Process Arbitrarily Long Lists. Implementing the LTM Invariance Principle with an On-Center Off-Surround Circuit. Resetting Items Once They can be Classified. Properties of a Classifying System. Simulations. Discussion. K. Venkatesh, A. Pandya, and S. Hsu, On the Use of High Level Petri Nets in the Modeling of Biological Neural Networks: Introduction. Fundamentals of PNs. Modeling of Biological Neural Systems with High Level PNs. New/Modified Elements Added to HPNs to Model BNNs. Example of a BNN: The Olfactory Bulb. Conclusions. References. J. Principe, S. Celebi, B. de Vries, and J. Harris, Locally Recurrent Networks: The Gamma Operator, Properties, and Extensions: Introduction. Linear Finite Dimensional Memory Structures. The Gamma Neural Network. Applications of the Gamma Memory. Interpretations of the Gamma Memory. Laguerre and Gamma II Memories. Analog VLSI Implementations of the Gamma Filter. Conclusions. References
- Pulse-coupled neural networks / J.L. Johnson [and others] -- A neural network model for optical flow computation / Hua Li ; Jun Wang -- Temporal pattern matching using an artificial neural network / Fatih A. Unal ; Nazif Tepedelenlioglu -- Patterns of dynamic activity and timing in neural network processing / Judith E. Dayhoff [and others] -- A macroscopic model of oscillation in ensembles of inhibitory and excitatory neurons / Joydeep Ghosh ; Hung-Jen Chang ; Kadir Liano -- Finite state machines and recurrent neural networks--automata and dynamical systems approaches / Peter Tiňo [and others] -- Biased random-walk learning: a neurobiological correlate to trial-and-error / Russell W. Anderson -- Using SONNET 1 to segment continuous sequences of items / Albert Nigrin -- On the use of high-level petri nets in the modeling of biological neural networks / Kurapati Venkatesh ; Abhijit Pandya ; Sam Hsu -- Locally recurrent networks: the gamma operator, properties, and extensions / Jose C. Principe [and others]
- http://library.link/vocab/cover_art
- https://contentcafe2.btol.com/ContentCafe/Jacket.aspx?Return=1&Type=S&Value=9780080512617&userID=ebsco-test&password=ebsco-test
- Dimensions
- unknown
- http://library.link/vocab/discovery_link
- {'f': 'http://opac.lib.rpi.edu/record=b4169692'}
- Extent
- 1 online resource (xvi, 351 pages)
- Form of item
- online
- Isbn
- 9780080512617
- Media category
- computer
- Media MARC source
- rdamedia
- Media type code
- c
- Other physical details
- illustrations
- Specific material designation
- remote
Embed (Experimental)
Settings
Select options that apply then copy and paste the RDF/HTML data fragment to include in your application
Embed this data in a secure (HTTPS) page:
Layout options:
Include data citation:
<div class="citation" vocab="http://schema.org/"><i class="fa fa-external-link-square fa-fw"></i> Data from <span resource="http://link.lib.rpi.edu/portal/Neural-networks-and-pattern-recognition-edited/7Qjyc0a8bjU/" typeof="WorkExample http://bibfra.me/vocab/lite/Item"><span property="name http://bibfra.me/vocab/lite/label"><a href="http://link.lib.rpi.edu/portal/Neural-networks-and-pattern-recognition-edited/7Qjyc0a8bjU/">Neural networks and pattern recognition, edited by Omid Omidvar, Judith Dayhoff</a></span> - <span property="offers" typeOf="Offer"><span property="offeredBy" typeof="Library ll:Library" resource="http://link.lib.rpi.edu/"><span property="name http://bibfra.me/vocab/lite/label"><a property="url" href="http://link.lib.rpi.edu/">Rensselaer Libraries</a></span></span></span></span></div>
Note: Adjust the width and height settings defined in the RDF/HTML code fragment to best match your requirements
Preview
Cite Data - Experimental
Data Citation of the Item Neural networks and pattern recognition, edited by Omid Omidvar, Judith Dayhoff
Copy and paste the following RDF/HTML data fragment to cite this resource
<div class="citation" vocab="http://schema.org/"><i class="fa fa-external-link-square fa-fw"></i> Data from <span resource="http://link.lib.rpi.edu/portal/Neural-networks-and-pattern-recognition-edited/7Qjyc0a8bjU/" typeof="WorkExample http://bibfra.me/vocab/lite/Item"><span property="name http://bibfra.me/vocab/lite/label"><a href="http://link.lib.rpi.edu/portal/Neural-networks-and-pattern-recognition-edited/7Qjyc0a8bjU/">Neural networks and pattern recognition, edited by Omid Omidvar, Judith Dayhoff</a></span> - <span property="offers" typeOf="Offer"><span property="offeredBy" typeof="Library ll:Library" resource="http://link.lib.rpi.edu/"><span property="name http://bibfra.me/vocab/lite/label"><a property="url" href="http://link.lib.rpi.edu/">Rensselaer Libraries</a></span></span></span></span></div>