Coverart for item
The Resource New Theory of Discriminant Analysis after R. Fisher : Advanced Research by the Feature Selection Method for Microarray Data

New Theory of Discriminant Analysis after R. Fisher : Advanced Research by the Feature Selection Method for Microarray Data

Label
New Theory of Discriminant Analysis after R. Fisher : Advanced Research by the Feature Selection Method for Microarray Data
Title
New Theory of Discriminant Analysis after R. Fisher
Title remainder
Advanced Research by the Feature Selection Method for Microarray Data
Creator
Subject
Language
eng
Cataloging source
MiAaPQ
Literary form
non fiction
Nature of contents
dictionaries
New Theory of Discriminant Analysis after R. Fisher : Advanced Research by the Feature Selection Method for Microarray Data
Label
New Theory of Discriminant Analysis after R. Fisher : Advanced Research by the Feature Selection Method for Microarray Data
Link
http://libproxy.rpi.edu/login?url=https://ebookcentral.proquest.com/lib/rpi/detail.action?docID=4774258
Publication
Copyright
Related Contributor
Related Location
Related Agents
Related Authorities
Related Subjects
Carrier category
online resource
Carrier category code
cr
Carrier MARC source
rdacarrier
Color
multicolored
Content category
text
Content type code
txt
Content type MARC source
rdacontent
Contents
  • Preface -- Acknowledgments -- Contents -- Symbols -- 1 New Theory of Discriminant Analysis -- 1.1 Introduction -- 1.1.1 Theory Theme -- 1.1.2 Five Problems -- 1.1.2.1 Problem 1 -- 1.1.2.2 Problem 2 -- 1.1.2.3 Problem 3 -- 1.1.2.4 Problem 4 -- 1.1.2.5 Problem 5 -- 1.1.2.6 Summary -- 1.2 Motivation for Our Research -- 1.2.1 Contribution by Fisher -- 1.2.2 Defect of Fisher's Assumption for Medical Diagnosis -- 1.2.3 Research Outlook -- 1.2.4 Method 1 and Problem 4 -- 1.3 Discriminant Functions -- 1.3.1 Statistical Discriminant Functions -- 1.3.2 Before and After SVM -- 1.3.3 IP-OLDF and Four New Facts of Discriminant Analysis -- 1.3.4 Revised IP-OLDF, Revised LP-OLDF, and Revised IPLP-OLDF -- 1.4 Unresolved Problem (Problem 1) -- 1.4.1 Perception Gap of Problem 1 -- 1.4.2 Student Data -- 1.5 LSD Discrimination (Problem 2) -- 1.5.1 Importance of This Problem -- 1.5.2 Pass/Fail Determination -- 1.5.3 Discrimination by Four Testlets -- 1.6 Generalized Inverse Matrices (Problem 3) -- 1.7 K-Fold Cross-Validation (Problem 4) -- 1.7.1 100-Fold Cross-Validation -- 1.7.2 LOO and K-Fold Cross-Validation -- 1.8 Matroska Feature-Selection Method (Problem 5) -- 1.9 Summary -- References -- 2 Iris Data and Fisher's Assumption -- 2.1 Introduction -- 2.1.1 Evaluation of Iris Data -- 2.1.2 100-Fold Cross-Validation for Small Sample (Method 1) -- 2.2 Iris Data -- 2.2.1 Data Outlook -- 2.2.2 Model Selection by Regression Analysis -- 2.3 Comparison of Seven LDFs -- 2.3.1 Comparison of MNM and Eight NMs -- 2.3.2 Comparison of Seven Discriminant Coefficient -- 2.3.3 LINGO Program 1: Six MP-Based LDFs for Original Data -- 2.4 100-Fold Cross-Validation for Small Sample Method (Method 1) -- 2.4.1 Four Trials to Obtain Validation Sample -- 2.4.1.1 Generate Training and Validation Samples by Random Number -- 2.4.1.2 20,000 Normal Random Sampling
  • 2.4.1.3 20,000 Resampling Samples -- 2.4.1.4 K-Fold Cross-Validation for Small Sample Method -- 2.4.2 Best Model Comparison -- 2.4.3 Comparison of Discriminant Coefficient -- 2.5 Summary -- References -- 3 Cephalo-Pelvic Disproportion Data with Collinearities -- 3.1 Introduction -- 3.2 CPD Data -- 3.2.1 Collinearities -- 3.2.2 How to Find Linear Relationships in Collinearities -- 3.2.3 Comparison Between MNM and Eight NMs -- 3.2.4 Comparison of 95 % CI of Discriminant Coefficient -- 3.3 100-Fold Cross-Validation -- 3.3.1 Best Model -- 3.3.2 95 % CI of Discriminant Coefficient -- 3.4 Trial to Remove Collinearity -- 3.4.1 Examination by PCA (Alternative 2) -- 3.4.2 Third Alternative Approach -- 3.5 Summary -- References -- 4 Student Data and Problem 1 -- 4.1 Introduction -- 4.2 Student Data -- 4.2.1 Data Outlook -- 4.2.2 Different LDFs -- 4.2.3 Comparison of Seven LDFs -- 4.2.4 K-Best Option -- 4.2.5 Evaluation by Regression Analysis -- 4.3 100-Fold Cross-Validation of Student Data -- 4.3.1 Best Model -- 4.3.2 Comparison of Coefficients by LINGO Program 1 and Program 2 -- 4.4 Student Linearly Separable Data -- 4.4.1 Comparison of MNM and Nine "Diff1s" -- 4.4.2 Best Model -- 4.4.3 95 % CI of Discriminant Coefficient -- 4.5 Summary -- References -- 5 Pass/Fail Determination Using Examination Scores -- 5.1 Introduction -- 5.2 Pass/Fail Determination Using Examination Scores Data in 2012 -- 5.3 Pass/Fail Determination by Examination Scores (50 % Level in 2012) -- 5.3.1 MNM and Nine NMs -- 5.3.2 Error Rate Means (M1 and M2) -- 5.3.3 95 % CI of Discriminant Coefficients -- 5.4 Pass/Fail Determination by Examination Scores (90 % Level in 2012) -- 5.4.1 MNM and Nine NMs -- 5.4.2 Error Rate Means (M1 and M2) -- 5.4.3 95 % CI of Discriminant Coefficient -- 5.5 Pass/Fail Determination by Examination Scores (10 % Level in 2012) -- 5.5.1 MNM and Nine NMs
  • 5.5.2 Error Rate Means (M1 and M2) -- 5.5.3 95 % CI of Discriminant Coefficients -- 5.6 Summary -- References -- 6 Best Model for Swiss Banknote Data -- 6.1 Introduction -- 6.2 Swiss Banknote Data -- 6.2.1 Data Outlook -- 6.2.2 Comparison of Seven LDF for Original Data -- 6.3 100-Fold Cross-Validation for Small Sample Method -- 6.3.1 Best Model Comparison -- 6.3.2 95 % CI of Discriminant Coefficient -- 6.3.2.1 Consideration of 27 Models -- 6.3.2.2 Revised IP-OLDF -- 6.3.2.3 Hard-Margin SVM (H-SVM) and Other LDFs -- 6.4 Explanation 1 for Swiss Banknote Data -- 6.4.1 Matroska in Linearly Separable Data -- 6.4.2 Explanation 1 of Method 2 by Swiss Banknote Data -- 6.5 Summary -- References -- 7 Japanese-Automobile Data -- 7.1 Introduction -- 7.2 Japanese-Automobile Data -- 7.2.1 Data Outlook -- 7.2.2 Comparison of Nine Discriminant Functions for Non-LSD -- 7.2.3 Consideration of Statistical Analysis -- 7.3 100-Fold Cross-Validation (Method 1) -- 7.3.1 Comparison of Best Model -- 7.3.2 95 % CI of Coefficients by Six MP-Based LDFs -- 7.3.2.1 Revised IP-OLDF Versus H-SVM -- 7.3.2.2 Revised IPLP-OLDF, Revised LP-OLDF, and other LDFs -- 7.3.3 95 % CI of Coefficients by Fisher's LDF and Logistic Regression -- 7.4 Matroska Feature-Selection Method (Method 2) -- 7.4.1 Feature-Selection by Revised IP-OLDF -- 7.4.2 Coefficient of H-SVM and SVM4 -- 7.5 Summary -- References -- Bibliography -- 8 Matroska Feature-Selection Method for Microarray Dataset (Method 2) -- 8.1 Introduction -- 8.2 Matroska Feature-Selection Method (Method 2) -- 8.2.1 Short Story to Establish Method 2 -- 8.2.2 Explanation of Method 2 by Alon et al. Dataset -- 8.2.2.1 Feature-Selection by Eight LDFs -- 8.2.2.2 Results of Alon et al. Dataset Using the LINGO Program -- 8.2.3 Summary of Six Microarray Datasets in 2016 -- 8.2.4 Summary of Six Datasets in 2015
  • 8.3 Results of the Golub et al. Dataset -- 8.3.1 Outlook of Method 2 by the LINGO Program 3 -- 8.3.2 First Trial to Find the Basic Gene Sets -- 8.3.3 Another BGS in the Fifth SM -- 8.4 How to Analyze the First BGS -- 8.5 Statistical Analysis of SM1 -- 8.5.1 One-Way ANOVA -- 8.5.2 Cluster Analysis -- 8.5.3 PCA -- 8.6 Summary -- References -- 9 LINGO Program 2 of Method 1 -- 9.1 Introduction -- 9.2 Natural (Mathematical) Notation by LINGO -- 9.3 Iris Data in Excel -- 9.4 Six LDFs by LINGO -- 9.5 Discrimination of Iris Data by LINGO -- 9.6 How to Generate Resampling Samples and Prepare Data in Excel File -- 9.7 Set Model by LINGO -- Index
http://library.link/vocab/cover_art
https://contentcafe2.btol.com/ContentCafe/Jacket.aspx?Return=1&Type=S&Value=9789811021640&userID=ebsco-test&password=ebsco-test
Dimensions
unknown
http://library.link/vocab/discovery_link
{'f': 'http://opac.lib.rpi.edu/record=b4387198'}
Extent
1 online resource (221 pages)
Form of item
online
Isbn
9789811021640
Media category
computer
Media MARC source
rdamedia
Media type code
c
Sound
unknown sound
Specific material designation
remote

Library Locations

    • Folsom LibraryBorrow it
      110 8th St, Troy, NY, 12180, US
      42.729766 -73.682577
Processing Feedback ...