Essentials of Deep Learning and AI
US$ 19.95
The publisher has enabled DRM protection, which means that you need to use the BookFusion iOS, Android or Web app to read this eBook. This eBook cannot be used outside of the BookFusion platform.
Description
Contents
Reviews
Language
English
ISBN
9789391030353
Cover Page
Title Page
Copyright Page
Foreword
Dedication Page
About the Authors
About the Reviewer
Acknowledgement
Preface
Errata
Table of Contents
1. Introduction
Structure
Objectives
1.1 Artificial intelligence
1.1.1 What is Artificial Intelligence?
1.1.2 Definitions of Artificial Intelligence
1.1.3 Applications of Artificial Intelligence
1.1.4 Industry domains and sectors along with sample use cases
1.1.5 Broad classification of what is AI, ML, FL, and DL?
1.2 Machine learning
1.2.1 History and definition of ML
1.2.2 Machine learning and its applications
1.2.3 Classification of ML algorithms
1.3 Deep Learning
1.3.1 What are the prerequisites to understand deep learning?
1.3.2 Difference between machine learning and deep learning
1.3.3 Applications of deep learning
1.4 Tools and frameworks for AI, ML and DL
1.5 Languages used for AI, ML, and DL
1.6 Sample datasets for AI, ML, and DL development
Conclusion
Points to remember
Questions
Multiple choice questions
Answers
2. Supervised Machine Learning
Structure
Objectives
2.1 Introduction to Supervised Machine Learning
2.2 Data Cleanup
2.3 Data preparation
2.4 Classification and regression
2.5 Architecture and realization of algorithms
2.5.1 Linear Regression
2.5.2 Support vector machine (SVM)
2.5.3 Decision trees
2.5.4 Random forest
2.6 Performance statistics
2.6.1 Performance metrics of regression problems
2.6.2 Performance metrics of classification problems
2.7 Optimization and loss methods
2.8 Use cases and examples
2.9 Conclusion
Points to remember
Questions
Multiple choice questions
Answers
3. System Analysis with Machine Learning/Un-Supervised Learning
Structure
Objectives
3.1 Introduction and architecture of unsupervised machine learning
3.2 Data preparation methods and steps
3.2.2 Data preprocessing and scaling
3.2.2.1 StandardScaler
Code sample
3.2.2.2 MinMaxScaler
3.2.2.3 RobustScaler
3.2.2.4 Normalizer
3.3 Clustering techniques
3.3.1 K-Means
3.3.2 Hierarchical clustering
3.3.3 Density-based spatial clustering of applications (DBSCAN)
3.4 Other algorithms and methodologies
(Dimensionality reduction techniques)
3.4.1 t-Distributed stochastic neighbor embedding (t-SNE)
3.4.2 Principal Component Analysis (PCA)
3.4.2.1 Decomposition incremental PCA
3.4.2.2 Decomposition kernel PCA
3.4.2.3 Decomposition MiniBatchSparse PCA
3.4.2.4 Decomposition PCA
3.4.2.5 Decomposition sparse PCA
3.4.3 Singular value decomposition (SVD)
3.4.4 Independent component analysis (ICA)
3.4.5 Dictionary learning
3.5 Error minimization
3.5.1 Distance computing
3.5.2 Threshold limit
3.5.3 Log loss
3.5.4 Euclidian distance
3.5.5 Examples and samples
3.9 Conclusion
Points to remember
Questions
Multiple choice questions
Answers
4. Feature Engineering
Why is feature engineering needed?
Structure
Objectives
4.1 Introducing feature engineering
4.2 What is Feature selection?
4.2.1 Baselining model
4.2.2 Categorical encodings
Nominal encoding types
4.2.2.1 One Hot Encoding
4.2.2.2 Mean encoding
4.2.2.3 One hot encoding with many categories
Ordinal encoding types
4.2.2.4 Label encoding
4.2.2.5 Target guided ordinal encoding
4.2.2.6 Frequency encoding
4.2.2.7 Other Encoding techniques
Hash encoder
Effect encoding
Dummy encoding
Binary encoding
4.2.3 Feature generation
4.2.4 Feature Selection
Correlation
Feature importance methods
Univariate feature selection
4.2.5 Collecting and refining data
4.2.6 Cleaning and organizing data
4.2.7 Data preparation
4.2.8 Mining data for pattern selection
4.3 Other popular techniques of feature engineering
4.3.1 Imputation
4.3.2 Handling outliers
4.3.3 Binning
4.3.4 One-hot encoding
4.3.5 Feature split
4.3.6 Scaling
4.4 Examples and samples
Conclusion
Points to remember
Questions
Multiple choice questions
Answers
5. Classification, Clustering, Association Rules, and Regression
Structure
Objectives
5.1 Introduction
5.2 Classification Techniques
5.3 One-Class Classification
5.4 Zero-Shot Learning
5.5 One Shot, Few Shot, K-Shot, or N-Shot Learning
5.6 Clustering techniques
5.7 Distribution-Based Clustering
5.7.1 Density-Based Clustering
5.7.2 Fuzzy Clustering
5.7.3 Grid-Based Clustering
5.8 Association Rules Techniques
5.9 Regression Techniques
5.10 Logistic Regression
5.11 Ridge Regression
5.12 Lasso Regression
5.13 ElasticNet Regression
5.14 Factors for selecting the right regression model
5.15 Use Cases and Examples
Conclusion
Points to remember
Questions
Multiple Choice Questions
Answers
6. Time Series Analysis
Structure
Objectives
6.1 Introduction to time series
6.2 Various types of time series
6.3 Univariate and multivariate time series models
6.4 Time domain and frequency domain time series models
6.5 Linear and non-linear time series models
6.6 Time series models
6.6.1 Autoregression (AR)
6.6.2 Moving average (MA)
6.6.3 Autoregression Moving Average (ARMA)
6.6.4 Autoregressive Integrated Moving Average (ARIMA)
6.6.5 Vector Autoregression (VAR)
6.6.6 Vector Autoregression Moving Average (VARMA)
6.6.7 Fourier Transforms (FT)
6.7 Examples and samples
Conclusion
Points to remember
Questions
Multiple choice questions
Answers
7. Data Cleanup, Characteristics and Feature Selection
Structure
Objectives
7.1 Introduction
7.2 Data formatting
7.3 Normalization
Min-Max normalization
Z-Score Normalization or Standardization
Box-Cox Transformation
Decimal Scaling
7.4 Model Training and Test Splitting
7.5 Bias and Variance Trade-off
7.6 Model Overfitting and Underfitting
7.7 Cross Validation
7.8 Feature Reduction Techniques
7.9 Use Cases and Examples
Conclusion
Points to remember
Questions
Multiple Choice Questions
Answers
8. Ensemble Model Development
Structure
Objectives
8.1 Introduction
8.2 Ensemble methods
8.2.1 Popular ensemble methods
8.2.1.1 Sequential ensemble methods
8.2.1.2 Parallel ensemble methods
8.2.1.3 Averaging ensemble methods
8.2.1.4 Boosting ensemble methods
8.3 Combining weak learners
How to combine weak learners?
8.4 Advanced ensemble model building tips
8.5 Hyperparameter tuning
8.6 Genetic algorithm-based tuning
8.7 Use cases or examples
Conclusion
Points to remember
Questions
Multiple choice questions
Answers
9. Design with Deep Learning
Structure
Objectives
9.1 Introduction
9.2 Architecture of CNN
9.3.1 AlexNet
9.4 Training a CNN network
9.5 Latest trends and algorithms in CNN
9.6 Use cases or examples
Conclusion
Points to remember
Questions
Multiple choice questions
Answers
10. Design with Multi Layered Perceptron (MLP)
Structure
Objectives
10.1 Introduction
10.2 Components of MLP
10.3 Architecture of MLP
10.4 Training mechanisms in MLP
10.5 Latest trends in MLP
10.5.1 Knowledge distillation
10.6 Use cases or examples on how to build an MLP with various frameworks
Conclusion
Points to remember
Questions
Multiple choice questions
Answers
11. Long Short Term Memory Networks
Structure
Objectives
11.1 Introduction
11.2 Recurrent neural networks (RNN)
11.3 Gated recurrent units (GRU)
11.4 Architectures of Long Short Term Memory Networks (LSTM)
11.4.1 Bi-directional LSTM
11.4.2 Attention-based LSTM
11.5 Training mechanisms of LSTM
11.6 Use cases and examples
11.7 NLP using LSTM and advancements
Conclusion
Points to remember
Questions
Multiple choice questions
Answers
12. Autoencoders
Structure
Objectives
12.1 Introduction to autoencoders and simple architecture
12.2 Undercomplete autoencoders
12.3 Overcomplete autoencoders
12.4 Denoising autoencoders
12.5 Sparse autoencoders
12.6 Stacked autoencoders
12.7 Variational autoencoders (VAEs)
12.8 Other autoencoders
12.9 Examples and samples
Conclusion
Points to remember
Questions
Multiple choice questions
Answers
13. Applications of Machine Learning and Deep Learning
Structure
Objectives
13.1 Introduction
13.2 Domain-specific applications
13.2.1 Telecommunications
13.3 Technology specific applications
13.3.1. The working mechanism
13.4 Device specific applications
13.4.1 Feedback predictor model
13.5 Platform-specific applications
13.6 Solution-specific application
13.6.1 Contextual advice
13.6.2. Context-based search
Conclusion
Points to remember
Questions
Multiple choice questions
Answers
14. Emerging and Future Technologies
Structure
Objectives
14.1 Introduction to next-generation technologies
14.2 Internet of Things
14.2.1 IoT issues
14.3 Cloud computing
14.4 5G Networks
14.4.1 Resource management in 5G
14.4.2 Prediction based architecture
14.4.3 Resource allocation
14.5 Quantum computers
14.6 Conversation systems
14.7 Neuromorphic computing
14.8 Deep reinforcement learning
Conclusion
Points to remember
Questions
Multiple choice questions
Answers
Index
The book hasn't received reviews yet.