1、培訓(xùn)過程中,如有部分內(nèi)容理解不透或消化不好,可免費(fèi)在以后培訓(xùn)班中重聽;
2、培訓(xùn)結(jié)束后,授課老師留給學(xué)員聯(lián)系方式,保障培訓(xùn)效果,免費(fèi)提供課后技術(shù)支持。
3、培訓(xùn)合格學(xué)員可享受免費(fèi)推薦就業(yè)機(jī)會。 |
Overview of neural networks and deep learning
The concept of Machine Learning (ML)
Why we need neural networks and deep learning?
Selecting networks to different problems and data types
Learning and validating neural networks
Comparing logistic regression to neural network
Neural network
Biological inspirations to Neural network
Neural Networks– Neuron, Perceptron and MLP(Multilayer Perceptron model)
Learning MLP – backpropagation algorithm
Activation functions – linear, sigmoid, Tanh, Softmax
Loss functions appropriate to forecasting and classification
Parameters – learning rate, regularization, momentum
Building Neural Networks in Python
Evaluating performance of neural networks in Python
Basics of Deep Networks
What is deep learning?
Architecture of Deep Networks– Parameters, Layers, Activation Functions, Loss functions, Solvers
Restricted Boltzman Machines (RBMs)
Autoencoders
Deep Networks Architectures
Deep Belief Networks(DBN) – architecture, application
Autoencoders
Restricted Boltzmann Machines
Convolutional Neural Network
Recursive Neural Network
Recurrent Neural Network
Overview of libraries and interfaces available in Python
Caffee
Theano
Tensorflow
Keras
Mxnet
Choosing appropriate library to problem
Building deep networks in Python
Choosing appropriate architecture to given problem
Hybrid deep networks
Learning network – appropriate library, architecture definition
Tuning network – initialization, activation functions, loss functions, optimization method
Avoiding overfitting – detecting overfitting problems in deep networks, regularization
Evaluating deep networks
Case studies in Python
Image recognition – CNN
Detecting anomalies with Autoencoders
Forecasting time series with RNN
Dimensionality reduction with Autoencoder
Classification with RBM
|