Go Back   2023 2024 Courses.Ind.In > Main Category > Main Forum > Sathyabama Institute of Science and Technology BE CSE SCSA3015 Deep Learning Syllabus

Thread: Sathyabama Institute of Science and Technology BE CSE SCSA3015 Deep Learning Syllabus Reply to Thread
Your Username: Click here to log in
Title:
  
Message:
Trackback:
Send Trackbacks to (Separate multiple URLs with spaces) :
Post Icons
You may choose an icon for your message from the following list:
 

Additional Options
Miscellaneous Options

Topic Review (Newest First)
January 14th, 2021 10:58 AM
vikash
Sathyabama Institute of Science and Technology BE CSE SCSA3015 Deep Learning Syllabus

Sathyabama Institute of Science and Technology BE CSE SCSA3015 Deep Learning Syllabus

SATHYABAMA INSTITUTE OF SCIENCE AND TECHNOLOGY SCHOOL OF COMPUTING

SCSA3015 DEEP LEARNING
L T P Credits Total Marks
3 0 0 3 100

UNIT 1 INTRODUCTION 9 Hrs.
Introduction to machine learning- Linear models (SVMs and Perceptrons, logistic regression)- Intro to Neural Nets: What a
shallow network computes- Training a network: loss functions, back propagation and stochastic gradient descent- Neural
networks as universal function approximates.

UNIT 2 DEEP NETWORK 9 Hrs.
History of Deep Learning- A Probabilistic Theory of Deep Learning- Backpropagation and regularization, batch
normalization- VC Dimension and Neural Nets-Deep Vs Shallow Networks- Convolutional Networks- Generative Adversarial
Networks (GAN), Semi-supervised Learning.

UNIT 3 DIMENTIONALITY REDUCTION 9 Hrs.
Linear (PCA, LDA) and manifolds, metric learning - Auto encoders and dimensionality reduction in networks - Introduction to
Convnet - Architectures – AlexNet, VGG, Inception, ResNet - Training a Convnet: weights initialization, batch normalization,
hyper parameter optimization.

UNIT 4 OPTIMIZATION AND GENERALIZATION 9 Hrs.
Optimization in deep learning– Non-convex optimization for deep networks- Stochastic Optimization- Generalization in
neural networks- Spatial Transformer Networks- Recurrent networks, LSTM - Recurrent Neural Network Language Models-
Word-Level RNNs & Deep Reinforcement Learning - Computational & Artificial Neuroscience

UNIT 5 CASE STUDY AND APPLICATIONS 9 Hrs.
Imagenet- Detection-Audio WaveNet-Natural Language Processing Word2Vec - Joint Detection- BioInformatics- Face
Recognition- Scene Understanding- Gathering Image Captions.
Max.45 Hrs.

COURSES OUTCOMES
On completion of the course, student will be able to
CO1 - Understand basics of deep learning.
CO2 - Implement various deep learning models.
CO3 - Realign high dimensional data using reduction techniques.
CO4 - Analyze optimization and generalization in deep learning.
CO5 - Explore the deep learning applications.
CO6 - Design and creation of data models.

TEXT / REFERENCE BOOKS
1. Cosma Rohilla Shalizi, Advanced Data Analysis from an Elementary Point of View, 2015.
2. Deng and Yu, Deep Learning: Methods and Applications, Now Publishers, 2013.
3. Ian Good fellow, Yoshua Bengio, Aaron Courville, Deep Learning, MIT Press, 2016.
4. Michael Nielsen, Neural Networks and Deep Learning, Determination Press, 2015.

END SEMESTER EXAMINATION QUESTION PAPER PATTERN
Max. Marks : 100 Exam Duration : 3 Hrs.
PART A : 10 Questions of 2 marks each-No choice 20 Marks
PART B : 2 Questions from each unit with internal choice, each carrying 16 marks 80 Marks

Posting Rules
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off


All times are GMT +5.5. The time now is 09:34 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
Search Engine Friendly URLs by vBSEO 3.6.1
vBulletin Optimisation provided by vB Optimise (Lite) - vBulletin Mods & Addons Copyright © 2024 DragonByte Technologies Ltd.