0% found this document useful (0 votes)
109 views6 pages

Neural Networks

The Neural Networks course, offered to third-year Computer Science students, focuses on the theory and practical applications of artificial neural networks, covering topics such as classification, regression, and pattern recognition. Students will gain knowledge in building and training neural networks using MATLAB, and will be assessed through assignments, exams, and practical work. The course includes lectures, tutorials, and discussions, with a recommended textbook list for further study.

Uploaded by

Walaa Gabr
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
109 views6 pages

Neural Networks

The Neural Networks course, offered to third-year Computer Science students, focuses on the theory and practical applications of artificial neural networks, covering topics such as classification, regression, and pattern recognition. Students will gain knowledge in building and training neural networks using MATLAB, and will be assessed through assignments, exams, and practical work. The course includes lectures, tutorials, and discussions, with a recommended textbook list for further study.

Uploaded by

Walaa Gabr
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

1 Neural Networks

Course Specification
Program on which the course is given: Computer Science
Department offering the program: Computer Science
Department offering the course: Computer Science
Academic year /level: 2010/2011 – Third year students
Date of specification approval:

A- Basic Information

Title Neural Networks


Lecture Three Hours /Week
Practical: Three Hours /Week
Total: Six Hours /Week
Code: CS361

B- Professional Information

1- Overall Aims of Course:

The course introduces the theory and practice of neural computation. It offers
the principles of neuro-computing with artificial neural networks widely used for
addressing real-world problems such as classification, regression, pattern
recognition, data mining, time-series prediction, etc... . Knowledge and tools for
the specification, design, and practical implementation of ANNs are also provided.

2- Intended Learning Outcomes of Course:


a) Knowledge and Understandings:
The course aims to give the student:
a1- A good understanding of artificial neural networks and its practical
applications
a2- An understanding of the basic fundamentals of the neural networks.

b) Intellectual Skills:
At the end of the course, the student will know:
b1- How to think in simulating the human brain with an artificial neural
network.
b2- How to think building a supervised and unsupervised neural network
in simple applications.
2 Neural Networks

c) Professional and Practical Skills:


At the end of the course, the student will be able to:
c1- Build a simple neural network with Mat-Lab tool and try to perform
simple training to his network with a small dataset.
c2- Interact with the activation function the weight matrix for a given
neural network.

d) General and Transferable Skills:


At the end of the course, the student will have:
d1- The ability to use the neural networks in some applications like
pattern recognitions and classification.
d2- The ability to adapt the weight matrix of a given neural network
during the training process in a small dataset.
e) 8- Attitudes:
At the end of the course, the students are expected to:
e1- Have a positive attitude towards the aim of the course.
e2- Like analyzing with software tools and packages in neural networks.
e3- Be satisfied with the important points of the course contents.

3- Course Content:

No. of ILOs
Lecture Topic Lecture Tutorial
Hrs
Fundamentals: A1,b2
- Introduction
- A framework for distributed
representation
- Processing units
- Connections between units
- Activation and output rules
6 3 3
- Network topologies
- Training of artificial neural networks
- Paradigms of learning
- Modifying patterns of connectivity
- Notation and terminology
- Notation
- Terminology
Perceptron and Adaline: A2,b1
- Networks with threshold activation
functions
- Perceptron learning rule and 6 3 3
convergence theorem
- Example of the Perceptron learning rule
- Convergence theorem
3 Neural Networks

- The original Perceptron


- The adaptive linear element (Adaline)
- Networks with linear activation
functions: the delta rule
- Exclusive-OR problem
- Multi-layer perceptrons can do
everything
Back-Propagation: D2
- Multi-layer feed-forward networks
- The generalized delta rule
- Understanding back-propagation
- Working with back-propagation
- An example
- Other activation functions
- Deficiencies of back-propagation 12 6 6
- Advanced algorithms
- How good are multi-layer feed-forward
networks?
- The effect of the number of learning
samples
- The effect of the number of hidden units
- Applications
Recurrent Networks: D1
- The generalized delta-rule in recurrent
networks
- The Jordan network
- The Elman network
- Back-propagation in fully recurrent
networks 6 3 3
- The Hopfield network
- Description
- Hopfield network as associative
memory
- Neurons with graded response
- Boltzmann machines
Self-Organizing Networks: C2
- Competitive learning
- Clustering
- Vector quantization
- Kohonen network
- Principal component networks 12 6 6
- Introduction
- Normalized Hebbian rule
- Principal component extractor
- More eigenvectors
- Adaptive resonance theory
4 Neural Networks

- Background: Adaptive resonance theory


- ART1: The simplified neural network
model
- ART1: The original model
Reinforcement learning: C1
- The critic
- The controller network
- Barto's approach: the ASE-ACE
combination
6 3 3
- Associative search
- Adaptive critic
- The cart-pole system
- Reinforcement learning versus optimal
control
Robot Control: D1
- End-effector positioning
- Camera{robot coordination is function
approximation
6 3 3
- Robot arm dynamics
- Mobile robots
- Model based navigation
- Sensor based control
Vision: C1
- Introduction
- Feed-forward types of networks
- Self-organizing networks for image
compression
- Back-propagation
- Linear networks
- Principal components as features
- The cognition and neocognitron 12 6 6
- Description of the cells
- Structure of the cognition
- Simulation results
- Relaxation types of networks
- Depth from stereo
- Image restoration and image
segmentation
- Silicon retina
General Purpose Hardware: C2
- The Connection Machine
- Architecture
- Applicability to neural networks 6 3 3
- Systolic arrays
Dedicated Neuro-Hardware:
- General issues
5 Neural Networks

- Connectivity constraints
- Analogue vs. digital
- Optics
- Learning vs. non-learning
- Implementation examples
- Carver Mead's silicon retina
- LEP's LNeuro chip

4- Teaching and Learning Methods:


 Lectures
 Tutorials
 Class discussions

5- Assessment:
a) Student Assessment Methods:

 Assignments
 Midterm written exam
 Oral exam
 Practical exam
 Final written exam

b) Assessment Schedule and Weighting:

 Four assignments with a rate one assignment every 2 weeks (8%)


 One written mid-term exam at the sixth week of the semester (8%)
 One oral and practical exam at the end of the semester (17%)
 Final written exam (67%)

6- List of Recommended Textbooks:


 Principe, Euliano, and Lefebvre, "Neural and Adaptive Systems:
Fundamentals through Simulations”, John Wiley and Sons, ISBN:
0471351679.
 Christopher M. Bishop, “Neural Networks for Pattern Recognition”,
Oxford University Press, USA; 1 edition, ISBN-10: 0198538642,
1996.

7- Facilities Required for Teaching and Learning:


a) Vital Facilities:
- Computer lab supported by MATLAB software.
6 Neural Networks

- Data show device.

b) Lecturing Facilities:
- Overhead Projector, Data show device.

Course lecturer /Coordinator:

Head of the Department: Prof. Dr. Hamed Nassar.

You might also like