SET 1

Code No: 37011

.in

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY HYDERABAD R05 IV B.Tech. I Semester Supplementary Exams, May/June – 2009 NEURAL NETWORKS AND FUZZY LOGIC (Common to EEE, E.CON.E, MEP, AE, ICE, & AME) Time: 3 hours Max Marks: 80 Answer any FIVE Questions. All Questions carries equal marks. -----

Give a brief account on neural networks. Also, explain what is supervised and unsupervised learning with example. [16]

2.

Discuss, how a recurrent neural network is different from a feed forward neural network? [16]

3.

Explain the classification model, features and decision regions in single layer perception. [16]

4.

”The choice of learning coefficient is a tricky task in back propagation algorithm”. Support your answer. [16]

5.

Explain the following: a] Hetero-associative memory. b] Auto-associative memory.

[8+8]

6.

Write about classical set theory and classical sets.

[16]

7.

Discuss in detail the methods to generate membership functions.

[16]

8.

Write about the following: a] Indirect learning architecture b] Specialized on-line learning control architecture.

[8+8]

Aj

nt

uW

or ld

1.

--OoO--

Code No: 37011

SET 2

.in

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY HYDERABAD R05 IV B.Tech. I Semester Supplementary Exams, May/June – 2009 NEURAL NETWORKS AND FUZZY LOGIC (Common to EEE, E.CON.E, MEP, AE, ICE, & AME) Time: 3 hours Max Marks: 80 Answer any FIVE Questions. All Questions carries equal marks. -----

“An IF neuron model is considered as a special case of spiking neuron model”. Justify your answer. [16]

2.

A fully connected feedforward network has 10 source nodes, 2 hidden layers, one with 4 neurons and the other with 3 neurons, and a single output neuron. Construct an architectural graph of this network. [16]

3.a] b]

Compare and contrast supervised and unsupervised learning strategies. Distinguish between Batch learning and incremental(stepwise) learning. [8+8] Define the following terms: a] Pattern b] Classes/Categories c] Features and pattern space d] Decision regions and surface. [4+4+4+4]

5.

Explain the following: a] Address-addressable memory b] Content-addressable memory Explain the following terms a] Relation matrix b] Binary relation c] Identity relation d] Universal relation

nt

6.

uW

4.

or ld

1.

[4+4+4+4]

Explain the following: a] Singular value decomposition b] Combs method

Aj

7.

8.a] b]

[8+8]

[8+8]

What is the limitation of the plant inverse identification? What are the limitations of specialized on-line learning control architecture? [8+8] --oOo--

Code No: 37011

SET 3

.in

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY HYDERABAD R05 IV B.Tech. I Semester Supplementary Exams, May/June – 2009 NEURAL NETWORKS AND FUZZY LOGIC (Common to EEE, E.CON.E, MEP, AE, ICE, & AME) Time: 3 hours Max Marks: 80 Answer any FIVE Questions. All Questions carries equal marks. -----

What are the applications of neural networks?

2.

Briefly explain the Winner take all learning.

3.

Draw the block diagram of a pattern classifier. Explain about discriminator discriminate and dichotomize. Also explain what an exemplar mean. [16] Explain the selection of number of hidden nodes, sigmoid gain, local minima and learning coefficient in back propagation network. [16]

5.

6. 7.

[16]

[8+8]

Given three sets A, B and C. Prove Demurrage’s laws using Venn diagrams. [16] Discuss the following: a] Fuzzy synthesis evaluation b] Fuzzy ordering c] Preferences and consensus d] No transitive ranking [4+4+4+4] Explain Top-down and Bottom-up approach in ANN. Explain types of faults and its diagnosis using ANN.

Aj

nt

8.a] b]

Write short notes on a] Sigmoid gain b] Threshold value in back propagation algorithm.

[16]

uW

4.

or ld

1.

--OoO--

[8+8]

SET 4

Code No: 37011

Explain the following terms: a] Resting potential b] Nernst equation c] Action potential d] Refractory periods e] Chemical synapses

or ld

1.

.in

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY HYDERABAD R05 IV B.Tech. I Semester Supplementary Exams, May/June – 2009 NEURAL NETWORKS AND FUZZY LOGIC (Common to EEE, E.CON.E, MEP, AE, ICE, & AME) Time: 3 hours Max Marks: 80 Answer any FIVE Questions. All Questions carries equal marks. -----

[3+3+3+4]

Discuss the simplified model of an artificial neuron. What are the three basic elements of a neuronal model?

3.

Illustrate the training and classification of continuous perception with an example. [16]

4.

Illustrate back propagation algorithm with your own training sets, and explain. [16]

5.

Describe the architecture of BAM.

[16]

6.

Discuss the measure of fuzziness and dissonance.

[16]

7.

Discuss the fuzzy rule based system.

[16]

8.

Write a short notes on decision surface.

[16]

Aj

nt

uW

2.a] b]

--oOo--

[8+8]

1. Give a brief account on neural networks. Also ... Accounts

Explain the classification model, features and decision regions in single layer · perception. [16] ... Singular value decomposition · b] · Combs method · [8+8].

22KB Sizes 0 Downloads 120 Views

Recommend Documents

A Survey on Leveraging Deep Neural Networks for ...
data. • Using Siamese Networks. Two-stream networks, with shared weight .... “Learning Multi-domain Convolutional Neural Networks for Visual Tracking” in ...

On parallelizing recursive neural networks on
As a first step, we changed the architecture proposed in Fig. 1 (a) and we used a .... (iv) the processor interconnection topology is the bi-directional ring. Fig.

Neural Networks - GitHub
Oct 14, 2015 - computing power is limited, our models are necessarily gross idealisations of real networks of neurones. The neuron model. Back to Contents. 3. ..... risk management target marketing. But to give you some more specific examples; ANN ar

Simon Haykin-Neural Networks-A Comprehensive Foundation.pdf ...
Simon Haykin-Neural Networks-A Comprehensive Foundation.pdf. Simon Haykin-Neural Networks-A Comprehensive Foundation.pdf. Open. Extract. Open with.

On Recurrent Neural Networks for Auto-Similar Traffic ...
auto-similar processes, VBR video traffic, multi-step-ahead pre- diction. ..... ulated neural networks versus the number of training epochs, ranging from 90 to 600.

Training Deep Neural Networks on Noisy Labels with Bootstrapping
Apr 15, 2015 - “Soft” bootstrapping uses predicted class probabilities q directly to ..... Analysis of semi-supervised learning with the yarowsky algorithm.

On the Emergence of Rules in Neural Networks
Part of the puzzle is how neuron-like elements could, from simple signal processing prop- erties, emulate symbol-like behavior. Symbols and symbol systems ...

Deep Convolutional Neural Networks On Multichannel Time Series for ...
Deep Convolutional Neural Networks On Multichannel Time Series for Human Activity Recognition.pdf. Deep Convolutional Neural Networks On Multichannel ...

Improving the speed of neural networks on CPUs - CiteSeerX
This paper is a tutorial ... putations by factors from 5× to 50× [1-3]. .... As an illustration of how these simple techniques fare in comparison to off-the-shelf fast ...

Recurrent Neural Networks
Sep 18, 2014 - Memory Cell and Gates. • Input Gate: ... How LSTM deals with V/E Gradients? • RNN hidden ... Memory cell (Linear Unit). . =  ...

Intriguing properties of neural networks
Feb 19, 2014 - we use one neural net to generate a set of adversarial examples, we ... For the MNIST dataset, we used the following architectures [11] ..... Still, this experiment leaves open the question of dependence over the training set.

Neural Graph Learning: Training Neural Networks Using Graphs
many problems in computer vision, natural language processing or social networks, in which getting labeled ... inputs and on many different neural network architectures (see section 4). The paper is organized as .... Depending on the type of the grap

Data Mining Using Neural Networks: A Guide for ...
network models and statistical models are related to tackle the data analysis real problem. The book is organized as follows: some basics on artificial neural ...

lecture 17: neural networks, deep networks, convolutional ... - GitHub
As we increase number of layers and their size the capacity increases: larger networks can represent more complex functions. • We encountered this before: as we increase the dimension of the ... Lesson: use high number of neurons/layers and regular

A versatile semi-supervised training method for neural networks
labeled data, our training scheme outperforms the current state of the art on ... among the most popular methods for neural networks in the past. It has been .... 4.1.2 Confusion analysis. Even after ..... Learning Research, 17(59):1–35, 2016. 8.

Compressing Deep Neural Networks using a ... - Research at Google
tractive model for many learning tasks; they offer great rep- resentational power ... differs fundamentally in the way the low-rank approximation is obtained and ..... 4Specifically: “answer call”, “decline call”, “email guests”, “fast

Learning Methods for Dynamic Neural Networks - IEICE
Email: [email protected], [email protected], [email protected]. Abstract In .... A good learning rule must rely on signals that are available ...