Professional Documents
Culture Documents
SC Imp
SC Imp
Explain approaches,
advantages and limitations of AI
Artificial Intelligence (AI) is when machines are programmed to think and learn like humans to do tasks
that usually require human intelligence. There are different areas of AI:
1. Machine Learning: Machines learn from big sets of data to make predictions or decisions, like
recognizing images, understanding speech, or suggesting things.
2. Natural Language Processing (NLP): Machines learn to understand and work with human language,
such as translating languages, analyzing sentiments, or answering questions.
3. Computer Vision: Machines learn to see and understand images or videos, like recognizing objects,
faces, or dividing images into parts.
4. Robotics: AI is used to make robots smart, so they can move around, manipulate things, and interact
with people.
1. Automation: AI automates boring and repetitive tasks, making things faster and more efficient.
2. Decision Making: AI can analyze a lot of data and make smart decisions that humans might miss.
3. Accuracy: AI algorithms are really good at tasks like recognizing images or analyzing data, making
fewer mistakes.
4. Handling Complexity: AI can solve difficult problems with lots of factors, giving solutions that humans
might struggle with.
AI Limitations :
1. Common Sense: AI may not understand things that humans find obvious.
2. Data Dependency: AI needs good and enough data to work well. Biased or not enough data
can lead to mistakes.
3. Ethical Concerns: AI raises questions about privacy, fairness, and job loss, so we need rules
to use it responsibly.
4. Limited Creativity: AI is not great at tasks that need imagination, new ideas, or
understanding emotions.
Additionally, AI may not be good at tasks requiring creativity, innovation, or emotional understanding.
It's important to remember that AI is always evolving, so new ideas, methods, and limitations may come
up over time.
In simple words, Soft Computing is flexible, approximate, and deals with uncertain or fuzzy information,
while Hard Computing is precise, deterministic, and works with exact models and data.
Q. 3. What is Artificial Neural Network? With neat sketch, explain different terms used in
ANN with characteristics
Artificial Neural Network (ANN) is a computational model inspired by the human brain.
Artificial Neural Network (ANN) is a computational model inspired by the structure and
functioning of the human brain. It consists of interconnected artificial neurons that process and
transmit information through weighted connections.
3. Hidden Layer: Intermediate layers that perform computations and extract features.
7. Activation Function: Non-linear function applied to weighted inputs, producing neuron output.
8. Forward Propagation: Transmitting input data through the network to produce an output.
10. Training: Adjusting weights using a training dataset to minimize prediction errors.
Characteristics of ANN:
4. Fault Tolerance: ANNs can function even with damaged neurons or connections.
5. Generalization: ANNs can make predictions on unseen data based on training examples.
Understanding these terms is crucial for comprehending the functioning and training of neural
networks.
Q5) Compare biological neurons with ANN
Difference between Biological Neuron Network and Artificial Neural Network:
Biological Neuron Network:
1.Soma or cell body: where cell nucleus is located
2. Dendrites: where the nerve is connected to the cell body
3. Axon: which carries the impulses of the neuron
7.
Artificial Neural Network (ANN):
1. Creation: Designed and implemented by humans using computer systems.
2. Structure: Composed of artificial neurons, layers, and connections.
3. Communication: Computation is performed through mathematical operations and weighted
connections.
4. Learning: ANNs can learn and improve through training algorithms.
5. Simplicity: ANNs are simplified models inspired by biological networks but lack the
complexity of the human brain.
6. Scalability: ANNs can be easily scaled and adapted for various applications.
In simple words, biological neuron networks exist in living organisms and rely on electrical and
chemical processes for communication, whereas artificial neural networks are created by
humans using computers and use mathematical operations and weighted connections for
computation. Biological networks are complex and adaptive, while artificial networks are
simplified models that can be scaled and trained for specific tasks.
Q. 6. Explain various learning techniques used in Artificial Neural Network.
1. ANN's main property is its ability to learn, i.e., adapt itself based on input.
2. Learning or training is the process where the neural network adjusts its parameters to
generate the desired response.
3. There are two types of learning in ANNs:
- Parameter learning: Updates the connection weights.
- Structure learning: Focuses on changing the network structure, such as the number of
neurons and their connections.
4. Parameter and structure learning can be performed together or separately.
5. Learning in ANNs can be broadly classified into three categories:
- Supervised learning: Training with labeled input-output pairs.
- Unsupervised learning: Discovering patterns and relationships in unlabeled data.
- Reinforcement learning: Learning through interactions with the environment, maximizing
rewards.
Unsupervised Learning:
Reinforcement Learning:
1. Similar to supervised learning, but with less available information.
2. Feedback indicates the correctness of the output, not precise values.
3. Learning is based on evaluative reinforcement signals.
4. The network receives feedback from its environment.
5. Feedback is evaluative, not instructive.
6. Critic signal generator processes external reinforcement signals.
7. Adjusts weights of the ANN for better future feedback.
8. Aims to improve performance based on received reinforcement signals.
Q7) State and explain McCulloch-Pits and Hebb Networks.
McCulloch-Pitts Network:
- Proposed by McCulloch and Pitts in 1943.
- Consists of binary threshold neurons.
- Inputs are binary, and outputs are binary based on a predefined threshold.
- Activation function is a simple threshold function.
- If the weighted sum of inputs exceeds the threshold, the neuron fires and produces an output
of 1; otherwise, it produces an output of 0.
Hebb Network:
- Proposed by Donald Hebb in 1949.
- Focuses on synaptic connections between neurons.
- Uses the Hebbian learning rule: "cells that fire together, wire together."
- Synaptic weights are adjusted based on the simultaneous activation of connected neurons.
- Reinforces connections between neurons that frequently activate together.
- Demonstrates synaptic plasticity and is associated with associative learning.
11. The neuron will fire if it receives "k" or more excitatory inputs and no inhibitory
inputs.
12. The McCulloch-Pitts neuron does not have a specific training algorithm.
13. Analysis is performed to determine the weights and threshold values for the
neuron.
14. The McCulloch-Pitts neuron is used as a building block to model various
functions or phenomena based on logical operations.