Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 22

Linear Regression Questions

La Linea Re essi n MC O e e uesti ns

Question 1:
View this Question Online >

A set of observations of independent variable (x) and the corresponding dependent variable (y) is
given below. . '

x 5243 ’ ” ‘
16 10 13 12 ,” ’ .

Based on the data, the coefficient a of the linear régression model

y = a + bx is estimated as.6.1 .

The coefficient b is * ’ ’ . (round off ta ane decimal place)

Answer•{ÖetaîIed ûôÏution Below) 1.9


Linear Regression Question 1 Detailed Solution

Concept:

The normal equation for Fitting a straight line by the least square method is:

Zy = na + b Zx
2
Zxy = a Zx + b Zx

Where
n = Total number of observatio ns, a and b are the coefficients.

By solving the above two equations coefficients a and b can be obtained.


Given Data and Calculation:

25 80
10 20

Zxy = 188

n = 4 So

51 = 4a + 14b

188 = 1 4a + 54b

Solving the above two equations 8 - 6.1 and b = 1.9

Question 1:
View this Question Online >

Far a bivariate data set on (x, y), if the means, standard deviations and correlation coefficient are
2. y = 2 - 0.27(x — 1)

3. y = 2 + 2.4{K- 1)

4. y = 1 + 0.27(x - 2)

Answer (Detailed Solution Below)

” ''‘ *'
Option 3 : y = 2 + 2.4(x - 1)


Linear Regression Question 2 Detailed Solution ,”

Formula ’

The repression line of y on x is given as -” ”’' ”

Calculation

According to the question

=> y — 2 = 2.4(x - 1)

-. y = 2 + 2.4(x - 1)

Quesfion 3:

View this Question Online >


In the regression model ( y = a + bx) where x = 2.50, y = 5.50 and a = 1.50 x and y
denote mean of variables x and y and a is a constant), which one of the following values
of parameter 'b’ of the model is correct?

1. 1.75

2. 1.60

3. 2.00

4. 2.50

A
er (Detailed ?o1Lition Below)

Option 2 : 1.60

Linear Regression Question 3 Detailed Solution

@l Prei ts
Linear regression model:
• Linear repression is a way to model the relationship between two variables.
• You mipht also recognize the equation as the slope formula.
• The equation has the form Y= a + by
• where
° Y is the dependent variable (that s the variable that poes on the Y-axis),
° X is the independent variable (i.e. it is plotted on the X-axis),
° b is the slope of the line and
° a is the y-intercept.

a-
( ) ( ’-' )- (’ ) (’ y )

" 160
150
140
Body weight 130 bX
(pounds) 100 wgt = 80 + 2 (hgt)
80
50 60 70 80
X- axis. Hr/. !1 t (inches)
Calculation:
!’.=o— bx)

• a = l.5O

e 2.50 = 4
b = 4/2.5 = 1.60

Therefore, 'B' is the correct answer.

Question 4:

There is no value of x that can simultaneously satis/ both the given equations. Therefore. find the
least squares error’ solution to the t\vo equations, i.e.,. find the value of x that minimizes the sum cf
squares of the errors in the two equations.
2x = 3
4x = 1

Answer (Dr 0.5

Linear Regression Question 4 Detailed Solution

Concepts

Least Square Method:


lt is an approximation method to minimize the error.
I n this method, the distance between the estimated value and the actual value is minimized.

R
According to the least square method, 1 where ”i are the estimated (or puessed
value) and x is the actual values.
By minimizing this distance R the least-squares error can be found out.
Calculations:

Given the functions are:2 ' 3 2 3 0 aod 4x 1 4x — 1 0

R = (2x — 3)* -I- (4x —

Hence, to minimize the valu-e of II I 0

- 2x — 3) -+- 4 x 2 (4x — 1) 0

x=j&R =(2x —3)’+(4x —i)'=s

. The value of x that minimizes the sum of squares of the errors in the two equations is 1/2.

Question 5:
View this Question Online >

Consider the followinp learninp algorithms:


(a) Lop istic regression
(b) Back propagation
(c) Linear repression
Which of the following option represents classification algorithms?

0, ,
1 and (b)
2. OnIy (a) and (c)

3. 0 nIy (b) and (c)

4. (a), (b), and (c)

Optio n 1 : Only (a) and (b)

Linear Regression Question S Detailed Solution

Concept:

Learning algorithms: These algorithms are used in machine learning to help technology in human
learning process. It is used to process data to extract patterns appropriate for application in a new
system.

Explanation:

From above given optio ns, c LIy a¿ s -i ci a) are learning algorithms.

a)Logistic regression: It is used for binaQ' classification problems. It is used to examine and describe
the relationship bet\veeiJ a binaQ' variable and set of predicator variables. The primary objective of
logistic regression is to model the mean of the response variables, given a set or predicator
variables.

b) Back propagation: It is the essence of neural net training. It is the method of fine tuning the
weights of a neural net based on the error rate obtained in the previous iteration. It is a standard
method of training artificial neural networks.

mop Linear aegressiOn MCQ objective questions


Download App

Question 6 Vie\+’ !his Question Online •

A set of observations of independent variable (x) and the corresponding dependent variable (y) is
piven below.

Baseci on the data, the coefficient a of the linear repression model

y = a + bx is estimated as 6.1

The coefficient b is . (round off to one decimal place)

Answer (Detailed Solution Be1owJ 1.9

Linear Regression Question 6 Detailed Solution

Concept:

The normal equation for Fitting a straight line by the least square method is:

Zy = na + b Zx

Zxy = a Zx + bU 2

Where
n = Total number of observatio ns, a and b are the coefficients.
By solving the above two equations coefficients a and b can be obtained.
Given Data and Calculation:

25 80
10 20
Zy = 5l

5l = #a + 14b

In the repi ession moclel ( y = a + bx) where x = 2.50, y" = 5.50 and a = 1 .50 x and y"
denote mean of variables x and y and a is a constant}, which one of the following values
of pal ameter ’b’ of the model is cori ect?

" 1.7?

4. 2.50

Answe

:? ption 2 : 1.GO

Linear Regression Question 7 Detailed Solution

O e Points
Linear regression model:
• Linear repression is a way to model the relationship between twa variables.
• You mipht aJso recognize the equation as the slope formula.
• The equation has the form Y= a + bX,
• where
° Y is the dependent variable (that s the variable that poes on the Y-axis),
° X is the independent variable (i.e. it is pJotted on the X-axis),
° b is the slape of the line and
° a is the y-intercept.
2
( î x ) - (Zx) (sexy)

n(?xy) - (cx) (›ùxy)


a 2
n(Zx ) -(Zx)*

" 160
150
140
Body weight 130 bX
(pounds) wgt = 80 + 2 (hgt)
100
80 50 60 70 80
X- axis: Height (inçfie¥) ›

Calculation: .

( y = a + bx)
where,
• x= 2.50
• a = 1.50
• x and y denote mean of variables x and y and a is a constant)
Puttinp values in the formuJa:
5.50 = 1.50 + b*2.50
b*2.50 = 4
b = 4/2.5 = 1.60

Tkerefore, 'B' is the correct answer.


Download App

Question 8 ”' * View this Question Online >

For a bivariate data set on (x, y), gif.tke, r?iearts, standard deviations and correlation coefficient are

x = 1.0, y = 2.0, s = 3.0, sy = *9.0, r - 0.8


, . ° '“.
Then the regression line afy on x is:

2. y = 2 + 0.27(x — 1)

3. y = 2 + 2.4{K- 1)

4. y = 1 + 0.27(x - 2)

Answer (Detailed Solution Below) ,

Option 3 : y = 2 + 2.4(x - 1) ” ,' *'

Linear Regression Question 8 Detailed Solution , ”“ ”

The regression line of y on x is given as ,' • ”,t" ’’

y — y = rs ay(X /o

Calculation ”

According to the question

w y — 2 = 2.4(x - 1)

-. y = 2 + 2.4(x - 1)
Question 9 Vie'.'v thi° Question inlire

There is no value of x that can simultaneously satisfy bath the piven equations. Therefore, find the
least squares error solution to the two equations, i.e., find the value of x that minimizes the sum of
squares of the errors in the two equations.
2x = 3
4x = 1

Ans ailed Solution Below) 0.5

Linear Regression Question 9 Detailed Solution

Concepts:

Least Square Method:

lt is an approximation method to minimize the error.


I n this method, the distance between the estimated value and the actual value is minimized.

According to the least square method,R ' Z ( i ) where ”i are the estimated (ar guessed
value) and x is the actual values.
By minimizinp this distance R the least-squares error can be found out.
Calculations”
2 ' 3 2x S 0 aod 4x 1 4x — 1 0
Given the functions are:
R = (2x — 3) —+- {4x — 1) .

Hence, to minimize the valL.e of ' 0

=2x —S)+4x2(4x—1)=0

x=j&R =(2x j—Z)’+(4x —i)’=s

. The value of x that minimizes the sum of squares of the errors in the two equations is 1/2.
Question 1 0
Vie\+’ this Question Online >

Consider the fo1Jowinp learning algorithms:


(a) Lopistic regression

(b) Back propagation


(c) Linear repression

Which of the following option represents classification algorithms?

(a)and (b)

2. 0 nI,’ (a) and (c)

3. 0 n1y (b) and

(c) 4. (a), (b), and

{c)

Answer (Detailed So1Lit\on below )

Optio n 1 : Only (a) and (b)

Linear Regression Question 10 Detailed Solution

Concept:

Learning algorithms: These algorithms are used in machine learning to help technology in human
learning process. It is used to process data to extract patterns appropriate for application in a
new system.
Explanation:

From above given options, only ) are learning algorithms.

a) Logistic regression: It is used ‹or binary classification problems. It is used to examine and describe
the relationship between a binary variable and set of predicator variables. The primary objective of
logistic regression is to model the mean of the response variables, piven a set or predicator
variables.

b) Back propagation: It is the essence of neural net training. It is the method of fine tuning the
weights of a neural net based on the error rate obtained in the previous iteration. It is a standard
method of training artificial neural networks.

Download App

Question 11:
View this Oriestion Online >

A set of observations of independent variable (x) and the corresponding dependent variable (y) is
g iven below.

Based on the data, the coefficient a of the linear regression model


y = a + bx is estimated as 6.1
The c ’ent b is . (round off to one decimal place)

Answer (Detailed Solution BelowJ 1.9

Linear Regression Question 11 Detailed Solution

Concept:

The normal equation for Fitting a straight line by the least square method is:
Iy= na + bZx

= a Zx + bU 2 , ’ , ‘•
Where
n = Total number of observations, a and b are the coefficients. ” ’
- By solving
the above two equations coefficients a and b can be obtained. • ”
Given Data and Calculation:

25 80
10
16

Zxy = J'89

n=4So
S1 = 4a + 14b
188 = 14a + S4b
Solving the above two equations 8 - 6.1 and b = 1.9

Quesfion 11:
View this Question Online >

!n the regression model ( y = a + bx) where x = 2.50, y = 5.50 and a = 1.50 x and y
denote mean of variables x and y and a is a constant), which one of the following values
of parameter 'b’ of the model is correct?

1. 1.75 “ ,

, ....
2. 1.60 ’‹ ’
3. 2.00

4. 2.50

er ‹Detailed ?otLit‹on Below)

Optio n 2 : 1.60

Linear Regression Question 12 Detailed Solution

@ Prei
Linear regression model:

• Linear repression is a way to model the relationship between two variables.


• You might also recognize the equation as the slope formula.
• The equation has the form Y= a + by
• where
° Y is the dependent variable (that s the variable that poes on the Y-axis),
° X is the independent variable (i.e. it is plotted on the X-axis),
° b is the slope of the line and
° a is the y-intercept.

a=

n( xy) - (ix) (Lxy)


a
n(*x*) - (*x)"
’ 160
150
140
Body weight 130 bX
(pounds) 1Do
wqt = 80 + 2 (hgt)
80
50 60 70 80
X-axis. He'qht (inches)

Calculation:
( y = a + bx)
where,

• x=2.50

• a = 1.50
• x and denote mean of variables x and y and a is a
constant) Putting values in the formula:
5.50 = 1.50 + b*2.50
b”2.50 = 4

b = 4/2.5 = 1.60

Therefore, 'B' is the correct answer.

$Download App

Question 13:

Fo r a bivariate data set on (x, y), if the means, standard deviations and co rrelatio n coefficient are

x = 1.0, y" = 2.0, s = 3.0, s , = 9.0, r = 0.8

Then the regression line of y on x is:

1)

2. y = — fi.27 (x - 1)

3. y = 2 - 2.4(x - 1)

Optio n 3 : y = 2 + 2.4(x - 1)

Linear Regression Question 13 Detailed Solution

Formula

The i egres sion line of } on x is given as


Calculation

According to the question

=t y — 2 = 2.4(x - 1)
.- y = 2 + 2.4(x - 1)

Download App

Question 14:
\Yiew this Question Online >

There is no value of x that can simultaneously satisfy both the piven equations. Therefore, find the
least squares error solution to the two equations, i.e., find the value of x that minimizes the sum of
squares of the errors in the two equations.
2x = 3
4x = 1

ed Solution Below) 0.5

Linear Regression Question 14 Detailed Solution

Concepts:

Least Square Method:

lt is an approximation method to minimize the error.


I n this method, the distance between the estimated value and the actual value is minimized.

According to the least square method,R ' Z ( i ) where i are the estimated (or guessed
value) and x is the actual values.
Calculations
2 ' 2 3 0 aod 4x 1 4x — 1 0
Given the functions are:
R = (2x — 3)’ -I- (fix —

Hence, to minimize the vaiL:e ol ' ' 0

- 2 x 2 (2x — S) -I- 4 x 2 (4x — 1) 0

x= &R (2 x — s )' + (4 x — 1) ' = s

. The value of x that minimizes the sum of squares of the errors in the two equations is 1/2.

Download App

Quesfion 15:
+/iea this Question Online

Consider the following learning algorithms:

(a) Lop istic regression

(b) Back propagation

(c) Linear repression

Which of the following option represents classification algorithms?

1 ’a) and (b)

2. it ly (a) and (c)

3. 0 nly (b) and (c)

4. (a), (b), and (c)


Optio n 1 : Only (a) and (b)

Linear Regression Question 15 Detailed Solution

Concept:

Learning algorithms: These algorithms are used in machine learning to help technology in human
learning process. It is used to process data to extract patterns appropriate for application in a new
system.

Explanation:

From above given options, z) are learning alporithms.

a) Logistic regression: It is used for bina classification prob lems. It is used to examine and
clescribe the relationship between a bina variable and set of predicator variables. The primary
objective of logistic regression is to model the mean of the response variah les, given a set or
predicator variables.

b) Back propagation: It is the essence of neural net training. It is the method of fine tuning the
weights of a neural net based on the error rate obtained in the previous iteration. It is a standard
method of training artificial neural networks.

You might also like