0% found this document useful (0 votes)
32 views6 pages

JSO (Test - 10) Paid

Uploaded by

Lakshya Yadav
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views6 pages

JSO (Test - 10) Paid

Uploaded by

Lakshya Yadav
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Practice Set for (Paid Course) JSO/ASO

(Statistics) Test- 10
By
L1 Classes Prayagraj
Time: 1:00 hour mm: 100

1. In multiple regression, what does a high 4. Multiple correlation can be


value of 𝑹𝟐 indicate? decomposed into:
a) A good fit of the model to the data a) Several partial correlations
b) High multicollinearity b) Several bivariate correlations
c) The residuals are independent c) Spearman’s rank correlations
d) All independent variables are d) Covariance matrices
significant Answer: a
Answer: a 5. If the partial correlation 𝒓𝒙𝒚.𝒛 is close to
2. What does the partial correlation zero, it suggests that:
coefficient measure when controlling a) The relationship between X and Y is
for a third variable? not explained by Z
a) The direct association between two b) X and Y are highly correlated
variables, ignoring the third variable c) There is no relationship between X
b) The remaining association between and Y
two variables after accounting for the d) Z has no effect on X and Y
third variable Answer: c
c) The sum of the squared residuals 6. In multiple regression, controlling for
d) The overall goodness of fit of the additional variables helps to:
regression a) Remove confounding effects
Answer: b b) Decrease the goodness of fit
3. In the context of correlation and c) Increase the residuals
regression, what does 𝒓𝒙𝒚.𝒛 represent? d) Increase multicollinearity
a) The partial correlation between X and Answer: a
Y, controlling for Z 7. When the partial correlation between
b) The multiple correlation coefficient two variables is high after controlling
for X, Y, and Z for a third, it suggests:
c) The regression coefficient of X on Z a) The two variables are strongly related
d) The residual sum of squares independently of the third
Answer: a b) The third variable explains most of the
variance b) Two dependent variables
c) There is multicollinearity c) No dependent variable
d) The dependent variable has no effect d) None
Answer: a Answer: (a)
8. What happens to the multiple 13.The equation for a multiple regression
correlation coefficient when more model is:
variables are added to the model? a) Y = a + b1X1 + b2X2 +⋯+bk Xk
a) It can increase or stay the same b) Y = a + b X
b) It always decreases c) X = a + b Y
c) It remains constant d) None
d) It decreases exponentially Answer: (a)
Answer: a 14.The coefficients b1, b2, … in multiple
9. The range of Cramér’s V is: regression represent:
a) -1 to 1 a) Partial effects of independent
b) 0 to 1 variables
c) -2 to 2 b) Total effects
d) None c) Means
Answer: (b) d) None
10.The presence of association between Answer: (a)
two attributes does not imply: 15.The dependent variable in multiple
a) Causation regression must be:
b) Correlation a) Continuous
c) Independence b) Categorical
d) None c) Binary
Answer: (a) d) None
11.Which of the following measures the Answer: (a)
strength of association in a 2×2 table? 16.In multiple regression, multicollinearity
a) Yule’s Q occurs when:
b) Chi-square a) Independent variables are highly
c) Phi coefficient correlated
d) All of the above b) Dependent variables are highly
Answer: (d) correlated
12.Multiple regression involves: c) Independent and dependent
a) One dependent variable and two or variables are unrelated
more independent variables
d) None c) Changing the dependent variable
Answer: (a) d) None
17.The adjusted R2 is used in multiple Answer: (a)
regression to: 22.Partial correlation measures:
a) Penalize for the number of predictors a) The correlation between two
b) Maximize the variance variables, controlling for a third
c) Remove outliers variable
d) None b) Total correlation
Answer: (a) c) Nonlinear correlation
18.If R2 = 0.75, it means: d) None
a) 75% of the variance in Y is explained Answer: (a)
by the predictors 23.Multiple correlation coefficient (RRR)
b) 25% of the variance in Y is explained measures:
by the predictors a) Strength of association between one
c) Y and X are independent dependent variable and multiple
d) None independent variables
Answer: (a) b) Partial effects
19.The F-test in multiple regression is used c) Multicollinearity
to test: d) None
a) Overall significance of the model Answer: (a)
b) Individual coefficients 24.The range of multiple correlation
c) Multicollinearity coefficient is:
d) None a) -1 to 1
Answer: (a) b) 0 to 1
20.Adding irrelevant variables to a multiple c) 0 to 2
regression model will: d) None
a) Reduce R2 Answer: (b)
b) Increase R2 25.If R2 = 1, it indicates:
c) Have no effect on R2 a) Perfect prediction
d) None b) No prediction
Answer: (b) c) Partial correlation
21.Backward elimination in regression d) None
analysis involves: Answer: (a)
a) Removing insignificant predictors
b) Adding predictors
d) None
26.The formula for multiple correlation is: Answer: (a)
2 +𝑟 2 −2𝑟 𝑟 𝑟
2
a) 𝑅1.23 =
𝑟12 13 12 13 23 31.The correlation coefficient between
2
1− 𝑟23
2 +𝑟 2 −4𝑟 𝑟 𝑟
two variables is 0. What does this
2 𝑟12 13 12 13 23
b) 𝑅1.23 = 2 imply?
1− 𝑟23
2
2 +𝑟 2 −2𝑟 𝑟 𝑟
𝑟12 13 12 13 23 a) No linear relationship exists
c) 𝑅1.23 = 2
1− 𝑟12 between the variables
d) (d) None b) No relationship exists between the
Answer: (a) variables
27.If two independent variables are c) A nonlinear relationship exists
uncorrelated, partial correlation is between the variables
equal to: d) None of these
a) Simple correlation Answer: (a)
b) Zero 32.The sum of the two regression
c) One coefficients bxy and byx is:
d) None a) Equal to the correlation coefficient
Answer: (a) b) Always less than 1
28.Partial correlation is zero if: c) Always greater than 1
a) The variables are independent d) None of these
b) The variables are correlated Answer: (d)
c) R2 = 1 33.If the correlation coefficient is
d) None positive, then the regression lines
Answer: (a) will:
29.In three variables X, Y, Z the correlation a) Intersect at 90∘
between X and Y controlling Z is: b) Intersect at 45∘
a) Partial correlation c) Have a positive slope
b) Multiple correlation d) None
c) Simple correlation Answer: (c)
d) None 34.A perfect correlation (r=1) implies
Answer: (a) that:
30.Partial correlation is useful for: a) All points lie on a straight line
a) Eliminating the effect of confounding b) Regression lines are perpendicular
variables c) The scatter plot is random
b) Testing hypotheses d) None of these
c) Multicollinearity Answer: (a)
35.Which of the following methods can 39.Which of the following is a violation
estimate parameters in a multiple of the assumptions of multiple
regression model? regression?
a) Ordinary Least Squares (OLS) a) Multicollinearity
b) Maximum Likelihood Estimation b) Homoscedasticity
(MLE) c) Linearity
c) Both (a) and (b) d) None
d) None of these Answer: (a)
Answer: (c) 40.The Durbin-Watson test is used to
36.If two variables X and Y are detect:
independent, then the correlation a) Multicollinearity
coefficient between them is: b) Autocorrelation in residuals
a) 0 c) Normality of residuals
b) 1 d) Heteroscedasticity
c) -1 Answer: (b)
d) Undefined 41.The coefficient of determination (R2)
Answer: (a) in regression indicates:
37.In a regression equation, a) The proportion of variance in the
multicollinearity affects: dependent variable explained by
a) The standard errors of the the independent variables
coefficients b) The strength of the relationship
b) The slope of the regression line between variables
c) The correlation coefficient c) Both (a) and (b)
d) None d) None
Answer: (a) Answer: (a)
38.In regression analysis, residuals are: 42.If R2 = 0.9, it means that:
a) The differences between observed a) 90% of the variance in the
and predicted values dependent variable is explained
b) The coefficients of independent b) 10% of the variance in the
variables dependent variable is explained
c) The intercept of the regression c) The correlation coefficient is
line 0.90.90.9
d) None d) None of these
Answer: (a) Answer: (a)
43.The assumption of homoscedasticity b) Scatter plot
in regression implies: c) Line chart
a) The variance of residuals is d) None
constant across all levels of the Answer: (a)
independent variable 48.If two variables are perfectly
b) The mean of residuals is zero correlated, the regression line will:
c) Residuals are normally distributed a) Pass through every point in the
d) None scatter plot
Answer: (a) b) Be parallel to the x-axis
44.Which diagnostic tool is used to c) Be parallel to the y-axis
detect influential observations in d) None
regression analysis? Answer: (a)
a) Cook's Distance 49.Which of the following tests
b) Durbin-Watson statistic multicollinearity in regression?
c) Variance Inflation Factor (VIF) a) Variance Inflation Factor (VIF)
d) None b) Durbin-Watson statistic
Answer: (a) c) Shapiro-Wilk test
45.Heteroscedasticity refers to: d) None
a) Unequal variance of residuals Answer: (a)
b) Multicollinearity among 50.When adding a new predictor to a
independent variables regression model, the adjusted R2:
c) Nonlinearity in data a) Can decrease if the predictor is
d) None irrelevant
Answer: (a) b) Always increases
46.A partial regression plot is used to: c) Always decreases
a) Identify the relationship between d) None
two variables, controlling for other Answer: (a)
variables
b) Examine residuals
c) Test for multicollinearity
d) None
Answer: (a)
47.Which plot is used to check
normality of residuals in regression?
a) Q-Q plot

You might also like