0% found this document useful (0 votes)
37 views16 pages

Stats Reviewer

Uploaded by

Jakess
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views16 pages

Stats Reviewer

Uploaded by

Jakess
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

1.

Cronbach’s Alpha (Reliability Test)


Think of it like this:
If you give your friends 10 questions to measure "how introverted they are", Cronbach’s Alpha
checks if the questions are consistent with each other.
• Acceptable numbers (rule of thumb):
o ≥ 0.90 → Excellent
o 0.80–0.89 → Good
o 0.70–0.79 → Acceptable
o 0.60–0.69 → Questionable
o < 0.60 → Poor
• SPSS path:
Analyze → Scale → Reliability Analysis → Select variables → Choose Cronbach’s Alpha
• Quick tip: If alpha is too low, check “Cronbach’s Alpha if Item Deleted” in SPSS to find
the bad question.

2. T-test
T-tests are like comparing two groups to see if their averages are different.
Types:
1. Independent Samples T-test → Compare 2 different groups.
Example: Male vs. Female test scores.
SPSS: Analyze → Compare Means → Independent-Samples T Test
2. Paired Samples T-test → Compare the same group twice.
Example: Weight before and after a diet.
SPSS: Analyze → Compare Means → Paired-Samples T Test
3. One-Sample T-test → Compare a sample mean to a known number.
Example: Is your class average different from 75?
SPSS: Analyze → Compare Means → One-Sample T Test

3. Linear & Multiple Regression


Regression predicts an outcome based on variables.
• Linear Regression → 1 predictor, 1 outcome.
Example: Predict sales from advertising budget.
• Multiple Regression → More than 1 predictor.
Example: Predict salary from years of experience + education level + training.
SPSS path:
Analyze → Regression → Linear → Put outcome in "Dependent" → predictors in
"Independent(s)"
Key outputs:
• R² → % of variation explained by your predictors (closer to 1 is better).
• Beta (β) → Strength & direction of each predictor. Positive means "more increases
outcome", negative means "more decreases outcome".
• Sig. (p-value) → < 0.05 means predictor is statistically significant.

4. Deciding Which Test to Use (High Yield)


Here’s your decision tree:

Scenario Test

2 different groups → compare means Independent t-test

Same group, measured twice Paired t-test

3+ groups → compare means ANOVA

Predict outcome from 1+ variables Regression

Check relationship strength between 2 variables Correlation

Compare proportions/categories Chi-square test

5. ANOVA (Analysis of Variance)


ANOVA is like a t-test for more than 2 groups.
Example: Test scores of students from 3 different schools.
• Null hypothesis (H₀): All groups have the same mean.
• Alternative (H₁): At least one group mean is different.
SPSS path:
Analyze → Compare Means → One-Way ANOVA
Key outputs:
• F-statistic → Ratio of between-group variance to within-group variance.
• Sig. (p-value) → < 0.05 means at least one group is different.
• If significant → run Post Hoc Tests to find exactly which groups differ.

6. Correlation
Checks how strongly 2 variables move together.
• r value range: -1 to +1
o +1 → Perfect positive relationship
o 0 → No relationship
o -1 → Perfect negative relationship
Example: Height & weight. Taller people tend to weigh more → positive correlation.
SPSS path:
Analyze → Correlate → Bivariate

7. Assumptions (Must Check Before Tests)


T-test:
• Data is continuous (interval/ratio scale)
• Groups are independent (for independent t-test)
• Normal distribution (roughly bell-shaped)
• Equal variances (check Levene’s test in SPSS)
ANOVA:
• Data is continuous
• Independent groups
• Normality
• Equal variances (Levene’s test again)
Regression:
• Linear relationship between predictors & outcome
• No extreme outliers
• No multicollinearity (predictors shouldn’t be highly correlated with each other)
• Homoscedasticity (residuals have equal spread)

8. Beta, Sig., F-statistic


When you run regression in SPSS:
• Beta (β): Shows direction & strength of each variable’s effect.
• Sig. (p-value): < 0.05 means the variable significantly predicts the outcome.
• F-statistic: Tests if the overall regression model is significant (p < 0.05 means the
predictors together explain the outcome well).

Final Exam – Statistical Analysis with Software Application (SPSS)

50 Multiple Choice Questions with Answers & Explanations

1. Which Cronbach’s Alpha value is generally considered the minimum acceptable for most
academic research?

A. 0.50
B. 0.60
C. 0.70
D. 0.90

Answer: C
Explanation: In most research, a Cronbach’s Alpha ≥ 0.70 indicates acceptable reliability. Values ≥
0.80 are good, and ≥ 0.90 are excellent.

2. In SPSS, which menu path do you follow to compute Cronbach’s Alpha?

A. Analyze > Compare Means > Means


B. Analyze > Scale > Reliability Analysis
C. Analyze > Descriptive Statistics > Descriptives
D. Analyze > Correlate > Bivariate

Answer: B
Explanation: Cronbach’s Alpha is calculated in Analyze > Scale > Reliability Analysis, then
selecting "Alpha" as the model.

3. A researcher has Cronbach’s Alpha = 0.62. What is the best first step to improve it?

A. Add random new questions


B. Delete items with low item–total correlation
C. Reverse code all items
D. Ignore the result
Answer: B
Explanation: Removing items that do not correlate well with the total score can improve alpha, as
they reduce internal consistency.

4. Which of the following is an assumption for an independent samples t-test?

A. Dependent variable is categorical


B. Groups have equal variances
C. Independent variable is continuous
D. Samples are dependent

Answer: B
Explanation: Independent t-tests assume normal distribution, equal variances (Levene’s test), and
independent observations.

5. In SPSS, which menu option performs an independent samples t-test?

A. Analyze > Compare Means > Paired-Samples T Test


B. Analyze > Compare Means > One-Sample T Test
C. Analyze > Compare Means > Independent-Samples T Test
D. Analyze > Compare Means > Means

Answer: C
Explanation: The Independent-Samples T Test is used to compare means between two unrelated
groups.

6. When running a paired-samples t-test, what type of data is required?

A. Nominal data from different groups


B. Continuous data from the same subjects measured twice
C. Ordinal data from independent groups
D. Categorical data

Answer: B
Explanation: Paired t-tests compare two related measurements, such as pre-test and post-test
scores from the same group.

7. A company compares pre-training and post-training scores of employees. Which test should
be used?

A. Independent t-test
B. Paired t-test
C. One-way ANOVA
D. Regression

Answer: B
Explanation: Since the same individuals are measured twice, a paired t-test is appropriate.

8. In an independent t-test output, the p-value is found in which column of the "Independent
Samples Test" table?

A. Sig. (2-tailed)
B. Levene’s Test
C. Mean Difference
D. Standard Error Difference

Answer: A
Explanation: The p-value for the hypothesis test is in the Sig. (2-tailed) column.

9. Which assumption is common to ANOVA, regression, and t-tests?

A. Homoscedasticity
B. Categorical dependent variable
C. Unequal variances
D. Chi-square distribution

Answer: A
Explanation: All these tests assume homoscedasticity (equal variances) when comparing group
means.

10. In SPSS, where do you find the F-statistic for a one-way ANOVA?

A. In the ANOVA table


B. In the Post Hoc table
C. In the Descriptives table
D. In the Levene’s Test table

Answer: A
Explanation: The F-value and associated p-value are presented in the ANOVA table of SPSS
output.

11. An F-statistic in ANOVA represents:

A. Ratio of between-group variance to within-group variance


B. Difference in medians between groups
C. Correlation strength
D. Probability of error

Answer: A
Explanation: F = Between-group variance / Within-group variance. Larger values suggest greater
between-group differences.

12. If the p-value in an ANOVA is less than 0.05, what does this indicate?

A. Groups have equal means


B. At least one group mean is significantly different
C. No significant difference
D. The null hypothesis is true

Answer: B
Explanation: A p < 0.05 means there is evidence that at least one group differs from the others.

13. A study tests weight loss among three diets. Which test is most appropriate?

A. Independent t-test
B. Paired t-test
C. One-way ANOVA
D. Regression

Answer: C
Explanation: Comparing more than two group means uses ANOVA.

14. Which SPSS path runs a one-way ANOVA?

A. Analyze > Compare Means > One-Way ANOVA


B. Analyze > Descriptive Statistics > Explore
C. Analyze > Compare Means > Independent-Samples T Test
D. Analyze > Correlate > Bivariate

Answer: A
Explanation: The correct menu for one-way ANOVA is Analyze > Compare Means > One-Way
ANOVA.

15. Correlation measures:

A. Causal relationship
B. Degree of linear relationship between two variables
C. Difference in means
D. Effect size in ANOVA

Answer: B
Explanation: Correlation assesses how strongly two variables move together in a linear fashion.

16. Which SPSS menu path is used for correlation analysis?

A. Analyze > Compare Means > Means


B. Analyze > Correlate > Bivariate
C. Analyze > Regression > Linear
D. Analyze > Scale > Reliability Analysis

Answer: B
Explanation: Pearson, Spearman, and Kendall correlations are accessed under Analyze >
Correlate > Bivariate.

17. In correlation output, the “Sig. (2-tailed)” value represents:

A. The effect size


B. The p-value testing if correlation ≠ 0
C. The correlation coefficient
D. The slope of the regression line

Answer: B
Explanation: “Sig. (2-tailed)” tests the null hypothesis that the correlation is zero.

18. A Pearson correlation coefficient (r) = 0.85 means:

A. A strong positive linear relationship


B. A strong negative linear relationship
C. A weak positive relationship
D. No relationship

Answer: A
Explanation: r values close to 1 indicate a strong positive relationship.

19. Which assumption is required for Pearson correlation?


A. Categorical variables
B. Linearity and normality
C. Multicollinearity
D. Heteroscedasticity only

Answer: B
Explanation: Pearson correlation assumes both variables are continuous, normally distributed,
and linearly related.

20. In SPSS, to run a simple linear regression, use:

A. Analyze > Regression > Linear


B. Analyze > Correlate > Bivariate
C. Analyze > Compare Means > Means
D. Analyze > Descriptive Statistics > Descriptives

Answer: A
Explanation: Linear regression is done under Analyze > Regression > Linear.

21. In regression, the β (beta) coefficient indicates:

A. The mean difference


B. The strength and direction of predictor’s effect
C. The proportion of variance explained
D. The test statistic for ANOVA

Answer: B
Explanation: Beta shows how much the dependent variable changes per unit change in the
predictor.

22. R² = 0.65 means:

A. The predictors explain 65% of variance in the dependent variable


B. The model is correct 65% of the time
C. Each predictor has a 65% chance of being significant
D. 65% of predictors are independent

Answer: A
Explanation: R² is the proportion of variance in the dependent variable explained by the predictors.

23. In regression output, “Sig.” values test:


A. Whether β is significantly different from zero
B. Whether variances are equal
C. Whether correlation is strong
D. Whether F-statistic is large

Answer: A
Explanation: The p-value for each coefficient tests if the predictor has a statistically significant
effect.

24. In SPSS regression, the F-statistic tests:

A. Each β individually
B. Whether the model explains a significant amount of variance overall
C. Equality of means
D. Multicollinearity

Answer: B
Explanation: The F-test checks if the overall model significantly predicts the dependent variable.

25. Which assumption is specific to regression and ANOVA but not t-test?

A. Independence
B. Homoscedasticity
C. Linearity between predictors and outcome
D. Normality

Answer: C
Explanation: Linearity is crucial in regression; ANOVA can be seen as a special regression case.

26. A researcher wants to see if average sales differ by branch (Branch A, B, C). Which test?

A. Independent t-test
B. One-way ANOVA
C. Regression
D. Correlation

Answer: B
Explanation: More than two group means = one-way ANOVA.

27. Which post-hoc test is common after ANOVA?


A. Levene’s Test
B. Tukey HSD
C. Chi-square
D. Shapiro-Wilk

Answer: B
Explanation: Tukey’s HSD identifies which specific groups differ.

28. In SPSS ANOVA, Levene’s Test checks:

A. Equality of means
B. Equality of variances
C. Normality
D. Independence

Answer: B
Explanation: Levene’s Test assesses the homogeneity of variances assumption.

29. Paired t-test is appropriate when:

A. Two related measurements per subject


B. Two independent groups
C. More than two groups
D. Variables are categorical

Answer: A
Explanation: Paired t-tests compare dependent measurements.

30. Independent t-test is appropriate when:

A. Comparing two unrelated groups’ means


B. Same group pre/post
C. More than two groups
D. Testing correlation

Answer: A
Explanation: It’s for comparing means of two independent samples.

31. Cronbach’s Alpha = 0.94 means:

A. Poor reliability
B. Excellent reliability
C. Acceptable reliability
D. Unacceptable reliability

Answer: B
Explanation: ≥ 0.90 = excellent.

32. In SPSS, the grouping variable for an independent t-test is defined in:

A. Test Variable(s) box


B. Grouping Variable box
C. Options menu
D. Post Hoc menu

Answer: B
Explanation: The grouping variable defines the categories for comparison.

33. Which SPSS feature converts continuous to categorical?

A. Visual Binning
B. Descriptives
C. Crosstabs
D. Means

Answer: A
Explanation: Visual binning groups continuous values into categories.

34. To test if pre-test and post-test scores are significantly different, use:

A. Independent t-test
B. Paired t-test
C. One-way ANOVA
D. Chi-square

Answer: B
Explanation: Same group, two measurements = paired t-test.

35. Which assumption is common to all parametric tests?

A. Normality of data
B. Categorical variables
C. Unequal variances
D. Multicollinearity
Answer: A
Explanation: Normal distribution of the dependent variable is required.

36. Multiple regression is used when:

A. One dependent variable, several predictors


B. Several dependent variables, one predictor
C. One predictor, one outcome
D. Comparing means

Answer: A
Explanation: Multiple regression predicts DV from multiple IVs.

37. In SPSS, to check assumptions for regression, you can:

A. Analyze residual plots


B. Look at frequencies
C. Run a Chi-square
D. Use Crosstabs

Answer: A
Explanation: Residual plots help check linearity, homoscedasticity, and normality.

38. If Sig. < 0.05 in t-test, you:

A. Fail to reject null


B. Reject null hypothesis
C. Accept null hypothesis
D. Ignore results

Answer: B
Explanation: p < 0.05 means significant difference, reject H₀.

39. SPSS command for detailed frequency distribution is:

A. Analyze > Descriptive Statistics > Frequencies


B. Analyze > Descriptive Statistics > Descriptives
C. Analyze > Compare Means > Means
D. Analyze > Descriptive Statistics > Explore

Answer: A
Explanation: Frequencies provides counts, %, cumulative %.
40. Independent t-test menu path is:

A. Analyze > Compare Means > Paired-Samples T Test


B. Analyze > Compare Means > One-Sample T Test
C. Analyze > Compare Means > Independent-Samples T Test
D. Analyze > Descriptive Statistics > Descriptives

Answer: C
Explanation: This option compares two independent group means.

41. Grouping variable for independent t-test is set in:

A. Test Variable(s)
B. Grouping Variable
C. Define Groups
D. Options

Answer: B
Explanation: The grouping variable tells SPSS which groups to compare.

42. p-value for independent t-test is in:

A. Group Statistics
B. Descriptives
C. Independent Samples Test
D. Case Processing Summary

Answer: C
Explanation: This table shows the t-statistic and p-value.

43. One-way ANOVA menu path is:

A. Analyze > Compare Means > One-Way ANOVA


B. Analyze > Compare Means > Independent-Samples T Test
C. Analyze > General Linear Model > Univariate
D. Analyze > Descriptive Statistics > Explore

Answer: A
Explanation: The specific menu for one-way ANOVA is under Compare Means.

44. Dependent variable in ANOVA goes in:

A. Factor box
B. Dependent List box
C. Fixed Factor(s) box
D. Dependent Variable box

Answer: D
Explanation: The dependent variable is placed in the “Dependent Variable” field.

45. F-value and p-value in ANOVA are found in:

A. Descriptives
B. ANOVA table
C. Post Hoc Tests
D. Means Plots

Answer: B
Explanation: The ANOVA table contains these statistics.

46. To test difference in satisfaction between those who got a report vs. not:

A. ANOVA
B. Correlation
C. Independent t-test
D. Regression

Answer: C
Explanation: Two independent groups = independent t-test.

47. Regression with R² = 0.82 means:

A. Model explains 82% of DV variance


B. 82% probability model is correct
C. Variables are independent
D. No multicollinearity

Answer: A
Explanation: R² is variance explained by predictors.

48. Comparing accuracy of 5 preparers:

A. ANOVA
B. Paired t-test
C. Regression
D. Chi-square
Answer: A
Explanation: More than two groups = ANOVA.

49. Testing if late payment proportion is same across 4 client types:

A. Independent t-test
B. Chi-square
C. ANOVA
D. Regression

Answer: B
Explanation: Chi-square is used for proportions/frequencies.

50. Predicting net income from expenses, sales, tax:

A. Correlation
B. Descriptive Statistics
C. Multiple Regression
D. T-test

Answer: C
Explanation: Multiple regression predicts a DV from several IVs.

You might also like