0% found this document useful (0 votes)
39 views4 pages

IE360 Quiz3

The document is a quiz for the IE 360 Statistical Forecasting and Time Series course, consisting of True/False and Multiple Choice questions focused on time series modeling concepts. It assesses students' understanding of various aspects of time series analysis, including ARIMA models, stationarity, and autocorrelation. The quiz is designed to be completed in 45 minutes without requiring calculations, emphasizing interpretation and conceptual knowledge.

Uploaded by

ahmetyes123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
39 views4 pages

IE360 Quiz3

The document is a quiz for the IE 360 Statistical Forecasting and Time Series course, consisting of True/False and Multiple Choice questions focused on time series modeling concepts. It assesses students' understanding of various aspects of time series analysis, including ARIMA models, stationarity, and autocorrelation. The quiz is designed to be completed in 45 minutes without requiring calculations, emphasizing interpretation and conceptual knowledge.

Uploaded by

ahmetyes123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

09.05.

2025

IE 360 Statistical Forecasting and Time Series


Quiz 3, Spring 2025
Duration: 45 minutes
Instructions:
• This quiz assesses your understanding of time series modeling concepts.
• Read each question carefully. It contains a mix of True/False and Multiple Choice questions. No
calculations are required. Focus on interpretation and conceptual understanding.
• Answer all questions. For multiple-choice questions, select the best answer. For true/false, simply
indicate True or False (with a brief explanation if instructed).
Good luck!

True/False Questions
Provide brief explanation if your answer is False (5 points each)

Q1. Regressing two independent non-stationary time series can produce misleadingly high R² and
significant coefficients.

Q2. Differencing a stationary series does not harm the stationarity.

Q3. The ACF of a stationary MA(1) process exhibits a single spike at lag 1 followed by near-zero
autocorrelations.

Q4. The PACF of a white noise process has a sharp cutoff at lag 0.

Q5. Including seasonal dummy variables in a regression model can capture SARIMA-type behavior.

1
Q6. First-order differencing removes both trend and seasonality if they exist together in a series.

Q7. A seasonal ARIMA model can capture both short-term autocorrelation and repeating seasonal
patterns in the same framework.

Q8. A non-zero mean in residuals from an ARIMA model indicates a violation of model assumptions.

Multiple Choice Questions (5 points each)


Q1. You observe the following for a time series model:
• ACF shows slow decay
• PACF cuts off sharply after lag 1
What model is most consistent with this behavior?
A. AR(1)
B. MA(1)
C. ARIMA(0,1,1)
D. White Noise

Q2. Which of the following would most likely cause a spurious regression?
A. Fitting an AR(1) model to stationary white noise
B. Regressing two trend series without first transforming them to stationarity
C. Regressing differenced series with omitted variables
D. Including seasonal dummies in stationary data

Q3. Which of the following SARIMA models can best capture monthly seasonality and differenced
trend?
A. ARIMA(1,1,1)
B. SARIMA(1,1,1)(0,1,1)[12]
C. SARIMA(0,1,1)(1,0,0)[7]
D. ARIMA(0,0,1)

2
Q4. You are given a seasonal time series. After applying SARIMA(1,0,0)(0,1,1)[12], the
residuals show no autocorrelation. What is the best next step?
A. Add more seasonal AR terms
B. Add a deterministic trend
C. Evaluate residual diagnostics and finalize the model
D. Try differencing again

Q5. Which of the following models could produce an ACF with spikes at lag 12, 24, 36?
A. ARIMA(0,0,1)
B. SARIMA(0,0,1)(0,1,1)[12]
C. ARIMA(1,1,0)
D. SARIMA(1,0,0)(1,0,0)[12]

Q6. You suspect that temperature and electricity demand have a true relationship. However, when you
regress them in levels (without transformation), the residuals show high autocorrelation and both series
show upward trends. What is the best course of action?
A. Add lagged residuals to the model
B. Transform both series to differences before regression
C. Remove outliers from the residuals
D. Include dummy variables for months

Q7. After differencing a time series, the ACF shows a spike at lag 1 and nothing beyond. What is the
likely model?
A. AR(1)
B. MA(1)
C. ARIMA(0,1,1)
D. White noise

Q8. A researcher fits a model and observes that residuals pass all white noise checks, yet out-of-sample
forecasts are poor. What is the most likely issue?
A. The model is too simple
B. The model is overfit to in-sample noise
C. Residuals should be autocorrelated
D. Differencing was not used

3
Q9. Why might a researcher prefer BIC over AIC for ARIMA model selection?
A. BIC penalizes model complexity more heavily
B. AIC is not defined for time series
C. BIC always leads to the most accurate forecasts
D. AIC favors underfitting in small samples

Q10. (Select all that apply) You fit an ARIMA(1,1,1) model to a differenced time series, but the AIC is
higher than for ARIMA(0,1,1) and residuals show significant PACF at lag 1. What does this likely
suggest?
A. The AR term is unnecessary
B. The MA(1) model is underfitting
C. Differencing may have overdifferenced the series
D. PACF of residuals should be flat if the model is adequate

Q11. You model a time series and find that the residuals follow a cyclical pattern. What is the best next
modeling step?
A. Include an MA term
B. Add a seasonal differencing term
C. Increase the forecast horizon
D. Reduce the AR order

Q12. An analyst decomposes a daily demand series using STL into trend, seasonal, and residual
components. She models the residuals using ARIMA(1,0,1), and:
• Residuals from the ARIMA model pass the Ljung-Box test,
• Forecasts on held-out data show bias during specific forecast windows,
• Adding exogenous variables or increasing AR/MA order does not reduce forecast error.
• Which of the following is the most plausible explanation?
A. The STL decomposition may have misattributed dynamic features (e.g., structural shifts) to the
residual instead of trend or seasonality
B. The ARIMA model is underfitting and requires differencing or seasonal terms
C. The trend and seasonal components should have been modeled jointly with the residual component
D. The residuals may show time-varying patterns in variability, which are not captured by the ARIMA
model

You might also like