Hasil (
Bahasa Indonesia) 1:
[Salinan]Disalin!
Drawing conclusions about the dependent variable requires that we make six assumptions, the classic assumptions in relation to the linear regression model:The relationship between the dependent variable Y and the independent variable X is linear in the slope and intercept parameters a and b. This requirement means that neither regression parameter can be multiplied or divided by another regression parameter (e.g. a/b), and that both parameters are raised to the first power only. In other words, we can't construct a linear model where the equation was Y = a + b2X + ε, as unit changes in X would then have a b2 effect on a, and the relation would be nonlinear.The independent variable X is not random.The expected value of the error term "ε" is 0. Assumptions #2 and #3 allow the linear regression model to produce estimates for slope b and intercept a.The variance of the error term is constant for all observations. Assumption #4 is known as the "homoskedasticity assumption". When a linear regression is heteroskedastic its error terms vary and the model may not be useful in predicting values of the dependent variable.The error term ε is uncorrelated across observations; in other words, the covariance between the error term of one observation and the error term of the other is assumed to be 0. This assumption is necessary to estimate the variances of the parameters.The distribution of the error terms is normal. Assumption #6 allows hypothesis-testing methods to be applied to linear-regression models.
Sedang diterjemahkan, harap tunggu..
