I'm doing a simple linear regression I tried:
> mod <- lm(rnorm(100,sd=2) ~ rnorm(100,sd=2.1)) > summary(mod) Call: lm(formula = rnorm(100, sd = 2) ~ rnorm(100, sd = 2.1)) Residuals: Min 1Q Median 3Q Max -4.0396 -1.0698 0.0803 0.9823 5.4893 Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 0.0280 0.1868 0.150 0.8812 rnorm(100, sd = 2.1) -0.1523 0.0900 -1.692 0.0939 . --- Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 Residual standard error: 1.868 on 98 degrees of freedom Multiple R-squared: 0.02838, Adjusted R-squared: 0.01846 F-statistic: 2.862 on 1 and 98 DF, p-value: 0.09387
as you can see the multiple R-squared is very very low. I must to reject that model, but what is the level to understand if I can accept the model or not?
Thank you
Contents
hide
Best Answer
You did a linear regression of two random variables. You don't need to look at r-squared to reject that model.
In general, I would not reject/accept models based on r-squared or any other statistic, but on whether they make sense, add to knowledge, help answer questions and so on.
However, the "typical" values of r-squared vary from field to field. Generally higher in physical sciences and lower in social sciences/behavioral sciences
Similar Posts:
- Solved – How to avoid log(0) term in regression
- Solved – 95% confidence interval for the difference between two values from a categorical variable given some r output
- Solved – Can’t understand this multiple R-squared value in R lm()
- Solved – direction of association (beta value) using lm in R
- Solved – Interpreting $R^2$, F-statistic & p-value of a model