Where results make sense
About us   |   Why use us?   |   Reviews   |   PR   |   Contact us  

Topic: Regression

Related Topics

  Regression - Wikipedia, the free encyclopedia
Generally, regression is a move backwards; It is the opposite of progress.
In psychology, regression is believed to be a defense mechanism of the ego, where a person reacts to failure with an immature response.
In philosophy and theology, infinite regression is a problem discussed in the cosmological argument.
en.wikipedia.org /wiki/Regression   (271 words)

 Regression analysis - Wikipedia, the free encyclopedia
In statistics, regression analysis is used to model relationships between random variables, determine the magnitude of the relationships between variables, and can be used to make predictions based on the models.
Simple linear regression and multiple linear regression are related statistical method for modeling the relationship between two or more random variables using a linear equation.
Linear regression assumes the best estimate of the response is a linear function of some parameters (though not necessarily linear on the predictors).
en.wikipedia.org /wiki/Regression_analysis   (951 words)

 [No title]
Thus the rms of the vertical residuals is a measure of the typical vertical distance from the data to the regression line, that is, the typical error in estimating the value of Y by the height of the regression line.
Recall that the regression line is a smoothed version of the graph of averages: The height of the regression line at the point x is an estimate of the average of the values of Y for individuals whose value of X is close to x.
If the scatterplot is football shaped, the regression line follows the graph of averages reasonably well: In each vertical slice, the deviations of the values of Y from their mean is approximately the vertical residuals of those values of Y from the regression line.
www.stat.berkeley.edu /users/stark/SticiGui/Text/ch6.htm   (3506 words)

 [No title]   (Site not responding. Last check: 2007-10-21)
The regression line approximates the relationship between X and Y. The slope and intercept of the regression line can be found from the five numbers.
The regression line is the line that fits the data best, in a sense made precise in this chapter.
Regression is a common statistical tool, better suited to summarizing some scatterplots than to drawing inferences.
www.stat.berkeley.edu /users/stark/SticiGui/Text/ch5.1.htm   (873 words)

 Logistic Regression
Among other situations, linear least squares regression is the thing to do when one asks for the best way to estimate the response from the predictor variables when they all have a joint multivariate normal distribution.
Logistic regression is used to determine whether other measurements are related to the presence of some characteristic--for example, whether certain blood measures are predictive of having a disease.
While the response variable in a logistic regression is a 0/1 variable, the logistic regression equation, which is a linear equation, does not predict the 0/1 variable itself.
www.tufts.edu /~gdallal/logistic.htm   (992 words)

 Regression Analysis
Regression analysis is used to model the relationship between a response variable and one or more predictor variables.
Regression Model Selection - fits all possible regression models for multiple predictor variables and ranks the models by the adjusted R-squared or Mallows' Cp statistic.
If the number of predictors is not excessive, it is possible to fit regression models involving all combinations of 1 predictor, 2 predictors, 3 predictors, etc, and sort the models according to a goodness-of fit statistic.
www.statgraphics.com /regression_analysis.htm   (993 words)

 Logistic Regression
Backward stepwise regression appears to be the preferred method of exploratory analyses, where the analysis begins with a full or saturated model and variables are eliminated from the model in an iterative process.
Since logistic regression calculates the probability or success over the probability of failure, the results of the analysis are in the form of an odds ratio.
For example, logistic regression is often used in epidemiological studies where the result of the analysis is the probability of developing cancer after controlling for other associated risks.
online.sfsu.edu /~efc/classes/biol710/logistic/logisticreg.htm   (1020 words)

 PA 765: Multiple Regression
Multiple regression shares all the assumptions of correlation: linearity of relationships, the same level of relationship throughout the range of the independent variable ("homoscedasticity"), interval or near-interval data, absence of outliers, and data whose range is not truncated.
Cubic regression splines operate similar to local polynomial regression, but a constraint is imposed that the regression line in a given bin must join to the start of the regression line in the next bin, thereby avoiding discontinuities in the curve, albeit by increasing error a bit.
Local regression fits a regression surface not for all the data points as in traditional regression, but for the data points in a "neighborhood." Researchers determine the "smoothing parameter," which is a specified percentage of the sample size, and neighborhoods are the points within the corresponding radius.
www2.chass.ncsu.edu /garson/pa765/regress.htm   (19061 words)

 [No title]
In regression problems the purpose is to predict the value of a continuous output variable.
In regression problems, the purpose of the neural network is to learn a mapping from the input variables to a continuous output variable, or variables.
The aim in using a regression network is therefore to produce an estimate that has a lower prediction error standard deviation than the training data standard deviation.
www.statsoft.com /textbook/glosr.html   (4576 words)

 PA 765: Logistic Regression
Parameter estimates (b coefficients) are logits of explanatory variables used in the logistic regression equation to estimate the log odds that the dependent equals 1 (binomial logistic regression) or that the dependent equals its highest/last value (multinomial logistic regression [though the researcher may select any value as the reference value, overriding the default]).
The convention for binomial logistic regression is to code the dependent class of greatest interest as 1 and the other class as 0, and to code its expected correlates also as +1 to assure positive correlation.
Logistic regression does not require linear relationships between the independent factor or covariates and the dependent, as does OLS regression, but it does assume a linear relationship between the independents and the log odds (logit) of the dependent.
www2.chass.ncsu.edu /garson/pa765/logistic.htm#lltests   (12403 words)

The regression functions are used to determine the relationship between the dependent variable (target field) and one or more independent variables.
This is a linear regression equation predicting a number of insurance claims on prior knowledge of the values of the independent variables age, salary and car location.
This is a stepwise polynomial regression equation predicting a number of insurance claims on prior knowledge of the values of the independent variables salary and car location.
www.dmg.org /v2-0/Regression.html   (914 words)

 DSS - Interpreting Regression Output
When running your regression, you are trying to discover whether the coefficients on your independent variables are really different from 0 (so the independent variables are having a genuine effect on your dependent variable) or if alternatively any apparent differences from 0 are just due to random chance.
In simple or multiple linear regression, the size of the coefficient for each independent variable gives you the size of the effect that variable is having on your dependent variable, and the sign on the coefficient (positive or negative) gives you the direction of the effect.
In regression with a single independent variable, the coefficient tells you how much the dependent variable is expected to increase (if the coefficient is positive) or decrease (if the coefficient is negative) when that independent variable increases by one.
dss.princeton.edu /online_help/analysis/interpreting_regression.htm   (977 words)

 Frank Anscombe's Regression Examples
The intimate relationship between correlation and regression raises the question of whether it is possible for a regression analysis to be misleading in the same sense as the set of scatterplots all of which had a correlation coefficient of 0.70.
Figure 1 is the picture drawn by the mind's eye when a simple linear regression equation is reported.
However, the regression equation is determined entirely by the single observation at x=19.
www.tufts.edu /~gdallal/anscombe.htm   (211 words)

 Regression Analysis
The goal of regression analysis is to obtain estimates of the unknown parameters Beta_1,..., Beta_K which indicate how a change in one of the independent variables affects the values taken by the dependent variable.
In economics, the dependent variable might be a family's consumption expenditure and the independent variables might be the family's income, number of children in the family, and other factors that would affect the family's consumption patterns.
Since the residuals from a regression will generally not be independently or identically distributed (even if the disturbances in the regression model are), it is advisable to weight the residuals by their standard deviations (this is what is meant by
elsa.berkeley.edu /sst/regression.html   (2389 words)

 Study Skills - Reading Fluency
Regression is the tendency to re-read a sentence, phrase, or passage that has already been read.
Usually, regression is a result of a lack of concentration the first time through.
The best way to control regression is to notice when you do it and make a conscious effort to increase your concentration.
www.wwu.edu /depts/tutorialcenter/studyskills/readingspeed.htm   (285 words)

Linear regression uses one independent variable to explain and/or predict the outcome of Y, while multiple regression uses two or more independent variables to predict the outcome.
Regression takes a group of random variables, thought to be predicting Y, and tries to find a mathematical relationship between them.
Regression is often used to determine how much specific factors such as the price of a commodity, interest rates, particular industries or sectors influence the price movement of an asset.
www.investopedia.com /terms/r/regression.asp   (273 words)

 Regression Statistical Procedure
Options: If you are doing stepwise regression, you can change the criteria for entry and removal under this submenu.
This is used to determine factors that affect the presence or absence of a characteristic when the dependent variable has two levels (binary logistic) or more (multinomial logistic).
Options: If you are doing stepwise regression, this is where you can set your entry and removal criteria.
calcnet.mth.cmich.edu /org/spss/StaProcRegress.htm   (311 words)

This is a useful and simple to use exploratory method that relies on principles of bandwidth regression.
Enter a quadratic polynomial regression equation and ZumaStat calculates the lowest point on the predicted curve, the highest point on the predicted curve and the rate of change between any two points on the curve.
These include power analysis for the test of average correlations, power analysis for contrasts between groups of studies on average correlations, power analysis of WLS regression analyses of correlations and power analysis for the Q test of homogeneity for correlations for a fixed effects model.
www.zumastat.com /regression.htm   (1503 words)

 regressive fallacy
The regressive fallacy is the failure to take into account natural and inevitable fluctuations of things when ascribing causes to them (Gilovich 1993: 26).
This tendency to move toward the average away from extremes was called "regression" by Sir Francis Galton in a study of the average heights of sons of very tall and very short parents.
(The study was published in 1885 and was called "Regression Toward Mediocrity in Hereditary Stature.") He found that sons of very tall or very short parents tend to be tall or short, respectively, but not as tall or as short as their parents.
skepdic.com /regressive.html   (774 words)

 Stat 5102 (Geyer, Spring 2003) Regression in R
We'll use the example for simple linear regression and the example for quadratic regression which were done above.
Regression coefficients in simple linear regression are intimately related to correlation coefficients.
The null hypothesis is some regression model, and the alternative hypothesis is some other regression model, and the little model is a submodel of the big model (the little model is obtained by setting some of the regression coefficients of the big model to zero).
www.stat.umn.edu /geyer/5102/examp/reg.html   (2148 words)

 WINKS Statistics Software - Simple Linear Regression   (Site not responding. Last check: 2007-10-21)
A regression line is the line described by the equation and the regression equation is the formula for the line.
A low p-value for this test (less than 0.05) means that there is evidence to believe that the slope of the line is not 0, or that there is a statistically significant linear relationship between the two variables.
Warning: Using the regression equation to predict values of the dependent variable outside the range of the independent variable is not recommended since you have no evidence that the same linear relationship exists outside the observed range.
www.texasoft.com /winkslr.html   (573 words)

 Multiple Regression
The green crosses are the actual data, and the red squares are the "predicted values" or "y-hats", as estimated by the regression line.
In least-squares regression, the sums of the squared (vertical) distances between the data points and the corresponding predicted values is minimized.
One possible solution is to perform a regression with one independent variable, and then test whether a second independent variable is related to the residuals from this regression.
www.okstate.edu /artsci/botany/ordinate/MULTIPLE.htm   (1400 words)

 Dr. Sanity: Regression
With this primitive regression from adulthood and maturity; is it any surprise that denial, paranoia and projection run rampant--all three in the class of the most primitive and infantile emotional defenses--defenses that are usually completely abandoned for the most part by mature and responsible adults?
Is it any suprise that the conspiracy theories that are put forward by the denizens of the DU, ignorant left-leaning blogs or even by supposedly reputable adults, are the same caliber as the most paranoid of schizophrenics; the most psychotic and delusional of the brain damaged?
This psychological regression to childhood on the part of many adults in our country--and which is championed and even ecouraged by the Left in the cult of victimhood that they worship--has reached a state where it now seriously impedes the adult functioning of this country.
drsanity.blogspot.com /2005/09/regression.html   (1206 words)

The program determines the regression parameters as well as the generalized correlation coefficient and the standard error of estimate
The regression is performed using the Least-Squares method.
These programs determine the regression parameters as well as the generalized correlation coefficient and the standard error of estimate.
www.numericalmathematics.com /regression.htm   (178 words)

 [No title]   (Site not responding. Last check: 2007-10-21)
For example, we might consider height and weight of a sample of adults.
Linear regression attempts to explain this relationship with a straight line fit to the data.
Where the "residual" e is a random variable with mean zero.
www.math.csusb.edu /faculty/stanton/m262/regress/regress.html   (92 words)

 Correlation and regression   (Site not responding. Last check: 2007-10-21)
The applet draws the regression line and the correlation coefficient.
When more than two points are given, the correlation coefficient and the regression line are shown.
Every time a point is added, the correlation coefficient and the regression line are updated.
www.stattucino.com /berrie/dsl/regression/regression.html   (84 words)

 Regression Applet ( 9-Sep-1996)   (Site not responding. Last check: 2007-10-21)
The applet below is designed to teach students the effect of leverage points on a regression line.
By adding points far from the existing line, the regression line changes considerably.
This should help students understand the effect of outliers on regression analysis.
www.stat.sc.edu /~west/javahtml/Regression.html   (91 words)

 Stata Web Books: Regression with Stata
It is assumed that you have had at least a one quarter/semester course in regression (linear models) or a general statistical methods course that covers simple and multiple regression and have access to a regression textbook that explains the theoretical background of the materials covered in these chapters.
These materials also assume you are familiar with using Stata, for example that you have taken the ATS Stata 1 and Stata 2 classes or have equivalent knowledge of Stata.
There may be a number of regression concepts introduced in the chapters that are new to you.
www.ats.ucla.edu /stat/stata/webbooks/reg   (741 words)

Try your search on: Qwika (all wikis)

  About us   |   Why use us?   |   Reviews   |   Press   |   Contact us  
Copyright © 2005-2007 www.factbites.com Usage implies agreement with terms.