• Complain

David J. Olive - Linear Regression

Here you can read online David J. Olive - Linear Regression full text of the book (entire story) in english for free. Download pdf and epub, get meaning, cover and reviews about this ebook. year: 2017, publisher: Springer, genre: Home and family. Description of the work, (preface) as well as reviews are available. Best literature library LitArk.com created for fans of good reading and offers a wide selection of genres:

Romance novel Science fiction Adventure Detective Science History Home and family Prose Art Politics Computer Non-fiction Religion Business Children Humor

Choose a favorite category and find really read worthwhile books. Enjoy immersion in the world of imagination, feel the emotions of the characters or learn something new for yourself, make an fascinating discovery.

David J. Olive Linear Regression

Linear Regression: summary, description and annotation

We offer to read an annotation, description, summary or preface (depends on what the author of the book "Linear Regression" wrote himself). If you haven't found the necessary information about the book — write in the comments, we will try to find it.

This text covers both multiple linear regression and some experimental design models. The text uses the response plot to visualize the model and to detect outliers, does not assume that the error distribution has a known parametric distribution, develops prediction intervals that work when the error distribution is unknown, suggests bootstrap hypothesis tests that may be useful for inference after variable selection, and develops prediction regions and large sample theory for the multivariate linear regression model that has m response variables. A relationship between multivariate prediction regions and confidence regions provides a simple way to bootstrap confidence regions. These confidence regions often provide a practical method for testing hypotheses. There is also a chapter on generalized linear models and generalized additive models. There are many R functions to produce response and residual plots, to simulate prediction intervals and hypothesis tests, to detect outliers, and to choose response transformations for multiple linear regression or experimental design models. This text is for graduates and undergraduates with a strong mathematical background. The prerequisites for this text are linear algebra and a calculus based course in statistics.

David J. Olive: author's other books


Who wrote Linear Regression? Find out the surname, the name of the author of the book and a list of all author's works by series.

Linear Regression — read online for free the complete book (whole text) full work

Below is the text of the book, divided by pages. System saving the place of the last page read, allows you to conveniently read the book "Linear Regression" online for free, without having to search again every time where you left off. Put a bookmark, and you can go to the page where you finished reading at any time.

Light

Font size:

Reset

Interval:

Bookmark:

Make
Springer International Publishing AG 2017
David J. Olive Linear Regression 10.1007/978-3-319-55252-1_1
1. Introduction
David J. Olive 1
(1)
Department of Mathematics, Southern Illinois University, Carbondale, IL, USA
This chapter provides a preview of the book but is presented in a rather abstract setting and will be easier to follow after reading the rest of the book. The reader may omit this chapter on first reading and refer back to it as necessary. Chapters illustrates some of these extensions for the generalized linear model (GLM) and the generalized additive model (GAM).
Response variables are the variables of interest, and are predicted with a p 1 vector of predictor variables x =( x 1,, x p ) T where x T is the transpose of x . A multivariate regression model has m >1 response variables. For example, predict Y 1= systolic blood pressure and Y 2= diastolic blood pressure using a constant x 1, x 2= age , x 3= weight , and x 4= dosage amount of blood pressure medicine . The multivariate location and dispersion model of Chapter .
A univariate regression model has one response variable Y . Suppose Y is independent of the predictor variables x given a function h ( x ), written Linear Regression - image 1, where Linear Regression - image 2 and the integer d is as small as possible. Then Y follows a dD regression model, where d p since Picture 3. If then Y follows a 0 D regression model Then there are 0D 1D pD regression - photo 4, then Y follows a 0 D regression model. Then there are 0D, 1D, , pD regression models, and all univariate regression models are dD regression models for some integer 0 d p . Cook (, p. 414) use similar notation with Linear Regression - image 5
The remainder of this chapter considers 1D regression models, where Linear Regression - image 6 is a real function. The additive error regression model Y = m ( x ) + e is an important special case with h ( x )= m ( x ).See Section An important special case of the additive error model is the linear regression model Multiple linear regression and many experimental design models are special - photo 7 . Multiple linear regression and many experimental design models are special cases of the linear regression model.
The multiple linear regression model has at least one predictor x i that takes on many values. Chapter consider response plots, plots for response transformations, and prediction intervals for the multiple linear regression model fit by least squares. All of these techniques can be extended to alternative fitting methods.
1.1 Some Regression Models
All models are wrong, but some are useful.
Box ()
In data analysis , an investigator is presented with a problem and data from some population . The population might be the collection of all possible outcomes from an experiment while the problem might be predicting a future value of the response variable Y or summarizing the relationship between Y and the p 1 vector of predictor variables x .A statistical model is used to provide a useful approximation to some of the important underlying characteristics of the population which generated the data. Many of the most used models for 1D regression, defined below, are families of conditional distributions Y | x = x o indexed by x = x o . A 1D regression model is a parametric model if the conditional distribution is completely specified except for a fixed finite number of parameters, otherwise, the 1D model is a semiparametric model . GLMs and GAMs, defined below, are covered in Chapter .
Definition 1.1.
Regression investigates how the response variable Y changes with the value of a p 1 vector x of predictors. Often this conditional distribution Y | x is described by a 1D regression model , where Y is conditionally independent of x given the sufficient predictor SP = h ( x ), written
Linear Regression - image 8
(1.1)
where the real valued function Linear Regression - image 9 . The estimated sufficient predictor ESP = Linear Regression - image 10 An important special case is a model with a linear predictor Linear Regression - image 11 where ESP = Picture 12 . This class of models includes the generalized linear model (GLM). Another important special case is a generalized additive model (GAM), where Y is independent of x =( x 1,, x p ) T given the additive predictor AP = + j =1 p S j ( x j ) for some (usually unknown) functions S j . The estimated additive predictor EAP = ESP = Linear Regression - image 13
Notation: In this text, a plot of x versus Y will have x on the horizontal axis, and Y on the vertical axis.
Plots are extremely important for regression. When p =1, x is both a sufficient predictor and an estimated sufficient predictor. So a plot of x versus Y is both a sufficient summary plot and a response plot. Usually the SP is unknown, so only the response plot can be made. The response plot will be extremely useful for checking the goodness of fit of the 1D regression model.
Definition 1.2.
A sufficient summary plot is a plot of the SP versus Y . An estimated sufficient summary plot (ESSP) or response plot is a plot of the ESP versus Y .
Notation.
Often the index i will be suppressed. For example, the linear regression model
Linear Regression - image 14
(1.2)
for i =1,, n where Linear Regression - image 15 is a p 1 unknown vector of parameters, and e i is a random error. This model could be written Linear Regression - image 16 . More accurately, Linear Regression - image 17 , but the conditioning on x will often be suppressed. Often the errors e 1,, e n are iid (independent and identically distributed) from a distribution that is known except for a scale parameter. For example, the e i s might be iid from a normal (Gaussian) distribution with mean 0 and unknown standard deviation .For this Gaussian model, estimation of , Picture 18 , and is important for inference and for predicting a new value of the response variable Y f given a new vector of predictors x f .
The class of 1D regression models is very rich, and many of the most used statistical models, including GLMs and GAMs, are 1D regression models. Nonlinear regression, nonparametric regression, and linear regression are special cases of the additive error regression model
Next page
Light

Font size:

Reset

Interval:

Bookmark:

Make

Similar books «Linear Regression»

Look at similar books to Linear Regression. We have selected literature similar in name and meaning in the hope of providing readers with more options to find new, interesting, not yet read works.


Reviews about «Linear Regression»

Discussion, reviews of the book Linear Regression and just readers' own opinions. Leave your comments, write what you think about the work, its meaning or the main characters. Specify what exactly you liked and what you didn't like, and why you think so.