+ All Categories
Home > Education > Numerical method (curve fitting)

Numerical method (curve fitting)

Date post: 07-Jan-2017
Category:
Upload: varendra-university-rajshahi-bangladesh
View: 197 times
Download: 6 times
Share this document with a friend
13
Name:Sujit Kumar Saha Lecturer at Varendra University Raj shahi Name: Istiaque Ahmed Shuvo Id: 141311057 5 th batch, 7 th Semester Sec-B Dept. Of Cse Varendra University, Rajshahi Submitted By: Submitted To 11-Apr-16 1
Transcript
Page 1: Numerical method (curve fitting)

1

Name:Sujit Kumar Saha

Lecturer at Varendra University Rajshahi

Name: Istiaque Ahmed ShuvoId: 1413110575th batch, 7th SemesterSec-B

Dept. Of CseVarendra University, Rajshahi

Submitted By: Submitted To

11-Apr-16

Page 2: Numerical method (curve fitting)

Curve Fitting

Page 3: Numerical method (curve fitting)

3

TOPICS ARE

Linear Regression Multiple Linear Regression Polynomial Regression Example of Newton’s Interpolation

Polynomial And example

11-Apr-16

Page 4: Numerical method (curve fitting)

4

Fitting a straight line to a set of paired observations: (x1, y1), (x2, y2),…,(xn, yn).

y = a0+ a1 x + ea1 - slopea0 - intercepte - error, or residual, between the model and the observations

Linear Regression

11-Apr-16

Page 5: Numerical method (curve fitting)
Page 6: Numerical method (curve fitting)

6

210

10

11

1

0

0

0)(2

0)(2

iiii

ii

iioir

ioio

r

xaxaxy

xaay

xxaayaS

xaayaS

210

10

00

iiii

ii

xaxaxy

yaxna

naa

2 equations with 2 unknowns, can be solved

simultaneously

Linear Regression: Determination of ao and a1

11-Apr-16

Page 7: Numerical method (curve fitting)

7

221

ii

iiii

xxn

yxyxna

xaya 10

Linear Regression: Determination of ao and a1

11-Apr-16

Page 8: Numerical method (curve fitting)

• Another useful extension of linear regression is the case where y is a linear function of two or more independent variables:

• Again, the best fit is obtained by minimizing the sum of the squares of the estimate residuals:

Multiple Linear Regression

Page 9: Numerical method (curve fitting)

9

• The least-squares procedure from Chapter 13 can be readily extended to fit data to a higher-order polynomial. Again, the idea is to minimize the sum of the squares of the estimate residuals.

• The figure shows the same data fit with:

a) A first order polynomialb) A second order polynomial

Polynomial Regression

11-Apr-16

Page 10: Numerical method (curve fitting)

10

Many times, data is given only at discrete points such as (x0, y0), (x1, y1), ......, (xn−1, yn−1),(xn, yn). So, how then does one find the value of y at any other value of x ? Well, acontinuous function f (x) may be used to represent the n +1 data values with f (x)

passing through the n +1 points (Figure 1). Then one can find the value of y at anyother value of x . This is called interpolation.

Of course, if x falls outside the range of x for which the data is given, it is nolonger interpolation but instead is called extrapolation.

So what kind of function f (x) should one choose? A polynomial is a commonchoice for an interpolating function because polynomials are easy to

(A) evaluate,(B) differentiate, and

(C) integrate,relative to other choices such as a trigonometric and exponential series.

Polynomial interpolation involves finding a polynomial of order n that passesthrough the n +1 points. One of the methods of interpolation is called Newton’s divided

difference polynomial method. Other methods include the direct method and theLagrangian interpolation method. We will discuss Newton’s divided difference

polynomial method in this

What is interpolation?

11-Apr-16

Page 11: Numerical method (curve fitting)

11

Newton’s Divided-Difference Interpolating Polynomials?

11-Apr-16

Page 12: Numerical method (curve fitting)

12

Example of Newton’s Interpolation Polynomial

11-Apr-16

Page 13: Numerical method (curve fitting)

1311-Apr-16


Recommended