I remember in high school having a contest with my Calculus teacher. We both created functions on our TI-83 calculators to approximate the definite integral of a function. This was on our off time, and he won by using while statements. I used goto and label statements in a loop. I tried to argue the point that his calculator may have better batteries, but I don’t know how much that would have really played a part in the run time.

My approximation technique and his were exactly the same. It was very similar to the Riemann Sum technique, but instead of rectangles we used trapezoids. This technique, I would assume, is much faster at converging on curved surfaces than the Riemann sum. The Riemann Sum averages two different rectangles fit on the boundaries of the curve.

Today, for fun, I recreated this approximation technique in R. Interestingly, since my experience in graduate school and later math knowlede, I didn’t even use a loop in my new function. I used vectors in R. I wonder now, whose function would compute faster?… My code falls below:

int.e<-function(b,s,e){

#f=function,b=bin,s=start,e=end

(e-s)/b->a

a*(1:b)->x

c(0,x)->r

c(x,0)->x

#type function below here in terms of x and r

x^2->f

r^2->f1

(f+f1)*a/2->q

print(sum(q[1:b]))

}

int.e(225,0,2)

#8/3 is the actual integral result

### Like this:

Like Loading...

*Related*