Dirk Eddelbuettel has studied the performance of R 2.13 and the attached byte code compiler of Luke Tierney, when calculating the expression 1/(1+x) many times. This is, of course an artifical example, but it is good for comparison. He reports that the byte code compiler of R provides speed up factors of 3 - 5.

Let us compare the performance of interpretated R and P.

Here is the shortened R code:

f <- function(n, x=1) for (i in 1:n) x=1/(1+x)

h <- function(n, x=1) for (i in 1:n) x=(1+x)^(-1)

## now load some tools

library(rbenchmark)

# now run the benchmark

N <- 1e6

benchmark(f(N,1), h(N,1),

columns=c("test", "replications", "elapsed", "relative"),

order="relative", replications=10)

I ran that code on my laptop (dual core, 2.4GHz)

test replications elapsed relative

1 f(N, 1) 10 12.558 1.000000

2 h(N, 1) 10 18.150 1.445294

Averaged over 10 repetitions, R needs 12.558 seconds to calculate 1/(1+x) one million times and 18.150 seconds for (1+x)^(-1).

Let us look how P performs. Here is the equivalent P code:

f <- function(n, x=1) { var i; for (i in 1:n) x=1/(1+x); return(0) }

g <- function(n, x=1) { var i; for (i in 1:n) x=(1+x)^(-1); return(0) }

N = 1e6

repetitions = 10

i = 0

tm = time # gives time in seconds

for ( i in (1:repetitions) ) call f( N, 1 )

print( time-tm )

tm = time

for ( i in (1:repetitions) ) call g( N, 1 )

print( time-tm )

P needs only 1.29 seconds for function f and 1.59 seconds for g! This is a speed up factor of 9.73 and 11.4. The Professional Edition with code optimization only needs 0.89 and 1.05 seconds corresponding to a speedup factor of 14.1 and 17.2, respectively.