## Monday, July 11, 2011

### You can scrap it and write something better but let me keep R ;)

Ross Ikaha (via Xi'an -- thanks ;) ) gives a nice example to show why R is basically impossible to optimize:

> f = function() {

> if (runif(1) > 0.5) {

> x = 10

> }

> x

> }

x in the last expression could be a local or a global, and this won't be known until runtime!

That's pretty good.

I wonder if this will always return 10:

> f = function() {

> a = 10

> a

> }

> f()

 10

Maybe we could optimize it out? No:

> regular.equals <- =

> = = function(...) {

> with(parent.frame(), a <- 11)

> }

> f()

 11

> = <- regular.equals

OK but assigning to = is really cheating. This is cheating too:

> regular.brace = {

> { = function(...) 11

> f()

 11

> { = regular.brace

If = and { weren't available for assignment, is there any way to make f not return 10? Well there's this, but it's hardly "making f return 10":

> body(f) = quote(11)

> f()

 11

I can't think of any other way to attack it; so if you block all those I guess we can declare this function safe ;)

> g = function(x, y) {

> x + y

> }

Could we safely say that while g() is being called, f() will never be called at the same time (ie so that we could overlay their stacks?)

No, not even close. There are a million ways to break that:

> g(f(), f())

 22

(Lazy evaluation -- f() is called while executing g())

> normal.plus = +

> + = function(a, b) UseMethod("+")

> +.numeric = function(a, b) { f() * a * b }

> g(1, 2)

 22

> + = normal.plus

(Generic + is not uncommon).

Well at least can we say g() will always evaluate both arguments? No:

> g(return(2), print("hi"))

 2

Which can make code that looks like an error be valid:

> g(return(2), not.defined)

 2

If you assume g() is strict you will knock out some perfectly legitimate R code :)

Likewise an explicit call to return() doesn't have to actually do anything:

> g = function(x) {

> 3

> }

> g(return(2))

 3

I also find these hacks delightful:

> delayedAssign('b', {b <- 7; 8})

> a <- b

> a

 8

> b

 7

> x = '@'

> up = function() {

> x. <- x

> x <<- intToUtf8(utf8ToInt(x)+1L)

> delayedAssign(x, up(), ass=.GlobalEnv)

> x.

> }

> up()

 "@"

> A

 "A"

> B

 "B"

> C

 "C"

> D

 "D"

> backw = function(...) {

> args = as.list(substitute(list(...)))[-1L]

> env = parent.frame()

>

> for (arg in rev(args)) {

> res <- eval(arg, env)

> }

> res

> }

> environment(backw) = baseenv()

> { = backw

> fib = function(n) {

> a

> for (i in 1:n) {

> a = b - a

> b = a + b

> }

> b = 1

> a = 1

> }

> fib(8)

 34

> lex.scope = function(...) {

> args = as.list(substitute(list(...)))[-1L]

> parent = parent.frame()

> env = new.env(parent=parent)

>

> for (arg in args) {

> res <- eval(arg, env)

> }

> res

> }

> environment(lex.scope) = baseenv()

> { = lex.scope

> {

> x = 2

> {

> x = 3

> }

> x

> }

 2

(Hey, that one could actually be useful).

> evanescent = function(x, v) {

> force(v)

> env = parent.frame()

> name = as.character(substitute(x))

>

> delayedAssign(name, {

> env[[name]] = alist(x=)\$x

> v

> }, ass = env

> )

> }

> environment(evanescent) = baseenv()

> = = evanescent

> a = 2

> a

 2

> a

Error in eval(expr, envir, enclos) :

argument "a" is missing, with no default

My thoughts on all this: R is a wonderful and flexible language, but it is entirely not the right language for intense numerical calculation: it is simply too flexible to be fast. And unfortunately whenever you're trying some algorithm that hasn't been done before, it will necessarily not already be written in C, and you find yourself having to make a choice: at what point is it too slow to keep in R, and the best thing to do is to give up and rewrite it in C?

The thing is, as an interactive environment, R's flexibility really gains something (think how you'd implement with() in Python). I personally am OK with the current system of using R for high level code and ducking to C for tight loops; it's better than trying to make one language be good at 2 different things.

1. 2. 