Programming and the feedback loop

In a previous post I spoke about the importance of feedback loops in creative processes, including programming. Here’s a talk and an essay where Bret Victor explores this issue in more depth.

Victor not only identifies the importance of the feedback loop in determining search speed and enabling discovery, but also notes that the intelligibility of the trial-result relationship is as critical. Understanding the trial-result relationship allows the creator to focus on the productive areas of the search space, ignoring bad solutions and thus avoiding unnecessary trials. This aspect, the learnability of the trial-result relationship in programming, is the focus of his essay Learnable Programming.

Handshakes and the young Gauss

If a group of n people meet, and everyone shakes hands with everyone else, how many handshakes occur?

I posed this problem to a friend, to see if she would give me the correct answer, which is n(n – 1) / 2. It’s easy to see why this formula is correct: every one of the n persons must shake hands with n – 1 other persons, giving the expression n(n – 1). We must correct this because it double counts handshakes, counting once for each source and again for each target. To fix the double counting we simply divide by 2, which yields n(n – 1) / 2.

My friend went another route, instead solving the problem in sequence: the first person must shake hands with n – 1 people. But the second person, since he/she has already shaken hands with the first person we already counted, has n – 2 people left. This process is repeated, giving a sum formula like so:

n  – 1 + n – 2 + n – 3 + …. + 1

You can easily verify that both formulas give them same result by testing for several values of n. Thus the sum formula should be reducible to n(n – 1) / 2. The trick to make this reduction is to group the terms of the sum formula into pairs that add to the same value. The pairing is done from the “outer extremes” and moving inward, for example with

n – 1 + n – 2 + n – 3 + …. + 3 + 2 + 1

we can make the following pairs

n – 1 + 1,  n – 2 + 2,  n – 3 + 3…

and note that each of these pairs add to n. Using a concrete example, with n = 5:

4 + 3 + 2 + 1 = (4 + 1) + (2 + 3)

there are (n – 1) / 2 pairs that each sum to n, which gives

(n – 1) / 2 * n = n(n – 1) / 2

If n is even, as for example with n=6:

5 + 4 + 3 + 2 + 1 = (5+0) + (4+1) + (3+2)

here there are n/2 pairs that each sum to n – 1:

(n – 1) * n/2 = n(n – 1) / 2

So the sum formula reduces to the original answer. Astute reads will have noticed that the sum formula is just the sum analog of the factorial operation. Values obtained from this sum formula are called triangle numbers, with notation Tn[1]. Triangle numbers have a similar geometric interpretation to square numbers; adding the next value in the series is equivalent to adding an extra “layer” to the previous triangle:

Triangle numbers (wikipedia)

As we have seen, triangle numbers solve the handshake problem, and many other problems that have the same abstract form: how many connections are necessary to connect a group of n nodes such that every node is connected to every other node. This abstract formulation manifests itself for example in computer or telephone networks, yielding Metcalfe’s law and thus also related to the network effect.

Telephone connections (wikipedia)

But besides these relationships, the handshake problem also reminds me of the Gauss Schoolroom anecdote, where a young Gauss and classmates were tasked with adding the numbers from 1 to 100 to keep them occupied. To the teacher’s surprise, young Gauss solved the problem immediately. He too had made the reduction we’ve seen above, and quickly obtained said sum. What Gauss did not know, and I realized thanks to my friend’s answer, is that he was solving the handshake problem as well!

 


[1] Triangle numbers are in fact of the form n(n + 1)/2, so they give the number of handshakes for n + 1 people, not n

Two negative views of Scala

It never hurts to read criticism and downsides to some technology as an antidote to rosy tainted hype. Here are two not so positive accounts of Scala. This first one is a recurring theme in criticisms of Scala, namely that it’s too complex, cryptic, overly complicated:

http://yz.mit.edu/wp/true-scala-complexity/

Besides the comments addressing specifics of that post[1], here’s a general response to criticisms of Scala as overly complex by Martin Odersky. The notion of language complexity is a deep subject that I’ve been thinking about lately, so I’ll save discussion for another post.

Moving on, perhaps Scala’s biggest public embarrasment to date is the Yammer fiasco, where a move from Java to Scala was initally promising but eventually proved not to be worth it. The code was moved back to Java. Here’s the original critique:

https://gist.github.com/1406238

and some useful coverage on InfoQ and further comments by the original author.

The story with Yammer has some overlap with the first criticism above; again the issue of complexity is brought up, and in particular how this is a problem since there is a shortage of engineers with Scala experience. But besides more academic considerations of language complexity, what the Yammer story seems to point to is the relative immaturity of Scala as an industrial language. I recall some comments saying that the problems brought up about Scala today are reminiscent of criticism of Java in the 1.3 days, and that these issues are a natural part of a language’s evolution and improvement that will be resolved in time. Typesafe seems to be going down the right path.

But if we accept this interpretation, Scala’s successful evolution is a matter of resources. Does typesafe have the financial muscle to address these issues diligently? Maybe!

 


[1] Commenters point out that the collection excercise the writer attempts as evidence of excessive complexity is unnatural, useless, and in fact impossible in any language, but Scala actually comes close