Results for the Third Annual ICFP Programming Contest

 

 


Winners

           

First Place        Team PLClub

            Second Place   Camls ‘R Us

            Third Place       Team Galois Connections

            Fourth Place     The Merry Mercurians

            Judges’ Prize    Team Helikopter -- awarded for the coolest test image (GML source, image)


Submissions and Related Statistics

           

We had 39 entries from 38 teams.  (Two entries from Team PLClub were sufficiently different to deserve separate submissions.  Either one would have won the contest.) Here are the team names, language(s) used, tier of the entries, and team size.  A tier-1 or tier-2 entry would have had to be much faster than a tier-3 entry to beat it. All of the winners were tier-3 entries.  Links are to external sites provided by the teams; let us know if your team has a site.

 

Team Name

Languages

Tier

Team Size

\f.(\x.f(x x)) (\x.f(x x))

SML

3

4

Ansuz

Perl 5

3

4

Automata for the People

O'Caml 2.04

2

1

Ben Lynn

Eiffel (SmallEiffel –0.76beta#4)

3

1

Cadence

C++

1

1

Callisto Diamond

Smalltalk (Squeak)

3

6

Camls 'R Us

O'Caml 3.0

3

7

Chris Lahey

bison/yacc, lex, C, Gtk+

1

1

Cobleigh

Haskell 98

3

1

Cobra

Python

3

3

Daniel Wright

Java

3

1

David Alex Lamb

Java

1

1

Dylan Hackers

Dylan (Gwdion "d2c" 2.3.4pre1)

1

4

Exo-Plugarite Zulanga

SML (MLton and SML/NJ)

3

4

Galois Connections

Haskell98 (GHC-4.08)

3

5

Helikopter

SML (SML/NJ)

3

2

Ian Lance Taylor

C++, Bison, flex

2

1

jed

SML (Moscow ML)

1

1

KsHsK

Haskell

1

1

Martin Hofmann

O'Caml 2.01

1

1

Merry Mercurians

Mercury

3

13

Nederwiet

Java

1

2

nibbler

Haskell

1

3

O'caml's Razor

O'Caml

1

4

PLClub

O'Caml 3.0

3

4

PLClubCN

O'Caml 3.0

3

4

Radical

Perl

2

1

Sandburst

Haskell/C

2

1

Shaolin Drunk Monkey

C++

2

1

Sloneczko

Clean

3

2

Snail's Pace Trace

Smalltalk (VisualWorks 3 non-commercial)

1

1

Space Leak

Haskell 98

3

1

Team OOPS

Java 2

1

1

The Corn Field(s) at UIUC

Scheme (bigloo)

1

2

The Golden Mean

C++

3

1

The Seven Types of Ambiguity

Java

3

3

Three Guys

Java

3

3

University of Western Ontario

Scheme (Bigloo)

3

2

Vladimir Dergachev

C + flex

3

1

 

 


Test Suites and Summary of Results

 

allGmlTests.tar.gz contains the GML files we used for testing.  The files cone-fractal.gml, large.gml, pipe.gml, chess.gml, and fractal.gml were used only in the second round of testing.  We ran all the entries on all of the other files, except that we did not run an entry on files using features from tier levels higher than an entry claimed to implement. 

 

To advance, an entry had to produce images that looked correct for all of the tests on which we ran it.    (We automated the process of determining average pixel errors with respect to our sample implementation so that we could quickly identify which images would most likely look wrong.)  We also eliminated one entry that exhausted our patience; it took over three days to complete one of the tier-3 tests. 

 

Ten teams remained.  For these teams, we looked at the rendering time for the tier-1 and tier-2 files that took a non-trivial amount of time to render, namely holes.gml, fov.gml, and intercyl.gml.  All rendering times are in seconds.

 

 

holes

fov

intercyl

geometric mean

The Golden Mean

90

60

42

60.98

Daniel Wright

30

30

31

30.33

Camls 'R Us

41

36

29

34.98

Merry Mercurians

277

503

307

349.73

Team PLClub

5

4

4

4.31

Ben Lynn

44

43

18

32.41

Galois Connections

319

190

274

255.13

Space Leak

575

1122

614

734.42

Sandburst

23

22

33

25.56

Shaolin Drunk Monkey

139

2267

350

479.56

 

Observing that the tier-1 and tier-2 entries were not faster than the tier-3 entries,  we eliminated the non-tier-3 entries.   We then included tier-3 files snowgoon.gml, dice.gml, and golf.gml when examining rendering time.  The geometric mean includes the time for the three files above.

 

 

snowgoon

dice

golf

Cumulative geometric mean

The Golden Mean

153

880

488

156.87

Daniel Wright

105

103

29

45.39

Camls 'R Us

70

100

39

47.64

Merry Mercurians

1514

5788

369

719.14

Team PLClub

8

10

3

5.17

Ben Lynn

36

75

37

38.78

Galois Connections

476

745

194

323.33

Space Leak

3328

11255

1075

1586.58

 

We eliminated the slowest entry and ran the seven remaining teams on five programs: cone-fractal.gml, large.gml, pipe.gml, chess.gml, and fractal.gml.

The first and second place teams produced correct images for all the programs.  The geometric mean includes the time for the six files above.

A * indicates failure to produce any sort of reasonable output.

 

 

cone-fractal

large

pipe

chess

fractal

Cumulative geometric mean

The Golden Mean

425

*

196

720

1727

 

Daniel Wright

248

*

98

*

1964

 

Camls 'R Us

10

10

16

39

4

25.46

Merry Mercurians

1214

82

2130

14326

1349

949.7

Team PLClub

10

7

14

44

25

8.67

Ben Lynn

356

*

718

*

28

 

Galois Connections

71

33

96

221

62

170.42

 

At this point the judges proclaimed an extended sigh of relief.

 


First-Place Team’s Images

 

We have converted Team PLClub’s tier-3 results to JPEG files for your viewing pleasure:

 


Greg Morrisett | John Reppy | icfp00@cs.cornell.edu