unofficial mirror of guile-devel@gnu.org 
 help / color / mirror / Atom feed
* "Pace is nothing without guile"
@ 2008-07-13 17:06 Neil Jerram
  2008-07-13 18:24 ` Greg Troxel
  2008-07-15 19:21 ` Ludovic Courtès
  0 siblings, 2 replies; 4+ messages in thread
From: Neil Jerram @ 2008-07-13 17:06 UTC (permalink / raw)
  To: guile-devel, guile-user

... That's a comment from coverage of the current England v South
Africa cricket match
(http://uk.cricinfo.com/talk/content/current/multimedia/360921.html).

But is Guile nothing without pace?

Well obviously it isn't "nothing", but I think Guile is perceived,
among both Scheme implementations and free scripting languages, as
being a bit slow, and I think that a large part of the reason for this
is that we have no systematic benchmarking.

So this email is about systematic performance data.  I was wondering
what benchmarks we could run to get good coverage of all Guile's
function, and suddenly thought "of course, the test suite!"  The test
suite should, by definition, provide coverage of everything that we
care about.  Therefore I think that we should be able to start
collecting a lot of useful performance data by implementing a version
of "make check" that measures and stores off the time that each test
takes to run.

What I'd like input/advice on, is exactly how we store and collate
such data.  I think the system should ideally support

- arbitrary later analysis of the collected data

- correlation of the result for a specific test with the exact source
code of that test at the time it was run...

- ...and hence, being able to work out (later) that the results
changed because the content of the test changed

- anyone running the tests and uploading data, not just Guile core developers

- associating a set of results with the relevant information about the
machine that they were obtained on (CPUs, RAM) in such a way that the
information is trustable, but without invading the privacy of the
uploader.

So how do we do that?  Perhaps the test content identification could
be done by its Git (SHA-1) hash - together with the path of the repo
containing that version.  And I imagine that the form of the results
could be a file containing lines like:

("numbers.test" SHA1-HASH REPO-PATH DATE+TIME MACHINE-INFO MEASURED-DURATION)

That would allow sets of results to be concatenated for later
analysis.  But I'm not sure what the relevant MACHINE-INFO is and how
to represent that.

Any thoughts / comments / ideas?  Thanks for reading!

      Neil




^ permalink raw reply	[flat|nested] 4+ messages in thread

end of thread, other threads:[~2008-07-15 19:21 UTC | newest]

Thread overview: 4+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2008-07-13 17:06 "Pace is nothing without guile" Neil Jerram
2008-07-13 18:24 ` Greg Troxel
2008-07-13 22:31   ` Neil Jerram
2008-07-15 19:21 ` Ludovic Courtès

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).