unofficial mirror of guile-devel@gnu.org 
 help / color / mirror / Atom feed
* Syntax checks
@ 2002-04-06  6:25 Dirk Herrmann
  2002-04-06 15:38 ` Neil Jerram
  2002-04-07 10:05 ` Marius Vollmer
  0 siblings, 2 replies; 18+ messages in thread
From: Dirk Herrmann @ 2002-04-06  6:25 UTC (permalink / raw)


Hello everybody,

in the evaluator, there are a lot of syntax checks performed that could
probably better be performed in a previous syntax checking phase, keeping
the evaluator itself free of such checks.

As an example, there is the 'do loop construct: In the body of the do loop
you only have to execute expressions that can have a side effect.  We do
not have a proper analysis for which expressions can have a side effect
and which don't, but there is a simple syntactic criterion that can be
used:  If the expression is not a list, then it is an object or a variable
reference and can't have sideeffects.

Thus, the body of the do loop could be scanned in the macro transformer
and freed of unnecessary expressions.  Then, the corresponding check in
the evaluator could be removed.

I would like to apply such changes, but I have some questions:

* Removing unnecessary expressions from code would change the source in a
way that can't be restored by unmemoizing.  Do we care?  In the long run
we will probably want to allow more transformations on the source anyway
for the sake of optimization.  Then, memoization/unmemoization won't work
and we will have to provide a different mechanism to record the
relationship between transformed code and the source.

* Should warnings be issued when dead code is eliminated from the source?

Best regards
Dirk Herrmann


_______________________________________________
Guile-devel mailing list
Guile-devel@gnu.org
http://mail.gnu.org/mailman/listinfo/guile-devel


^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: Syntax checks
  2002-04-06  6:25 Syntax checks Dirk Herrmann
@ 2002-04-06 15:38 ` Neil Jerram
  2002-04-07  7:09   ` Dirk Herrmann
  2002-04-07 10:40   ` Marius Vollmer
  2002-04-07 10:05 ` Marius Vollmer
  1 sibling, 2 replies; 18+ messages in thread
From: Neil Jerram @ 2002-04-06 15:38 UTC (permalink / raw)
  Cc: Guile Development List

>>>>> "Dirk" == Dirk Herrmann <dirk@ida.ing.tu-bs.de> writes:

    Dirk> Hello everybody,
    Dirk> in the evaluator, there are a lot of syntax checks performed that could
    Dirk> probably better be performed in a previous syntax checking phase, keeping
    Dirk> the evaluator itself free of such checks.

    Dirk> As an example, there is the 'do loop construct: In the body of the do loop
    Dirk> you only have to execute expressions that can have a side effect.  We do
    Dirk> not have a proper analysis for which expressions can have a side effect
    Dirk> and which don't, but there is a simple syntactic criterion that can be
    Dirk> used:  If the expression is not a list, then it is an object or a variable
    Dirk> reference and can't have sideeffects.

    Dirk> Thus, the body of the do loop could be scanned in the macro transformer
    Dirk> and freed of unnecessary expressions.  Then, the corresponding check in
    Dirk> the evaluator could be removed.

What you say makes sense, but I'm struggling to see why it's really
worth doing this.  Are you aware of a lot of code that puts pointless
expressions in `do' bodies?

Also, does this fit into a more general picture of how we should
evolve the interactions between transformation, evaluation,
compilation etc.?  (I don't claim that such a picture exists, but it
would be nicer if it did.)

    Dirk> I would like to apply such changes, but I have some questions:

    Dirk> * Removing unnecessary expressions from code would change the source in a
    Dirk> way that can't be restored by unmemoizing.  Do we care?  In the long run
    Dirk> we will probably want to allow more transformations on the source anyway
    Dirk> for the sake of optimization.  Then, memoization/unmemoization won't work
    Dirk> and we will have to provide a different mechanism to record the
    Dirk> relationship between transformed code and the source.

I don't have much experience here, but your general point sounds right
to me.  That is, rather than unmemoizing, we may eventually need just
to keep a copy of the original source.

From the debugging point of view, the requirements are that

- breakpoint positions are preserved as code is transformed

- when a breakpoint is hit, it is possible to map back from the
  transformed breakpoint location to the coordinates of the breakpoint
  in the original source.

    Dirk> * Should warnings be issued when dead code is eliminated from the source?

I guess it should be configurable :-)

It isn't currently possible to set source properties on something that
isn't a pair, so at least you don't need to worry about eliminating
code with important source properties on it (such as a breakpoint).

        Neil


_______________________________________________
Guile-devel mailing list
Guile-devel@gnu.org
http://mail.gnu.org/mailman/listinfo/guile-devel


^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: Syntax checks
  2002-04-06 15:38 ` Neil Jerram
@ 2002-04-07  7:09   ` Dirk Herrmann
  2002-04-08 18:27     ` Neil Jerram
  2002-04-07 10:40   ` Marius Vollmer
  1 sibling, 1 reply; 18+ messages in thread
From: Dirk Herrmann @ 2002-04-07  7:09 UTC (permalink / raw)
  Cc: Guile Development List

On 6 Apr 2002, Neil Jerram wrote:

> >>>>> "Dirk" == Dirk Herrmann <dirk@ida.ing.tu-bs.de> writes:
> 
>     Dirk> Hello everybody,
>     Dirk> in the evaluator, there are a lot of syntax checks performed that could
>     Dirk> probably better be performed in a previous syntax checking phase, keeping
>     Dirk> the evaluator itself free of such checks.
> 
>     Dirk> As an example, there is the 'do loop construct: In the body of the do loop
>     Dirk> you only have to execute expressions that can have a side effect.  We do
>     Dirk> not have a proper analysis for which expressions can have a side effect
>     Dirk> and which don't, but there is a simple syntactic criterion that can be
>     Dirk> used:  If the expression is not a list, then it is an object or a variable
>     Dirk> reference and can't have sideeffects.
> 
>     Dirk> Thus, the body of the do loop could be scanned in the macro transformer
>     Dirk> and freed of unnecessary expressions.  Then, the corresponding check in
>     Dirk> the evaluator could be removed.
> 
> What you say makes sense, but I'm struggling to see why it's really
> worth doing this.  Are you aware of a lot of code that puts pointless
> expressions in `do' bodies?

No, certainly not.  However, the evaluator has to check for it.  The
reason is the following:  SCM_CEVAL _must_ be called with a non-immediate.  
That means that whereever SCM_CEVAL is to be called with some expression,
the special case of an immediate has to be checked before doing the actual
call.

In the execution of the 'do body, every expression is checked to be an
immediate, and if it isn't, then SCM_CEVAL is called.  It is this kind of
check (and a couple of similar ones) that I try to remove, since it slows
down the evaluator.

> Also, does this fit into a more general picture of how we should
> evolve the interactions between transformation, evaluation,
> compilation etc.?  (I don't claim that such a picture exists, but it
> would be nicer if it did.)

I at least am not aware of such a general picture.  But, whatever it would
be like, it would require the following:
* a preprocessing (or compilation) of the code in order to reduce the
  actual work of the evaluator to a minimum.
* as a consequence, a more general way to track the relationship between
  source code and preprocessed (aka memoized) code.

My personal strategy is to move forward in small steps and learn during
that process.  A complete re-write of the evaluator and all of its
surrounding code (debugger, stack handling, source tracking, ...) is at
least for me too complex.  I don't know about the status of the other
attempts that are currently made towards this goal, but it seems that
there is little progress.

As you may have noticed, I have already started to clean up the evaluator
code (although that process is far from finished yet).  The number of jump
labels has been reduced, the intra-function communication has been
simplified and the code has been made somewhat easier (IMO) to understand.  
Things are going slowly, but they are proceeding :-)

>     Dirk> * Removing unnecessary expressions from code would change the source in a
>     Dirk> way that can't be restored by unmemoizing.  Do we care?  In the long run
>     Dirk> we will probably want to allow more transformations on the source anyway
>     Dirk> for the sake of optimization.  Then, memoization/unmemoization won't work
>     Dirk> and we will have to provide a different mechanism to record the
>     Dirk> relationship between transformed code and the source.
> 
> I don't have much experience here, but your general point sounds right
> to me.  That is, rather than unmemoizing, we may eventually need just
> to keep a copy of the original source.
> 
> >From the debugging point of view, the requirements are that
> 
> - breakpoint positions are preserved as code is transformed
> 
> - when a breakpoint is hit, it is possible to map back from the
>   transformed breakpoint location to the coordinates of the breakpoint
>   in the original source.

I have to admit that I have not taken a look at how debugging works.  If
you agree, we can work together:  Before doing any changes to the
evaluator, I would double check with you.

>     Dirk> * Should warnings be issued when dead code is eliminated from the source?
> 
> I guess it should be configurable :-)

That's what I think, too.  However, there are some difficulties with
macros:  A macro can be used perfectly fine, but resulting in a construct
with some dead code.  You might not be interested in warnings in such a
special situation, while in all other situations you might want to be
warned.  I think, it is best to start with issuing warning messages to a
dedicated output port and think of improvements later.

Best regards
Dirk Herrmann


_______________________________________________
Guile-devel mailing list
Guile-devel@gnu.org
http://mail.gnu.org/mailman/listinfo/guile-devel


^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: Syntax checks
  2002-04-06  6:25 Syntax checks Dirk Herrmann
  2002-04-06 15:38 ` Neil Jerram
@ 2002-04-07 10:05 ` Marius Vollmer
  1 sibling, 0 replies; 18+ messages in thread
From: Marius Vollmer @ 2002-04-07 10:05 UTC (permalink / raw)
  Cc: Guile Development List

Dirk Herrmann <dirk@ida.ing.tu-bs.de> writes:

> Hello everybody,
> 
> in the evaluator, there are a lot of syntax checks performed that could
> probably better be performed in a previous syntax checking phase, keeping
> the evaluator itself free of such checks.

This is a very interesting issue!

> * Removing unnecessary expressions from code would change the source in a
> way that can't be restored by unmemoizing.  Do we care?

I'd say, we needn't care.  Requiring that unmemoizing always works
perfectly will inhibit almost all interesting work towards a 'real'
compiler.

> In the long run we will probably want to allow more transformations
> on the source anyway for the sake of optimization.  Then,
> memoization/unmemoization won't work and we will have to provide a
> different mechanism to record the relationship between transformed
> code and the source.

Exactly.

> * Should warnings be issued when dead code is eliminated from the source?

Hmm, when the memoization is a distinct phase from execution, I'd say
we might emit warnings (when the user wants them), but when
memoization is intertwined with execution like it is now, the warnings
will come at unexpected times, I'm afraid, and might be more annoying
than helpful.  So, they should be off by default.

(more general musings in my reply to Neil...)

_______________________________________________
Guile-devel mailing list
Guile-devel@gnu.org
http://mail.gnu.org/mailman/listinfo/guile-devel


^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: Syntax checks
  2002-04-06 15:38 ` Neil Jerram
  2002-04-07  7:09   ` Dirk Herrmann
@ 2002-04-07 10:40   ` Marius Vollmer
  2002-04-09 20:48     ` Lynn Winebarger
  1 sibling, 1 reply; 18+ messages in thread
From: Marius Vollmer @ 2002-04-07 10:40 UTC (permalink / raw)


Neil Jerram <neil@ossau.uklinux.net> writes:

> Also, does this fit into a more general picture of how we should
> evolve the interactions between transformation, evaluation,
> compilation etc.?  (I don't claim that such a picture exists, but it
> would be nicer if it did.)

I have a dimm picture of a compilation framework in my head.  Here is
what I wrote about it in a mail to Rob:

    But having a compiler for Guile is still an important goal, but we
    should do it Right rather than rush it.  I plan to attack it from
    above, by first straighten out the module system to be friendly to
    compiler semantics, redoing our current memoizer to be no longer
    tangled with the evaluator and then having a way to write memoized
    code to disk and load it back in.  Down that road are enough
    design problems that are important for good compiler support, long
    before we come close to worrying about how to produce native
    machine code (or even cross compiling).

    When we are able to write memoized code to disk and this does
    indeed speed up loading, we can make syntax-case our standard
    macro expander and the memoizer wouldn't have to work with macros
    at all.  We could then also move GOOPS even closer to the core
    [because it will load in tolerable time], using it for ports and
    the exception system, for example.  Maybe even throw out smobs.

    Then (if development is linear, which it needn't be of course), we
    can seriously think about compiling to machine code, by offering a
    choice between the memorizer and other code generators.

    [...]

    I would like to follow the path outlined above.  Hobbit would
    enter quite late, and would have to handle macros at all.  It
    would get expanded code with certain additional guarantees
    (i.e. unique variable names, correct syntax, only a few core
    forms, maybe even some special language that isn't Scheme).

    > It's not clear that full expansion at compile time is always
    > acceptable since it looks like, given Guile's current API,
    > people could have been using the macro system as a simple way to
    > write memoized code for dynamic programming, which shouldn't be
    > fully expanded at compile-time.

    I don't think we want to support that.  That practice would be
    even worse than unmotivated use of eval and instead of letting
    people get away with it, we should educate them.

    > And of course there's a whole raft of other issues related to
    > how macros and compilation should interact. i.e.

    Common Lisp will offer good hints, I guess.  This is what I want
    to iron out when separating the memoizer from the evaluator.  I
    want to make these questions 'central' to Guile, that is, the
    basic semantics of Guile are affected by this, not just the
    odd-ball compiler user.  This will ensure that compilation is
    consistent with using Guile interactively.

So I see Dirk's syntax check cleanups as steps towards a separation of
the memoizer and the executor.

_______________________________________________
Guile-devel mailing list
Guile-devel@gnu.org
http://mail.gnu.org/mailman/listinfo/guile-devel


^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: Syntax checks
  2002-04-07  7:09   ` Dirk Herrmann
@ 2002-04-08 18:27     ` Neil Jerram
  0 siblings, 0 replies; 18+ messages in thread
From: Neil Jerram @ 2002-04-08 18:27 UTC (permalink / raw)
  Cc: Guile Development List

>>>>> "Dirk" == Dirk Herrmann <dirk@ida.ing.tu-bs.de> writes:

    Dirk> No, certainly not.  However, the evaluator has to check for it.  The
    Dirk> reason is the following:  SCM_CEVAL _must_ be called with a non-immediate.  
    Dirk> That means that whereever SCM_CEVAL is to be called with some expression,
    Dirk> the special case of an immediate has to be checked before doing the actual
    Dirk> call.

    Dirk> In the execution of the 'do body, every expression is checked to be an
    Dirk> immediate, and if it isn't, then SCM_CEVAL is called.  It is this kind of
    Dirk> check (and a couple of similar ones) that I try to remove, since it slows
    Dirk> down the evaluator.

That sounds fine.

    Dirk> As you may have noticed, I have already started to clean up the evaluator
    Dirk> code (although that process is far from finished yet).  The number of jump
    Dirk> labels has been reduced, the intra-function communication has been
    Dirk> simplified and the code has been made somewhat easier (IMO) to understand.  
    Dirk> Things are going slowly, but they are proceeding :-)

Yes, I had noticed, although I'm not sure I'd call the code
understandable yet :-) (not by me, anyway)  Good luck!

    >> >From the debugging point of view, the requirements are that
    >> 
    >> - breakpoint positions are preserved as code is transformed
    >> 
    >> - when a breakpoint is hit, it is possible to map back from the
    >> transformed breakpoint location to the coordinates of the breakpoint
    >> in the original source.

    Dirk> I have to admit that I have not taken a look at how debugging works.  If
    Dirk> you agree, we can work together:  Before doing any changes to the
    Dirk> evaluator, I would double check with you.

Sure; the only possible problem is that I don't have much time during
the week these days, so I might slow you down.  I'd be happy to check
things over the weekend.

        Neil


_______________________________________________
Guile-devel mailing list
Guile-devel@gnu.org
http://mail.gnu.org/mailman/listinfo/guile-devel


^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: Syntax checks
  2002-04-07 10:40   ` Marius Vollmer
@ 2002-04-09 20:48     ` Lynn Winebarger
  2002-04-13  9:01       ` Dirk Herrmann
  2002-04-14 17:52       ` Marius Vollmer
  0 siblings, 2 replies; 18+ messages in thread
From: Lynn Winebarger @ 2002-04-09 20:48 UTC (permalink / raw)


On Sunday 07 April 2002 05:40, Marius Vollmer wrote:
> Neil Jerram <neil@ossau.uklinux.net> writes:
> > Also, does this fit into a more general picture of how we should
> > evolve the interactions between transformation, evaluation,
> > compilation etc.?  (I don't claim that such a picture exists, but it
> > would be nicer if it did.)
>     [snip]
>     But having a compiler for Guile is still an important goal, but we
>     should do it Right rather than rush it.  I plan to attack it from
>     above, by first straighten out the module system to be friendly to
>     compiler semantics, redoing our current memoizer to be no longer
>     tangled with the evaluator and then having a way to write memoized
>     code to disk and load it back in.  Down that road are enough
>     design problems that are important for good compiler support, long
>     before we come close to worrying about how to produce native
>     machine code (or even cross compiling).

        Are you actively working on the module system/first class 
environments?  The TODO list currently only lists 1.6 and "Eventually"
as target times.   I am interested in this particular task, but am still poking
around the source.   It's not entirely clear what the exact difference between
environments and modules is (or should be).  Well, there's import/export and
syntax readers.  
     I did go back and read a lot of discussion from late 2000/early 2001,
but I'm not sure how much has sunk in yet.  
     Also, while I'm aware modules should act orthogonally to objects in
terms of introducing namespaces, it seems to me the module system
should allow constructs similar to the imperative object-oriented languages.
Before I go on, though, I'd like to find out if there's any point to doing so.

>  > [interaction of macros and compilation should interact how? ]       
>     Common Lisp will offer good hints, I guess.  This is what I want
>     to iron out when separating the memoizer from the evaluator.  I
>     want to make these questions 'central' to Guile, that is, the
>     basic semantics of Guile are affected by this, not just the
>     odd-ball compiler user.  This will ensure that compilation is
>     consistent with using Guile interactively.
       
         The real question (lingering from at least late 2000) seems to be
whether lambda abstractions should delay expansion as well as evaluation.
My first impulse is to say it shouldn't, that macros are "evaluated" at read 
time.  Among other effects of using lambda to delay expansion, you have
introduced a definite evaluation ordering of applications.  I'm guessing one
of the appeals of this behaviour is that in
(define (foo x) (bar x))
(define (bar x) (+ x 5))
(define-syntax bar (syntax-rules () ((_ x) (+ x 5))))

    the 2 definitions of bar work "the same".  However, IMO, the second
definition should yield an  error in  (foo 4)  because it's evaluation time 
and bar evaluates to a macro, and 5 is not "syntax". 
    Mainly, however, I see this as a kind of lexical scoping - if you 
re-evaluated macros whenever they changed, you have a kind of 
dynamic scope.  I know this was characterized by Marius in the opposite
way in the earlier (late 2000) discussion.  I.e. that because macro expanding
at read time captures whatever value of the syntax binding was lying around
rather than the binding itself (to be used over and over), it is "dynamic".  Well,
I don't have a great counter to this, it's just my intuition says that expanding
at read time gives you a "what you get is what you write" promise of 
lexicality.  Or actually that the other scheme is even more dynamic than 
expanding at read  time.  Besides which the expansion stage is supposed to 
occur (completely) before either interpretation or compilation, not intertwined
with it.  I guess I sort of see define-syntax as implicitly introducing a new,
inescapable and non-bracketed scope.  
    Probably the most compelling reason to expand at read time, though is that
any sane compilation system would not go re-compiling every bit of code just
because someone redefines lambda or if at the repl.

Lynn

_______________________________________________
Guile-devel mailing list
Guile-devel@gnu.org
http://mail.gnu.org/mailman/listinfo/guile-devel


^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: Syntax checks
  2002-04-09 20:48     ` Lynn Winebarger
@ 2002-04-13  9:01       ` Dirk Herrmann
  2002-04-13 12:48         ` Neil Jerram
                           ` (3 more replies)
  2002-04-14 17:52       ` Marius Vollmer
  1 sibling, 4 replies; 18+ messages in thread
From: Dirk Herrmann @ 2002-04-13  9:01 UTC (permalink / raw)
  Cc: Marius Vollmer, Guile Development List

On Tue, 9 Apr 2002, Lynn Winebarger wrote:

>          The real question (lingering from at least late 2000) seems to be
> whether lambda abstractions should delay expansion as well as evaluation.
> My first impulse is to say it shouldn't, that macros are "evaluated" at read 
> time.  Among other effects of using lambda to delay expansion, you have
> introduced a definite evaluation ordering of applications.  I'm guessing one
> of the appeals of this behaviour is that in
> (define (foo x) (bar x))
> (define (bar x) (+ x 5))
> (define-syntax bar (syntax-rules () ((_ x) (+ x 5))))
> 
>     the 2 definitions of bar work "the same".  However, IMO, the second
> definition should yield an  error in  (foo 4)  because it's evaluation time 
> and bar evaluates to a macro, and 5 is not "syntax". 
>     Mainly, however, I see this as a kind of lexical scoping - if you 
> re-evaluated macros whenever they changed, you have a kind of 
> dynamic scope.  I know this was characterized by Marius in the opposite
> way in the earlier (late 2000) discussion.  I.e. that because macro expanding
> at read time captures whatever value of the syntax binding was lying around
> rather than the binding itself (to be used over and over), it is "dynamic".  Well,
> I don't have a great counter to this, it's just my intuition says that expanding
> at read time gives you a "what you get is what you write" promise of 
> lexicality.  Or actually that the other scheme is even more dynamic than 
> expanding at read  time.  Besides which the expansion stage is supposed to 
> occur (completely) before either interpretation or compilation, not intertwined
> with it.  I guess I sort of see define-syntax as implicitly introducing a new,
> inescapable and non-bracketed scope.  
>     Probably the most compelling reason to expand at read time, though is that
> any sane compilation system would not go re-compiling every bit of code just
> because someone redefines lambda or if at the repl.

I agree with you that re-compilation in case of a macro redefinition is of
questionable worth.  Moreover, I don't see a demand for it:  Guile does
not provide such a feature, and I can't remember that I ever saw a user
requesting it on the list.

But, IMO the way guile currently does it is broken (IMO).  Look at the
following example:

(define (foo x) (if x (bar) (bar)))
(defmacro bar () ''first)
(foo #t)
--> first
(defmacro bar () ''second)
(foo #t)
--> first
(foo #f)
--> second
(foo #t)
--> first

First, we see that macro expansion happens during evaluation.  Otherwise,
if expansion was done after 'read, the references to 'bar would not be
expanded: 'bar is not known to be a macro at that time.

Second, we see that there is no re-compilation.  Otherwise the second
execution of (foo #t) would have resulted in 'second.

Third, we see that macro expansion is only done in those paths which are
actually executed.  Otherwise after the first execution of (foo #t) _all_
references to 'bar would have been replaced by 'first.

IMO, the current solution is broken.  Depending on the evaluation order
and the actual execution paths that are taken, you can get completely
different expansions, although the code is syntactically the same.  It is
easy to create an example where it depends on _input data_ how the
expansion will be done.  Ugh.

Thus, I sugggest to first go the way that Lynn suggests: Do expansion
after reading, but don't care about re-compilation.  If we later decide
for re-compilation, we can think about possible solutions then.

Best regards
Dirk Herrmann


_______________________________________________
Guile-devel mailing list
Guile-devel@gnu.org
http://mail.gnu.org/mailman/listinfo/guile-devel


^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: Syntax checks
  2002-04-13  9:01       ` Dirk Herrmann
@ 2002-04-13 12:48         ` Neil Jerram
  2002-04-13 18:28           ` Lynn Winebarger
  2002-04-13 18:10         ` Lynn Winebarger
                           ` (2 subsequent siblings)
  3 siblings, 1 reply; 18+ messages in thread
From: Neil Jerram @ 2002-04-13 12:48 UTC (permalink / raw)
  Cc: Lynn Winebarger, Marius Vollmer, Guile Development List

>>>>> "Dirk" == Dirk Herrmann <dirk@ida.ing.tu-bs.de> writes:

    Dirk> Thus, I sugggest to first go the way that Lynn suggests: Do expansion
    Dirk> after reading, but don't care about re-compilation.  If we later decide
    Dirk> for re-compilation, we can think about possible solutions then.

I broadly agree with your argument, but have just one question.  Does
this last point mean that macro definitions will only work if they are
placed _before_ any expressions that use them?  Or am I missing
something cleverer here?

        Neil


_______________________________________________
Guile-devel mailing list
Guile-devel@gnu.org
http://mail.gnu.org/mailman/listinfo/guile-devel


^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: Syntax checks
  2002-04-13  9:01       ` Dirk Herrmann
  2002-04-13 12:48         ` Neil Jerram
@ 2002-04-13 18:10         ` Lynn Winebarger
  2002-04-14 18:18           ` Marius Vollmer
  2002-04-14 18:11         ` Marius Vollmer
  2002-04-23 21:55         ` Thien-Thi Nguyen
  3 siblings, 1 reply; 18+ messages in thread
From: Lynn Winebarger @ 2002-04-13 18:10 UTC (permalink / raw)
  Cc: Marius Vollmer, Guile Development List

On Saturday 13 April 2002 04:01, Dirk Herrmann wrote:
> I agree with you that re-compilation in case of a macro redefinition is of
> questionable worth.  Moreover, I don't see a demand for it:  Guile does
> not provide such a feature, and I can't remember that I ever saw a user
> requesting it on the list.

   In the discussion I was looking at (around December 2000, referenced in 
the module design notes),   Mikael Djurfeldt was arguing (if I understand 
him correctly) that object systems based on macros were inconvenient to
experiment with because changing core definitions didn't change the 
definitions that depended on them [in a course or lab he was teaching].  He
wanted changes in macros to be reflected in code that used them (including
reloading files/modules).
    Maybe there should be some way (for development purposes) to 
re-expand macros, but I don't believe it should be the default semantics 
- or even  that it should supported in normal use (i.e. without an evaluator
option being flipped on somewhere).  In particular, any compiler generated
code should not be required to support it without specially requesting that 
support.

> But, IMO the way guile currently does it is broken (IMO).  Look at the
> following example:
> [snipped - guile currently lazily evaluates macros with dynamic scoping]
>
> IMO, the current solution is broken.  Depending on the evaluation order
> and the actual execution paths that are taken, you can get completely
> different expansions, although the code is syntactically the same.  It is
> easy to create an example where it depends on _input data_ how the
> expansion will be done.  Ugh.

     Ahh, I had only seen that it expanded bar after re-definition, I didn't
realize it didn't consistently do that.  That seems worse to me (like the
worst of both worlds - all the variable capture problems of dynamic scope
with none of the hope of correcting them).

Lynn

_______________________________________________
Guile-devel mailing list
Guile-devel@gnu.org
http://mail.gnu.org/mailman/listinfo/guile-devel


^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: Syntax checks
  2002-04-13 12:48         ` Neil Jerram
@ 2002-04-13 18:28           ` Lynn Winebarger
  0 siblings, 0 replies; 18+ messages in thread
From: Lynn Winebarger @ 2002-04-13 18:28 UTC (permalink / raw)
  Cc: Marius Vollmer, Guile Development List

On Saturday 13 April 2002 07:48, Neil Jerram wrote:
> I broadly agree with your argument, but have just one question.  Does
> this last point mean that macro definitions will only work if they are
> placed _before_ any expressions that use them?
        Yes.

>  Or am I missing something cleverer here?

       Not that I know of. The productions of macro definitions don't get
expanded until the macro is applied so recursive use is fine.  Those
productions should (lexically) scope to wherever the macro was defined 
(like a module) so that the problem of exporting extra syntax can be 
avoided.  In some sense this should 'obvious' as macros are just lambdas
and they scope lexically, on the other hand it's not since macros just produce
new code and the evaluator isn't itself scoped to the lexical environment of
the macro.  What's the correct evaluation of

(define-syntax foo (syntax-rules () ((_) (bar))))
(define-syntax bar (syntax-rules () ((_) 1)))
(let-syntax ((bar (syntax-rules () ((_) 2))))
                   (foo))

    I'd say it should be 1, even though (foo) produces (bar) in an environment
with a new binding for bar.  It should work the same way for modules.

Lynn

_______________________________________________
Guile-devel mailing list
Guile-devel@gnu.org
http://mail.gnu.org/mailman/listinfo/guile-devel


^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: Syntax checks
  2002-04-09 20:48     ` Lynn Winebarger
  2002-04-13  9:01       ` Dirk Herrmann
@ 2002-04-14 17:52       ` Marius Vollmer
  2002-04-29 23:55         ` Lynn Winebarger
  1 sibling, 1 reply; 18+ messages in thread
From: Marius Vollmer @ 2002-04-14 17:52 UTC (permalink / raw)
  Cc: Guile Development List

Lynn Winebarger <owinebar@free-expression.org> writes:

>         Are you actively working on the module system/first class 
> environments?

No, not actively.

>  The TODO list currently only lists 1.6 and "Eventually" as target
> times.  I am interested in this particular task, but am still poking
> around the source.  It's not entirely clear what the exact
> difference between environments and modules is (or should be).

The environments are intended to provide the run-time data structures
that implement the module system.

I myself am not entirely sure how to use the environments, and if we
need their--although elegant--richness.  When developing the semantics
of the module system, we should not try to move it into such a
direction that it fits perfectly with the existing environments
interfaces.  If it does fit, so much the better, but that should not
be the main goal.

> Well, there's import/export and syntax readers.  I did go back and
> read a lot of discussion from late 2000/early 2001, but I'm not sure
> how much has sunk in yet.  Also, while I'm aware modules should act
> orthogonally to objects in terms of introducing namespaces, it seems
> to me the module system should allow constructs similar to the
> imperative object-oriented languages.

Can you elaborate?  Are you talking about data hiding, inheritance,
etc?

> Before I go on, though, I'd like to find out if there's any point to
> doing so.

Hmm, I can't really follow you here.  What questions do you need
answered precisely to find out?  I'm not sure what you would like to
"go on" with.

>          The real question (lingering from at least late 2000) seems
> to be whether lambda abstractions should delay expansion as well as
> evaluation.  My first impulse is to say it shouldn't, that macros
> are "evaluated" at read time.

Yep.  I think we should be careful about defining our 'times'.  'Read
time' would be during the call to 'read', but that's not related to
macros yet.  'read' is also used for reading general data structures
in the sexp language.  'Read time' is concerned with evaluating the
"#." construct, etc, but not with Scheme macros.

I think that following the 'read time' (of a Scheme program), we
should always have a 'compile time' (even when we are going to execute
the program right away).  During this compile time, Scheme macros are
expanded.  Then might follow 'execute time'.

>  Among other effects of using lambda to delay expansion, you have
> introduced a definite evaluation ordering of applications.  I'm guessing one
> of the appeals of this behaviour is that in
> (define (foo x) (bar x))
> (define (bar x) (+ x 5))
> (define-syntax bar (syntax-rules () ((_ x) (+ x 5))))
> 
>     the 2 definitions of bar work "the same".  However, IMO, the second
> definition should yield an  error in  (foo 4)  because it's evaluation time 
> and bar evaluates to a macro,

Yes, and a block compiler might warn about this in advance.  (Both
about redefining bar and about using bar as a function and later
defining it as a macro.)

Hmm, I wouldn't say "bar evaluates to a macro".  This makes sense
right now because of the way our evaluator works, but I think it wrong
to say that 'bar' is evaluated, at all.  It is recognized by the
compiler (or memoizer) as a macro.

> and 5 is not "syntax".

I don't understand, could'ya explain?

>     Mainly, however, I see this as a kind of lexical scoping - if
> you re-evaluated macros whenever they changed, you have a kind of
> dynamic scope.  I know this was characterized by Marius in the
> opposite way in the earlier (late 2000) discussion.  I.e. that
> because macro expanding at read time captures whatever value of the
> syntax binding was lying around rather than the binding itself (to
> be used over and over), it is "dynamic".

Hmm, if I remember the discussion right, we were talking about two
kinds of consistencies of a system, not about scoping disciplines.
Consider the following simple model of interactive development of a
'system': you start with a set of files and load them into a freshly
started Guile process.  You then make changes to the files and load
them again.  Do this a couple of times.  At the end, you have a new
set of files, and Guile process in a certain state.  You then start a
second, fresh Guile process and load the new set of files.

When the two Guile processes must be in the same state, we would have
"static consistency".  The state of the system is only determined by
what is in the files that describes it.  When the two Guile processes
are allowed to differ in their states, we would have "dynamic
consistency".  The state of the system is determined by the operations
performed on the Guile process (in our simple model, the only
operation was loading a file).


Now I argue that static consistency is _very_ hard to achieve in full
generality, and if so only with great costs, and maybe wouldn't even
be desirable from a user point of view.  Re-expanding macros when
their definition changes is one example.  You need to keep extensive
data structures around (but maybe not in core) to realize it, it would
take significant time to do it, and it would probably make it
needlessly difficult to steer your system thru some 'illegal'
configurations when needing to make coordinated changes to a multitude
of places.

I think it will be better to not try to achieve static consistency
automatically.  Dynamic consistency, of course, is trivial to have
(basically, you can't avoid it).

For example, instead of magically reexpanding all uses of a redefined
macro, we should simply require the user to reload or recompile all
affected files, if he so desires.  The system can help by providing a
list of affected files, and doing the whole reloading/recompiling upon
a simple command.

Also, I think we should extend this to bindings in modules: when the
list of exported bindings of a module changes (say), the current
proposal is to automatically correct all existing uses of the affected
bindings.  I now think it will be better to fix the list of imported
bindings when executing a 'define-module' or 'use-modules' (if it
survives) statement.  This is Dirk's signatures idea, and I now
finally got it, I think.  When you want a changed export list to take
effect, you need to reload/recompile the files that are affected by
it.

_______________________________________________
Guile-devel mailing list
Guile-devel@gnu.org
http://mail.gnu.org/mailman/listinfo/guile-devel


^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: Syntax checks
  2002-04-13  9:01       ` Dirk Herrmann
  2002-04-13 12:48         ` Neil Jerram
  2002-04-13 18:10         ` Lynn Winebarger
@ 2002-04-14 18:11         ` Marius Vollmer
  2002-04-23 21:55         ` Thien-Thi Nguyen
  3 siblings, 0 replies; 18+ messages in thread
From: Marius Vollmer @ 2002-04-14 18:11 UTC (permalink / raw)
  Cc: Lynn Winebarger, Guile Development List

Dirk Herrmann <dirk@ida.ing.tu-bs.de> writes:

> IMO, the current solution is broken.

Yes, my opinion as well.

> Thus, I sugggest to first go the way that Lynn suggests: Do
> expansion after reading, but don't care about re-compilation.  If we
> later decide for re-compilation, we can think about possible
> solutions then.

Yes.  More detailed:

 - revoke macro transformers from being stored in variables.  I.e.

    (define-macro (foo ...) ...)
    (define bar foo)

   will no longer work.  Instead, put macros into the toplevel by
   binding them directly to a symbol.  We now have

      (module lookup)    (variable-ref)
    symbol ------> variable ------> #<macro>

   but should have

      (module lookup)
    symbol ------> #<macro>

 - Make syntax-case use this arrangement, and make it work correctly
   with modules in general.

 - Separate memoization from execution.

_______________________________________________
Guile-devel mailing list
Guile-devel@gnu.org
http://mail.gnu.org/mailman/listinfo/guile-devel


^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: Syntax checks
  2002-04-13 18:10         ` Lynn Winebarger
@ 2002-04-14 18:18           ` Marius Vollmer
  0 siblings, 0 replies; 18+ messages in thread
From: Marius Vollmer @ 2002-04-14 18:18 UTC (permalink / raw)
  Cc: Dirk Herrmann, Guile Development List

Lynn Winebarger <owinebar@free-expression.org> writes:

>     Maybe there should be some way (for development purposes) to 
> re-expand macros, but I don't believe it should be the default semantics 
> - or even  that it should supported in normal use (i.e. without an evaluator
> option being flipped on somewhere).  In particular, any compiler generated
> code should not be required to support it without specially requesting that 
> support.

I say we should support it by making sure reloading of files (source
or compiled) behaves sanely in general.  People who want their changed
macro definitions to take effect can then reload or recompile the
affected code.

_______________________________________________
Guile-devel mailing list
Guile-devel@gnu.org
http://mail.gnu.org/mailman/listinfo/guile-devel


^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: Syntax checks
  2002-04-13  9:01       ` Dirk Herrmann
                           ` (2 preceding siblings ...)
  2002-04-14 18:11         ` Marius Vollmer
@ 2002-04-23 21:55         ` Thien-Thi Nguyen
  3 siblings, 0 replies; 18+ messages in thread
From: Thien-Thi Nguyen @ 2002-04-23 21:55 UTC (permalink / raw)
  Cc: owinebar, mvo, guile-devel

   From: Dirk Herrmann <dirk@ida.ing.tu-bs.de>
   Date: Sat, 13 Apr 2002 11:01:26 +0200 (MEST)

   Thus, I sugggest to first go the way that Lynn suggests: Do expansion
   after reading, but don't care about re-compilation.  If we later
   decide for re-compilation, we can think about possible solutions
   then.

could you summarize this under workbook/compilation/ somewhere?
(this is a new directory.)  describing the "define evaluation model"
steps there is also a good idea.

thi

_______________________________________________
Guile-devel mailing list
Guile-devel@gnu.org
http://mail.gnu.org/mailman/listinfo/guile-devel


^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: Syntax checks
  2002-04-14 17:52       ` Marius Vollmer
@ 2002-04-29 23:55         ` Lynn Winebarger
  2002-05-07 19:24           ` Marius Vollmer
  0 siblings, 1 reply; 18+ messages in thread
From: Lynn Winebarger @ 2002-04-29 23:55 UTC (permalink / raw)
  Cc: Guile Development List

Sorry for taking so long, I've been browsing guile.  I still haven't
thoroughly read Waddell/Dybvig's paper on modules yet, either,
but it's been 2 weeks so I thought I should go ahead and try to 
reply.  

On Sunday 14 April 2002 12:52, Marius Vollmer wrote:
> Lynn Winebarger <owinebar@free-expression.org> writes:
> >  The TODO list currently only lists 1.6 and "Eventually" as target
> > times.  I am interested in this particular task, but am still poking
> > around the source.  It's not entirely clear what the exact
> > difference between environments and modules is (or should be).
> 
> The environments are intended to provide the run-time data structures
> that implement the module system.
> 
      Actually, I had meant environments in the generic sense (what you
supply to code to provide values to variables) rather than any particular
implementation.  At the time I didn't realize there was an 'environment'
type in guile code.

> I myself am not entirely sure how to use the environments, and if we
> need their--although elegant--richness.  When developing the semantics
> of the module system, we should not try to move it into such a
> direction that it fits perfectly with the existing environments
> interfaces.  If it does fit, so much the better, but that should not
> be the main goal.

     Fine by me.  While in principle I think the idea of examinable 
environments is neat, we should think carefully about how features
would interact with a compiler.  I'd say, for non-debugging purposes,
variables/syntax should be visible only if it's declared "exportable",
and non-visible otherwise.
     I am in favor of Waddell/Dybvig's general approach, i.e. modules
provide another form of lexical scoping, evaluate to first class objects
and can be nested.  

> > Also, while I'm aware modules should act
> > orthogonally to objects in terms of introducing namespaces, it seems
> > to me the module system should allow constructs similar to the
> > imperative object-oriented languages.
> 
> Can you elaborate?  Are you talking about data hiding, inheritance,
> etc?
> 
     Yes.  How will generics interact with the module system? How
will classes interact with nested modules?
    I'll try to write some (currently non-interpretable) code to illustrate
what I'm thinking about and how it should evaluate.

> 
> >          The real question (lingering from at least late 2000) seems
> > to be whether lambda abstractions should delay expansion as well as
> > evaluation.  My first impulse is to say it shouldn't, that macros
> > are "evaluated" at read time.
> 
> Yep.  I think we should be careful about defining our 'times'.  'Read
>  [...]
> I think that following the 'read time' (of a Scheme program), we
> should always have a 'compile time' (even when we are going to execute
> the program right away).  During this compile time, Scheme macros are
> expanded.  Then might follow 'execute time'.
> 
      I took another look at R5RS for more insight.  Looks like 
syntax expansion, compilation, and evaluation should be separate stages
(logically if not implementationally).  This seems correct to me.

> > (define (foo x) (bar x))
> > (define (bar x) (+ x 5))
> > (define-syntax bar (syntax-rules () ((_ x) (+ x 5))))
> > 
> >     the 2 definitions of bar work "the same".  However, IMO, the second
> > definition should yield an  error in  (foo 4)  because it's evaluation time 
> > and bar evaluates to a macro,
> 
> Yes, and a block compiler might warn about this in advance.  (Both
> about redefining bar and about using bar as a function and later
> defining it as a macro.)
> 
> Hmm, I wouldn't say "bar evaluates to a macro".  This makes sense
> right now because of the way our evaluator works, but I think it wrong
> to say that 'bar' is evaluated, at all.  It is recognized by the
> compiler (or memoizer) as a macro.

           For a while I was thinking you were wrong, but upon more 
reflection I see you are right.  R5RS does specify them as separate
(Node: Variables; syntactic keywords; and region in r5rs.info).  Chez
5.0b the following interesting behaviour:
(define foo (lambda (x) (bar x)))
(define bar +)
(define-syntax bar (syntax-rules () ((_ x ...) (+ x ...))))
(foo 5) => 5
(bar 5)  => -5
bar or (cons bar '()) or (define bar +) 
      => error about use of keyword in non-operator position

     It's interesting because the old code continues to refer to the 
variable binding and yet you can't create a new reference to that location.
This seems to be the logical result of requiring the ability to specify syntax
in non-operator positions ("identifier-syntax").  
     My thinking is that while they're logically separate environments, they should
should be implemented in one table, where each entry has 2 possible entries
(one for syntax and one for variable).

> > and 5 is not "syntax".
> 
> I don't understand, could'ya explain?

    I meant it's a value, not a token.  The original message included
a bit about dereferencing a variable location and getting a macro, 
and that a (syntax-case macro anyway) should complain if it's not
acting on "syntax".     Actually, I should have said "4 is not syntax".
    Of course that's not really true as syntax-case is sloppy in what it
will accept to allow for recursive munging.

> >     Mainly, however, I see this as a kind of lexical scoping - if
> > you re-evaluated macros whenever they changed, you have a kind of
> > dynamic scope.  [...]
>
> Hmm, if I remember the discussion right, we were talking about two
> kinds of consistencies of a system, not about scoping disciplines.

     There's a difference?

> Consider the following simple model of interactive development of a
> 'system': you start with a set of files and load them into a freshly
> started Guile process.  You then make changes to the files and load
> them again.  Do this a couple of times.  At the end, you have a new
> set of files, and Guile process in a certain state.  You then start a
> second, fresh Guile process and load the new set of files.
> 
> When the two Guile processes must be in the same state, we would have
> "static consistency".  The state of the system is only determined by
> what is in the files that describes it.  When the two Guile processes
> are allowed to differ in their states, we would have "dynamic
> consistency".  The state of the system is determined by the operations
> performed on the Guile process (in our simple model, the only
> operation was loading a file).
> 
      To me, this description views the files themselves as the source 
of bindings (i.e. a scope and instantiation of a scope at the same time).
That's why the usage is backwards. Since files are dynamic objects (and
completely outside the interpreter's grasp), synchronizing with them is
a dynamic activity.  Consistency with what's been read (which is a static
concept as "what's been read" never changes except in being added
on to) is the static consistency.
    I would actually argue that this brand of "static vs. dynamic" prevents
modules from being completely orthogonal to directories and files.  I.e.
modules should be kept all in file (though more than one module per
file is fine).  Having one module in multiple files is asking for trouble.
Although it may be that the trouble doesn't need to concern the language
implementation and compiler optimizations.

> [...]
> For example, instead of magically reexpanding all uses of a redefined
> macro, we should simply require the user to reload or recompile all
> affected files, if he so desires.  The system can help by providing a
> list of affected files, and doing the whole reloading/recompiling upon
> a simple command.

    Sounds good to me.  

> Also, I think we should extend this to bindings in modules: when the
> list of exported bindings of a module changes (say), the current
> proposal is to automatically correct all existing uses of the affected
> bindings.  I now think it will be better to fix the list of imported
> bindings when executing a 'define-module' or 'use-modules' (if it
> survives) statement.  This is Dirk's signatures idea, and I now
> finally got it, I think.  When you want a changed export list to take
> effect, you need to reload/recompile the files that are affected by
> it.
> 
     I've looked at signatures.texi, but I don't see how this relates.
I'm not actually sure adding bindings to a module _after_ its definition/lexical
scope should be possible at all.  Redefinition of the entire module, yes. 
But then the old definition/bindings would be still in use by the other
modules that used it.
      Another issue - how would you direct the compiler to export C
"bindings" (e.g. a trampoline shared library and header file).  This
touches on something I've noticed while browsing - why are there 
so many smobs in the core interpreter?  Is it because you can't export 
macros to C automatically, or is there a deeper reason?

Lynn

_______________________________________________
Guile-devel mailing list
Guile-devel@gnu.org
http://mail.gnu.org/mailman/listinfo/guile-devel


^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: Syntax checks
  2002-04-29 23:55         ` Lynn Winebarger
@ 2002-05-07 19:24           ` Marius Vollmer
  2002-05-09  5:59             ` Lynn Winebarger
  0 siblings, 1 reply; 18+ messages in thread
From: Marius Vollmer @ 2002-05-07 19:24 UTC (permalink / raw)
  Cc: Guile Development List

Lynn Winebarger <owinebar@free-expression.org> writes:

> Sorry for taking so long, I've been browsing guile.

Heh, no problem.  I enjoy not getting e-mail more and more... ;-)

> [...] How will generics interact with the module system? How will
> classes interact with nested modules?  I'll try to write some
> (currently non-interpretable) code to illustrate what I'm thinking
> about and how it should evaluate.

My current stance on this is that generics and classes should not be
treated specially by the module system.  Just like functions, generics
and classes are nameless objects that just happen to be accessible via
some name (or more than one, or none).  Thus, modules should only be
there to manage the name-space, not to create new (merged) generics, etc.

>      It's interesting because the old code continues to refer to the
> variable binding and yet you can't create a new reference to that
> location.  This seems to be the logical result of requiring the
> ability to specify syntax in non-operator positions
> ("identifier-syntax").  My thinking is that while they're logically
> separate environments, they should should be implemented in one
> table, where each entry has 2 possible entries (one for syntax and
> one for variable).

Yes, indeed.  Guile is somewhat confused about this right now: it does
symbol -> variable -> macro instead of symbol -> macro.

> > Also, I think we should extend this to bindings in modules: when
> > the list of exported bindings of a module changes (say), the
> > current proposal is to automatically correct all existing uses of
> > the affected bindings.  I now think it will be better to fix the
> > list of imported bindings when executing a 'define-module' or
> > 'use-modules' (if it survives) statement.  This is Dirk's
> > signatures idea, and I now finally got it, I think.  When you want
> > a changed export list to take effect, you need to reload/recompile
> > the files that are affected by it.
> > 
>      I've looked at signatures.texi, but I don't see how this
> relates.

Not much, I'd say.  The main thing I took away from signatures is that
they are compile-time constructs, not run-time constructs.  (If I
understood them right.)  That is, what binding comes from what module
is fixed at compile time, not at load time.  Knowing at compile time
which module to look in for a given symbol is important for macros,
and also for doing important optimizations (such as inlining of
fixnum-+, say).

>  I'm not actually sure adding bindings to a module _after_ its
> definition/lexical scope should be possible at all.  Redefinition of
> the entire module, yes.  But then the old definition/bindings would
> be still in use by the other modules that used it.

Yes, my view as well.

>       Another issue - how would you direct the compiler to export C
> "bindings" (e.g. a trampoline shared library and header file).

What do you mean?  How to export bindings from a module defined in C?

> This touches on something I've noticed while browsing - why are
> there so many smobs in the core interpreter?  Is it because you
> can't export macros to C automatically, or is there a deeper reason?

I don't understand.  How would macros reduce the need for smobs?

_______________________________________________
Guile-devel mailing list
Guile-devel@gnu.org
http://mail.gnu.org/mailman/listinfo/guile-devel


^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: Syntax checks
  2002-05-07 19:24           ` Marius Vollmer
@ 2002-05-09  5:59             ` Lynn Winebarger
  0 siblings, 0 replies; 18+ messages in thread
From: Lynn Winebarger @ 2002-05-09  5:59 UTC (permalink / raw)
  Cc: Guile Development List

On Tuesday 07 May 2002 14:24, Marius Vollmer wrote:
> Lynn Winebarger <owinebar@free-expression.org> writes:
> > [...] How will generics interact with the module system? How will
> > classes interact with nested modules?  I'll try to write some
> > (currently non-interpretable) code to illustrate what I'm thinking
> > about and how it should evaluate.
> 
> My current stance on this is that generics and classes should not be
> treated specially by the module system.  Just like functions, generics
> and classes are nameless objects that just happen to be accessible via
> some name (or more than one, or none).  Thus, modules should only be
> there to manage the name-space, not to create new (merged) generics, etc.

     I've had some trouble coming up with interesting examples.  My problem
is I want to make modules that encapsulate the fields of the object
and then import the fields in methods, but the syntax makes this difficult, 
and so does the MOP.  Of course, it's easy to encapsulate the class and 
be picky about what accessors and methods get exported. 
Essentially, in my view, objects and modules are both lexical environments,
but modules are (generally at least) unique whereas lots of objects might 
provide the same environment (in terms of the variables they close over).
     I'm not sure what you mean by merged generics.  Does GOOPS manage
its own namespace (in terms of finding generics and parent classes) or
do classes and generics really get encapsulated in modules (now and
in principle)?

> >      It's interesting because the old code continues to refer to the
> > variable binding and yet you can't create a new reference to that
> > location.  This seems to be the logical result of requiring the
> > ability to specify syntax in non-operator positions
> > ("identifier-syntax").  My thinking is that while they're logically
> > separate environments, they should should be implemented in one
> > table, where each entry has 2 possible entries (one for syntax and
> > one for variable).
> 
> Yes, indeed.  Guile is somewhat confused about this right now: it does
> symbol -> variable -> macro instead of symbol -> macro.
> 
       I have been looking at the code to see if I can fix it.  There appears
to be some issues with the implementation of fluids that could be fixed
at the same time, though.  See below (I've re-ordered some pieces).

> Not much, I'd say.  The main thing I took away from signatures is that
> they are compile-time constructs, not run-time constructs.  (If I
> understood them right.)  That is, what binding comes from what module
> is fixed at compile time, not at load time.  Knowing at compile time
> which module to look in for a given symbol is important for macros,
> and also for doing important optimizations (such as inlining of
> fixnum-+, say).
> 
> > This touches on something I've noticed while browsing - why are
> > there so many smobs in the core interpreter?  Is it because you
> > can't export macros to C automatically, or is there a deeper reason?
> 
> I don't understand.  How would macros reduce the need for smobs?
> 
      I was specifically thinking of fluids.  Disregard threads for a moment.
It appears (and I emphasize appears as I am not completely sure) that
the fluids in guile are only work for global definitions.  If so it's incorrect.
Any variable, global or lexical should be fluid-settable (so to speak).  Take
Dybvig's macro definition:
(define-syntax fluid-let
    (syntax-rules ()
        (_ ((x v)) e1 e2 ...)
            (let ((y v))
               (let ((swap (lambda ()
                                    (let ((t x))
                                       (set! x y)
                                       (set! y t)))))
                  (dynamic-wind
                        swap
                         (lambda ()  e1 e2 ...)
                         swap))))))

Hence
(define x 7)
(define foo (lambda () x))
(let ((x 5))
    (letrec ((bar (lambda () x))))
         (fluid-let ((x 0))
               (list (foo) (bar)))))
=> (7 0)

whereas in the current scheme (I think) you'd get (0 5).  It took me
a while to understand this approach (hygenic macros make "y" just
some generated symbol - new storage - to hide the "real" value
of the variable (lexical or global) until the expressions are done, and
the dynamic wind ensures that it's seen by any invocation of a continuation
captured inside e1 e2 ... .)

     However, there is a problem when you add threads (cooperative or
interruptable) that the fluid binding should only apply to the "dynamic
extent" in a single thread (which is what dynamic-root is/was for I believe?),
which makes a macro definition more difficult.
     I think the way to attack this is to make each lexical-environment have
a variable-per-thread, and an underlying variable, and each variable lookup
check the per-thread location first which might just contain the reference
to the underlying variable, or a thread-local fluid binding.  Macros probably
have to be treated the same way to get fluid-let-syntax to work right.
       By the by, I think this touches on issues in the comments preceding
lookupcar.  If I read it right, the main issue arises when the expansion stage
doesn't properly precede the interpretation stage and two different threads
might try to do a combined expand/execute on the same s-expression.  It
seems to me it would be solved by the proper separation - that is, threads
should only be "sharing s-expressions" through closures stored in common
variables.  One thread should be do the reading and expanding before any
storage of the resulting object happens.   The only problem would be if you
purposefully did an eval! operation on a quoted sexpression in two different
threads, in which case you sort of deserve what you get.  I think that applies
to either the coop threads or interruptable threads (I know guile doesn't
work with them now, but if it did it wouldn't make a difference).

> >       Another issue - how would you direct the compiler to export C
> > "bindings" (e.g. a trampoline shared library and header file).
> 
> What do you mean?  How to export bindings from a module defined in C?
> 
   No, I meant if we had a guile compiler, how would we direct it to export
bindings (i.e. header files) and a C API compliant shared library that trampolined
into the real code.  And when I say export, I mean where the C caller wouldn't
use SCM values, but the exporter would generate translations as part of the
trampoline.  Although this is kind of specious issue at the moment.  It's just what
I was thinking about in terms of compiled modules (generally) corresponding to 
shared libraries, and how guile definitions could be made "first class" wrt C
functions.
    Just to express my opinion, scheme compilers should be written in scheme.  It's
a moral imperative.  Leave the interpreter in C and lift the common data into
a schemey format, autogenerating the C definitions.  That'd be my approach.

Lynn

_______________________________________________
Guile-devel mailing list
Guile-devel@gnu.org
http://mail.gnu.org/mailman/listinfo/guile-devel


^ permalink raw reply	[flat|nested] 18+ messages in thread

end of thread, other threads:[~2002-05-09  5:59 UTC | newest]

Thread overview: 18+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2002-04-06  6:25 Syntax checks Dirk Herrmann
2002-04-06 15:38 ` Neil Jerram
2002-04-07  7:09   ` Dirk Herrmann
2002-04-08 18:27     ` Neil Jerram
2002-04-07 10:40   ` Marius Vollmer
2002-04-09 20:48     ` Lynn Winebarger
2002-04-13  9:01       ` Dirk Herrmann
2002-04-13 12:48         ` Neil Jerram
2002-04-13 18:28           ` Lynn Winebarger
2002-04-13 18:10         ` Lynn Winebarger
2002-04-14 18:18           ` Marius Vollmer
2002-04-14 18:11         ` Marius Vollmer
2002-04-23 21:55         ` Thien-Thi Nguyen
2002-04-14 17:52       ` Marius Vollmer
2002-04-29 23:55         ` Lynn Winebarger
2002-05-07 19:24           ` Marius Vollmer
2002-05-09  5:59             ` Lynn Winebarger
2002-04-07 10:05 ` Marius Vollmer

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).