"Phillip J. Eby" <pje@telecommunity.com> writes: > > > Me, I could probably live with: > > def p_expr_term(self,args) [ > spark.rule( > """expr ::= expr + term > term ::= term * factor""" > ) > ]: > return AST(...) Come to think, I tried to do something rather similar while attempting to write an expect clone in python. I would suggest the syntax def p_expr_term(self, args) {rule : """expr ::= expr + term term ::= term * factor"""}: return AST(...) I.e. allow a dictionary constructor expression between the argument list and the colon; this initializes the property dictionary of the code object. One could stick a call to spark.rule() in there, or have a metaclass do it automatically. This allows the square bracket notation to be reserved for [classmethod] and the like. I suppose there's nothing stopping square brackets from being used for both, but I like having consistency with other dictionary constructor expressions. zw
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4