At 11:29 AM 6/12/03 +0100, Moore, Paul wrote: >From: Moore, Paul > > Specifically, requirement (e) in the PEP considers precisely > > the case where neither the "protocol" nor the "object" need > > to know about the adapt() mechanism. Sadly, the reference > > implementation provided does not support requirement (e). > >And doubly sadly, I see that PyProtocols doesn't either. I'd >suggest that you add this, and then maybe your "chicken and egg" >issues will go away. Or maybe I've missed something :-) Open protocols solve the chicken and egg problem by allowing one to make declarations about third-party objects. The problem with __adapt__ and __conform__ is as you say: people have to write things to support them. PyProtocols bootstraps this by providing plenty of objects, base classes, adapters, and wrappers that support __adapt__ and __conform__. While it's true that you can't use 'adapt(ob,file)' meaningfully, there's nothing stopping you from saying 'declareImplementation(file,[IFile])' and 'adapt(ob,IFile)'. (And the user can say 'adviseObject(ob,provides=[IFile]) before passing you 'ob'.) The point of the open protocol mechanism is that you can make these declarations relative to a specific package's interfaces, *without* requiring there to be a common standard. (Because you can define the relationships between different packages' interfaces.) To put it another way... it isn't necessary that 'file' be a protocol in the Python core. If I write code that needs a file, I define a protocol that means whatever I want it to mean, and if files satisfy that requirement, I declare that file satisfies it. If a user has an object that they believe satisfies my requirement, they declare that it does (whether they created it or somebody else did). As long as I use an "open" protocol (one that will accept declarations), then this works. At present, there are only three implementations of open protocols (that conform to PyProtocols' API): protocols derived from PyProtocols base type, Zope interfaces, and Twisted interfaces. However, in principle, one could use any other interface type that had the ability to be told what objects support or adapt to the interface. To use them with PyProtocols, one would need an adapter that translated their current API to the PyProtocols API, like the two I wrote for Zope and Twisted. Anyway, PyProtocols provides enough functionality around PEP 246 to allow a great deal of useful development to happen without ever writing an __adapt__ or __conform__ method. So, from my point of view it breaks the chicken-and-egg problem because if you write an app using PyProtocols to define interfaces and adapt to them, then end users can extend it. They don't have to write __conform__ methods, and you don't have to write __adapt__ methods. You both use the declaration API, unless you need to do something that's very unusual indeed. The declaration API (all the functions named 'declareSomething' or 'adviseSomething') is the main addition to PEP 246 (aside from some minor refinements to the specification of adapt()). That is where I'm most interested in third-party critique, especially from developers of other interface systems than the ones I've already written adapters for, and for that matter from the Zope and Twisted developers for their perspective on whether PyProtocols' architecture would be reusable for them to support additional capabilities in their interface types (since PyProtocols declaration API is a superset of the declaration facilities provided for Zope and Twisted interfaces currently).
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4