Anthony: >> Sure - I get that. There's a couple of reasons for me doing it. First is gpg >> signing the release files, which has to happen on my local machine. There's >> also the variation in who actually builds the releases; at least one of the >> Mac builds was done by Bob I. But there could be ways around this. I don't >> want to have to ensure every builder has scp scp or scp access? the former isn't much of a requirement, really. I would be surprised to find a developer that didn't already have it on all machines, or knew how to run it off the internet (type "putty download" into google and click "I feel lucky"). >> all "go live" at once. A while back, the Mac installer would follow up "some >> time" after the Windows and source builds. Every release, I'd get emails >> saying "where's the mac build?!" that's a worthwhile goal, now that we have plenty of build volunteers, but I think that could be solved simply by delaying the *public* announcement until everything is in place. this is open source, after all - we don't need to hide how we're doing things. Bob Ippolito wrote: > With most consumer connections it's a lot faster to download than to > upload. Perhaps it would save you a few minutes if the contributors > uploaded directly to the destination (or to some other fast server) > and you could download and sign it, rather than having to scp it back > up somewhere from your home connection. that's another interesting advantage of a more asynchronous release process. if we can reduce the costly parts to a few 8-minute slots, it's a lot easier for any busy developer to find the time, even on a hectic day. and if we can dis- tribute those slots, things will be even easier. </F>
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4