Re: End result of Wiki-ish system design + final question

[prev] [thread] [next] [Date index for 2005/02/14]

From: Martin Moss
Subject: Re: End result of Wiki-ish system design + final question
Date: 16:40 on 14 Feb 2005
Hi Ben,

I have a few thoughts on this... In my experience
writing a daemon process is easy (well ish) but then
configuring your system to manage them (have they
died, have they crashed etc..) is more trouble than
its worth.

Is it possible to use some kind of cronjob based
system, which runs a script every minute, picks up a
list of things to process from say a database (which
your Handler writes to, to communicate to the backend
processes)... Of course a lockfile mechanism would
need to be used to ensure only one cronjob script runs
at any one time...

Regards

Marty


 --- ben syverson <ben@xxxxxxxxxxx.xxx> wrote: 
> 
> On Feb 12, 2005, at 9:44 PM, ben syverson wrote:
> 
> > Maybe the solution is to have 5 or 10 perl
> processes fire up and stay 
> > open as daemons, processing these background regen
> requests?
> 
> After testing this, that looks like the way to go.
> The regen code now 
> lives as a pre-forking server, accepting connections
> via either TCP or 
> UN*X socket. This opens up the interesting
> possibility of dedicating a 
> server (or multiple servers) just to the
> regeneration process.
> 
> - ben
> 
>  


	
	
		
___________________________________________________________ 
ALL-NEW Yahoo! Messenger - all new features - even more fun! http://uk.messenger.yahoo.com

Re: End result of Wiki-ish system design + final question
Martin Moss 16:40 on 14 Feb 2005

Generated at 17:31 on 15 Feb 2005 by mariachi v0.52