[ic] Large file uploads failing

Gert van der Spoel gert at 3edge.com
Tue Aug 12 19:29:52 UTC 2008


> -----Original Message-----
> From: interchange-users-bounces at icdevgroup.org [mailto:interchange-
> users-bounces at icdevgroup.org] On Behalf Of jimbo
> Sent: dinsdag 12 augustus 2008 22:11
> To: interchange-users at icdevgroup.org
> Subject: Re: [ic] Large file uploads failing
> 
> Gert van der Spoel wrote:
> 
> > Not totally sure but he is probably referring to the HammerLock
> directive:
> > http://www.interchange.rtfm.info/icdocs/config/HammerLock.html
> 
> I've monitored the upload using tcpdump and can confirm the upload
> completes, as far as Apache is concerned. Once the interchange cgi link
> program is invoked the interchange process goes into 99% cpu mode, the
> memory bounces around like a yo-yo, eventually fails. Raising the
> Apache
> TimeOut directive value has no real benefit, as eventually the process
> dies and I get a 500 error in any case.
> 
> I know this is not a network issue as I've set up a test bed on the
> machine and all uploads are done using a small Perl script that
> generates random large uploads.
> 
> > But you do not get any error messages/failures when doing the upload?
> 
> There are no messages in either the catalog error log or the Apache
> error log until the process fails and Apache logs a premature end of
> script headers.

The file you are trying to upload is supposed to be loaded into a mysql
database?
Or postgres, or gdbm?  Any differences in this area depending on the
database that it has a different behavior?

Setting the HammerLock to 10 minutes or more, does that give a different
time until the premature end of script headers popup?

CU,

Gert




More information about the interchange-users mailing list