[ic] IC not responding

John1 list_subscriber at yahoo.co.uk
Tue Nov 15 18:00:29 EST 2005


On Tuesday, November 15, 2005 2:33 PM, sandy at scotwebshops.com wrote:

> John1 wrote:
>
>> From reading around on the web, it would seem that MaxClients 150
>> ought to be enough even for moderately busy web servers.  Does anyone
>> else have a view on this?  Ron, I can understand your idea of raising
>> MaxClients to help the problem, (although I can't try this myself as
>> my Virtual Server environment restricts the number of concurrent
>> processes I can run).
>>
>> However, I am sure you will agree that raising MaxClients is really a
>> work around rather than a solution to the underlying problem.  I can
>> only presume that this problem of scripts looking for security holes
>> hammering websites must be a problem common to every website on the
>> Internet, and therefore presumably every website is bumping up
>> against MaxClients on a regular basis!?  I presume that most
>> websites come back to life as soon as the script robot goes away,
>> but even so, all websites will be brought down by this sort of robot
>> for the duration of its visit.  So, to get to my point, are there
>> any easy ways to stop these script robots racking up the MaxClients
>> count in the first place.  i.e. is there anything in Apache or
>> Apache modules that can be used to spot these "robot attacks" and
>> drop their requests before they cause Apache to spawn loads of
>> processes?  I'd be grateful for any ideas, especially as upping the
>> MaxClients setting is not really an option for me.  Thanks
>
>
> Try mod_evasive, or use the robot blocking functionality in
> interchange, in conjunction with iptables, to drop packets from the
> malicious IP. Most common robots should not kill your webserver.
>
Thanks Sandy mod_evasive looks just the job :-)  I will look into 
implementing as soon as I get chance.

In the meantime, I would like to implement the Interchange lockout option 
immediately.  I have only just realised that although I have defined a 
"RobotLimit 100" I have not defined a lockout command, so bad robots are not 
actually being locked out.

I am not up on iptables, but from what I can see the lockout command I 
should use is:

iptables -I INPUT -s %s -j DROP

Is this correct?

The problem I see with this is that the IP address is then *permanently* 
locked out.  What is the best way to lockout IP addresses for a given 
timeframe, and then let them back in again?  I would be really grateful if 
anyone has a script to do this that they wouldn't mind sharing?  Thanks 


		
___________________________________________________________ 
How much free photo storage do you get? Store your holiday 
snaps for FREE with Yahoo! Photos http://uk.photos.yahoo.com


More information about the interchange-users mailing list