[ic] Most Common Performance Issues

interchange-users@icdevgroup.org interchange-users@icdevgroup.org
Mon Aug 26 12:30:02 2002


Quoting Michael Stearne (mstearne@entermix.com):
> Barry Treahy, Jr. wrote:
> 
>  > Michael Stearne wrote:
>  >
>  >> Our site is running IC on a DP 800Mhz Redhat box with 512MB of RAM
>  >> with SCSI disk.  This machine runs only IC.  The site is based off of
>  >> the foundation sample site and contains about 300 product using the
>  >> standard (non-RDBMS) database.  We have seen decreasing performance
>  >> over the last 6 months the site has been up.  There is up to 4 IC
>  >> processes running at a time because of traffic.  Each page on the
>  >> site takes from 2-10 secs to generate.  This is across all platforms
>  >> and browsers at LAN speed. If an IC process has the machine to itself
>  >> (only 1 user on the site), that process will take 97% of the CPU.
>  >> While this is understandable, even when there is 1 process, it still
>  >> takes ~4 seconds to generate a page.  I don't know how a machine as
>  >> powerful can get floored by 4 concurrent users.
>  >>
>  >> Currently we run expireall each day.  This seems to help a little.
>  >>
>  >> What else could I do (configuration, cron, etc) to work on performance?
>  >
>  >
>  > Hi Michael,
>  >
>  > I had many of the defaults, the random and cross components for
>  > example, which would drag down our development system.  Simply put,
>  > those two functions really caused IC to generate a heavy CPU loaded.
>  > I can't say how much I should blame it on the fact that the DB's are
>  > in DBM rather than SQL, but I would have expected to see more of an
>  > I/O restriction with a poorer database format, not a CPU drain.
>  >
>  > Since I too am coming up to speed, perhaps some of the Perl and IC
>  > wizards could shed some light on that?
>  >
>  > Barry
>  >
> I use the random component on many pages.  I wonder if that could be an
> issue, I kind of need to use the random component, though so I hope
> there is a work around.
> 

That is controlled as much as anything by the size *and indexing* of the
database. If you don't have an indexed database, you will see speed
problems. The cross component should show no special loading problems.

Random should be reasonable as long as your database is not too large.
If it is large, then it will be an unacceptable drain and you should
limit your "random" products with a field like "select_as_random" and
then change the search component in the "random" component to only
select those items as candidates via query or coordinated search.

But the decreasing performance thing is most likely, in Interchange,
to be associated with size of a directory. If you have directories
with thousands of file entries, and the program frequently accesses
them, then lookup times kill you.

Prefork mode should be a must for sites with any appreciable amount
of traffic.

-- 
Mike Heins
Perusion -- Expert Interchange Consulting    http://www.perusion.com/
phone +1.513.523.7621      <mike@perusion.com>

Software axiom: Lack of speed kills.