[ic] A new User Tacking - RFC

Jon Jensen jon at endpoint.com
Fri Mar 10 12:56:32 EST 2006


On Fri, 10 Mar 2006, Paul Jordan wrote:

> Is doing a UPDATE query on most every page a bad idea? If it is, then is 
> it better or worse than the logging going on on most pages with the 
> current "usertrack" method.

Database writes are slower than log file appends.

It may still be fast enough, depending on your traffic level, whether or 
not you have a dedicated logging database server, etc. A really important 
question is concurrency. If you're using MySQL with MyISAM tables, any 
write locks the entire table. So you're likely to see things bog down 
under load. With PostgreSQL and possibly with MySQL and InnoDB, you should 
be able to do frequent INSERT operations cheaply, but UPDATE will be 
slower and may block for that row.

Another problem is that if you have any database trouble, your logging 
won't work, either. It's nice to see database errors logged to a text file 
rather than causing an error themselves. :)

> My alternative to this is to have a cron once a night go through the new 
> files in orders/session/* and grab the treasure trove of information 
> there.

That would be a nice way of doing it. You could do that right before your 
regularly scheduled purge of old sessions.

> Now, doing this all seems fairly easy, which makes me think someone has
> already done this, or tried it. I am looking for your experience and to
> point out if this might be a bad idea. I know it will work, I just don't
> know if it's a bad idea :-)

I generally reserve database-level tracking for things like affiliate 
hits, cart-related stuff, etc. that isn't every single page on the site. 
You can always parse your text file logs and put them in a database later, 
and then you avoid the concurrency hit, and can do it at a non-busy time 
or on a separate internal log-parsing machine, etc.

HTH,
Jon

--
Jon Jensen
End Point Corporation
http://www.endpoint.com/


More information about the interchange-users mailing list