perl -d:DProf

I have been running a simple search engine tool on UKlug and I have noticed that things are getting a bit sluggish due to the amount of jobs in the database (300K+). Its not an astronomical amount but the method I am using is starting to strain against the hardware. I am going to rewrite it (article for another day) but for now is there anything I could do to speed things up?
When something just isn’t running as fast as expected then its time to break out the Perl profiler. The search engine has a mod_perl front end which is the first pain in the ass. I am fully conversant with the mod_perl performance tuning guide but trying to profile mod_perl is not as straight forward as the guide suggests.
Luckily I always use modules for the bulk of the work on any cgi scripts so I created a mock script to call out to the modules and then ran the profiler against this as a stand alone program.

]$ perl -d:DProf mock_script.pl

This confirmed my suspicion that the main problem was database access. There are a couple of Perl functions that could be faster but tuning these when the database is such a bottle neck would be an exercise in futility. I know I have tuned the database to a point where it is not going to get any faster so everything is pointing at either a faster machine or a rewrite.
It just so happens that I have a faster machine to hand so running the offending SQL with timings on I got the following times.
Slow machine:
Time: 3003.434 ms
Fast Machine:
Time: 1683.190 ms
This is a marked improvement over the slower machine but it still a hellish time to wait for some results that have yet to be displayed. So how can I reduce the time taken to retrieve the results? More to follow.

Leave a Reply

Your email address will not be published. Required fields are marked *