User login

You are here

Cheap and easy Drupal MySQL query log analyses and profiling

One thing that will definately improve in Drupal 6.x and onwards is the performance relating to database queries, It is not uncommon for a site which has the usual formula of a few views generated blocks, translation module and a couple of other miscellaneous views entries to suddenly push your site into the 200 to 600 database queries just to generate *one page*.

Fortunately Drupal is continously improving with its caching methods and data abastraction models so it doesnt always have to hit the DB for something simple, or get it to return multiple items in one hit, but i digress.

Heres a simple way to analyse what is hitting your database, this assumes you have the log parameter set in your my.cnf , ie

log             = /var/log/mysql/mysql-queries.log

and the gnu-utils command line tools (most Linux and os x installs?), remember - having the log enable in your mysql configuration will slow performance, but right now we dont care because we just want to see what is hitting the database.

Ok so first off, reset your mysql log file back to 0 bytes

> /var/log/mysql/mysql-queries.log

In your browser, load one page of your drupal site

Now back in your terminal, make sure you are in your mysql log directory ie /var/log/mysql, and run

grep SELECT mysql-queries.log |sed 's/.*SELECT/SELECT/g'|sed 's/WHERE.*$//g'|sort|uniq -c|sort -n|tail -n 20

You will see the top 20 mysql hits.

eight:/var/log/mysql# grep SELECT mysql-queries.log |sed 's/.*SELECT/SELECT/g'|sed 's/WHERE.*$//g'|sort|uniq -c|sort -n|tail -n 20

      7 SELECT * FROM view_view
      7 SELECT mid, block_title, page_title, name FROM members_page
      7 SELECT pages, visibility FROM blocks
      7 SELECT vid, block_title, page_title, name FROM view_view
      8 SELECT data, created, headers, expire FROM cache_filter
      9 SELECT data, created, headers, expire FROM cache_views
      9 SELECT * FROM term_data
      9 SELECT path FROM taxonomy_redirect
      9 SELECT v.*, n.type FROM vocabulary v LEFT JOIN vocabulary_node_types n ON v.vid = n.vid ORDER BY v.weight,
     10 SELECT b.buddy,, u.mail, u.uid FROM buddylist b
     10 SELECT, f.type, v.value FROM profile_fields f INNER JOIN profile_values v ON f.fid = v.fid
     10 SELECT * FROM location
     10 SELECT * FROM users u
     10 SELECT latitude,longitude FROM location
     11 SELECT r.rid, FROM role r INNER JOIN users_roles ur ON ur.rid = r.rid
     18 SELECT data, created, headers, expire FROM cache
     57 SELECT *
     70 SELECT r.rid, FROM role r
     94 SELECT dst FROM url_alias
    141 SELECT s.lid, t.translation FROM locales_source s INNER JOIN locales_target t ON s.lid = t.lid

What does that command do? It is split into pipes using a |

grep SELECT mysql-queries.log get all the SELECTS
sed 's/.*SELECT/SELECT/g' strip any leading text whitespace junk etc
sed 's/WHERE.*$//g' string anything after "WHERE" this is so it groups by what the general query is, not so much the variables in the query, this helps to understand more about what tables are being hit, you can remove this item for more verbose results.
sort sorts those results again
uniq -c tells us how many non-uniq rows there are with a count prepended
sort -n sorts them by the integer at the start of the string
tail -n 20 shows the last 20 results

So you can see the last 3 hits are for - locales module (has lots of improvements in 6.x ), roles and url_alias

There are probably lots of other MySQL log analyses tools out there, this is just an easy one that you can execute on most servers.

Another idea is that you can use this to help fine-tune your mysql table column indexes based on what is being hit.