database statistics and reporting with or without parsing of built in logs

1

I prefer a generic statistical utility / plugin / application, but don't limit myself as specific might do well too.

I want to see a way by which I can see query statistics for ALL queries.

e.g displaying and sorting or filtering

  1. all queries ever, or
  2. all queries made on
  3. all or specific databases, or
  4. on specific tables.
  5. or on specific columns or
  6. with specific query regex/params; sorting by most made queries count, most long queries in context to time taken
  7. Some graphical charts would be a plus though. Are there any applications or built in features for such things?

To do that, you'll need to log all queries, then run / parse those logs through some sort of analyzer. This might need a custom build log parser application.

The current server is a high volume server: 500 - 3000 queries per second. The log might be stored to some other HDD.

Questions:

  1. Are there any pre-built applications for such things?

  2. Will it slow down the regular queries? If yes, how much, roughly?

  3. Is there a way to see live queries and do the logging else where, or not do at all and make a live stats report by some other app?

  4. I might not parse the logs with a database specific statistic maker but would just get the log file and make my own parser application and make stats and graphical chart with those logs. May be put those parsed states in another DB with values of queries + time taken to exec + date of query, then generate reports from it later. Is this a good idea?

  5. I didn't notice any "live" way of doing all that without logging it to a file though.

  6. Any pre-build app for mysql and postgres? That's what I use.

  7. Any strategy recommendations on doing all that? I just started to think "how to do performance analysis in detail".

Mab

Posted 2013-10-27T17:29:30.133

Reputation: 11

Answers

0

This is a complex problem, and there is probably no easy answer.

One approach I have used is to use pg_logforward to intercept logs, forward them to a custom daemon, which writes normalized log entries into another PostgreSQL database. (Hadoop etc. would also be possible.) Then run ad hoc queries against that database, or build yourself a pretty frontend. It's not pre-built by any means, but it's the most powerful approach at the moment, IMO.

Peter Eisentraut

Posted 2013-10-27T17:29:30.133

Reputation: 6 330