2

Just downloaded a trial of Splunk, and am thinking of using it to monitor a Windows server base, with the associated apps, e.g.:

o Windows event logs / WMI queries (for Windows O/S, SQL Server, Exchange, etc)
o Apache/Jboss/Tomcat logs
o Oracle listener/db/etc logs
o Home-grown log files

Any short-but-sweet advice or gotchas?

Simon Catlin
  • 5,222
  • 3
  • 16
  • 20

3 Answers3

2

Once you need it.. then you have to pay for it :P
Its what got me..

However its a really good application/tool..

Overall, you will probably be suprised how much you will log ( i was )...

Arenstar
  • 3,592
  • 2
  • 24
  • 34
  • Haven't even looked at the costings yet... If I can convince myself its a useful tool, I'll then concern my self about convincing 'da management'. – Simon Catlin Nov 14 '10 at 19:01
  • How many servers do you have?? Its quite expensive IMHO.. im often using Logzilla for simple syslog-stuff.. But its definitely not as complex as Splunk- it just depends if you need the features it provides – Arenstar Nov 14 '10 at 19:06
  • We have just under 250 servers; two thirds of which are virtualised. – Simon Catlin Nov 16 '10 at 22:32
  • Ohh ok ok, lots of logs then, sure you will need to pay for splunk.. As i mentioned, its worth the money if you use the features.. :) If you just need simple logging querying etc, syslog-ng -> mysql works too.. :D – Arenstar Nov 16 '10 at 22:41
  • Thanks for the info - it's appreciated. More trialling underway... – Simon Catlin Nov 18 '10 at 22:33
2

I started off by loading a day of data for one of our applications and spending a few weeks just coming up with questions about the data I wanted answered: how many transactions per second of a particular customer, how busy are the busiest times for different transaction types, how can I search the logs for SLA violations, stuff like that.

It's a bit surprising how easy a lot of things are to search, and the more I searched the more ideas I had for new searches. Before long you'll get quite a catalog of saved searches.

The gotcha that got me in the beginning is making sure the time and hostname data is correct at index time. Some of our custom logs were not timestamped in a friendly format and took a few iterations to get it indexed correctly. Be sure to index a few small samples first to ensure everything looks correct before indexing a large collection of logs.

But, yeah, just imagine the questions you want answered about your data.

Cakemox
  • 24,141
  • 6
  • 41
  • 67
  • Thanks for that. I am playing with one sample host at present. Interestingly, it's indexed as two different entities, one upper-case, one lower-case. Now that I know the host names are case-sensitive, I'll be sure to maintain a standard in future. – Simon Catlin Nov 14 '10 at 19:00
1

Splunk is great. Be SURE to know all the pricing before you get into it. It is really expensive.