0

I have several servers (Linux, various distributions managed by different people) which logs I would like to centralize in splunk>. The logs are gathered in /var/logs but the sources either update them directly (Apache for instance) or via rsyslog. In other words I can assume that the logs will be in on place but the way they appear there is not that defined (and changes between servers).

I am therefore looking for a way to handle /var/logs as a whole by generating a daily delta which I would then send to splunk>. I can write a script which would do such things (parse the tree, gather the files, add to an archive, zero them, etc.) but I am sure this problem has already be resolved in a better way (something along the lines of logrotate but for a whole directory)

(Note: following up on comments I want to stress the fact that I do not have control nor knowledge about the files which will be created in /var/log. Specifically I do not want to rely on solutions which require me to configure the handlong log file by log file)

WoJ
  • 3,365
  • 8
  • 46
  • 75
  • 2
    Why not use rsyslog (or similar) to send all the logs to a centralised log server ? – user9517 Jun 13 '14 at 11:32
  • @Iain: that would have been the easiest solution (and directly implemented in splunk from the receiving side) but all of the logs are not created by `rsyslog`. Some are updated by the application itself (like the example I gave - Apache) and I also cannot assume that I will know all the services which log to a file in `/var/log`. – WoJ Jun 13 '14 at 11:36
  • http://serverfault.com/questions/550070/send-apache-access-logs-to-syslog – user9517 Jun 13 '14 at 11:38
  • 1
    How about using Splunk, we do, it's change the way we work. – Chopper3 Jun 13 '14 at 11:39
  • http://www.rsyslog.com/doc/imfile.html – user9517 Jun 13 '14 at 11:45
  • Thanks for the comments, but as mentioned in the comment of the first answer I do not know which files will be present in `/var/log` - this is why I need to handle it as a whole. I will stress and highlight this in my question. – WoJ Jun 13 '14 at 12:16
  • 2
    So the first problem is that you don't know which processes are creating logs on your server - that seems a fairly urgent issue to me. – Jenny D Jun 13 '14 at 12:25
  • @JennyD: no, this is not a fairly urgent issue. The systems are handled by various people (they do know what will appear), the servers are peripheral helpers, not key servers etc. The background story is long but the TLDR; version is that it would be good to be able to feed these various logs to splunk> in a reasonably simple way, with the constraints mentioned in my question. – WoJ Jun 13 '14 at 12:29
  • 1
    @WoJ I've been in that exact case. I set up a centralized log server and left it up to the people handling the systems to make sure that any log they created was configured to be forwardet to the log server. This is a policy question, not a technical one. – Jenny D Jun 13 '14 at 12:33
  • @JennyD: I do not know about your case but mine, details aside, is that I need to have these logs centralized without relying on the admins to always configure the forwarding - and a technical solution to that. – WoJ Jun 13 '14 at 12:35

2 Answers2

3

You should be able to do this by using rsyslog on your systems to send all the logs to a centralised log server.

  • For applications that log via syslog this is fairly straightforward.
  • For applications that log directly to files, rsyslog provides a Text File Input Monitor module that sends lines from a text file to rsyslog for processing.

Regarding logrotate, it's fairly straightforward to configure but you'll have to analyse each log file and configure logrotate appropriately.

user9517
  • 114,104
  • 20
  • 206
  • 289
  • Thanks - but the Text File Input Monitor module forces me to handle each file separately. I do not know which files will be created, this is why I wanted to handle `/var/log` as a whole (including all the changes: files appearing and disappearing) – WoJ Jun 13 '14 at 12:14
  • I will end up building a wrapper around text File Input Monitor to dynamically create configuration entries for each non-rsyslogd-managed file. Thanks! – WoJ Jun 13 '14 at 13:49
1

I'm using splunk on my work to handle a logs from a bunch of servers, some linux and some windows

Highly advise you to look at splunk universal forwarder. With it you can choose, what logs to send and basically you can create any scenario on logs treatment, It's way easier to tune, then rsyslog, there is a deployment server, that can help you to manage forwarders later on,

Look at the splunk web-site, there is a simple explanation how to start use it

Hope it will be helpful to you,