Good day!
I have a FreeBSD 11.1 + Rsyslog + MySQL installation. And I need a web interface for log view. But almost all manuals suggests loganalyser for this. But it doesn't exist in ports. Maybe deleted some time ago?
So what software I can use for centralize log view through web interface?
Echofish of course
https://echothrust.github.io/echofish/
if you want to go the that route but IMHO you are doing everything wrong. Where to begin? Firstly rsyslog is Linux login daemon which should not be used on other OSs. Personally I run syslog-ng server but for small installations OpenBSD's syslog server is really sufficient. FreeBSD syslog sucks both as a client and particularly as a server. Secondly SQL database is ill-suited to keep the log data which is frequently recorded. Just for the sake of compassion with ELK mentioned by
SirDice notice that Logstash is used data collection so rsyslog, syslog-ng, syslog are not necessary. ELK also uses Elasticsearch cluster to store log files. Finally Kibana is used to display data. Whole ELK stuff is pretty nifty thing to impress upper management and keep the job but it is useless.
https://wikitech.wikimedia.org/wiki/Logstash
Namely the real problem is not collecting the log files (syslog-ng) does a good job but parsing them and making the sense of data. The only product that addresses this problem seriously is Splunk
https://www.splunk.com/
As a matter of disclosure I will say that bunch of my former classmates from PhD math studies work there. Splunk cost lot of money so what can a person with 0 budget do short of becoming an expert in anomaly detection (Machine Learning) and writing its own application? Not much if you ask me.
You can look into ELK but you will find out that you have to teach it how to parse the logs to
extract any useful information. You can try fluentd and you will end up with the same shit. Quickly you will come to the conclusion that if you have to teach the tool how to read a log, you might just write your own
damn thing. You can use a perl script to mask out the stuff you don't care about, keeping track of how many times they were seen. You get a report of new log lines not in the ignore list and how many times they were seen (there is some scrubbing of unique data like PIDs and session IDs so I get a useful count), and any ignored lines that didn't fall into the expected range of counts.
Pushing that into a database for historic info and visualization wouldn't be too hard.
Check out the following links and the ideas found here:
http://undeadly.org/cgi?action=article&sid=20091215120606
http://www.ranum.com/security/computer_security/papers/ai/
And much more info:
http://www.ranum.com/security/computer_security/archives/logging-notes.pdf
Personally my interest is in UNIX system logs and IDS/IPS events, with full packet captures. The simplest form I have used is with automated processing of IDS events, firewall logs, and full pcap data as static files shared on a webserver. I would be interested in a CLI log viewer with ncurses, or scripted output (maybe using pipecut to process data as you search for what you want in the simplest UNIX way).