Introducing audit-explorer

Steve Grubb sgrubb at redhat.com
Tue Jun 20 17:22:00 UTC 2017


Hello,

On Tuesday, June 20, 2017 12:28:16 PM EDT Vincas Dargis wrote:
> 2017.06.19 23:55, Steve Grubb rašė:
> > I have released the audit-explorer shiny app that I have been demo'ing
> > this spring:
> > 
> > https://github.com/stevegrubb/audit-explorer
> 
> Very nice, thanks for sharing!

Thanks.

> Now if we are talking about tools, are there somewhere (maybe in your shelf?
> :-) ) a conveniently configurable tool for generating daily plaintext (or
> HTML) reports, that could be sent via email from machine your are
> interested in?

I am working my way around to this. For one, its hard to imagine all the 
reports that might be of interest without overwhelming the report. If we could 
define a small list of what is expected or useful, then we can work towards 
this.

The aureport tool is good at doing summary reports. It's about 12 years old. 
There are newer technologies that might be better to use. For example, how 
would people like to see reporting in a Jupyter notebook? If you don't know 
what a Jupyter notebook is, then take a few minutes and google it. Or take a 
look here:

https://github.com/jupyter/jupyter/wiki/A-gallery-of-interesting-Jupyter-Notebooks


> For example, I had to build custom bash script at work, that uses ausearch,
> aureport and even grep (for AppArmor events since it has issues with it's
> audit messages) to aggregate most interesting audit records (for example,
> with -k apache_user_executed_binaries, non-root executed something as root,
> failed logins and such) and sends it via email every day.
> 
> Though it is not that complicated to fill your .sh with bunch of
> ausearch/aureport/grep calls, it feels like I'm reimplementing something...

Yes. Another way forward might be to use the CSV extraction options in 
ausearch and send that into a SQL database. Then any SQL reporting tool out 
there can be used.

I'm going to be starting the 2.8 development cycle in the near future. The 
goal for it is to get the remote logging better and continue improving the 
auparse_normalizer API. This will directly lead to more and better reporting 
options.

Things that may be possible:
push events in realtime to datalake such as kafka
push events in realtime to SQL database
demux events out of rsyslog and push into audit aggregation server
enhance collectors for various SIEMs

Of course to do any of this means I need participation and collaboration from 
the community. I can't guess what people need to hook up the audit trail to 
reporting tools.

-Steve





More information about the Linux-audit mailing list