New draft standards

Steve Grubb sgrubb at redhat.com
Sat Dec 26 16:38:58 UTC 2015


On Thursday, December 24, 2015 09:44:00 AM Burn Alting wrote:
> On Fri, 2015-12-18 at 16:12 +1100, Burn Alting wrote:
> > On Tue, 2015-12-15 at 08:46 -0500, Steve Grubb wrote:
> > > On Tuesday, December 15, 2015 09:12:54 AM Burn Alting wrote:
> > > > I use a proprietary ELK-like system based on ausearch's -i option. I
> > > > would
> > > > like to see some variant outputs from ausearch that "packages" events
> > > > into
> > > > parse-friendly formats (json, xml) that also incorporates the local
> > > > transformations Steve proposes. I believe this would be the most
> > > > generic
> > > > solution to support centralised log management.
> > > > 
> > > > I am travelling now, but can write up a specification for review next
> > > > week.
> > > 
> > > Yes, please do send something to the mail list for people to look at and
> > > comment on.
> > 
> > All,
> > 
> > To reiterate, my need is to generate easy to parse events over which
> > local interpretation has been applied, retaining raw input to the some
> > of the interpretations if required. I want to then transmit the complete
> > interpreted event to my central event repository.
> > 
> > My proposal is that ausearch gains the following 'interpreted output'
> > options
> > 
> >         --Xo plain|json|xml
> >         generate plain (cf --interpret), xml or json formatted events
> >         
> >         --Xr key_a'+'key_b'+'key_c
> >         include raw value for given keys using the the new key
> >         __r_key_a, __r_key_b, etc. The special key __all__ is
> >         interpreted to retain the complete raw record. If the raw value
> >         has no interpreted value, then we will end up with two keys with
> >         the same value.
> > 
> > I have attached the XSD from which the XML and JSON formats could be
> > defined.
> 
> Is there any interest in this? If is was available, would people make
> use of it?

I'm somewhat interested in this. I'm just not sure where the best place to do 
all this is. Should it be in ausearch? Should it be in auditd? Should it be in 
the remote logging plugin? Should audit utilities be modified to accept this 
new form of input?

Ultimately, I am wanting to be able to reduce the audit records down to 
English sentences something like this:

On 1-node at 2-time 3-subj 4-acting-as 5-results 6-action 7-what 8-using

Which maps to
1) node
2) time
3) auid, failed logins=remote system
4) uid (only when uid != auid) or role (when not unconfined_t)
5) res - successfully / failed to
6) op, syscall, type, key - requires per type classification
7) path,system
8) exe,comm

So, what I was thinking about is looking at the whole event and picking out 
the node, time, subject, object, action, and results. The subject and object 
would be further broken down to primary identity, secondary identity, and 
attributes. I was planning to put this into an extension of auparse so that 
events could be dumped out using the classification system.

My thoughts had been to organize the event data to support something along 
these lines. I want to get the events easier to understand.

 
> If so I can modify ausearch and generate a proposed patch over the
> Christmas break.

At the moment, I'm looking at auditd performance improvements to prepare for 
the enrichment of audit records. You're one step ahead of where I am. I hope 
to finish this performance work soon so that I can start thinking about the 
problem you are.  :-)

Of course...we could look at the auditd performance issues together and then 
move on to event formatting.

-Steve




More information about the Linux-audit mailing list