[Open-scap] Let me poll the community

Geoffry Roberts geoffry.roberts at hedronanalytics.com
Wed May 16 15:14:39 UTC 2018


Fen,

Thanks for sharing your situation.

You say you are dealing with ten instances.  I had imagined more, but no
mind ten is enough.  I do have a solution in mind. What I do is convert the
SCAP result sets into RDF then load them into a database (I use MapReduce
for this.) so I can work on them in aggregate.  Currently, I frontend the
database with MatLab. One can view the query results using the graphics
MatLab currently supports, or one can export, and view using the likes of
Spotfire or Tableau.

As soon as I can, I intend to see what I can do processing these query
results with say Tensor Flow,  but I'm getting ahead of myself.

I did want to poll the community and see what the interest level might be.

My background is in health information specifically, health systems
interoperability.  The solution I outlined above, grew out of efforts to
find a better way of rummaging around in large health datasets--lots of
hefty XML files.  I got into SCAP because the need for better security in
health systems and bingo! there again were quantities of sizable XML files.

I have a few people who have expressed an interest in working on this.
They like the Tensor Flow part.  We'll see.

On Tue, May 15, 2018 at 1:06 PM, Fen Labalme <fen.labalme at civicactions.com>
wrote:

> Totally agree. I have two federal clients each with about ten instances to
> scan. Easy to read results by hand, but I would like two things: 1) a
> summary something like:
>
> # Results from scanning instances on 20180208
> Using OpenSCAP command line tool (oscap) 1.2.16
> Using profile xccdf_org.ssgproject.content_profile_stig-rhel7-disa
> From XCCDF data stream file ssg-rhel7-ds.xml version v0.1.37
> ## Instance1
> - 30 high severity controls. OpenSCAP says 23 passing,  4 failing, and  3
> notchecked.
> - 177 med severity controls. OpenSCAP says 136 passing, 11 failing, and 30
> notchecked.
> - 32  low severity controls. OpenSCAP says 28 passing,  4 failing, and  0
> notchecked.
> ## Instance2...
>
> and 2) a feed into e.g. Graylog so I can get nice graphs of trends as the
> number of fails shrink ;) over time (or a warning if they go up).
>
> Trevor suggested some XSLT here:  https://lists.fedorahosted.
> org/archives/list/scap-security-guide at lists.fedorahosted.org/message/
> H7GOMUHF3LWX5ZSC5JCONLBAM6HCVYG4/
>
> And the summary above was created using XSLT from an old GovReady script
> here:  https://github.com/GovReady/govready/blob/master/govready#L501
>
> I've got ansible-based automation in place that creates the scans now,
> just want to feed them to Graylog. It's not "large volumes" but enough
> that I want more automation.
>
> Do you have any suggestions? (As previously mentioned, I'm not facile in
> XSLT).
>
> Thanks,
> =Fen
>
>
> On Mon, May 14, 2018 at 7:26 PM, Geoffry Roberts <geoffry.roberts@
> hedronanalytics.com> wrote:
>
>> A few weeks ago I saw a thread or two where some were seeking a means of
>> analyzing large volumes of SCAP result sets.
>>
>> I'd like to ask the community as to what extent this represents a
>> problem?
>>
>> People I know who are using SCAP are scanning on a small scale and can
>> read the results manually.  It makes sense to me that as volumes rise some
>> form automation would be in order.
>>
>> What say ye?
>>
>>
>>
>> _______________________________________________
>> Open-scap-list mailing list
>> Open-scap-list at redhat.com
>> https://www.redhat.com/mailman/listinfo/open-scap-list
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listman.redhat.com/archives/open-scap-list/attachments/20180516/7151cf3f/attachment.htm>


More information about the Open-scap-list mailing list