[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]

Re: [K12OSN] web filtering with SquidGuard



On 2/6/06, Trond Mæhlum <trond maehlum net> wrote:

> We are blocking these sites as we discover them, but there's just too
> many of them... Our installation of SquidGuard is installed on a Debian
> Sarge machine. It doesn't auto-update it's databases in any way as far
> as I know. Is there some way of updating SquidGuards database of blocked
> urls and domains?
There are ways.  You can download the latest blacklist from MESD. 
Eric hosts that list.  I think it's compiled from a couple different
lists.

I did a google for bypassing proxys.  There are a couple of sites that
have lists of proxy bypass pages.   Grabbing domains from those such
pages is a good way of adding to sites to block.

One site offered up a perl script for creating your own.   What I did
was add a regular expression list to what squidguard is blocking.  
The contents of the list have 1 line per script.  I used essentially
the common file name that seemed to be used for the proxy-bypass
software.
Here's what I have:

(nph-proxy.[cgi|pl])
(nph-proxya.[cgi|pl])
(nph-proxyb.[cgi|pl])
(nph-surf.[cgi|pl])
(nph-one.[cgi|pl])
(nph-teste.[cgi|pl])

I think I did it this way so if someone had a legit script called
nph-something.pl, it wouldn't be blocked.
I have not had any false positives reported with this list.

Roger


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]