rpms/squidGuard/devel squid-getlist.html, NONE, 1.1 squid-getlist.html.patch, NONE, 1.1 squidGuard-destdir.patch, NONE, 1.1 squidGuard-paths.patch, NONE, 1.1 squidGuard.logrotate, NONE, 1.1 squidGuard.spec, NONE, 1.1 squidguard-1.2.0-db4.patch, NONE, 1.1 .cvsignore, 1.1, 1.2 sources, 1.1, 1.2

Oliver Falk (oliver) fedora-extras-commits at redhat.com
Tue Sep 6 10:50:43 UTC 2005


Author: oliver

Update of /cvs/extras/rpms/squidGuard/devel
In directory cvs-int.fedora.redhat.com:/tmp/cvs-serv2253/devel

Modified Files:
	.cvsignore sources 
Added Files:
	squid-getlist.html squid-getlist.html.patch 
	squidGuard-destdir.patch squidGuard-paths.patch 
	squidGuard.logrotate squidGuard.spec 
	squidguard-1.2.0-db4.patch 
Log Message:
auto-import squidGuard-1.2.0-11 on branch devel from squidGuard-1.2.0-11.src.rpm


--- NEW FILE squid-getlist.html ---
<title>Auto Squidguard Filter Update</title>
<pre>
#!/bin/bash
#-----------------------------------------------------------------
#
# -- Changed - 15 June 2003
# --    Change 
#       "squidguard -C domains" to 
#       "squidguard -C all"
#       old way wasn't doing anything useful
#
#Below is the script that I use. You need to edit the first part
#to tell where your squid and squidguard binaries are, and also
#where your squidguard #blacklists are. This has the recent change
#of address for the squidguard.org-supplied blacklists.
#You will need to obtain and install the utility "wget" for this
#script to work. It is available as an RPM for RedHat. WGet is used
#to get the files. You can #use scripted ftp instead, but it's much
#more of a pain and less reliable.
#-----------------------------------------------------------------
# --------------------------------------------------------------
# Script to Update squidguard Blacklists
# Rick at Matthews.net with mods by morris at maynidea.com
# Last updated 05/31/2003
#
# This script downloads blacklists from two sites, merges and
# de-dupes them, then makes local changes (+/-). It does this
# in all of the categories (except porn) using the standard
# squidguard .diff files. The porn directory is handled
# differently, in part because of the large volume of changes.
#
# The user maintains local changes to the porn category in the
# files < domains_diff.local> and <urls_diff.local>. These files
# use the standard squidguard .diff file format:
# +domain_A.com
# -domain_B.com
#		       SquidGuard is a free CIPA Compliant filter
#                                
#                                current updater at PAISD.net 
#                               -Leif Johnson Feb. 4, 2004 leif at paisd.net
#			         Port Aransas H.S. Texas 78373
#
# setup as a cronjob to run on mobydick (squid server) once a week.
#
# --------------------------------------------------------------
# Set date format for naming files

DATE=`date +%Y-%m-%d`
YEAR=`date +%Y`
DATETIME=`date +"%a %d %h %Y %T %Z"`
UNIQUEDT=`date +"%Y%m%d%H%M%S"`
#UNIQUEDT="xxx"
WGOPTS=-nv
echo ${UNIQUEDT}

# Give location of squid and squidguard programs
SQUID=/usr/sbin/squid
SQUIDGUARD=/usr/local/bin/squidGuard
# --------------------------------------------------------------
# BLACKDIR should be set to equal the dbhome path declaration
# in your squidguard.conf file
BLACKDIR=/var/lib/squidguard/db
BLKDIRADLT=${BLACKDIR}/blacklists
PORN=${BLACKDIR}/blacklists/porn
ADULT=${BLACKDIR}/blacklists/adult
ADS=${BLACKDIR}/blacklists/ads

# --------------------------------------------------------------
# Create statistics file for porn directory
#
mkdir -p ${PORN}/stats
mkdir -p ${PORN}/archive
mkdir -p ${ADULT}/stats
mkdir -p ${ADULT}/archive

touch ${PORN}/stats/${UNIQUEDT}_stats
echo "Blacklist Line Counts for "${DATETIME} \
  >> ${PORN}/stats/${UNIQUEDT}_stats

# --------------------------------------------------------------
# Download the latest blacklist from the squidguard site
#
# Uses wget (http://wget.sunsite.dk/)
#
# Downloads the current blacklist tar.gz file into the
# ${BLACKDIR} directory (defined above) and will name the file
# uniquely with today's date: ${UNIQUEDT}_sg.tar.gz
#

wget ${WGOPTS} --output-document=${BLACKDIR}/${UNIQUEDT}_sg.tar.gz \
  http://ftp.teledanmark.no/pub/www/proxy/squidguard/contrib/blacklists.tar.gz

#
# Download the latest adult.tar.gz file from the
# Université Toulouse in France (Seems to be updated daily)
#
# see http://cri.univ-tlse1.fr/documentations/cache/squidguard_en.html
#
# Uses wget (http://wget.sunsite.dk/)
#
# Download the current adult.tar.gz file into the
# ${BLACKDIR} directory (defined above) and will name the file
# uniquely with today's date: ${UNIQUEDT}_fr.tar.gz
#
# If you are inside of a firewall you may need passive ftp.
# For passive ftp change the wget line below to read:
# wget --passive-ftp --output-document=${BLACKDIR}/${UNIQUEDT}_fr.tar.gz \
#

wget ${WGOPTS} --output-document=${BLACKDIR}/${UNIQUEDT}_fr.tar.gz \
  ftp://ftp.univ-tlse1.fr/pub/reseau/cache/squidguard_contrib/adult.tar.gz

# --------------------------------------------------------------
# Install the new squidguard blacklist
#
# Installs the blacklist under the ${BLACKDIR} directory:
#   ${BLACKDIR}/blacklists/ads
# ${BLACKDIR}/blacklists/aggressive
# ${BLACKDIR}/blacklists/audio-video
# ${BLACKDIR}/blacklists/drugs
# ${BLACKDIR}/blacklists/gambling
# ${BLACKDIR}/blacklists/hacking
# ${BLACKDIR}/blacklists/mail
# ${BLACKDIR}/blacklists/porn
# ${BLACKDIR}/blacklists/proxy
# ${BLACKDIR}/blacklists/violence
# ${BLACKDIR}/blacklists/warez
#

gunzip < ${BLACKDIR}/${UNIQUEDT}_sg.tar.gz | (cd ${BLACKDIR}; tar xvf -)

# --------------------------------------------------------------
# Remove the differential diff files that are supplied with the
# squidguard blacklists - they are simply clutter
#

rm -f ${PORN}/domains.*.diff
rm -f ${PORN}/urls.*.diff
rm -f ${ADS}/domains.*.diff
rm -f ${ADS}/urls.*.diff

# --------------------------------------------------------------
# Remove the comment lines from the ${PORN}/domains and
# ${PORN}/urls files so they can be sorted
#

grep -v -e '^#' ${PORN}/domains > ${PORN}/domains.temp
mv -f ${PORN}/domains.temp ${PORN}/domains

grep -v -e '^#' ${PORN}/urls > ${PORN}/urls.temp
mv -f ${PORN}/urls.temp ${PORN}/urls

# --------------------------------------------------------------
# Log item counts to porn statistics file
#

echo " " >> ${PORN}/stats/${UNIQUEDT}_stats
echo "Squidguard blacklist files as downloaded" \
  >> ${PORN}/stats/${UNIQUEDT}_stats
echo "----------------------------------------" \
  >> ${PORN}/stats/${UNIQUEDT}_stats

wc --lines ${PORN}/domains >> ${PORN}/stats/${UNIQUEDT}_stats
wc --lines ${PORN}/urls >> ${PORN}/stats/${UNIQUEDT}_stats

# --------------------------------------------------------------
# Install the new adult blacklist from Université Toulouse
#
# Installs the blacklist under the ${BLKDIRADLT} directory:
#   ${BLKDIRADLT}/adult
#
# Also cleans up any entries that begin with a dash (-)
#

# gunzip < ${BLACKDIR}/${UNIQUEDT}_fr.tar.gz | (cd ${BLACKDIR}; tar xvf -)
tar -C ${BLKDIRADLT} -xvzf ${BLACKDIR}/${UNIQUEDT}_fr.tar.gz
perl -pi -e "s#^\-##g" ${BLKDIRADLT}/adult/domains
perl -pi -e "s#^\-##g" ${BLKDIRADLT}/adult/urls

# --------------------------------------------------------------
# Save current files for subsequent processing
# Age older files
# The most recent files will always be domains.0 and urls.0
#

[ -f ${PORN}/archive/domains.-2 ] && mv -f ${PORN}/archive/domains.-2 ${PORN}/archive/domains.-3
[ -f ${PORN}/archive/urls.-2    ] && mv -f ${PORN}/archive/urls.-2 ${PORN}/archive/urls.-3
[ -f ${PORN}/archive/domains.-1 ] && mv -f ${PORN}/archive/domains.-1 ${PORN}/archive/domains.-2
[ -f ${PORN}/archive/urls.-1    ] && mv -f ${PORN}/archive/urls.-1 ${PORN}/archive/urls.-2
[ -f ${PORN}/archive/domains.0  ] && mv -f ${PORN}/archive/domains.0 ${PORN}/archive/domains.-1
[ -f ${PORN}/archive/urls.0     ] && mv -f ${PORN}/archive/urls.0 ${PORN}/archive/urls.-1
cp ${PORN}/domains ${PORN}/archive/domains.0
cp ${PORN}/urls ${PORN}/archive/urls.0

[ -f ${ADULT}/archive/domains.-2 ] && mv -f ${ADULT}/archive/domains.-2 ${ADULT}/archive/domains.-3
[ -f ${ADULT}/archive/urls.-2    ] && mv -f ${ADULT}/archive/urls.-2 ${ADULT}/archive/urls.-3
[ -f ${ADULT}/archive/domains.-1 ] && mv -f ${ADULT}/archive/domains.-1 ${ADULT}/archive/domains.-2
[ -f ${ADULT}/archive/urls.-1    ] && mv -f ${ADULT}/archive/urls.-1 ${ADULT}/archive/urls.-2
[ -f ${ADULT}/archive/domains.0  ] && mv -f ${ADULT}/archive/domains.0 ${ADULT}/archive/domains.-1
[ -f ${ADULT}/archive/urls.0     ] && mv -f ${ADULT}/archive/urls.0 ${ADULT}/archive/urls.-1
cp ${ADULT}/domains ${ADULT}/archive/domains.0
cp ${ADULT}/urls ${ADULT}/archive/urls.0

# --------------------------------------------------------------
# Log item counts to porn statistics file
#

echo " " >> ${PORN}/stats/${UNIQUEDT}_stats
echo "University Toulouse blacklist files as downloaded" \
  >> ${PORN}/stats/${UNIQUEDT}_stats
echo "-------------------------------------------------" \
  >> ${PORN}/stats/${UNIQUEDT}_stats

wc --lines ${ADULT}/domains >> ${PORN}/stats/${UNIQUEDT}_stats
wc --lines ${ADULT}/urls >> ${PORN}/stats/${UNIQUEDT}_stats

# --------------------------------------------------------------
# Sort and de-dupe the _diff.local files
#

cat ${PORN}/domains_diff.local | sort | uniq > ${PORN}/domains.temp
cat ${PORN}/urls_diff.local | sort | uniq > ${PORN}/urls.temp
mv -f ${PORN}/domains.temp ${PORN}/domains_diff.local
mv -f ${PORN}/urls.temp ${PORN}/urls_diff.local

# --------------------------------------------------------------
# Log item counts to porn statistics file
#

echo " " >> ${PORN}/stats/${UNIQUEDT}_stats
echo "Local _diff.local files" >> ${PORN}/stats/${UNIQUEDT}_stats
echo "-----------------------" >> ${PORN}/stats/${UNIQUEDT}_stats

wc --lines ${PORN}/domains_diff.local >> ${PORN}/stats/${UNIQUEDT}_stats
wc --lines ${PORN}/urls_diff.local >> ${PORN}/stats/${UNIQUEDT}_stats

# --------------------------------------------------------------
# Create to_add & to_delete files from the _diff.local files.
# The to_add files contain only the adds, and the to_delete files
# contain only the deletes.
# The _diff.local files are unchanged by this process.
#

grep -e '^+' ${PORN}/domains_diff.local > ${PORN}/domains.to_add
grep -e '^-' ${PORN}/domains_diff.local > ${PORN}/domains.to_delete
grep -e '^+' ${PORN}/urls_diff.local > ${PORN}/urls.to_add
grep -e '^-' ${PORN}/urls_diff.local > ${PORN}/urls.to_delete

# --------------------------------------------------------------
# Remove +/- from the to_add & to_delete files
#

perl -pi -e "s#^\+##g" ${PORN}/urls.to_add
perl -pi -e "s#^\-##g" ${PORN}/urls.to_delete
perl -pi -e "s#^\+##g" ${PORN}/domains.to_add
perl -pi -e "s#^\-##g" ${PORN}/domains.to_delete

# --------------------------------------------------------------
# Log item counts to porn statistics file
#

echo " " >> ${PORN}/stats/${UNIQUEDT}_stats
echo "Local to_add and to_delete files" >> ${PORN}/stats/${UNIQUEDT}_stats
echo "--------------------------------" >> ${PORN}/stats/${UNIQUEDT}_stats

wc --lines ${PORN}/domains.to_add >> ${PORN}/stats/${UNIQUEDT}_stats
wc --lines ${PORN}/domains.to_delete >> ${PORN}/stats/${UNIQUEDT}_stats
wc --lines ${PORN}/urls.to_add >> ${PORN}/stats/${UNIQUEDT}_stats
wc --lines ${PORN}/urls.to_delete >> ${PORN}/stats/${UNIQUEDT}_stats

# --------------------------------------------------------------
# Combine the adult, blacklist and to_add files
# Remove garbage and blanks
# Remove duplicate entries
#

cat ${PORN}/archive/domains.0 ${ADULT}/archive/domains.0 ${PORN}/domains.to_add \
  > ${PORN}/domains.merged.1
cat ${PORN}/domains.merged.1 | tr -d '\000-\011' > ${PORN}/domains.merged.2
cat ${PORN}/domains.merged.2 | tr -d '\013-\037' > ${PORN}/domains.merged.3
cat ${PORN}/domains.merged.3 | tr -d '\177-\377' > ${PORN}/domains.merged.4
sort -u ${PORN}/domains.merged.4 > ${PORN}/domains.merged

cat ${PORN}/archive/urls.0 ${ADULT}/archive/urls.0 ${PORN}/urls.to_add \
  > ${PORN}/urls.merged.1
cat ${PORN}/urls.merged.1 | tr -d '\000-\011' > ${PORN}/urls.merged.2
cat ${PORN}/urls.merged.2 | tr -d '\013-\037' > ${PORN}/urls.merged.3
cat ${PORN}/urls.merged.3 | tr -d '\177-\377' > ${PORN}/urls.merged.4
sort -u ${PORN}/urls.merged.4 > ${PORN}/urls.merged

# --------------------------------------------------------------
# Log item counts to porn statistics file
#

echo " " >> ${PORN}/stats/${UNIQUEDT}_stats
echo "Combined adult, blacklist and to_add files, deduped" \
  >> ${PORN}/stats/${UNIQUEDT}_stats
echo "---------------------------------------------------" \
  >> ${PORN}/stats/${UNIQUEDT}_stats

wc --lines ${PORN}/domains.merged >> ${PORN}/stats/${UNIQUEDT}_stats
wc --lines ${PORN}/urls.merged >> ${PORN}/stats/${UNIQUEDT}_stats

# --------------------------------------------------------------
# Remove entries that match the content of the to_delete files
#

grep -v -x -F --file=${PORN}/domains.to_delete \
  ${PORN}/domains.merged > ${PORN}/domains.adjusted
grep -v -x -F --file=${PORN}/urls.to_delete \
  ${PORN}/urls.merged > ${PORN}/urls.adjusted

# --------------------------------------------------------------
# Log item counts to porn statistics file
#

echo " " >> ${PORN}/stats/${UNIQUEDT}_stats
echo "After removing the contents of the to_delete files" \
  >> ${PORN}/stats/${UNIQUEDT}_stats
echo "--------------------------------------------------" \
  >> ${PORN}/stats/${UNIQUEDT}_stats

wc --lines ${PORN}/domains.adjusted >> ${PORN}/stats/${UNIQUEDT}_stats
wc --lines ${PORN}/urls.adjusted >> ${PORN}/stats/${UNIQUEDT}_stats

# --------------------------------------------------------------
# Install new text files
#

mv -f ${PORN}/domains.adjusted ${PORN}/domains
mv -f ${PORN}/urls.adjusted ${PORN}/urls

# --------------------------------------------------------------
# Log item counts to porn statistics file
#

echo " " >> ${PORN}/stats/${UNIQUEDT}_stats
echo "Final production files" \
  >> ${PORN}/stats/${UNIQUEDT}_stats
echo "----------------------" \
  >> ${PORN}/stats/${UNIQUEDT}_stats

wc --lines ${PORN}/domains >> ${PORN}/stats/${UNIQUEDT}_stats
wc --lines ${PORN}/urls >> ${PORN}/stats/${UNIQUEDT}_stats

# --------------------------------------------------------------
# Create new databases in all categories
#

${SQUIDGUARD} -C all

# --------------------------------------------------------------
# Update databases from your domains.diff and urls.diff files
# NOTE: The -u[pdate] command only looks for domains.diff and
# urls.diff. It does NOT use the incremental files that are
# included in the blacklist file.
# e.g. domains.20011230.diff, urls.20011230.diff
#

${SQUIDGUARD} -u

# --------------------------------------------------------------
# Change ownership of blacklist files
#

chown -R squid.squid ${BLACKDIR}/blacklists

# --------------------------------------------------------------
# Bounce squid and squidguard
#

${SQUID} -k reconfigure

# --------------------------------------------------------------
# Delete work files
#

rm -f ${PORN}/domains.merged
rm -f ${PORN}/domains.merged.*
rm -f ${PORN}/domains.to_add
rm -f ${PORN}/domains.to_delete

rm -f ${PORN}/urls.merged
rm -f ${PORN}/urls.merged.*
rm -f ${PORN}/urls.to_add
rm -f ${PORN}/urls.to_delete

# --------------------------------------------------------------
# Display stats file
#

cat ${PORN}/stats/${UNIQUEDT}_stats

# --------------------------------------------------------------
# Wait for everything to finish, then exit
#

sleep 5s
exit 0

squid-getlist.html.patch:

--- NEW FILE squid-getlist.html.patch ---
--- squid-getlist.html.1	2004-02-05 03:55:15.000000000 +0100
+++ squid-getlist.html	2005-09-05 17:47:18.000000000 +0200
@@ -1,8 +1,16 @@
-<title>Auto Squidguard Filter Update</title>
-<pre>
 #!/bin/bash
+
+ENABLED=0
+
+if [ $ENABLED == 0 ]; then exit 0; fi
+
 #-----------------------------------------------------------------
 #
+# -- Changed - 05 September 2005
+# --    Change by Oliver Falk <oliver at linux-kernel.at>
+#       Modified paths to work with my RPM
+#       Added ENABLED=0/1 (0 means disabled; default)
+#
 # -- Changed - 15 June 2003
 # --    Change 
 #       "squidguard -C domains" to 
@@ -55,11 +63,11 @@
 
 # Give location of squid and squidguard programs
 SQUID=/usr/sbin/squid
-SQUIDGUARD=/usr/local/bin/squidGuard
+SQUIDGUARD=/usr/bin/squidGuard
 # --------------------------------------------------------------
 # BLACKDIR should be set to equal the dbhome path declaration
 # in your squidguard.conf file
-BLACKDIR=/var/lib/squidguard/db
+BLACKDIR=/var/lib/squidGuard
 BLKDIRADLT=${BLACKDIR}/blacklists
 PORN=${BLACKDIR}/blacklists/porn
 ADULT=${BLACKDIR}/blacklists/adult

squidGuard-destdir.patch:

--- NEW FILE squidGuard-destdir.patch ---
--- squidGuard-1.1.4/src/Makefile.in.destdir	Fri Oct 13 19:44:15 2000
+++ squidGuard-1.1.4/src/Makefile.in	Fri Oct 13 19:45:09 2000
@@ -97,8 +97,8 @@
 
 install.bin:: squidGuard
 	@echo making $@ in `basename \`pwd\``
-	@$(MKDIR) $(bindir) $(logdir) $(cfgdir)
-	$(INSTALL_PROGRAM) squidGuard $(bindir)/squidGuard
+	@$(MKDIR) $(DESTDIR)$(bindir) $(DESTDIR)$(logdir) $(DESTDIR)$(cfgdir)
+	$(INSTALL_PROGRAM) squidGuard $(DESTDIR)$(bindir)/squidGuard
 
 uninstall.bin::
 	@echo making $@ in `basename \`pwd\``

squidGuard-paths.patch:

--- NEW FILE squidGuard-paths.patch ---
--- squidGuard-1.1.4/samples/sample.conf.in	Mon May 10 18:37:13 1999
+++ squidGuard-1.1.4/samples/sample.conf.in.paths	Fri Oct 13 19:47:31 2000
@@ -2,8 +2,8 @@
 # CONFIG FILE FOR SQUIDGUARD
 #
 
-dbhome @prefix@/squidGuard/db
-logdir @prefix@/squidGuard/logs
+dbhome @sg_dbhome@
+logdir @sg_logdir@
 
 #
 # TIME RULES:


--- NEW FILE squidGuard.logrotate ---
/var/log/squid/squidGuard.log {
    weekly
    compress
    notifempty
    missingok
}


--- NEW FILE squidGuard.spec ---
# $Id: squidGuard.spec,v 1.10 2005/09/06 07:39:02 oliver Exp $

%define			_dbhomedir		%{_var}/lib/%{name}

%define			_dbrpmver		%(eval "rpm -q --queryformat \"%{VERSION}\" db4")

Name:			squidGuard
Version:		1.2.0
Release:		11
Summary:		Filter, redirector and access controller plugin for squid

Group:			System Environment/Daemons
License:		GPL

Source0:		http://ftp.teledanmark.no/pub/www/proxy/%{name}/%{name}-%{version}.tar.gz
Source1:		squidGuard.logrotate
Source2:		http://ftp.teledanmark.no/pub/www/proxy/%{name}/contrib/blacklists.tar.gz
Source3:		http://cuda.port-aransas.k12.tx.us/squid-getlist.html

Patch0:			squidGuard-destdir.patch
Patch1:			squidGuard-paths.patch
Patch2:			squidguard-1.2.0-db4.patch
Patch3:			squid-getlist.html.patch
URL:			http://www.squidguard.org/

BuildRoot:		%{_tmppath}/%{name}-%{version}-%{release}-root-%(%{__id_u} -n)
BuildRequires:	db4-devel
Requires:		squid

%description
squidGuard can be used to 
- limit the web access for some users to a list of accepted/well known
  web servers and/or URLs only.
- block access to some listed or blacklisted web servers and/or URLs
  for some users.
- block access to URLs matching a list of regular expressions or words
  for some users.
- enforce the use of domainnames/prohibit the use of IP address in
  URLs.
- redirect blocked URLs to an "intelligent" CGI based info page.
- redirect unregistered user to a registration form.
- redirect popular downloads like Netscape, MSIE etc. to local copies.
- redirect banners to an empty GIF.
- have different access rules based on time of day, day of the week,
  date etc.
- have different rules for different user groups.
- and much more.. 

Neither squidGuard nor Squid can be used to
- filter/censor/edit text inside documents 
- filter/censor/edit embeded scripting languages like JavaScript or
  VBscript inside HTML

%prep
%setup -q
%{__cp} %{SOURCE3} .
%patch0 -p1 -b .destdir
%patch1 -p1 -b .paths
%if "%{_dbrpmver}" != "4.0.14"
%patch2 -p0 -b .db4
%endif
%patch3 -p0

%build
%configure \
	--with-sg-config=%{_sysconfdir}/squid/squidGuard.conf \
	--with-sg-logdir=%{_var}/log/squid \
	--with-sg-dbhome=%{_dbhomedir}
	
%{__make} %{?_smp_mflags} LIBS=-ldb

%install
%{__rm} -rf $RPM_BUILD_ROOT

%{__make} DESTDIR=$RPM_BUILD_ROOT install

%{__install} -p -D -m 0644 %{SOURCE1} $RPM_BUILD_ROOT%{_sysconfdir}/logrotate.d/squidGuard
%{__install} -p -D -m 0644 samples/sample.conf $RPM_BUILD_ROOT%{_sysconfdir}/squid/squidGuard.conf
%{__install} -p -D -m 0644 %{SOURCE2} $RPM_BUILD_ROOT%{_dbhomedir}/blacklists.tar.gz

# Don't use SOURCE3, but use the allready patched one #165689, also install it with perm 755 not 750
%{__install} -p -D -m 0755 squid-getlist.html $RPM_BUILD_ROOT%{_sysconfdir}/cron.daily/squidGuard

pushd $RPM_BUILD_ROOT%{_dbhomedir}
tar xfz $RPM_BUILD_ROOT%{_dbhomedir}/blacklists.tar.gz
popd

sed -i "s,dest/adult/,blacklists/porn/,g" $RPM_BUILD_ROOT%{_sysconfdir}/squid/squidGuard.conf

%clean
%{__rm} -rf $RPM_BUILD_ROOT

%files
%defattr(-,root,root)
%doc samples/*.conf
%doc samples/*.cgi
%doc samples/dest/blacklists.tar.gz
%doc COPYING GPL
%doc doc/*.txt doc/*.html doc/*.gif
%{_bindir}/*
%config(noreplace) %{_sysconfdir}/squid/squidGuard.conf
%config(noreplace) %{_sysconfdir}/logrotate.d/squidGuard
%config(noreplace) %{_sysconfdir}/cron.daily/squidGuard
%{_dbhomedir}/

%changelog
* Tue Sep 06 2005 Oliver Falk <oliver at linux-kernel.at>		- 1.2.0-11
- More bugs from Bug #165689
  Install cron script with perm 755
  Don't use SOURCE3 in install section, we need to use the patched one
  
* Mon Sep 05 2005 Oliver Falk <oliver at linux-kernel.at>		- 1.2.0-10
- Include GPL in doc section

* Mon Sep 05 2005 Oliver Falk <oliver at linux-kernel.at>		- 1.2.0-9
- More 'bugs' from Bug #165689
  Make changed on squid-getlist.html a patch, as sources should
  match upstream sources, so they are wget-able...

* Mon Sep 05 2005 Oliver Falk <oliver at linux-kernel.at>		- 1.2.0-8
- Bug #165689

* Thu May 19 2005 Oliver Falk <oliver at linux-kernel.at>		- 1.2.0-7
- Update blacklists
- Cleanup specfile

* Fri Apr 08 2005 Oliver Falk <oliver at linux-kernel.at>		- 1.2.0-6
- Fix build on RH 8 with db 4.0.14, by not applying the db4 patch

* Mon Feb 21 2005 Oliver Falk <oliver at linux-kernel.at> 		- 1.2.0-5
- Specfile cleaning
- Make it build with db4 again, by adding the db4-patch

* Mon Apr 12 2002 Oliver Pitzeier <oliver at linux-kernel.at>	- 1.2.0-4
- Tweaks

* Mon Apr 08 2002 Oliver Pitzeier <oliver at linux-kernel.at> 	- 1.2.0-3
- Rebuild

* Mon Apr 08 2002 Oliver Pitzeier <oliver at linux-kernel.at> 	- 1.2.0-2
- Updated the blacklists and put it into the right place
  I also descompress them
- Added a new "forbidden" script - the other ones are too
  old and don't work.  

* Fri Apr 05 2002 Oliver Pitzeier <oliver at linux-kernel.at> 	- 1.2.0-1
- Update to version 1.2.0

* Fri Jun  1 2001 Enrico Scholz <enrico.scholz at informatik.tu-chemnitz.de>
- cleaned up for rhcontrib

* Fri Oct 13 2000 Enrico Scholz <enrico.scholz at informatik.tu-chemnitz.de>
- initial build

squidguard-1.2.0-db4.patch:

--- NEW FILE squidguard-1.2.0-db4.patch ---
--- src/sgDb.c.orig	2004-03-09 03:45:59.000000000 +0100
+++ src/sgDb.c	2004-03-09 03:48:43.000000000 +0100
@@ -98,13 +98,13 @@
     if(createdb)
       flag = flag | DB_TRUNCATE;
     if ((ret = 
-	 Db->dbp->open(Db->dbp, dbfile, NULL, DB_BTREE, flag, 0664)) != 0) {
+	 Db->dbp->open(Db->dbp, NULL, dbfile, NULL, DB_BTREE, flag, 0664)) != 0) {
       (void) Db->dbp->close(Db->dbp, 0);
       sgLogFatalError("Error db_open: %s", strerror(ret));
     }
   } else {
     if ((ret = 
-	 Db->dbp->open(Db->dbp, dbfile, NULL, DB_BTREE, DB_CREATE, 0664)) != 0) {
+	 Db->dbp->open(Db->dbp, NULL, dbfile, NULL, DB_BTREE, DB_CREATE, 0664)) != 0) {
       sgLogFatalError("Error db_open: %s", strerror(ret));
     }
   }


Index: .cvsignore
===================================================================
RCS file: /cvs/extras/rpms/squidGuard/devel/.cvsignore,v
retrieving revision 1.1
retrieving revision 1.2
diff -u -r1.1 -r1.2
--- .cvsignore	6 Sep 2005 10:49:22 -0000	1.1
+++ .cvsignore	6 Sep 2005 10:50:41 -0000	1.2
@@ -0,0 +1,2 @@
+blacklists.tar.gz
+squidGuard-1.2.0.tar.gz


Index: sources
===================================================================
RCS file: /cvs/extras/rpms/squidGuard/devel/sources,v
retrieving revision 1.1
retrieving revision 1.2
diff -u -r1.1 -r1.2
--- sources	6 Sep 2005 10:49:22 -0000	1.1
+++ sources	6 Sep 2005 10:50:41 -0000	1.2
@@ -0,0 +1,2 @@
+63190e4c21688a4185cc3c0b62aae970  blacklists.tar.gz
+c6e2e9112fdbda0602656f94c1ce31fd  squidGuard-1.2.0.tar.gz




More information about the fedora-extras-commits mailing list