Squid redirector

Written in vi editor

A Squid proxy server redirector replaces certain URLs with others. It can therefore be used as an ad-blocker. Speeding up page load speeds and reducing tracking. It can also be used to block malicious scripts. Here a list of Squid Related Redirectors Software.

Squid redirector with RBL support

This redirector matches URLs against entries in a file. It can match against the beginning of an URL, the end of an URL and a sub-string of an URL.
It can also lookup host-names and IP addresses in DNS-based blacklists or RBLs.
Keep in mind that this will lead to false positives. Often there are many websites sharing the same IP address(es). Blacklisting a single IP address will block access to all of those sites. Including those which are not malicious.


Current version is: 2019-02-11 17:09:34 UTC (I'm too lazy to make up version numbers).



Directory for conf files.


Optional. 'debug on' will enable debugging.
Mandatory. Syntax:
redirurl URL
The redirurl is the URL the redirector redirects to. Usually a link to a small transparent GIF. E.G.;
redirurl http://www.example.org/images/transparant.gif
This won't work for HTTPS: The browser will complain. The site still gets blocked though.
Optional. Syntax;
dnsbl mode name
dnsbl 4 blacklist.example.net
dnsbl mode

A value between 1 and 15. It's an OR of;

1 Check host-name  Domain 
2 If alias, check CNAME  Domain 
4 Check IP address(es)  Address 
8 Log TXT record 

Most RBLs list either IP addresses or domains/hostnames. Don't get them mixed up!
Below some Lookup examples;

Host or IPTypeModesLookup
 www.example.org   Domain   1-3, 9-11   www.example.org.blacklist.example.net   Address   4, 12 
 2001:db8:2::1   Address   4, 12 

You can use multiple blacklists (one entry per line).

ACL files

hosts.allow Whitelisted hosts
hosts.deny Blacklisted hosts
urls.allow Whitelisted URLs
urls.deny Blacklisted URLs

The ACLS are checked in the above order and before the RBLs. The program stops checking at the first match.

ACL file syntax
Grep This program 
 ^Foobar  Foobar
 Foobar$  *Foobar
 .*Foobar.*  *Foobar*

The maximum line length is 4094 bytes (4095 including newline).

Host ACL examples
ad.doubleclick.net  Matches any URL with host-name 'ad.doubleclick.net'.
*.doubleclick.net  Matches any URL with host-name in the 'doubleclick.net' domain.
*doubleclick*  Matches any URL with host-name which contains the string 'doubleclick'.

You can put IP addresses in the host ACL files if you like. The software however, will not look up hostnames in order to see if their IP addresses are in hosts.allow or hosts.deny. If you want IP address based blacklisting, see 'Convert blacklists into zone files' below.

URL ACL examples
http://ad.doubleclick.net/  Matches any URL that begins with 'http://ad.doubleclick.net/'.
*count.gif Matches any URL that ends in 'count.gif'.
*doubleclick* Matches any URL that contains 'doubleclick'.

In case of HTTPS, Squid passes the host-name to the redirector, not the URL. And the URL ACLs aren't used.



Directory for log files. The directory has to be writable by the Squid process owner.


epoch.ms pid blocked_host_or_ip blacklist A lookup TXT

Number seconds since the 1st of January 1970 00:00:00 UTC.
Process ID of redirector. Squid may spawn several. This way you can tell their log entries apart.
Host-name or IP address of blocked website.
The blacklist that blocked it.
RBLs return an IP address just above E.G.:
The actual lookup that resulted in the above IP address.
TXT record for the same lookup. Usually a link to a web-page explaining why this IP address is blacklisted.

Before log-file rotation you need to reload Squid. This will kill the redirectors and close the logfile.



For TXT look-ups to work you need to remove the comments around '#define RSD_TXT_LKP 1'. If you do this you need to compile with -lresolv;
cc -O2 -Wall -lresolv -o rblsredir rblsredir.c
The maximum number of RBLs is eight. If you want more you need to increase the number next to '#define RSD_MAXLISTS'.
Put the binary in /usr/local/sbin/ (or /usr/sbin/ if you prefer).

Man pages

Put 'rblsredir.8' in /usr/local/man/man8/ and the rest in /usr/local/man/man5/ (or /usr/share/man/man8/ and /usr/share/man/man5/ if you prefer) and gzip them. It's probably a nice idea to create symlinks from 'sredir_acls.5.gz' to 'urls.allow.5.gz' and 'urls.deny.5.gz'. Do not link to 'hosts.allow.5.gz' or 'hosts.deny.5.gz'. These already point to libwrap's 'hosts_access.5.gz'!


Tar of source and man pages: rblsredir.tar.gz.

Convert blacklists into zone files

The Spamhaus DROP (Don't Route Or Peer) list consists of a number of files in network/netmask format. They are meant to be part of a firewall. You can however, convert them in a blacklist zone file instead.
The stuff below does this for you;

Script that does most of the work. Edit to suit your needs.
Header for Bind style zone file. Edit to suit your needs.
Program that generates the zone file entries.
The '-t' option adds TXT records.

You can add your own entries if you want.

  malice.example.com	IN	A