SQUID ACLs


Why SQUID?

There are two main reasons for installing and configuring SQUID on your network:

1. Reduce Internet bandwidth charges

Administrators configure the client web browsers to use the Squid proxy server instead of going to the web directly. The Squid server then checks its web cache for the web information requested by the user. It will return any matching information that finds in its cache (which in SQUID terminology is a TCP_HIT), and if not, it will access the Internet to find it on behalf of the user(TCP_MISS). Once it finds the information, SQUID will populate its cache with the new page information and forwards it to the user’s web browser. This reduces the amount of data accessed from the Internet when the same website is accessed again.

2. Limit access to the Web to only authorized users

You can configure your firewall (IPTABLES, CISCO PIX etc.) to only accept HTTP connections only from the Squid server and no one else. Squid can then be configured using access control list (ACL) to allow only certain departments, subnets or specific hosts to access Internet. SQUID can also block specific websites and allow access to Internet only during specific hours of the day.

In one of my previous post I have discussed on squidclient and squid proxy logs. Now, before starting with various squid rules (ACL’s – Access Control Lists), I would like to remind you that SQUID by default listens to TCP port 3128. Most Company’s Business Rule would ask you to change this default SQUID port to some other port for security purpose. You can customize the SQUID port to, say 8080 as follows:

vi /etc/squid/squid.conf
{Go to the section “Network Options” by searching for this string}
http_port 3128 (change to 8080)
:wq!

You will also see https_port. HTTPS_port allows SQUID to be an accelerator that basically is a middle man between the client and the server providing HTTPS access.

Client => Squid => HTTPS Web Server

Here, the client makes a request to the SQUID which terminates the SSL session and then connects to the backend server thus increasing the performance of the SSL connection.

Reload SQUID service so port 8080 is in effect.

/sbin/service squid reload

(Reloading SQUID service won’t disconnect or stop the SQUID sessions, while restarting a SQUID service will stop and start the service thereby interrupting the existing sessions/connections)

You can verify SQUID port by using netstat command

netstat -ntlp

Also, make sure you update the proxy port to 8080 on your web-browser so you can access the websites.

Access Control List (ACLs):

Squid matches each Web access request it receives by checking the http_access list from top to bottom. If it finds a match, it enforces the allow or deny statement and stops reading further. You have to be careful not to place a deny statement in the list that blocks a similar allow statement below it. The final http_access statement denies everything, so it is best to place new http_access statements above it.

ACL Syntax:

Defining an ACL

acl {acl_name} acl_type(src/dstdomain/srcdomain/time_of_day etc) decison_string

eg: acl our_networks src 192.168.2.0/24 192.168.5.0/24

Apply rules on ACL_names

http_access allow/deny acl_name

eg: http_access allow our_networks

There are pre-defined ACLs that SQUID will use to determine whether or not clients are able to make connections to certain ports outside.

eg: vi /etc/squid/squid.conf
acl Safe_ports port 80
acl Safe_ports port 21
acl Safe_ports port 443

These are safe ports which SQUID is allowed to connect. The rule to allow is as follows:

http_access deny !Safe_ports

(use of negation. In this case, it’s read as http_access allow Safe_port)

If, in your network, you use unix shell to download and configure various application, you will have to direct access to SQUID so you can download the contents via SQUID server thereby saving on bandwidth and time to pull the contents from Internet. Inorder to do this, you need to SET the variable “http_proxy” to point to the SQUID server as follows:

export http_proxy=192.168.2.50:3128

(If port is changed to any other, specify that port instead of the default 3128)

Now, if you download via wget or curl, you will see it uses the proxy server 192.168.2.50 to download the contents.

DENY Internet ACCESS to specific hosts:

In SQUID configuration file (/etc/squid/squid.conf), include the following to deny access to two hosts in your network

acl bad_hosts src 192.168.2.10 192.168.2.15
http_access deny bad_hosts

You can verify or confirm this denial via access.log (/var/log/squid/access.log).

ALC Lists:

There are mainly 2 methods you can use to create ACL Lists:

1. By repeating ACL names

acl bad_hosts src 192.168.2.10
acl bad_hosts src 192.168.2.15
http_access deny bad_hosts

2. By creating a file and including all the host IP’s in the file

acl bad_hosts src “/etc/squid/bad_host_file.txt”
http_access deny bad_hosts

In bad_host_file.txt, you can add each IP’s line after line.
Note:- Make sure this file “bad_host_file.txt” is readable by SQUID user. This file can be owned by any user though.

Run the command “service squid reload” to reload the squid configuration file without breaking the existing connections/sessions.

ACL’s Based on TIME:

Syntax: acl {acl_name} time {SMTWHFA} 00:00-00:00

Days of Week is represented as follows:

S – Sunday
M – Monday
T – Tuesday
W – Wednesday
H – Thursday
F – Friday
A – Saturday
D – WeekDays (Monday to Friday)

Hours and Minutes are represented as: hh:mm-hh:mm

eg:
acl break time 12:00-13:00
acl work_hours time D 09:00-17:00

eg 1: Deny Internet access between Work Hours
acl work_hours time 09:00-17:30
httpd_access deny work_hours

eg 2: Allows Internet access at all times
acl admins src 192.168.2.10 192.168.2.15
http_access allow admins

eg 3: Deny/Disable Internet access between 9am to 7pm on Monday, Wednesday, Thursday and Friday)
acl work_hours time MWHF 09:00-19:00
http_access deny work_hours

ACL’s Based on specific Destination Domains:

You can deny access to destination domains (dstdomain) or source domains (srcdomain) as follows:

eg 1: Using redundant lists:
acl bad_sites dstdomain .facebook.com
acl bad_sites dstdomain .orkut.com
acl bad_sites srcdomain .games.com
http_access deny bad_sites

eg 2: Using a text file:
acl bad_sites dstdomain “/etc/squid/bad_sites_file.txt”
http_access deny bad_sites

vi /etc/squid/bad_sites_file.txt
.facebook.com
.orkut.com
:wq!

Make sure you have a period ‘.’ before domain names so all subdomains match the domnain names. If you don’t put the period before the domain name, then http://www.facebook.com will be allowed and only facebook.com will be blocked by the above rule. So, keep this in mind when constructing rules.

Combining SQUID ACLs:

You can build separate rules, combine them and then apply tags to those combined rules to create ACLs. It’s done as follows:

acl work_hours time MTWHF 08:00-17:00
acl bad_sites dstdomain “/etc/squid/bad_site_file.txt”
http_access deny work_hours bad_sites

Here we are ANDing Or combining work_hours and bad_sites to deny access to all domains in the file “bad_site_file.txt” from Monday through Friday between 8am to 5pm

eg: No casual browsing during Work Hours on weekdays between 8am to 5pm from subnet 192.168.2.0/24. But, permit access to work related website like wikipedia.org

acl work_site dstdomain .wikipedia.org
http_access allow work_site
acl employees src 192.168.2.0/24
acl work_hours time MTWHF 08:00-17:00
http_access deny employees work_hours

eg: Deny browsing of sites with keyword ‘sex’

acl bad_keyword url_regex -i sex
http_access deny bad_keyword

You can use file to store more bad keywords line by line and block those websites as follows:

acl bad_keyword url_regex -i “/etc/squid/bad_keyword_file.txt”
http_access deny bad_keyword

eg: Deny download of prohibited extension like .exe .vbs etc

acl bad_extensions url_regex .*\.exe$
http_access deny bad_extensions

.* -> means any or all.
\ -> to escape the ‘.’ which is following
$ -> to end with .exe

eg: Block outbound access to certain TLD’s like .jp, .cn, .ru etc

acl bad_tlds dstdom_regex \.cn$
http_access deny bad_tlds

You can have multiple TLDs blocked using file as follows:

acl bad_tlds dstdom_regex “/etc/squid/bad_tld_file.txt”
http_access deny bad_tlds

vi bad_tld_file.txt
\.cn$
\.jp$
\.ru$
:wq!

Construct ACL to configure SQUID as a NON-Caching Proxy Server:

Sometimes you might want to construct specific rules so request from a particular subnet or specific host doesn’t get cached on SQUID server. This can be implemented for specific destination domain as well so access to those websites won’t get cached in SQUID server and all the contents will be pulled from the Internet upon each request for those websites.

NOTE:- NON-Caching rules should be setup before any other rules.

The following rule sets SQUID as a non-caching proxy server:

acl non_caching_hosts src 0.0.0.0/0.0.0.0 ####{this can be set as 0/0 as well}
no_cache deny non_caching_hosts

The following rule disables caching of specific websites

acl block_caching_sites dstdomain .hotmail.com
no_cache deny block_caching_sites

Or using file as follows

acl block_caching_sites url_regex -i “/etc/squid/no_cache_file.txt”
no_cache deny block_caching_sites

You can also disable caching of Dynamic webpages like .php, .pl, .asp, .jsp etc as follows:

acl no_dynamic_sites url_regex -i “/etc/squid/dynamic.txt”
no_cache deny no_dynamic_sites

vi /etc/squid/dynamic.txt
\.php$
\.pl
\.asp$
\.jsp$
:wq!

Also, you can construct a rule for not caching specific hosts and cache everyone else as follows:

acl no_cache_admins src 192.168.2.10 192.168.2.15
no_cache deny no_cache_admins

The above rule only applies to admin computers and denies caching so admin always pulls contents from Internet.

Don’t forget to RELOAD SQUID whenever you apply rules 🙂

That’s it.

Note: I found this URL very helpful: http://wiki.squid-cache.org/SquidFaq/SquidAcl

Advertisements

One response to “SQUID ACLs

  1. Pingback: Transparent SQUID Proxy | Kernel Craft

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s