For less experienced Squid administrators the concept of ACLs can be confusing at first. But they offer a great way of controlling who is allowed to access which web pages when.
ACLs
First you need to define certain criteria like accesses from the marketing department or accesses to google.com or need to authenticate. There are certain types of ACLs for that purpose. The complete list of ACLs can be found at http://www.visolve.com/squid/squid24s1/access_controls.php
The syntax of an acl is:
acl name type definition1 definition2 definition3 ...
Examples:
acl accesses_to_google dstdomain .google.com acl accesses_to_search_engines dstdomain .yahoo.com .google.com .vivisimo.com acl accesses_from_marketing_department src 10.52.0.0/16 acl need_to_authenticate proxy_auth
You can also use lists of definitions that are stored in files on your hard disk. Let’s assume you have a list of search engines URLs that you want to allow:
/etc/squid/search-engines-urls.txt: .google.com .yahoo.com .altavista.com .vivisimo.com
Then the ACL for that file would look like:
acl accessess_to_search_engines dstdomain "/etc/squid/search-engines-urls.txt"
The quotes are important here to tell Squid it needs to look up definitions in that file.
Using the ACLs: http_access
Defining the ACLs alone does not actually block anything – it’s just a definition. ACLs can be used in various places of your squid.conf. The most useful feature is the http_access statement. It works similar to the way a firewall would handle rules. For each request that Squid receives it will look through all the http_access statements in order until it finds a line that matches. It then either accepts or denys depending on your setting. The remaining rules are ignored.
The general syntax of an http_access line is:
http_access (allow|deny) acl1 acl2 acl3 ...
Example:
http_access allow accesses_from_admins http_access deny accesses_to_porn_urls http_access allow accesses_during_lunchtime http_access deny all
This would allow accessing from the admins (whatever that ACL looks like – probably a src ACL pointing to the subnet where the admin workstations are in). For everyone else it will deny accesses to porn URLs. Then it would allow accesses from everyone to every web site during lunch time. And finally all other accesses would be denied.
Combining ACLs (AND/OR)
Often you need to combine ACLs. Let’s say you want to allow access to google.com only for the back office. This combines two ACLS with an AND. This would look like this:
http_access allow accesses_to_google.com accesses_from_back_office
If you wanted to use an OR and say either accesses from the back office or accesses to google.com are allowed then the line would look like this:
http_access allow accesses_to_google.com http_access allow accesses_from_back_office
To summarize: AND means putting the conditions in one line. OR means using seperate lines.
Custom error pages (deny_info)
By default when you deny access the user gets the error page that is stored in the ERR_ACCESS_DENIED file. But luckily you can define your own custom error pages and display them when you deny certain accesses. A simple example:
acl google dstdomain google.com deny_info error-google google http_access deny google
Put an error page into the directory where the HTML files are stored (look for error_directory in your squid.conf) and name it error-google. If the user tries to access www.google.com the access is denied and your error page is shown.
Careful when you combine ACLs on a http_access line. Example:
acl google dstdomain google.com acl admin src 10.0.5.16 deny_info google error-google http_access deny admin google
This will deny access only for the user from the IP address 10.0.5.16 when www.google.com is accessed. As you can see I have combined the ACLs admin and google. In such a combination the last ACL in the line is taken into account for lookups of deny_info. So it’s important that you define a deny_info for the google ACL.
Re-Authentication control
Usually when a user is authenticated at the proxy you cannot “log out” and re-authenticate. The user has to close and re-open the browser windows to be able to re-login at the proxy. A simple configuration will probably look like this:
acl my_auth proxy_auth REQUIRED http_access allow my_auth http_access deny all
Now there is a tricky change that was introduced in Squid 2.5.10. It allows to control when the user is prompted to authenticate. Now it’s possible to force the user to re-authenticate although the username and password are still correct. Example configuration:
acl my_auth proxy_auth REQUIRED acl google dstdomain .google.com http_access allow my_auth http_access deny google my_auth http_access deny all
In this case if the user requests www.google.com then the second http_access line matches and triggers re-authentication. Remember: it’s always the last ACL on a http_access line that “matches”. If the matching ACL has to do with authentication a re-authentication is triggered. If you didn’t want that you would need to switch the order of ACLs so that you get http_access deny my_auth google.
You might also run into an authentication loop if you are not careful. Assume that you use LDAP group lookups and want to deny access based on an LDAP group (e.g. only members of a certain LDAP group are allowed to reach certain web sites). In this case you may trigger re-authentication although you don’t intend to. This config is likely wrong for you:
acl ldap-auth proxy_auth REQUIRED acl ldapgroup-allowed external LDAP_group PROXY_ALLOWED http_access deny !ldap-auth http_access deny !ldapgroup-allowed http_access allow all
The second http_access line would force the user to re-authenticate time and again if he/she is not member of the PROXY_ALLOWED group. This is perhaps not what you want. You rather wanted to deny access to non-members. So you need to rewrite this http_access line so that an ACL matches that has nothing to do with authentication. This is the correct example:
acl ldap-auth proxy_auth REQUIRED acl ldapgroup-allowed external LDAP_group PROXY_ALLOWED acl dummy src 0.0.0.0/0.0.0.0 http_access deny !ldap-auth http_access deny !ldapgroup-allowed dummy http_access allow all
This way the second http_access line still matches. But it’s the dummy ACL which is now last in the line. Since dummy is a static ACL (that always matches) and has nothing to do with authentication you will find that the access is just denied.
im having troubles assigning ACL to ldap groups… i have this:
external_acl_type gruposLDAP %LOGIN /usr/lib64/squid/squid_ldap_group -P -R -b OU=USUARIOS,DC=tierradelfuego,DC=gov,DC=ar -D proxyAuth -w password -f (&(objectClass=person)(sAMAccountName=%u)(memberOf=CN=%g,OU=GRUPOS,DC=tierradelfuego,DC=gov,DC=ar)) -h 10.1.9.33 -s sub -v 3 -d
acl ProxyA external gruposLDAP _GP_US_PROXY_A
acl ProxyB external gruposLDAP _GP_US_PROXY_B
acl ad_users proxy_auth REQUIRED
http_access deny !ad_users
http_access deny ProxyB
http_access allow ProxyA
http_access allow all
but no matters if a user is in ProxyA or ProxyB … it falls into an authentation loop.. can anybody help me?
Step by step ACL. It is very easy to get it.
Thanks
Hello I would like to ask in acl list.
I downloaded a definition of list in squidblacklist.org. One of the example I download is squid-torrent.acl, I rename it squid-torrent.txt….
This is my squid.conf configuration
acl squid-torrent dstdomain “/etc/squid/squid-torrent.txt”
http_access deny squid-torrent
after service squid restart command, ther’s an error….
[root@rpidvoproxy squid]# service squid restart
Stopping squid: 2013/05/02 14:49:39| WARNING: ‘.kicks-ass.org’ is a subdomain of ‘.kicks-ass.org’
2013/05/02 14:49:39| WARNING: because of this ‘.kicks-ass.org’ is ignored to keep splay tree searching predictable
2013/05/02 14:49:39| WARNING: You should probably remove ‘.kicks-ass.org’ from the ACL named ‘squid-torrent’
2013/05/02 14:49:39| WARNING: ‘.podtropolis.com’ is a subdomain of ‘.podtropolis.com’
2013/05/02 14:49:39| WARNING: because of this ‘.podtropolis.com’ is ignored to keep splay tree searching predictable
Help, how to fix this error, I like using a list of definition, because its easy for me to download from the web for the list, instead of searching all torrent site and add-in to my blockesite list.
Could it be that you are using the same ACL twice? (Two "http_access deny" lines.)
Or does the domain appear twice in the ACL file?
hello, can you please tell me if this is wrong declaration of my acl.
# ACL squid-torrent
#acl squid-torrent src “/etc/squid/squid-torrent.txt”
#acl squid-torrent dst “/etc/squid/squid-torrent.txt”
#acl squid-torrent srcdomain “/etc/squid/squid-torrent.txt”
#acl squid-torrent dstdomain “/etc/squid/squid-torrent.txt”
#acl squid-torrent srcdom_regex -i “/etc/squid/squid-torrent.txt”
#acl squid-torrent dstdom_regex -i “/etc/squid/squid-torrent.txt”
#acl squid-torrent url_regex -i “/etc/squid/squid-torrent.txt”
#acl squid-torrent urlpath_regex -i “/etc/squid/squid-torrent.txt”
# Deny access to squid-torrent
#http_access deny squid-torrent
I found the error, yes its duplicate domain in my acl squid-torrent.txt file.
thanks for your help.
hello,
Its very confusing about declaring of acl.
After, checking and removing the duplicate of acl list in my squid-torrent.txt file… service squid restart is ok! no found error.
[root@rpidvoproxy squid]# service squid restart
Stopping squid: ……………. [ OK ]
Starting squid: . [ OK ]
[root@rpidvoproxy squid]#
then;
after testing my client xp for opening the torrent sites like, isohunt.com, torrentz.eu, extratorrent.com etc…
still, my client xp can open the said torrent sites.
****exept the torrent site thepiratebay.org
*********************************************************************************************************************************************
The following error was encountered while trying to retrieve the URL: http://www.thepiratebay.org/
Access Denied.
Access control configuration prevents your request from being allowed at this time. Please contact your service provider if you feel this is incorrect.
*********************************************************************************************************************************************
then its good for me, that my proxy block atleast thepiratebay.org
but when I check my squid-torrent.txt content list (nano /etc/squid/squid-torrent.txt) those torrent site are included in my list. How come, isohunt, torrentz.eu etc… are not denied on my proxy.
help me out blocking https://facebook.com i tried useing
acl fb dstdomain .facebook.com
http_access deny fb
🙁
Doesn't look too bad. But you better show us your relevant ACLs and http_access statements.
I found a blacklist acl you might also find useful, figured I might share it.
http://www.squidblacklist.org/downloads/squid-facebook.acl
About the solution of squid3 error: “ERROR: ‘.sub.example.com’ is a subdomain of ‘.example.com'”.
I found the solution here:
http://www.maravento.com/p/blacklist.html
Hey…. I have a task to configure external ACL… can you help me please… I don’t understand
http://www.squidblacklist.org Blacklists Tailored For Squid Proxy.
A higher level of Quality in a blacklist that trumps the competition, we carry multiple ports, including SquidGuard and DansGuardian compatible formats. We are the worlds leading publisher of blacklists tailored for Squid proxy. We also carry Squid Native ACL format for those whom do not wish to use third party plugins, upstream proxies or helpers to achieve filtering.
Thanks for this info. it is of great help. I just recently installed squid on my Windows OS. I have created several ACL rules and it worked fine. However, I wanted to extensively filter the organization. I created several groups defined by their mac addresses. Whereas, level1 – can access all sites, level2 can access all sites except youtube and facebook, level3 – some sites defined on ACL and the rest not defined should not be allowed. I discovered that if someone changed their ip address manually, they can bypass web filtering. I need help on this one. Thanks in advance.
I want to apply on cache_peer for selection the proxy on the basis of the user agent, how to do that?
can you please help
I have figured out how to remove referer from header for request going to a destination, as under
acl referer_allowsrc dstdomain google.com
header_access Referer deny referer_allowsrc
Is it possible to deny this selectively for sites originating the referer? I don’t want to do it for all sites using the proxy.
Example – referrer should be stripped for xyz.abc.com and 123.abc.com referring to google.com, not other sites like pqr.abc.com etc.,
Very nice your article,
Thank’s
Another great tutorial from workaround!
Thanks for this great tutorial. I managed to setup my basic ACL rules and http access after reading your tutorial. I have one question regarding the use of reply_body_max_size ? I’m using squid 3.3.8 and I’m trying to set download limit for my special_group and officers_group in my configuration file below..
……….
………
acl localnet src 192.168.0.0/24
acl special group src 192.168.0.1
……….
……….
reply_body_max_size 10 MB localnet
reply_body_max_size 300 MB special_group
……..
…….
The above will always take the first as the download limit and ignores the second. Appreciate your time and assistance
thank you
I’m using ACLs to block access to a list of sites by IP address. I’m doing it this way because the list I want to use is an IP reputation list that I have access to via a subscription but they don’t provide a domain based list. What I’ve found is that our clients can’t access the sites on the list if they attempt to access by IP but if they attempt to access a domain name that resolves to one of the IP addresses Squid allows it to pass. Does anyone know of a way to resolve this issue and make it flat out block it?
Thanks!
I am getting a problem in my squid proxy server. I want to allow youtube to my localnet2 , but some how it does not work. Please help me.
#
# Recommended minimum configuration:
#
# Example rule allowing access from your local networks.
# Adapt to list your (internal) IP networks from where browsing
# should be allowed
#acl localnet src 10.0.0.0/8 # RFC1918 possible internal network
acl localnet src 172.16.0.0/12 # RFC1918 possible internal network
#acl localnet src 192.168.0.0/16 # RFC1918 possible internal network
acl localnet src fc00::/7 # RFC 4193 local private network range
acl localnet src fe80::/10 # RFC 4291 link-local (directly plugged) machines
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
#
# Recommended minimum Access Permission configuration:
#
# Deny requests to certain unsafe ports
http_access deny !Safe_ports
acl nightofftime1 time SMTWHFA 22:31-23:59
acl nightofftime2 time SMTWHFA 1:00-4:59
http_access deny nightofftime1 all
http_access deny nightofftime2 all
# ACL Local Sub Network
acl localnet1 src 172.16.150.0/24 # internal network for Staff
acl localnet2 src 172.16.156.0/24 # internal network for Grade 10-12 Students
acl localnet3 src 172.16.155.0/24 # internal network for Grade 3-9 Students
# ACL Domain Name ———
acl YTDomain dstdomain .youtube.com
acl FBDomain dstdomain .facebook.com
# ACL Time —————————–
acl internettime time SMTWHFA 7:00-22:29
acl facebooktime1 time SMTWHFA 15:30-16:30
acl facebooktime2 time SMTWHFA 21:15-22:30
acl restricted_domain_file dstdomain “/etc/squid/restricted_domain.txt”
http_access deny restricted_domain_file
acl ban_keywords url_regex “/etc/squid/ban_keywords.txt”
http_access deny ban_keywords
# Restrict navegation by IP
#acl ip_restriction dstdom_regex ^(([0-9]+\.[0-9]+\.[0-9]+\.[0-9]+)|(\[([0-9af]+)?:([0-9af:]+)?:([0-9af]+)?\]))
#http_access deny ip_restriction
#Allow Youtube localnet2
http_access allow YTDomain localnet2
acl blockkeyword1 url_regex porn
http_access deny blockkeyword1
# Deny CONNECT to other than secure SSL ports
http_access deny CONNECT !SSL_ports
# Only allow cachemgr access from localhost
http_access allow localhost manager
http_access deny manager
# We strongly recommend the following be uncommented to protect innocent
# web applications running on the proxy server who think the only
# one who can access services on “localhost” is a local user
#http_access deny to_localhost
#
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
#
#Alloiw Facebook
http_access allow FBDomain facebooktime1
http_access allow FBDomain facebooktime2
http_access deny FBDomain
http_access deny YTDomain
# Example rule allowing access from your local networks.
# Adapt localnet in the ACL section to list your (internal) IP networks
# from where browsing should be allowed
http_access allow localnet
http_access allow localhost
# And finally deny all other access to this proxy
http_access deny all
# Squid normally listens to port 3128
http_port 3128
# Uncomment and adjust the following to add a disk cache directory.
#cache_dir ufs /var/spool/squid 100 16 256
# Leave coredumps in the first cache dir
coredump_dir /var/spool/squid
#
# Add any of your own refresh_pattern entries above these.
#
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern . 0 20% 4320
Allowing *.youtube.com is not enough. Static images are loaded from ytimg.com. And the actual videos are loaded from a CDN (content delivery network) of Google. Check the access log to see which further accesses are blocked. Or try to leverage the “acl aclname referer_regex” type to allow all requests that are done from *.youtube.com web sites.
Dear All,
Unable to authenticate ldap_groups with windows AD, am getting the below error
— Unit squid.service has begun starting up.
Mar 13 19:12:13 nxtdevlnx09 squid[10383]: ERROR: Invalid ACL: acl ldap-auth proxy_auth REQUIRED
Mar 13 19:12:13 nxtdevlnx09 systemd[1]: squid.service: control process exited, code=exited status=1
Mar 13 19:12:13 nxtdevlnx09 systemd[1]: Failed to start Squid caching proxy.
— Subject: Unit squid.service has failed
— Defined-By: systemd
— Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel
—
— Unit squid.service has failed.
—
— The result is failed.
Mar 13 19:12:13 nxtdevlnx09 systemd[1]: Unit squid.service entered failed state.
Mar 13 19:12:13 nxtdevlnx09 systemd[1]: squid.service failed.
Mar 13 19:12:13 nxtdevlnx09 cache_swap.sh[10377]: init_cache_dir /cache_dir…
Mar 13 19:12:14 nxtdevlnx09 polkitd[816]: Unregistered Authentication Agent for unix-process:10362:619896 (system bus name :1.125, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, local
lines 3692-3742/3742 (END)
Pls find the squid.conf configuration file below
#
# Recommended minimum configuration:
#
# Adapt to list your (internal) IP networks from where browsing
# should be allowed
#acl localnet src 10.0.0.0/8 # RFC1918 possible internal network
acl localnet src 172.16.0.0/12 # RFC1918 possible internal network
acl localnet src 192.168.0.0/16 # RFC1918 possible internal network
acl localnet src fc00::/7 # RFC 4193 local private network range
acl localnet src fe80::/10 # RFC 4291 link-local (directly plugged) machines
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
#acl Dangerous_ports port 7 9 19 22 23 25 53 109 110 119
#http_access deny Dangerous_ports
acl blockExtensions rep_mime_type -i “/etc/squid/extension.acl”
http_reply_access deny blockExtensions
#cache_peer chennai-proxy parent 8080 0 no-query no-digest default
#cache_peer mumbai-proxy parent 8080 0 no-query no-digest
#never_direct allow all
acl blockfiles urlpath_regex “/etc/squid/blocks.files.acl”
deny_info ERR_BLOCKED_FILES blockfiles
http_access deny blockfiles
# Recommended minimum Access Permission configuration:
#
# Deny requests to certain unsafe ports
http_access deny !Safe_ports
# Deny CONNECT to other than secure SSL ports
http_access deny CONNECT !SSL_ports
#String to block keywords in websites
acl blockkeyword1 url_regex yahoo
acl blockkeyword2 url_regex gmail
acl blockkeyword3 url_regex orkut
http_access deny blockkeyword1
http_access deny blockkeyword2
http_access deny blockkeyword3
# Only allow cachemgr access from localhost
http_access allow localhost manager
http_access deny manager
# We strongly recommend the following be uncommented to protect innocent
# web applications running on the proxy server who think the only
# one who can access services on “localhost” is a local user
#http_access deny to_localhost
#
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
#
# Example rule allowing access from your local networks.
# Adapt localnet in the ACL section to list your (internal) IP networks
# from where browsing should be allowed
external_acl_type ldapgroup %LOGIN /usr/lib64/squid/squid_ldap_group -b dc=psldev,dc=com
-f (&(objectclass=person)(cn=%v)
(groupMembership=cn=%a,ou=Users,dc=psldev,dc=com))
-D cn=squid,ou=Users,dc=psldev,dc=com -w timspassword -h ldapserver
auth_param basic children 50
auth_param basic realm Web-Proxy
auth_param basic credentialsttl 1 minute
#
acl ldap-auth proxy_auth REQUIRED
acl ldapgroup-allowed external LDAP_group PROXY_ALLOWED
acl dummy src 0.0.0.0/0.0.0.0
http_access deny !ldap-auth
http_access deny !ldapgroup-allowed dummy
http_access allow all
#————————————————————————————————-
### Allow authenticated users
#————————————————————————————————-
acl ban_domains dstdomain .facebook.com .youtube.com
http_access allow my_auth
http_access deny ban_domains
http_access allow localhost
http_access allow localnet
#acl FTP proto FTP
#always_direct allow FTP
# And finally deny all other access to this proxy
http_access deny all
# Squid normally listens to port 3128
http_port 8080
# Uncomment and adjust the following to add a disk cache directory.
#cache_dir ufs /var/spool/squid 100 16 256
cache_dir ext4 /cache_dir 100 16 256
# Leave coredumps in the first cache dir
#coredump_dir /var/spool/squid
coredump_dir /cache_dir
# Add any of your own refresh_pattern entries above these.
#
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern . 0 20% 4320
cache_effective_user squid
cache_effective_group squid
#dns_testnames 8.8.8.8
Pls help me, I am new to this Squid & Linux
I have fetched name field from external acl and I am having 10 acl with different name field how Squid can know which acl to use.
Example – I have fetched xyz from the external acl and I am having 3 acl of abc,xyz and qaz How can I connect link between external acl and acl to know which acl to be used.
Quote: “For each request that Squid receives it will look through all the http_access statements in order until it finds a line that matches. It then either accepts or denys depending on your setting. The remaining rules are ignored.”
Also if you want to combine ACLs in an “AND” or “OR” fashion see the section above about it.
Does that explain it?
I want to separate out the destination url’s based on source Ip’s. Below is my configuration :-
Its not working as expected as I am able to access whitelist.txt urls from ip’s in 10.0.8.0/24.
acl allowed_http_sites dstdomain “/etc/squid/whitelist.txt”
acl allowed_http_sites dstdomain “/etc/squid/whitelist2.txt”
http_access allow allowed_http_sites
acl “/etc/squid/whitelist.txt” src 10.0.1.8/32
acl “/etc/squid/whitelist.txt” src 10.0.2.9/32
acl “/etc/squid/whitelist.txt” src 10.0.3.10/32
acl “/etc/squid/whitelist.txt” src 10.0.4.11/32
acl “/etc/squid/whitelist.txt” src 10.0.5.12/32
acl “/etc/squid/whitelist.txt” src 10.0.6.13/32
acl “/etc/squid/whitelist.txt” src 10.0.7.14/32
acl “/etc/squid/whitelist2.txt” src 10.0.8.0/24
Can someone help me in verifying the same and let me know what’s wrong with the config.
Hi Arpit. You are doing it wrong. 🙂
acl allowed_http_sites dstdomain “/etc/squid/whitelist.txt”
acl allowed_src_ips src “/etc/squid/source-ips.txt”
http_access allow allowed_http_sites allowed_src_ips
That’s an example of how it would work. You define two ACLs. And you allow access only if both ACLs match. The whitelist.txt would use one “dstdomain”-style entry per line. And the source-ips.txt file would have one “src”-style entry per line. (e.g. 10.0.1.8)
Does that make sense?
Thanks Christoph for the correction and help. We implemented the same and it worked.
Since we have all Whitelist URL in HTTPS domain, the same configuration doesn’t work with https. Below is our acl for https :-
acl allowed_https_sites ssl::server_name “/etc/squid/whitelist.txt”
acl allowed_https_sites ssl::server_name “/etc/squid/whitelist2.txt”
Can you please suggest on how we can enable the same config for https domains.
Thanks in Advance,
Arpit Gupta
Below is my full config for HTTPS :-
# Handling HTTPS requests
https_port 3130 cert=/etc/squid/ssl/squid.pem ssl-bump intercept
acl SSL_port port 443
http_access allow SSL_port
acl allowed_https_sites ssl::server_name “/etc/squid/whitelist1.txt”
acl allowed_https_sites ssl::server_name “/etc/squid/whitelist2.txt”
acl allowed_broken_trusted_sites ssl::server_name “/etc/squid/broken_trusted_sites.txt”
acl BrokenButTrustedServers dstdomain “/etc/squid/broken_trusted_sites.txt”
sslproxy_cert_error allow BrokenButTrustedServers
sslproxy_cert_error deny all
acl step1 at_step SslBump1
acl step2 at_step SslBump2
acl step3 at_step SslBump3
ssl_bump peek step1 all
ssl_bump peek step2 allowed_https_sites
ssl_bump splice step3 allowed_https_sites
ssl_bump terminate step2 all
Hi, Christoph Haas
Can we define more than two ACLs in one line http_access rule?
For Example :
acl ListIP1 src “/etc/squid/ListIP1.txt”
acl ListIP2 src “/etc/squid/ListIP2.txt”
acl ListIP3 src “/etc/squid/ListIP3.txt”
acl WebList1 dstdomain “/etc/squid/WebList1.txt”
acl WebList2 dstdomain “/etc/squid/WebList2.txt”
acl WebList3 dstdomain “/etc/squid/WebList3.txt”
acl WebList3 dstdomain “/etc/squid/WebList4.txt”
http_access allow ListIP1 WebList1 Weblist2
http_access allow ListIP2 WebList1 Weblist3
http_access allow ListIP3 WebList1 Weblist2 Weblist3
http_access deny all
#Will it work?
Pingback: Safe to Buy Atarax Without a Prescription