Providing a Whitelisted Wireless Hotspot? 58
Ploxis writes "I volunteer some of my day managing a small network (and a ragtag band of computers) for a local nonprofit. I have been asked to set up a second, open, independent wireless network on site that will provide cost-free broadband Internet access to patrons. The catch is that they want to provide access only to a select group of about 25 websites while disallowing everything else. No objectionable sites, no mundane but non-relevant sites such as online banking or YouTube, and no other activities such as P2P or IM. They only want HTTP and HTTPS activity from a set of whitelisted websites." For the rest of Ploxis's question and his intial thoughts on making this happen, read on below.
"They'd also like any non-whitelisted URL to be redirected to a 'splash page,' which would just be some HTML providing a list of allowed sites by category. I'd host this page internally on the network.
Their primary concerns are liability for access of illegal/objectionable materials and conserving their bandwidth, while still providing access to specific relevant tools online.
My initial thought was simply an open wireless router, a set of remarkably restrictive firewall rules, and an in-house server as a custom DNS ... but that's pretty shaky (i.e. anyone specifying their own DNS can still get at whatever they want). I assume they'll need a router with some pretty significant traffic management capabilities as well, but that's not something I've investigated before.
Anyone's experiences, recommendations, case studies, or maps of similar networks would be greatly appreciated."
Get an old machine put Linux on it... (Score:2, Informative)
Re: (Score:1)
There's no need for domains. The firewall just has to forward any non-approved IPs to their "non-approved hosts" page. Easy as pie, and (while non-trivial) not worth our time.
Squid (Score:5, Informative)
Configure a linux box as a router, put squid on it, set up your whitelist, and you're all set.
Re:Squid (Score:5, Informative)
I should also add there's some iptables stuff involved too, but if you know the terms "squid" and "transparent proxy", Google will give you plenty of pages telling you how to set it up.
Re: (Score:2)
I concur.
This is the way I operate our lab connection (allow connections only to equipment vendor sites for software).
-nB
Re: (Score:3, Informative)
Just a thought, annihilate at will.
Re: (Score:3, Interesting)
Sure that looks like a better solution, but squid over a linux router is easier and "good enough".
My caveat is that we have a strict usage policy and if you are caught circumventing my "good enough" solution you are not going to like the written warning. If you want general internet access you are expected to use your notebook and WiFi connection, and not connect to my lab network.
-nB
Re: (Score:3, Informative)
Squid was my first thought as well, configure it as a transparent proxy and redirect all non-allowed traffic to the splash page. Combine that with firewall rules that block all non-DNS and non-HTTP traffic.
/Mikael
Re: (Score:2, Interesting)
Since there can be only one HTTPS site per IP address, that's not a problem. If one of the sites is an HTTPS site, just allow it in the firewall. It's on a different port, so the transparent proxy isn't going to see the connection. Make sure that the address in the firewall rule is kept up to date.
(Yes, I know there is a TLS extension which allows multiple sites to share an IP address, but since that is not universally implemented, no HTTPS site owner uses it, as it would break too many clients.)
Re: (Score:2, Informative)
Re: (Score:3, Informative)
pfSense has got this built in.
Install it on an old Pentium 266-400 or so, with 256MB RAM, if you can, and check the captive portal section.
Set your client WAP up on a NIC by itself, and you can configure captive portal on that interface. Ensure your login page has no login options...just a "You can't go here" type of thing.
Then, set up your allowed sites in the captive portal whitelist.
Problem solved, and you've stopped another machine from ending up in a landfill.
pfSense (Score:3, Informative)
Sounds like something that pfSense [pfsense.org] might be able to do, between squid and maybe the captive portal.
Re: (Score:3, Informative)
Not even that complex. I wrote a little tutorial, here [bfccomputing.com] - just invert the meaning of the block rule and add a default deny.
mod_proxy (Score:5, Informative)
mod_proxy, mod_rewrite
your friends at apache have most of the work done for you. All you have to do is slap it together and write some custom rules.
Linux as a firewall, to make sure that all http/http traffic gets redirected through the proxy
if the hostname in the url doesn't match what's in your rewrite rules (aka, to pass through) then rewrite it to your custom splash page.
no need for wacky dns tricks here.
Re: (Score:1)
Re: (Score:3, Insightful)
One of the requirements is that this is wireless. So he wants to cut out the random interlopers leeching his bandwidth.
Re:Forget it (Score:5, Insightful)
If it's only 25 sites (and not going to turn into "hundreds") then why play whack-a-mole? Set the default to Deny, look up those 25 IP addresses, and allow only 80 and 443 to those sites. That gets you 90% of the way there (the remaining 10% being virtualhosts on the same IPs). The rest of the IPs can be rewritten to a local webserver, which either is dedicated to this purpose or uses namebased virtual hosts to have it's own website, then the "default" vhost being the message you're putting up.
Make a simple script to add and remove IPs from the list and reload the rules, write down instructions on "What to do if www.foo.com stops working" or "What to do if you want to add www.baz.com", and you're done.
There are probably dozens of ways to actually implement this. Most of them will involve either custom wireless router firmware, or the wireless router plugged into a "real" router.
Re:Forget it (Score:4, Informative)
Wont work. The days of IP's meaning anything are long over. You are best to assume they will change in a week.
These 25 sites could be using round-robin DNS and change their IP every DNS lookup. They could be using some load balancer that plays games with DNS and hops you around the globe depending on their mood. You have no idea how they manage their IP space and you are insane to try :-)
Squid is a much better solution. You can get squid to whitelist by domain.
But seriously, the greater internet nerd contingent needs to get it in their head that the days of IP addresses being useful as any kind of fixed or even temporary identifier are over.
Re:Forget it (Score:4, Insightful)
Re: (Score:3, Insightful)
Why in god's name would you statically encode IP addresses when the DNS system is sitting right there to make sure you don't have to do that manual work?
Because that's how firewalls work, in general. Some firewalls will helpfully resolve domain names into IP addresses, but there's no guarantee that the IP addresses that the firewall gets from DNS are the same the client gets, so that is a dead end too.
To do better you need to look into the actual HTTP session. If the poster had a firewall which could do that, he would most likely know, and therefore wouldn't ask the question in the first place.
Re: (Score:3, Insightful)
Keep honest people honest, and only allow a small subset of sites.
Tell Them No (Score:4, Funny)
Tell them no and strike a blow for Net Neutrality!
M
Re: (Score:1)
Tell them no and strike a blow for Net Neutrality!
M
That's not net neutrality, do you even know what that means? They're not running an ISP, they're just trying to provide access to a handful of websites for free.
Re: (Score:1)
You tell him! Strike a blow for humour*!
*Humor for you Americans and your crazy talkin'.
Untangle Pro (Score:2, Informative)
If you really need to make it bullet proof... (Score:1, Interesting)
You need a web proxy and a DNS proxy: The web proxy to restrict the URLs to those which are whitelisted and the DNS proxy to stop "clever" people from tunneling through DNS.
Re: (Score:1)
The only way a "clever" person could fool a transparent proxy would be to corrupt the DNS of the proxy.
Or perhaps you're talking about one of those things that uses DNS as a protocol transport?
query TXT GET.www.slashdot.org.proxy.home.server.org
query TXT 0..proxy.home.server.org
query TXT 1..proxy.home.server.org
2,3,4,5,...,session_end.
Like that?
Most of these wireless AP thingies have a DNS proxy included already, it gets used to redirect people to the AP IP for the usage agreement page.
Re: (Score:2)
You'd need a firewall too, to drop non HTTP traffic. Otherwise a clever user could just use SSH to tunnel their HTTP traffic and even related DNS requests.
Re: (Score:2, Funny)
Absolutely. Another essential ingredient is electricity. And an internet uplink. Who are you? Captain Obvious?
Re: (Score:2)
Use squid with a domain based whitelist. Set /etc/resolv.conf with "nameserver 127.0.0.1" and use dnsmasq to provide dns lookup for squid, which has the added benefit of using your /etc/hosts for lookup exclusively. You then set up a script to look up each of the IPs of the desired sites automatically from an external server, probably with nslookup though there might be a more efficient method, on an hourly basis from a trusted DNS server. Dnsmasq handles your dhcp as well, adding further simplicity to the
tinyproxy (Score:5, Informative)
Instead of squid, use tinyproxy. You're not primarily interested in caching, you're interested in access control. Tinyproxy gives you much finer control of that, and it's also ... well ... tiny.
Just set up a "no proxy" rule for the sites you want them to get to, and redirect everything else to a 404 server.
Re:tinyproxy (Score:5, Informative)
As the stated goals are to provide access to a very small number of pages and limit bandwidth, caching is a great idea.
Re: (Score:1)
But definitely non-essential, especially if caching can be implemented behind the blocking solution.
Perform a DNS lookup on each of the 25 domain name (Score:2, Interesting)
Of the allowed sites.
Use any commercial router and access point, or even a WRT-54G. Drop the list of allowed ips into an access list
Deny traffic for all other ips.
Use separate rules to deny traffic to ports other than 80 and 443
Mod parent up (Score:2)
Just using a firewall; nice idea. You'd have to keep on top of DNS lookups though.
The router I got from my ISP actually allows you to do this by default. It also lets you redirect to another page, which would allow an error message to be displayed. Can you think of a way to do this with kit available in normal routers?
Use scriptable devices (Score:1)
I would expect a simple shell script on a workstation or laptop should be able to assist with maintaining the ip list; or make a Linux virtual machine/put it on the cloud, and only ever run the VM while updating the list. It would be sensible to use a script and format the output so it can be pasted into the device to perform updates according to any DNS change.
It is fairly commonplace to have script-generated firewall configurations like this. Esp. when a single site has several firewalls (I.e. for backu
OpenDNS? (Score:3, Informative)
OpenDNS were talking about adding this as a pay-for service [opendns.com], which would be cheaper and easier than setting up a dedicated Linux box, which is the normal proposed solution to any problem posed to Slashdot.
Incidentally, the thread I linked has some other solutions posted in it.
Re: (Score:1)
Anyone with a static DNS server entry (or enough knowledge to look) would get around it instantly.
Options, options. (Score:1)
I would suggest a linksys-WRT54GL/Buffalo/Asus/etc. wifi router running OpenWRT. If you're only allowing to a relative handful of sites (~25), iptables rules wouldn't be too cumbersome. Add on a captive portal package (wifidog, nocat, etc.) and you're good. Though the basic captive portal redirection could be handled simply with iptables too, but one of the packages could make things easier to administer/monitor. I know that wifidog uses libhttpd as a web server that runs on the router, so you could run the
Dont forget to include dependencies! (Score:3, Informative)
Whatever you do, make sure you whitelist any dependencies these 25 websites use. I'm thinking of things like google-analytics, any kind of javascript library that is third-party hosted (Google Code or YUI) and ad code here. If you whitelist those as well, your patrons browsers might act a little funky depending on your solution.
Re: (Score:2)
Why would you whitelist google-analytics? Isn't it some sort of usage-tracking service? I'd rather minimize the information collected on me.
I use the NoScript Firefox 3 extension, and I set google-analytics to UNTRUSTED. I have no problems browsing with that.
Mikrotik would be perfect (Score:1)
Mikrotik will do everything you need and more.
You would need build your own using a RB/411A, CA/411, R52H, AC/SWI and a 12-24volt power supply and you would be all set.
http://www.mikrotik.com/ [mikrotik.com]
http://forum.mikrotik.com/ [mikrotik.com]
The guys over at http://www.quicklinkwireless.com/ [quicklinkwireless.com] sell preassembled AP's and will even walk you through configuring it.
Dans Guardian (Score:2, Interesting)
Setup a transparent proxy and use dansguardian [dansguardian.org]. I've set this up and had it running for several months. It *easily* supports whitelited/blacklisted sites, domains (using regular expressions even), and mime types. It can also block objectionable content based on keyword groups and ratings etc. Very good indeed.
Try a D-Link Router (Score:1)
Does not provide a bounce page, that I'm aware of.
DD-WRT (Score:1)
Mikrotik RouterOS (Score:2, Interesting)
Simplest, quickest way to do it, and does everything you're looking to do.
They put a relatively decent shell interface on top of linux that hides a lot of the complexity, and also have a good GUI management utility (I don't use it myself, but it can do everything the shell can).
It'll run on most hardware, including x86. You'd have to buy a license, $45, but it's worth the time saved figuring out how to get all the different parts tied in together.
And there is an active community forum with helpful people in