Human Web Filter Prototype

1. You have to write a web proxy that allows a human censor see and decide whether the surfers are allowed to surf certain cites.

1) The surfer surfes the web and has configured his browser so to use a web proxy.

2) He agrees to be censored - he can modify the proxy settings of his browser to avoid being censored but we don't worry about that.

3) Every time he enters a new website the browser will wait as much as is necessary until the censor permits or denies viewing the site.

4) Once a site is permitted it enters a white list and is deemed good forever. Equivalently, if it is denied, it will enter a black list and be denied forever.

5) The decisions are done per hostname: if [url removed, login to view] is allowed then any page in that domain will be automatically allowed forever, even [url removed, login to view]

6) Web search censorship:

a. If the user goes to google and enters a search, all the results and ads are passed through censorship. (that means: if in the whitelist, they are allowed immediate. If they are in the blacklist they are denied immediately. All other results are passed to the censor).

b. The Browser stays in “loading…”mode until at least 20 results have been approved or all the results in the page have been processed by the censor.

c. No need to support other search engines.

7) The censor’s console:

a. There will be a censor’s webpage (no need to log in, just a special URL).

b. The censor’s webpage just contains a list with every page that needs to be censored.

c. It should be populated with an AJAX call that is executed every second.

d. Beside each page to censor there should be two buttons: allow and deny. If any button is pressed the page disappears from the censor’s page and enters the white or black list appropriately.

8) The pages to be censored will be of type text/html - other pages URLS will pass without censorship (audio, images, etc)

9) The proxy doesn't need to support POST request or file uploads, only GET.

10) The database of white and black lists can be stored in plain text files.

11) There should be a web page to edit the white and black lists: something very simple, just put the whole lists in a huge text area with a send button.

12) No need to support HTTPS.

Skills: AJAX, Java, Javascript

See more: you proxy google, web search website, web search images, web search engines list, simple prototype, prototype request, new web search engines, html 5 web, google search web search website, get prototype, edit google web page, cites google, prototype 3, a prototype is, webpage or web page, web search 7, web results, search all the web, google web page, call lists, web search web post, web prototype, Web java, web domain, web buttons

About the Employer:
( 0 reviews ) Haifa, Israel

Project ID: #233448