My company is running a research project studying content and advertising on roughly 10 different web sites with a few pages on each site needed to be created.
1) We will be recruiting a few hundred participants to our office to read some articles on some major web sites.
2) We will be tracking their biometric and eye tracking responses as they read
3) We will be placing new types of online advertising units on web pages from major news web sites to test how these people respond to the new ad units
1) We need you to create all of the target test pages (up to 40 pages) in this experience. As this is a research test, they do not have to meet the standards of a live site. They just have to look accurate, but they will run off of our local server and only be seen by a few hundred people.
(a) Each test page should be a pixel perfect copy of the target web site. These will be copies of pages from major web sites like [url removed, login to view] or OMG.com.
(b) All links should be functional, but link back to the live web site if the user clicks something on the page. A first step might be to use an offline downloader like this one: [url removed, login to view]
(c) The pages would ideally maintain interactive elements like navigation rollovers, but more complex elements like interactive features, games, etc do not need to work. All of these site will have no highly interactive elements that need to be replicated.
(d) Critically important is the need to remove the ad that appears on the page and replace it with one of 5 of 6 different special units we have created for the test. These might be video units, interactive units, etc. This will require scripting knowledge, but nothing too difficult as the advertising units are mostly self-contained.
2) We will also need special landing pages to direct the research. For example, if we were using CNN, we would want a CNN homepage that has the following changes:
a) A "featured articles" section we create. These articles will be the only clickable items on the page. Each article linked will actually be the test pages (the pages we duplicate exactly but change the ad on the page).
b) Once a featured article is clicked, the system will create a cookie containing the article clicked. When the participant returns to the page, that article will no long be listed & the participant will not be encourage to select another article to read from the featured section (which again goes to another target page).
3) If you have skills to give us more page insights from this small pool of users, that could be valuable. This might include tracking when each page loads and unloads, creating mouse heatmaps (like these: [url removed, login to view]) or even identifying the times the ad was in view (not scrolled off the page). That said, these would be bonus elements and are not necessary to get this contract.
1) We believe this should be an hourly project as more items may come up and more iterations needed as we go. Feel free to give a ballpark estimate based on what you've seen so far, but we understand this may change dramatically one we give you (a) the actual sites, (b) the ad assets and (c) other changes or iterations.
2) Timeline: We can get started in a few days (once we pick someone), but we need to finish all sites by the end of May at the latest.
3) Opportunity: we need to conduct projects like this on a regular basis, so we are hoping to find someone who can do this type of work for all of our web research projects.
11 freelancers are bidding on average $35/hour for this job
Ready to start "Now " Give me chance to prove my ability and also quality work i am waiting your positive replay from your side i am highly interested to work with you