Web page thumbnails

Closed

I need a high perfomance system for taking screen shots of rendered web pages. The screen shots must then be turned into jpeg or gif thumbnail images.

Operating System: Red Hat Linux ES.

The system must do the following:

* use a browser component to render the page (e.g., Gecko, FireFox etc)

* capture the body onLoad event and take a copy of the fully rendered page

* prevent Javascript, pop-up windows obscuring the image of the web page

* reduce the size of the image to a 'thumbnail' while maintaining reasonable quality

* enable screenshots to be taken in parallel/asynchronously

* the minimum throughput required is 10 screen shots per second

* run on Linux in a small memory footprint - less than 30Meg of RAM at maximum throughput

* must be driven from the command-line in batch mode (e.g., perl [url removed, login to view] -size 400x140 [url removed, login to view]

Skills required:

Gecko, FireFox, XUL, Perl, GD, RedHat Linux, C, Javascript, HTML, webthumb, rpm, ppm tools, Image Magick

Deliverables:

* Ideally a perl script that given a list of URLs and IDs produces a directory full of thumbnail images

* Benchmarks of screenshooter working at over 10 screenshots per second

Skills: C Programming, Linux, Perl, PHP, Visual Basic

See more: linux web, javascript jpeg, gif to txt, gif reduce, firefox web, thumbnail web page linux, ram c, web script, web page, web html, Web e, web c, RED HAT LINUX, ram, prevent, page, mode, line-following, image render, Image magick , html web page, gd, footprint , event directory, directory web

Project ID: #11826