Java website, with arbitrary performance standards

  • Status Closed
  • Budget $2 - $8 USD / hour
  • Total Bids 3

Project Description

I need a website that doesn't do much and is instrumented with AppNeta's TraceView product.

# The site itself

The site itself should require 2 machines to run. The frontend machine should have 4 endpoints:

* /login - This endpoint should make 3 database calls.

* /overview - This endpoint should make a database call, a memcache call, and an internal REST API call, and an external REST API call.

* /top_sites - This endpoint should make 50 of the same database call and 2 memcache calls (one set, one get).

* /profiles/ - This endpoint should make 5 database calls and 5

memcache calls. It should return, no matter what is passed for the URL.

All of these should return a page with at least 200 lines of HTML on them. All

of them should include a twitter share button.

The backend machine should have one endpoint:

* /api/data - This endpoint should make a database call, and 2 memcache calls (set & get).

For each of the calls above, I specifically mean:

* A database call: Make a call to a MySQL database, which is set up on the same

machine. I don't care what the query is, but it should take 200-400ms to


* A memcache call: Make a call to memcached on the same machine. The set should

set a key (named 'obj_cache'), and the get should retrieve the key 'obj_cache'

90% of the time, and 10% of the time retrieve 'cached_obj'.

* An external REST API call: Make an HTTP call to

[url removed, login to view] Don't worry about passing


* An internal REST API call: Make an HTTP call to the other server (the

backend), at /api/data.

In all of these cases, you should ignore the result -- the important part is that the call is made.

You should use the following libraries / versions:

* Apache webserver, version 2.2, set up to proxy requests to Tomcat.

* Tomcat webserver, version 6.x.

* Spring MVC framework, version 3

* xmemcache memcache client, version 1.3.6


* Apache HTTP client, version 4.

There are also a couple of of URL flags that all pages should respond to.

* If any of pages are passed "qArg=300" in the query string, it should take an

extra 300ms to run. Similarly, "qArg=100" should delay the request by

100ms. Something like [url removed, login to view]() is fine here. This should be capped at

1000ms (i.e., if qArg=1000000, only sleep for 1000ms).

* If any of the pages are passed "qQuery=true", instead of running the normal

database query, it should run a query that takes an extra 200ms to run.

# The instrumentation

I want to get visibility into what this site is doing. You should instrument

both sites with TraceView. There is a 14 day trial available (link below), which

should give you enough time to instrument this site. Go through the installation

procedure on both machines, and make sure they have the following:

* Basic install (i.e., the tracelyzer daemon is running)

* Apache install on the frontend (i.e., mod_oboe installed)

* Java installation (i.e., [url removed, login to view] loaded via -javaagent in [url removed, login to view])

* RUM (Java function that returns JS on each page.)

* Wrap the [url removed, login to view] that qArg responds to with a Java profile named

'data_process'. (Instructions for that are here:

[url removed, login to view])

TraceView trial: [url removed, login to view]

Get free quotes for a project like this
Awarded to:
Skills Required

Looking to make some money?

  • Set your budget and the timeframe
  • Outline your proposal
  • Get paid for your work

Hire Freelancers who also bid on this project

    • Forbes
    • The New York Times
    • Time
    • Wall Street Journal
    • Times Online