Prosper’s new RESTful API

Prosper recently re-released their API, completely rewritten from the SOAP/WSDL service to a REST compliant service.  This is an exciting change for three reasons:

1.) More open. The move to REST will allow greater access to Prosper’s API.  Not only can you access data directly from your web browser but I believe developers will find the API much easier to work with too.  When you make a request to their API can you specify your content type as XML (default) or JSON which gives developers greater flexibility to consume the data in a format of their choice. There is a lot of freedom in terms of how you process the service results.

2.) Richness in data. Prosper has included endpoint access to your account balances and notes.  These are wonderful additions, but it doesn’t stop there. They have a new endpoint called DataDictionary which states every data element, the description, and SQL type for the specified endpoint. One of the biggest pains when handling a large amounts of data with no definition of the types is guessing what is most reasonable to use for your SQL schema. Not only is this  solved, but with the SQL types returned by the service.  You can create scripts to automatically generate and maintain your schema for Prosper data. This will translate to saved time and ease when maintaining your local copy of Prosper data.

3.) Security. Prosper requires a separate set of credentials to access the API. This means applications will use a set of credentials that can only access API functions, and not your main account which you log into from Prosper.

API Endpoints

Below is a brief descriptions of the new Endpoints:

Listings: GET https://api.prosper.com/api/Listings/

Listings contain 541 data elements per loan as of writing this. A bulk of these are part of the credit pull they receive from their expanded underwriting process.

ListingsHistorical: GET https://api.prosper.com/api/ListingsHistorical/

This is the same data contained in Listings, but for all listings that have ever existed. It’s a very large data set, and we’ll discus later how to “chunk” the data into smaller downloads. But the best way to get this data will likely still be direct from Prosper’s data download page.

Invest: POST https://api.prosper.com/Invest

The Invest endpoint will allow you to place an order for a note. Typically POSTs aren’t something you can do in a web browser manually but plugins like REST Console (covered later) would allow you to do so.  For developers its important to set your Content-Type to application/x-www-form-urlencoded for the investment to work correctly.

To do the invest all you need to do is POST two parameters:

  1. listingId: the listing number of note you wish to buy
  2. amount: The amount you wish to buy

If you are trying to buy $100 of a loan that only has $50 left, the service will automatically round down and tell you in the response how much of the investment amount was actually invested.

Investments: GET https://api.prosper.com/api/Investments/

Investments is an interesting endpoint because is allows you to see a detailed audit trail of all investment history for your account, not just notes that have been issued. It would be very easy to reconstruct your “issue rate” to see how many of your loans are actually getting issued.  This is a great metric, and one I personally monitor for my investments.

Notes: GET https://api.prosper.com/api/Notes/

Notes will be critical for those who in the past relied on LendStats or ProsperStats to gain insight into their portfolio. This is essentially the Lending Club equivalent to notes.csv.  In addition to showing you all the notes you own, Prosper gives you some nice additional data such as:

  1. Has the note been sold on FolioFN
  2. A complete breakdown of fees
  3. Recovery amounts
  4. Days past due

Later in the blog post I will show you how to get your Notes from your web browser in CSV format.

Loans: GET https://api.prosper.com/api/Loans/

The Loans endpoint provides a high level overview of loan performance. It won’t contain expanded underwriting, but will contain vital information related to balances. In many ways it shares a lot of same data elements as the Notes endpoint, but include all loan, not just ones you’ve invested in.

MarketPlace: GET https://api.prosper.com/api/MarketPlace/

MarketPlace is not yet implemented yet, but will contain data on the overall platform performance. Such as total issued,  interest rates, and other aggregate data points.

Account: GET https://api.prosper.com/api/Account/

Account is not yet implemented.   There is no public documentation on this endpoint, but from what I have been told it will allow you to access information such as idle cash, and total investment in terms of outstanding principle and pending notes.

oData Filtering

oData is not an Endpoint, but is method for filtering data. This allows you to specify subsets of data from the API. For instance you may want to look at only A grade loans for Listings, or find loans in a specific interest rate range. Theoretically, developers could rewrite their filters using oData, make an API call to Listings and simply buy the loans returned. No need for client side filtering or even storing data!

A good example would be using oData to filter ListingsHistory for all issued loans. Since ListingsHistory also contains loans that were not issued, we will have to filter these out.

GET https://api.prosper.com/api/ListingsHistorical/?$filter=LoanOriginationDate ne null

We are searching for all loans where the origination date is not NULL – NULL indicates the loan was not issued.

Keep in mind, this is a huge set of data. I would recommend using oData to break this request into smaller ones using $skip and $top to implement paging.

  • Request 1: https://api.prosper.com/api/ListingsHistorical/?$filter=LoanOriginationDate ne null&$top=50&$skip=0
  • Request 2: https://api.prosper.com/api/ListingsHistorical/?$filter=LoanOriginationDate ne null&$top=50&$skip=50
  • Request n: 50(n-1): https://api.prosper.com/api/ListingsHistorical/?$filter=LoanOriginationDate ne null&$top=50&$skip=50(n-1)

You would do requests until you result set is a size of 0.  At that point you would have a complete data set for all issued loans.

How to Access the API (Even if you’ve never written a single line of code in your life!)

One of the advantages of a REST based service is the simplicity. REST uses the same protocol the world wide web uses, HTTP, which means your browser can act as a client for the API. You can you can start accessing data with ease and speed.  For all of the service endpoints that support GET, you can go directly to the endpoint in your browser and retrieve data in XML format. However, the API supports CSV and JSON if you specify this in your Content-Type.  A quick summary of the support contents type are:

  • text/csv
  • application/json
  • text/xml

Please remember, in order to access the API you must contact Prosper’s Investment Services.

Using Chrome’s Rest Console to Access Your Notes

Chrome has a plugin called REST Console which allows you to interact with REST services using a graphical interface. By setting a few parameters you can start accessing the data without having to write a single line of code.

1. Install & start REST Console from inside Chrome

2. Configure the Target

papi_target

3. Set your credentials for the API

papd_auth

4. Click Send

You will see your currently owned notes returned in the console output. You can get the exact same data without using REST Console but it will be in XML format.

Conclusion

The new API offering for Prosper is a game changer. The DataDictionary endpoint is probably the least interesting in terms functionality but will provide developers with the ability to create applications in less time and with more robustness.  The DataDictionary also contains the description for all of the expanded underwriting elements so you will know what the data means and how it may impact your investing. As more developers decide to automate their own investing, it will prove and invaluable resource.

As for the other Endpoints, Prosper has literally exposed virtually all data they have for their loans through their new API.  There is no doubt this will be impactful on investors and open all sorts of new discussion on how we use data to invest.

Be Sociable, Share!
  • Pingback: Peer to Peer Lending News Roundup – March 16, 2013()

  • Pingback: Social Lending Beat – Monday 18 March 2013 | CommunityLoan()

  • Guest

    I really, really wish that people would stop labeling everything that’s not SOAP as “RESTful”. Nothing in these APIs has anything to do with REST. I would like to hear people just use the term HTTP Service/API or even JSON API. I’m not being pedantic or picky; there’s no resemblance to REST.

    • http://www.nickelsteamroller.com/ Michael

      I’m inclined to agree with you mostly. The fact 99% of developers consider REST to be HTTP verbs makes it difficult to write about a subject related to services that are not SOAP and not use the term REST. The most correct term would be HTTP Service but I am fine referring to it as RESTful. It’s the first iteration of this service, and there have already been improvements to bring it closer to a REST (as you presumably have it defined like this http://www.intridea.com/blog/2010/4/29/rest-isnt-what-you-think-it-is )

      This aside, I assume the implementation that annoys you the most is the POST /Invest end point which takes application/x-www-form-urlencoded as the content type. REST purists would demand the request/accepts content-type be application/json therefore representing an object and have a body with something like

      {
      listingId: NNNNNNNNN
      amount: NN
      }

      I think the core issue here is the .NET ApiController model binding assumes form data. To use the body requires the use of parsing the body, and attempting to convert object members to the appropriate primitives. I.E. var listingId = Convert.toInt32(InvestObeject["listingId"]); I believe writing robust REST APIs in .NET gets dicey and diffult to test if you try to create a true REST service. MS didn’t really design it for that. Thanks for commenting.

    • http://www.nickelsteamroller.com/ Michael

      One additional comment. Even among REST purists there isn’t agreement on what makes REST, REST. Some will say the version of the API should be in the Accepts-Version header, others state it should be in the URI i.e /v1/ I’ve heard arguments from both sides and although I lean towards the header, I do not consider making it part of of the URI a poor design or “un-RESTful”.

  • http://twitter.com/taiganaut Boreal Explorer

    Great new API. Too bad it won’t be serving up bid data any longer, at the behest of large investors who want to skim the cream off the top before the “retail” investors get at it.

    It’s really disappointing to see a supposedly “peer to peer” lending platform turn into yet another rigged casino.

    • http://www.nickelsteamroller.com/ Michael

      I can understand where they are coming from. The big money doesn’t want people riding on their investment strategy, additionally they don’t want other hedge funds to try to reverse engineer their strategy. I never used the bid data personally in my investing but I understand how this seems like their are “turning” their back.

      It’s not rigged because both companies give no preferential treatment to investors. If they for instance allowed a fund to place servers in their data center and gave them some advantage to do HFT then I would completely agree with you. To my knowledge this is not happening.

  • Gab Shepherd

    Is the login credentials for the API the same as the username and password for the Prosper website login? (Is an email address required as one of the login credentials?)

    I am getting a “HTTP Error 401: Unauthorized” when using python’s urllib2 module.

    • http://www.nickelsteamroller.com/ Michael

      They are different. You have to enable this in your Settings and then API Access

      • Gab S

        Thanks for responding. I enabled API Access with the username being defaulted to the login email address and the password being set to the login password for the website. However, I am still getting the same error “HTTP Error 401: Unauthorized” error while trying to access url = ‘https://api.prosper.com/api/Listings/’.

        Where in my account information can I see the unique API credentials, if they are not the same as the login credentials for Prosper’s website.

        • http://www.nickelsteamroller.com/ Michael

          The 401 means the password / login combination is wrong. If you set this up in “Settings -> API” and set the password then you may have to contact Prosper. Something might be off in your account.

          • Gab S

            I fixed the issue with my API credentials, but I am still getting the same error. I was able to login into the api manually via the browser, so that tells me something is wrong with my python script.

  • TJ

    I just stumbled upon your blog after independently trying to page through the ListingsHistorical call. I was doing the skip and top just like you suggested. I am seeing three very annoying things:

    1) You have to write resilient code with retries as the calls fail quite often.

    2) Paging without any filter (which is perfectly
    valid in odata) seems to slow down dramatically as page numbers increase.
    And it also starts erroring out as page numbers increase. For example I
    often get an error around this number of paging: https://api.prosper.com/api/ListingsHistorical?$skip=200000&$top=1000
    3) I am never able to get to the skip top point where no more data comes back, so I am never been able to walk the entire set…

    Any idea of what I may be doing wrong?

  • TJ

    One more thing: the csv content type is pretty useless in the listing and listinghistorical calls because one of the columns has user entered text which often contain commas and this makes it impossible to parse the data once the download is finished because there are more commas than the column delimiters…

    • Jeff Lunt

      They should be surrounding those fields with quotes to deal with that…if they’re doing it right. If this is still an issue 3 months in I would report an issue as that’s just a broken CSV export, plain and simple.

  • TJ

    And one last thing – for any pageable api (like true odata) to be of any service it needs to return total number of records and as far as I can see prosper does not do this so you never know when to expect the last set of records in a paging type of approach.

    • http://www.nickelsteamroller.com/ Michael

      It’s definitely an issue, but I don’t think Prosper intends for all the data to be pulled from their API. I know Rocco at Prosper stats is doing it successfully, it just requires robust code. I know I had a lot of trouble, but it will eventually make its way through the API. If you requesting data in blocks of say 100, its reasonable that anything less that 100 means the end of the data.

  • Jdub_p

    what is the best way of downloading more recent historical data? If I use “$top=10000&$skip=300000,” it does not give me a correct json file to download. Because it takes too much time to skip 300000 entries, I guess. I want to download the data from API rather than the Data Export page because API contains more variables. Thanks.

    • http://www.nickelsteamroller.com/ Michael

      You may consider doing multiple threads, say 20 threads per second each grabbing 10 records, you could do 200 per second then. You should be able to pull down the entire dataset in a reasonable amount of time. What language are you using?