Updated 2009-07-16: read below

Well, it seems recently i’m not very amused by the tools and sites i find online.
No, it’s not me ;-)

This time it happened with Ning. We have a network (no commercials, for now) and would like to add a very little tiny banner. Easy, eh?
Well, not so fast: the banner have to be shown only if the user is logged in and if he/she answered a multiple choice question with a definite answer (the other answers will lead to no banner or a generic one). This is because the user will be eligible to a free subscription and services on another website.
Yep, we’re not giving free beer to anyone :-P
So, that is a requirement for the partner site to give us the service: we filter the users, we get the service. No filter, no service!

First thing I thought: eham… Ning? I’m not very confident, but I know they’re smart guys and made two (wow!) APIs.

  1. REST and XML APIs
  2. OpenSocial APIs

Now, in March they completely disabled REST APIs and WebDav features because they want to concentrate on OpenSocial: Ning platform updates. So option #1 is no more.

The OpenSocial APIs are really a nice idea: you write an app/widget and it works all over the OpenSocial-compatible socialnetwork platforms! Great! :-D
Well, not so fast!

First, Ning didn’t implement a lot of features, just the minimum.
Second, even with the whole of the APIs, there is no way to recover user data of the custom questions (yes, the custom answers ;-) ) and moreover, there is no API to automatically recover the whole user base (in case one would like to export the data periodically as a feed and/or use it for an external app).

Ah, and forget to put OpenSocial apps on your home page (it’s not permitted)

“They want to concentrate on OpenSocial”

?!?!? :-x

The only way to recover user data is to download a CSV file from the admin panel: yes manually!

So I decided I had to hack this :-P

The only way to do that automatically is to use the cURL library to simulate the click of a real admin, but unfortunately the admin panel is so sophisticated, that it’s not just a matter of retrieving the login webpage and the export page.

The admin panel seems to work like this: since the extraction of the data could take much time, there is an AJAX procedure polling the server and the polling is the way the server dedicates time to this action (so if you do not poll, you don’t get the extraction). When the extraction is done, one may download the CSV file from a specified link.

Jee, i had to sniff the communications to get the protocol right! :-(

Here is the request sent by the AJAX function on the admin browser:
POST /main/bulk/exportMemberData?xn_out=json HTTP/1.1
Host: www.example.cpm
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; it; rv:1.9.0.8) Gecko/2009032609 Firefox/3.0.8
Accept: application/json, text/javascript, */*
Accept-Language: it,en-us;q=0.7,en;q=0.3
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 300
Connection: keep-alive
Content-Type: application/x-www-form-urlencoded; charset=UTF-8
X-Requested-With: XMLHttpRequest
Referer: http://www.example.com/main/membership/listMembers
Content-Length: 51
Cookie: xn_visitor=a111111-111-111-1111-11111111111; __utma=11111110.2111111111111111110.1111111111.1111111111.1111111111.1; __utmb=111111111.1.10.1111111111; __utmc=11111111; __utmz=11111111.11111111111.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none); xn_id_example=AAAAAAAAAAAAAAAAA*QAA*4ZAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAv-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA*qDaE_; xn_shared=GAAAAAAAAAAAAAAAA*QAA*4AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAeAAA*rA0NniIQ__
Pragma: no-cache
Cache-Control: no-cache

counter=2&xg_token=81111111111111111111111111111110

Well, I can’t post the reply because it’s gzipped, but the content is:
({"contentChanged":50,"contentRemaining":1})

So basically, one shoud call the same URL (www.example.com/main/bulk/exportMemberData?xn_out=json) in POST mode, with an increasing counter number and it receives the above reply every time. It’s time to stop when the contentRemaining variable is zero (0).

After that, you just open the url www.example.com/main/membership/downloadMemberData with a cURL call and you get the updated file.

So, summarizing:

  1. Setup cURL
  2. Call www.example.com/main/bulk/exportMemberData?xn_out=json with POST data counter=0&xg_token=81111111111111111111111111111110
  3. Examine the reply and if contentRemaining is != 0, increase counter and recall the #2
  4. Repeat #2 and #3 until contentRemaining==0
  5. Call www.example.com/main/membership/downloadMemberData and save the CSV to file

Be aware that:

  • if you do not run the extraction procedure, the file can be downloaded, but it will be obsolete
  • the extraction takes quite a long time, even with a relatively small number of users
  • the procedure refuses to update if you call it too frequently (it will return contentRemaining:0 even if some data is changed (I tried changing my profile)

Of course all the above operations must be done offsite with a script or an executable (you can’t do that on the network webpage itself with Javascript for example), so who does not have a different host to run the scripts on, well, its f*cked.

After that, you may use a Javascript on your Ning nework to get the current userid from the ning.CurrentProfile['id'] variable and pass it to your script via AJAX, JQuery or simply as a parameter of an iframe src url.

Here’s an example:
<iframe id="myframe" style="display: block;" scrolling="auto" width="150" height="200" frameborder="0" name="myframe"></iframe>
<script type="text/javascript">
document.getElementById('myframe').src = "http://external.example.com/banner.php?r="+ning.CurrentProfile['id'];
</script>

Hope this helps some desperate network owner and: thanx Ning…

P.S. This hack is given as is and it’s not guaranteed it will work forever.


Update 2009-07-16
As requested I’m uploading the PHP functions to operate as suggested above.
Happy downloading ningcsv.
Please note that no guarantee is given for this file.
The original piece of code is actually copyright by the firm i work for, but i prefer not to mention it for privacy metters (in case you would like to sue that! :-P ).

Share

Related posts

Related External Links

Rating

Not useful/interestingUseful/Interesting (No Ratings Yet)
Loading ... Loading ...

3 Comments

Leave a Reply