How to save 100 objects to server with ajax and php?

334 views Asked by At

Let say I have a page with 100 objects and each page is around 700 bytes when converted to json.

In order to save the objects to the php based controller I have the following options.

Option 1

For each objects (100 objects) do the following

  1. Take object definition 2 .convert to json
  2. Do http post to the php controller
  3. the php controller saves it to a file or database.

Option 2

Variable bigJsonString;

For each objects (100 objects) do the following

  1. Take object definition 2 .convert to json
  2. append the json to a string variable "bigJsonString" with a delimitter to indicate end of object.

After the big fat bigJsonString is constructed

  1. Do http post to the php controller by sending "bigJsonString"
  2. the php controller saves it to a file or database.

In option 1, I am doing 100 http posts one after another. Does this raise any alarms? Is this normal for web applications doing ajax post?

The second option seems safe but then the only concern is when the 100 objects become say 500 objects or to a point where the "bigJsonString" goes several Megabytes long.

The third option we can introduce is a hybrid of option 1 and 2 where we start by constructing the "bigJsonString" and if the length goes to a certain limit then do a ajax post. Flush the string and build the string again for remaining objects.

What are the pitfalls and what is the normal or standard practice. if someone can point to resources where this is already analysed, that would be great.

Thanks very much.

4

There are 4 answers

1
strager On BEST ANSWER

Browsers generally limit the number of connections to a single domain to a low number (under 20 by default for most browsers). In the meantime, many of your requests will block.

On the other hand, larger requests will take longer to fully process because there are less opportunities for parallelization.

Ideally, you would try both methods and see which one works most effectively.

A note: for the second method, you could create an array of the objects then serialize the array as JSON, instead of manually dealing with the JSON string. (JSON supports arrays, not just objects!)

0
Summer On

Go for option 2, bigJsonString. You should have no trouble passing messages that are several megabytes long - the same infrastructure is used to pass much larger html, image, style, script and video files over the internets.

2
simshaun On

I guess its situational. However, I don't see any situation in which sending 100 requests to the server all at one time (per se) is good.

Personally, I'd just push each object to a javascript array and send a JSON representation of the array to PHP, that way you don't have to worry about delimiters.

1
Byron Whitlock On

It depends how fast you are posting the objects to the server. If the json objects are being posted say every second, 1 object per post isn't too bad. But if you are posting 100 in a second, you really need to build up a large request.

There will be significant lag for each request. Building a large multi object json string is preferable in terms of performance.

What if there is an error in one of the objects? You will need to make sure it doesn't dump stop processing of all the other objects or the user will have to upload all that data again.

If you do multiple requests, you can give better user feedback client side since you know exactly where you are in the queue of objects.

It is up to you to balance all this.

Good luck.