How to increase the speed of a php function that returns a token

245 views Asked by At

Good day,

There's a file named "bfile" that has binary values like this:

7f45 4c46 0201 0103 0000 0000 0000 0000
0200 3e00 0100 0000 300c 4000 0000 0000
4000 0000 0000 0000 5062 0c00 0000 0000
0000 0000 4000 3800 0500 4000 1f00 1c00
0100 0000 0500 0000 0000 0000 0000 0000

The size of the file is 814KB. We're not allowed to change the binary in any way.

The function that access that is like this:

function get_auth_token() {
    $arg = json_encode($_REQUEST);
    return `./bfile $arg`;
}

Based on the function it's returning a token like this "z6x6ti5taac1mjn-9wG7w44-" but I don't how that works and how I can increase the speed.

With my current test, the result is this:

Server Software:        Apache/2.4.7
Server Hostname:        127.0.0.1
Server Port:            8000

Document Length:        38 bytes

Concurrency Level:      10
Time taken for tests:   201.117 seconds
Complete requests:      1000
Failed requests:        0
Total transferred:      226000 bytes
HTML transferred:       38000 bytes
Requests per second:    4.97 [#/sec] (mean)
Time per request:       2011.172 [ms] (mean)
Time per request:       201.117 [ms] (mean, across all concurrent requests)
Transfer rate:          1.10 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.0      0       0
Processing:     5 1995 1434.6   2010    4055
Waiting:        4 1994 1434.5   2009    4054
Total:          5 1995 1434.6   2010    4055

Percentage of the requests served within a certain time (ms)
  50%   2010
  66%   3009
  75%   3012
  80%   4006
  90%   4010
  95%   4013
  98%   4018
  99%   4022
 100%   4055 (longest request)

For a thousand request it took 201 seconds to finish. I have to improve the design to make it handle several thousand requests per second. I have no idea how to do that. Please help. Thank you!

1

There are 1 answers

0
fab2s On

If it still makes sense, there would be something you could do to increase the amount of request you can handle : use php fpm, easier to setup with nginx, to be able to call fastcgi_finish_request() as soon as you receive the request and returned the response status. This way, the rest of the script would be executed in background without keeping the connection slot. This should increase your request rate, but it may still not be enough. If so, you would need to scale horizontally by load balancing the target API endpoint over several servers, and probably, to use a queue/worker pattern if what you need to do involves loading some data to some repo that could also be overwhelmed by the load when it's too high.

The whole thing is also achievable with apache and without fpm, but it's a bit more complicated as you need to set every bit of the response right to actually free the connection artificially.