EntityStreamSizeException Due to Request Content Size Limit Exceeded

36 views Asked by At

I'm using Watsons Machine Learning service to deploy Docplex model (Lite plan). I encountered an error related to model.tar.gz file size limit issue when making a request to the platform. The error message received is as follows:

Exception has occurred: ApiRequestFailure Failure during scoring asyncronously. Status code: 413. The request content was malformed: EntityStreamSizeException: incoming entity size (32281635) exceeded size limit (10485760 bytes)! This may have been a parser limit (set via pekko.http.[server|client].parsing.max-content-length), a decoder limit (set via pekko.http.routing.decode-max-size), or a custom limit set with withSizeLimit.

To deploy the model I'm using this example

The error occurred while executing:

solve_payload = {
    client.deployments.DecisionOptimizationMetaNames.SOLVE_PARAMETERS: {
        'oaas.logAttachmentName': 'log.txt',
        'oaas.logTailEnabled': 'true',
        'oaas.includeInputData': 'false',
        'oaas.resultsFormat': 'JSON'
    },
    client.deployments.DecisionOptimizationMetaNames.INPUT_DATA: [
        {
            "id": cplex_file,
            "content": getfileasdata("model.lp")
        }
    ],
    client.deployments.DecisionOptimizationMetaNames.OUTPUT_DATA: [
        {
            "id": ".*\.json"
        },
        {
            "id": ".*\.txt"
        }
    ]
}
job_details = client.deployments.create_job(deployment_uid, solve_payload)

The idea I got from IBM support to address this issue is by adjusting the pekko.http.server.parsing.max-content-length parameter to an infinite value. And as far as I understood, the problem is on the server side and not on mine. Or is that the limitation only for the Lite plan?

0

There are 0 answers