Details
-
Type:
Bug
-
Status: Open (View Workflow)
-
Priority:
TBD
-
Resolution: Unresolved
-
Affects Version/s: None
-
Fix Version/s: None
-
Labels:None
-
Template:
Description
I found that the default maximum memory size setting in Docker is around 500 MB. Can you increase this to a larger value?
I have an application that updates about 30,000 users. It was working fine, but, then, I added custom fields to the JSON to be updated. At this point, Docker started terminating the user module process after updating about 500 users. I'm thinking this may be due to the extra work the web API is doing with regard to validating the custom fields. I increased the maximum memory setting to 1 GB by running the following commands after the process was terminated.
docker update --memory 1G 0dea573c17b7 docker start 0dea573c17b7
This fixed the problem. Also, before doing so, I ran the following command and saw that Docker had terminated the process due to it using too much memory.
docker inspect 0dea573c17b7
I think the 500 MB default size is too small.