Merge lp:~numerigraphe-team/ocb-addons/7.0-fill-inventory-OOM into lp:ocb-addons
Proposed by
Loïc Bellier - Numérigraphe
Status: | Merged |
---|---|
Merged at revision: | 10210 |
Proposed branch: | lp:~numerigraphe-team/ocb-addons/7.0-fill-inventory-OOM |
Merge into: | lp:ocb-addons |
Diff against target: |
66 lines (+27/-20) 1 file modified
stock/wizard/stock_fill_inventory.py (+27/-20) |
To merge this branch: | bzr merge lp:~numerigraphe-team/ocb-addons/7.0-fill-inventory-OOM |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Laurent Mignon (Acsone) (community) | code review | Approve | |
Guewen Baconnier @ Camptocamp | code review, test | Approve | |
Lionel Sausin - Initiatives/Numérigraphe (community) | co-author | Abstain | |
Holger Brunn (Therp) | code review | Approve | |
Review via email: mp+217049@code.launchpad.net |
Description of the change
We're having problems importing an inventory for a location that contains >630000 Stock Moves.
The problem seems to lie in the python loop based on the browse() of the Stock Moves: it preloads too much data and pushes the OS to an out-of-memory condition (which Linux responds to by killing the openerp server).
This patch solve this bug by splitting stock moves lines ids on separated list of 10 000 lines
Run GREEN on runbot.
pull request on standard 7.0: https:/
To post a comment you must log in.
I think you'd be better off using search(..., limit=your_limit, offset= your_offset) .
If you don't want to make that change, at least the 10000 should be in a variable and you should use range(start, end, step)