Reducing memory requirements of a pyomo model The Next CEO of Stack OverflowHow to measure actual memory usage of an application or process?How do I discover memory usage of my application in Android?Creating a memory leak with Java“Large data” work flows using pandasPyomo cannot iterate over abstract Set and constraint index errorSetting value of an indexed Variable to an indexed Expression - pyomoPyomo ValueError: Invalid constraint expressionPyomo: ERROR: evaluating expression: No value for uninitialized NumericValue objectPyomo KeyError: “Error accessing indexed component: Index '('student_5', 'company_3', 'meetingtime_1')' is not valid for array component 'var_X'”Memory efficiency when exposing C struct with Cython
Solution of this Diophantine Equation
Why does standard notation not preserve intervals (visually)
Why do professional authors make "consistency" mistakes? And how to avoid them?
How do I solve this limit?
Why did we only see the N-1 starfighters in one film?
How to start emacs in "nothing" mode (`fundamental-mode`)
WOW air has ceased operation, can I get my tickets refunded?
Visit to the USA with ESTA approved before trip to Iran
Does it take more energy to get to Venus or to Mars?
What is the point of a new vote on May's deal when the indicative votes suggest she will not win?
Anatomically Correct Mesopelagic Aves
How to use tikz in fbox?
How to make a variable always equal to the result of some calculations?
Would this house-rule that treats advantage as a +1 to the roll instead (and disadvantage as -1) and allows them to stack be balanced?
How to write the block matrix in LaTex?
How can I quit an app using Terminal?
How can I open an app using Terminal?
Is HostGator storing my password in plaintext?
What is meant by a M next to a roman numeral?
Increase performance creating Mandelbrot set in python
I believe this to be a fraud - hired, then asked to cash check and send cash as Bitcoin
How easy is it to start Magic from scratch?
What does "Its cash flow is deeply negative" mean?
How to write papers efficiently when English isn't my first language?
Reducing memory requirements of a pyomo model
The Next CEO of Stack OverflowHow to measure actual memory usage of an application or process?How do I discover memory usage of my application in Android?Creating a memory leak with Java“Large data” work flows using pandasPyomo cannot iterate over abstract Set and constraint index errorSetting value of an indexed Variable to an indexed Expression - pyomoPyomo ValueError: Invalid constraint expressionPyomo: ERROR: evaluating expression: No value for uninitialized NumericValue objectPyomo KeyError: “Error accessing indexed component: Index '('student_5', 'company_3', 'meetingtime_1')' is not valid for array component 'var_X'”Memory efficiency when exposing C struct with Cython
I am building a big pyomo model with over 1 million constraints and 2 million variables.
And I am looking for suggestions to reduce the memory requirements of the model that I am building.
At the moment it requires over 20gb
's of RAM.
How would I reduce this?
I've never tested defining variables with/without within=pyomo.NonNegativeReals
. But I am assuming it would reduce the amount of memory required for a given variable. Is there other things I could do without reducing the amount of the variables or constraints.
Eg:
Following var
will need X
bytes of memory
m.var = pyomo.Var(
m.index)
And maybe following will need X-1
bytes of memory
m.var = pyomo.Var(
m.index,
within=pyomo.NonNegativeReals)
Of course this is a speculation. Without testing one cannot be sure about this. However, I am willing to try anything if some1 has an idea or more experience regarding this issue.
Any ideas?
Some Tests:
Keep in mind that is not the real model but the example builded with an other data. But still the same script.
index=1000 // Full Consts // 347580 KB (commit) // 370652 KB (working set)
0 Const Full Rules // 282416 KB (commit) // 305252 KB (working set)
0 Const 0 Rule // 282404 KB (commit) // 305200 KB (working set)
1 Const 1 Rule // 290408 KB (commit) // 313136 KB (working set)
index=8760 // Full Consts // 1675860 KB (commit) // 1695676 KB (working set)
python memory pyomo reducing
|
show 2 more comments
I am building a big pyomo model with over 1 million constraints and 2 million variables.
And I am looking for suggestions to reduce the memory requirements of the model that I am building.
At the moment it requires over 20gb
's of RAM.
How would I reduce this?
I've never tested defining variables with/without within=pyomo.NonNegativeReals
. But I am assuming it would reduce the amount of memory required for a given variable. Is there other things I could do without reducing the amount of the variables or constraints.
Eg:
Following var
will need X
bytes of memory
m.var = pyomo.Var(
m.index)
And maybe following will need X-1
bytes of memory
m.var = pyomo.Var(
m.index,
within=pyomo.NonNegativeReals)
Of course this is a speculation. Without testing one cannot be sure about this. However, I am willing to try anything if some1 has an idea or more experience regarding this issue.
Any ideas?
Some Tests:
Keep in mind that is not the real model but the example builded with an other data. But still the same script.
index=1000 // Full Consts // 347580 KB (commit) // 370652 KB (working set)
0 Const Full Rules // 282416 KB (commit) // 305252 KB (working set)
0 Const 0 Rule // 282404 KB (commit) // 305200 KB (working set)
1 Const 1 Rule // 290408 KB (commit) // 313136 KB (working set)
index=8760 // Full Consts // 1675860 KB (commit) // 1695676 KB (working set)
python memory pyomo reducing
Is your model flat (i.e., it only consists of a top-levelConcreteModel
orAbstractModel
) or does it consist of manyBlock
objects?
– Gabe Hackebeil
Mar 21 at 16:36
@GabeHackebeil it only consist of aConcreteModel
, it is actually building the same variables for givenm.index
, and if you optimize for 1h, the index will be 1. however if you optimize for a year index will be 1 to 8760 which ofc increases the amount of vars and constraints...
– oakca
Mar 21 at 16:37
I can point you to an alternative interface that has a way of expressing linear constraints using less memory, but first I want to make sure you are not expressing the constraint rows as dense expressions when they can be sparse. Can you narrow the bulk of the memory usage down to a particular indexed constraint (by commenting out everything else)?
– Gabe Hackebeil
Mar 21 at 16:43
Well not every constraint has the same index, however in many cases, the constraints consist the time index which makes the model fat... I will try to narrow the model, and try to get mem usage of 1 constraint(with indexes) and add it here... Btw is there an easy way to check the memory of the created model? better than checking the task manager
– oakca
Mar 21 at 16:47
You can look into packages likepympler
, but the task manager is probably the faster route.
– Gabe Hackebeil
Mar 21 at 16:52
|
show 2 more comments
I am building a big pyomo model with over 1 million constraints and 2 million variables.
And I am looking for suggestions to reduce the memory requirements of the model that I am building.
At the moment it requires over 20gb
's of RAM.
How would I reduce this?
I've never tested defining variables with/without within=pyomo.NonNegativeReals
. But I am assuming it would reduce the amount of memory required for a given variable. Is there other things I could do without reducing the amount of the variables or constraints.
Eg:
Following var
will need X
bytes of memory
m.var = pyomo.Var(
m.index)
And maybe following will need X-1
bytes of memory
m.var = pyomo.Var(
m.index,
within=pyomo.NonNegativeReals)
Of course this is a speculation. Without testing one cannot be sure about this. However, I am willing to try anything if some1 has an idea or more experience regarding this issue.
Any ideas?
Some Tests:
Keep in mind that is not the real model but the example builded with an other data. But still the same script.
index=1000 // Full Consts // 347580 KB (commit) // 370652 KB (working set)
0 Const Full Rules // 282416 KB (commit) // 305252 KB (working set)
0 Const 0 Rule // 282404 KB (commit) // 305200 KB (working set)
1 Const 1 Rule // 290408 KB (commit) // 313136 KB (working set)
index=8760 // Full Consts // 1675860 KB (commit) // 1695676 KB (working set)
python memory pyomo reducing
I am building a big pyomo model with over 1 million constraints and 2 million variables.
And I am looking for suggestions to reduce the memory requirements of the model that I am building.
At the moment it requires over 20gb
's of RAM.
How would I reduce this?
I've never tested defining variables with/without within=pyomo.NonNegativeReals
. But I am assuming it would reduce the amount of memory required for a given variable. Is there other things I could do without reducing the amount of the variables or constraints.
Eg:
Following var
will need X
bytes of memory
m.var = pyomo.Var(
m.index)
And maybe following will need X-1
bytes of memory
m.var = pyomo.Var(
m.index,
within=pyomo.NonNegativeReals)
Of course this is a speculation. Without testing one cannot be sure about this. However, I am willing to try anything if some1 has an idea or more experience regarding this issue.
Any ideas?
Some Tests:
Keep in mind that is not the real model but the example builded with an other data. But still the same script.
index=1000 // Full Consts // 347580 KB (commit) // 370652 KB (working set)
0 Const Full Rules // 282416 KB (commit) // 305252 KB (working set)
0 Const 0 Rule // 282404 KB (commit) // 305200 KB (working set)
1 Const 1 Rule // 290408 KB (commit) // 313136 KB (working set)
index=8760 // Full Consts // 1675860 KB (commit) // 1695676 KB (working set)
python memory pyomo reducing
python memory pyomo reducing
edited Mar 21 at 17:09
oakca
asked Mar 21 at 16:15
oakcaoakca
387114
387114
Is your model flat (i.e., it only consists of a top-levelConcreteModel
orAbstractModel
) or does it consist of manyBlock
objects?
– Gabe Hackebeil
Mar 21 at 16:36
@GabeHackebeil it only consist of aConcreteModel
, it is actually building the same variables for givenm.index
, and if you optimize for 1h, the index will be 1. however if you optimize for a year index will be 1 to 8760 which ofc increases the amount of vars and constraints...
– oakca
Mar 21 at 16:37
I can point you to an alternative interface that has a way of expressing linear constraints using less memory, but first I want to make sure you are not expressing the constraint rows as dense expressions when they can be sparse. Can you narrow the bulk of the memory usage down to a particular indexed constraint (by commenting out everything else)?
– Gabe Hackebeil
Mar 21 at 16:43
Well not every constraint has the same index, however in many cases, the constraints consist the time index which makes the model fat... I will try to narrow the model, and try to get mem usage of 1 constraint(with indexes) and add it here... Btw is there an easy way to check the memory of the created model? better than checking the task manager
– oakca
Mar 21 at 16:47
You can look into packages likepympler
, but the task manager is probably the faster route.
– Gabe Hackebeil
Mar 21 at 16:52
|
show 2 more comments
Is your model flat (i.e., it only consists of a top-levelConcreteModel
orAbstractModel
) or does it consist of manyBlock
objects?
– Gabe Hackebeil
Mar 21 at 16:36
@GabeHackebeil it only consist of aConcreteModel
, it is actually building the same variables for givenm.index
, and if you optimize for 1h, the index will be 1. however if you optimize for a year index will be 1 to 8760 which ofc increases the amount of vars and constraints...
– oakca
Mar 21 at 16:37
I can point you to an alternative interface that has a way of expressing linear constraints using less memory, but first I want to make sure you are not expressing the constraint rows as dense expressions when they can be sparse. Can you narrow the bulk of the memory usage down to a particular indexed constraint (by commenting out everything else)?
– Gabe Hackebeil
Mar 21 at 16:43
Well not every constraint has the same index, however in many cases, the constraints consist the time index which makes the model fat... I will try to narrow the model, and try to get mem usage of 1 constraint(with indexes) and add it here... Btw is there an easy way to check the memory of the created model? better than checking the task manager
– oakca
Mar 21 at 16:47
You can look into packages likepympler
, but the task manager is probably the faster route.
– Gabe Hackebeil
Mar 21 at 16:52
Is your model flat (i.e., it only consists of a top-level
ConcreteModel
or AbstractModel
) or does it consist of many Block
objects?– Gabe Hackebeil
Mar 21 at 16:36
Is your model flat (i.e., it only consists of a top-level
ConcreteModel
or AbstractModel
) or does it consist of many Block
objects?– Gabe Hackebeil
Mar 21 at 16:36
@GabeHackebeil it only consist of a
ConcreteModel
, it is actually building the same variables for given m.index
, and if you optimize for 1h, the index will be 1. however if you optimize for a year index will be 1 to 8760 which ofc increases the amount of vars and constraints...– oakca
Mar 21 at 16:37
@GabeHackebeil it only consist of a
ConcreteModel
, it is actually building the same variables for given m.index
, and if you optimize for 1h, the index will be 1. however if you optimize for a year index will be 1 to 8760 which ofc increases the amount of vars and constraints...– oakca
Mar 21 at 16:37
I can point you to an alternative interface that has a way of expressing linear constraints using less memory, but first I want to make sure you are not expressing the constraint rows as dense expressions when they can be sparse. Can you narrow the bulk of the memory usage down to a particular indexed constraint (by commenting out everything else)?
– Gabe Hackebeil
Mar 21 at 16:43
I can point you to an alternative interface that has a way of expressing linear constraints using less memory, but first I want to make sure you are not expressing the constraint rows as dense expressions when they can be sparse. Can you narrow the bulk of the memory usage down to a particular indexed constraint (by commenting out everything else)?
– Gabe Hackebeil
Mar 21 at 16:43
Well not every constraint has the same index, however in many cases, the constraints consist the time index which makes the model fat... I will try to narrow the model, and try to get mem usage of 1 constraint(with indexes) and add it here... Btw is there an easy way to check the memory of the created model? better than checking the task manager
– oakca
Mar 21 at 16:47
Well not every constraint has the same index, however in many cases, the constraints consist the time index which makes the model fat... I will try to narrow the model, and try to get mem usage of 1 constraint(with indexes) and add it here... Btw is there an easy way to check the memory of the created model? better than checking the task manager
– oakca
Mar 21 at 16:47
You can look into packages like
pympler
, but the task manager is probably the faster route.– Gabe Hackebeil
Mar 21 at 16:52
You can look into packages like
pympler
, but the task manager is probably the faster route.– Gabe Hackebeil
Mar 21 at 16:52
|
show 2 more comments
1 Answer
1
active
oldest
votes
I've used pympler
to analyze the test case you pointed me to. Here is what I've found:
After pyomo_model_prep
(loads data and places it onto empty ConcreteModel
):
- memory usage is 13.2 MB
After adding all Set
and Param
objects:
- memory usage is 13.3 MB
After adding all Var
objects:
- memory usage is 14.3 MB
After adding all Constraint
objects:
- memory usage is 15.0 MB
When I set the timesteps to 60, the results are
- memory usage is 13.2 MB (data)
- memory usage is 13.3 MB (after sets, params)
- memory usage is 19.6 MB (after vars)
- memory usage is 23.6 MB (after constraints)
So the variables do have a pretty big impact on model memory when there are a larger number of timesteps. The only obvious place I can see for reducing memory usage is to not store all of the data on the model (or delete it from the model after it is no longer needed), then perhaps what is unused will be cleaned up by the garbage collector.
Unfortunately, there isn't really any easy way to reduce the memory of the variable declarations.
Update 1: Just an FYI, pretty much all of the memory usage for the variable declarations is a result of the e_pro_in
and e_pro_out
indexed variables.
Update 2: If a large number of indices of the e_pro_in
and e_pro_out
variables are not used in the model, you can reduce memory requirements by building a reduced index set for each of them. Here is how that might look:
e_pro_in_index = []
for t in m.tm:
for i,j in m.pro_tuples:
for c in m.com:
if ...:
e_pro_in_index.append((t,i,j,c))
m.e_pro_in_index = Set(dimen=4, initialize=e_pro_in_index)
m.e_pro_in = pyomo.Var(
m.e_pro_in_index,
within=pyomo.NonNegativeReals,
doc='Power flow of commodity into process (MW) per timestep')
You would need to extract the logic from constraint rules to figure out what indices are not needed.
A question though is there a way to remove some specific indexes ofe_pro_in
and_out
after creating all the indexes, because they arestale=True
?
– oakca
Mar 22 at 9:20
If you know ahead of time (before the solve) which indices will be stale, you should build a reduced index set for those variables. I'll add an example to the answer. Doing it after the solve insn't really going to help you reduce peak memory usage.
– Gabe Hackebeil
Mar 22 at 13:38
you are right and thank you :)
– oakca
Mar 22 at 13:46
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55284853%2freducing-memory-requirements-of-a-pyomo-model%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
I've used pympler
to analyze the test case you pointed me to. Here is what I've found:
After pyomo_model_prep
(loads data and places it onto empty ConcreteModel
):
- memory usage is 13.2 MB
After adding all Set
and Param
objects:
- memory usage is 13.3 MB
After adding all Var
objects:
- memory usage is 14.3 MB
After adding all Constraint
objects:
- memory usage is 15.0 MB
When I set the timesteps to 60, the results are
- memory usage is 13.2 MB (data)
- memory usage is 13.3 MB (after sets, params)
- memory usage is 19.6 MB (after vars)
- memory usage is 23.6 MB (after constraints)
So the variables do have a pretty big impact on model memory when there are a larger number of timesteps. The only obvious place I can see for reducing memory usage is to not store all of the data on the model (or delete it from the model after it is no longer needed), then perhaps what is unused will be cleaned up by the garbage collector.
Unfortunately, there isn't really any easy way to reduce the memory of the variable declarations.
Update 1: Just an FYI, pretty much all of the memory usage for the variable declarations is a result of the e_pro_in
and e_pro_out
indexed variables.
Update 2: If a large number of indices of the e_pro_in
and e_pro_out
variables are not used in the model, you can reduce memory requirements by building a reduced index set for each of them. Here is how that might look:
e_pro_in_index = []
for t in m.tm:
for i,j in m.pro_tuples:
for c in m.com:
if ...:
e_pro_in_index.append((t,i,j,c))
m.e_pro_in_index = Set(dimen=4, initialize=e_pro_in_index)
m.e_pro_in = pyomo.Var(
m.e_pro_in_index,
within=pyomo.NonNegativeReals,
doc='Power flow of commodity into process (MW) per timestep')
You would need to extract the logic from constraint rules to figure out what indices are not needed.
A question though is there a way to remove some specific indexes ofe_pro_in
and_out
after creating all the indexes, because they arestale=True
?
– oakca
Mar 22 at 9:20
If you know ahead of time (before the solve) which indices will be stale, you should build a reduced index set for those variables. I'll add an example to the answer. Doing it after the solve insn't really going to help you reduce peak memory usage.
– Gabe Hackebeil
Mar 22 at 13:38
you are right and thank you :)
– oakca
Mar 22 at 13:46
add a comment |
I've used pympler
to analyze the test case you pointed me to. Here is what I've found:
After pyomo_model_prep
(loads data and places it onto empty ConcreteModel
):
- memory usage is 13.2 MB
After adding all Set
and Param
objects:
- memory usage is 13.3 MB
After adding all Var
objects:
- memory usage is 14.3 MB
After adding all Constraint
objects:
- memory usage is 15.0 MB
When I set the timesteps to 60, the results are
- memory usage is 13.2 MB (data)
- memory usage is 13.3 MB (after sets, params)
- memory usage is 19.6 MB (after vars)
- memory usage is 23.6 MB (after constraints)
So the variables do have a pretty big impact on model memory when there are a larger number of timesteps. The only obvious place I can see for reducing memory usage is to not store all of the data on the model (or delete it from the model after it is no longer needed), then perhaps what is unused will be cleaned up by the garbage collector.
Unfortunately, there isn't really any easy way to reduce the memory of the variable declarations.
Update 1: Just an FYI, pretty much all of the memory usage for the variable declarations is a result of the e_pro_in
and e_pro_out
indexed variables.
Update 2: If a large number of indices of the e_pro_in
and e_pro_out
variables are not used in the model, you can reduce memory requirements by building a reduced index set for each of them. Here is how that might look:
e_pro_in_index = []
for t in m.tm:
for i,j in m.pro_tuples:
for c in m.com:
if ...:
e_pro_in_index.append((t,i,j,c))
m.e_pro_in_index = Set(dimen=4, initialize=e_pro_in_index)
m.e_pro_in = pyomo.Var(
m.e_pro_in_index,
within=pyomo.NonNegativeReals,
doc='Power flow of commodity into process (MW) per timestep')
You would need to extract the logic from constraint rules to figure out what indices are not needed.
A question though is there a way to remove some specific indexes ofe_pro_in
and_out
after creating all the indexes, because they arestale=True
?
– oakca
Mar 22 at 9:20
If you know ahead of time (before the solve) which indices will be stale, you should build a reduced index set for those variables. I'll add an example to the answer. Doing it after the solve insn't really going to help you reduce peak memory usage.
– Gabe Hackebeil
Mar 22 at 13:38
you are right and thank you :)
– oakca
Mar 22 at 13:46
add a comment |
I've used pympler
to analyze the test case you pointed me to. Here is what I've found:
After pyomo_model_prep
(loads data and places it onto empty ConcreteModel
):
- memory usage is 13.2 MB
After adding all Set
and Param
objects:
- memory usage is 13.3 MB
After adding all Var
objects:
- memory usage is 14.3 MB
After adding all Constraint
objects:
- memory usage is 15.0 MB
When I set the timesteps to 60, the results are
- memory usage is 13.2 MB (data)
- memory usage is 13.3 MB (after sets, params)
- memory usage is 19.6 MB (after vars)
- memory usage is 23.6 MB (after constraints)
So the variables do have a pretty big impact on model memory when there are a larger number of timesteps. The only obvious place I can see for reducing memory usage is to not store all of the data on the model (or delete it from the model after it is no longer needed), then perhaps what is unused will be cleaned up by the garbage collector.
Unfortunately, there isn't really any easy way to reduce the memory of the variable declarations.
Update 1: Just an FYI, pretty much all of the memory usage for the variable declarations is a result of the e_pro_in
and e_pro_out
indexed variables.
Update 2: If a large number of indices of the e_pro_in
and e_pro_out
variables are not used in the model, you can reduce memory requirements by building a reduced index set for each of them. Here is how that might look:
e_pro_in_index = []
for t in m.tm:
for i,j in m.pro_tuples:
for c in m.com:
if ...:
e_pro_in_index.append((t,i,j,c))
m.e_pro_in_index = Set(dimen=4, initialize=e_pro_in_index)
m.e_pro_in = pyomo.Var(
m.e_pro_in_index,
within=pyomo.NonNegativeReals,
doc='Power flow of commodity into process (MW) per timestep')
You would need to extract the logic from constraint rules to figure out what indices are not needed.
I've used pympler
to analyze the test case you pointed me to. Here is what I've found:
After pyomo_model_prep
(loads data and places it onto empty ConcreteModel
):
- memory usage is 13.2 MB
After adding all Set
and Param
objects:
- memory usage is 13.3 MB
After adding all Var
objects:
- memory usage is 14.3 MB
After adding all Constraint
objects:
- memory usage is 15.0 MB
When I set the timesteps to 60, the results are
- memory usage is 13.2 MB (data)
- memory usage is 13.3 MB (after sets, params)
- memory usage is 19.6 MB (after vars)
- memory usage is 23.6 MB (after constraints)
So the variables do have a pretty big impact on model memory when there are a larger number of timesteps. The only obvious place I can see for reducing memory usage is to not store all of the data on the model (or delete it from the model after it is no longer needed), then perhaps what is unused will be cleaned up by the garbage collector.
Unfortunately, there isn't really any easy way to reduce the memory of the variable declarations.
Update 1: Just an FYI, pretty much all of the memory usage for the variable declarations is a result of the e_pro_in
and e_pro_out
indexed variables.
Update 2: If a large number of indices of the e_pro_in
and e_pro_out
variables are not used in the model, you can reduce memory requirements by building a reduced index set for each of them. Here is how that might look:
e_pro_in_index = []
for t in m.tm:
for i,j in m.pro_tuples:
for c in m.com:
if ...:
e_pro_in_index.append((t,i,j,c))
m.e_pro_in_index = Set(dimen=4, initialize=e_pro_in_index)
m.e_pro_in = pyomo.Var(
m.e_pro_in_index,
within=pyomo.NonNegativeReals,
doc='Power flow of commodity into process (MW) per timestep')
You would need to extract the logic from constraint rules to figure out what indices are not needed.
edited Mar 22 at 13:45
answered Mar 21 at 19:32
Gabe HackebeilGabe Hackebeil
1,10659
1,10659
A question though is there a way to remove some specific indexes ofe_pro_in
and_out
after creating all the indexes, because they arestale=True
?
– oakca
Mar 22 at 9:20
If you know ahead of time (before the solve) which indices will be stale, you should build a reduced index set for those variables. I'll add an example to the answer. Doing it after the solve insn't really going to help you reduce peak memory usage.
– Gabe Hackebeil
Mar 22 at 13:38
you are right and thank you :)
– oakca
Mar 22 at 13:46
add a comment |
A question though is there a way to remove some specific indexes ofe_pro_in
and_out
after creating all the indexes, because they arestale=True
?
– oakca
Mar 22 at 9:20
If you know ahead of time (before the solve) which indices will be stale, you should build a reduced index set for those variables. I'll add an example to the answer. Doing it after the solve insn't really going to help you reduce peak memory usage.
– Gabe Hackebeil
Mar 22 at 13:38
you are right and thank you :)
– oakca
Mar 22 at 13:46
A question though is there a way to remove some specific indexes of
e_pro_in
and _out
after creating all the indexes, because they are stale=True
?– oakca
Mar 22 at 9:20
A question though is there a way to remove some specific indexes of
e_pro_in
and _out
after creating all the indexes, because they are stale=True
?– oakca
Mar 22 at 9:20
If you know ahead of time (before the solve) which indices will be stale, you should build a reduced index set for those variables. I'll add an example to the answer. Doing it after the solve insn't really going to help you reduce peak memory usage.
– Gabe Hackebeil
Mar 22 at 13:38
If you know ahead of time (before the solve) which indices will be stale, you should build a reduced index set for those variables. I'll add an example to the answer. Doing it after the solve insn't really going to help you reduce peak memory usage.
– Gabe Hackebeil
Mar 22 at 13:38
you are right and thank you :)
– oakca
Mar 22 at 13:46
you are right and thank you :)
– oakca
Mar 22 at 13:46
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55284853%2freducing-memory-requirements-of-a-pyomo-model%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Is your model flat (i.e., it only consists of a top-level
ConcreteModel
orAbstractModel
) or does it consist of manyBlock
objects?– Gabe Hackebeil
Mar 21 at 16:36
@GabeHackebeil it only consist of a
ConcreteModel
, it is actually building the same variables for givenm.index
, and if you optimize for 1h, the index will be 1. however if you optimize for a year index will be 1 to 8760 which ofc increases the amount of vars and constraints...– oakca
Mar 21 at 16:37
I can point you to an alternative interface that has a way of expressing linear constraints using less memory, but first I want to make sure you are not expressing the constraint rows as dense expressions when they can be sparse. Can you narrow the bulk of the memory usage down to a particular indexed constraint (by commenting out everything else)?
– Gabe Hackebeil
Mar 21 at 16:43
Well not every constraint has the same index, however in many cases, the constraints consist the time index which makes the model fat... I will try to narrow the model, and try to get mem usage of 1 constraint(with indexes) and add it here... Btw is there an easy way to check the memory of the created model? better than checking the task manager
– oakca
Mar 21 at 16:47
You can look into packages like
pympler
, but the task manager is probably the faster route.– Gabe Hackebeil
Mar 21 at 16:52