open-ideas / IDEAS

Modelica library allowing simultaneous transient simulation of thermal and electrical systems at both building and feeder level.
131 stars 56 forks source link

Heat pump model of buildings #731

Closed Mathadon closed 6 years ago

Mathadon commented 7 years ago

Buildings has a new heat pump model by Massimo Cimmino. It performs very well, also outside of the region for which the manufacturer specifies performance data. Validation using measurement data:

screen shot 2017-05-19 at 15 52 59

I would advise to use that model instead of ours. They provide a script for 'learning' the heat pump parameters. This requires some reformatting of the data, but given the high accuracy of the model I think this is reasonable. This would make #536 obsolete.

The question now is, do we delete our model? Do we copy the Buildings model? Either way there won't be a whole lot of IDEAS-specific models left in the Fluid package, which is a bit annoying..

Mathadon commented 6 years ago

@icupeiro @damienpicard , any objections against removing our water-water heat pump model? We then have to start the process within IBPSA to merge the heat pump model from buildings.

damienpicard commented 6 years ago

@Mathadon it looks like a good idea, at least if their model needs the same type of data from the one we use.

icupeiro commented 6 years ago

@Mathadon I agree and actually this is the model that I am using now

Mathadon commented 6 years ago

@mwetter is it sufficient that I review Massimo's model to have it integrated into the IBPSA library?

mwetter commented 6 years ago

@Mathadon @MassimoCimmino: I think this would be a valuable model for some studies, although at some point, we will also need a faster model for standard annual simulations (based on performance curves and/or performance tables). Massimo's model also needs a parameter identification; or someone who builds up a package with data records for different heat pumps, whereas these data records would be based on the parameter identification.

Mathadon commented 6 years ago

The model is indeed quite computationally expensive. My model has four of these and together I recall they use about a quarter of the computation time. However, it may well be that it can be made more computationally efficient. Right now the model needs two iteration variables. If we can reduce that to one then I expect performance can be better. Moreover the thermodynamic computations can perhaps be simplified too. In theory it should also be possible to couple the python script to a Modelica model where the combitable2D could then be used to insert the data into the parameter identification algorithm. This could even happen as an initial algorithm in the heat pump model itself. All of that may take some time to implement though=) However, if it works, then it may remove the need for simplified models?

mwetter commented 6 years ago

I think computing time would still be much higher than in a more simplified model that computes performance based on tables or functions.

Requiring Python to simulate a model is in my view prone to error; and for EnergyPlus, we don't want users to require a Python dependency as distributing a Python runtime is hard if it needs to run on all major operating systems. Requiring Python for developers is fine, however.

Mathadon commented 6 years ago

I agree that it would be higher. I'm just not sure whether it would be that high that it becomes a problem :)

Good point about the python dependency.

I'll see when and if I have some time to put into this :)

MassimoCimmino commented 6 years ago

Another possibility is to use the Python model to calibrate with the manufacturer data. Then, use the calibrated Python model to produce performance table that cover a broader range of mass flow/temperature conditions.

mwetter commented 6 years ago

That may actually be the preferred setup; e.g., use the "detailed" model to produce performance data that are for a wider region of operating points, and then store these performance in polynomial or tables for use in the annual simulations.

lievehelsen commented 6 years ago

I fully agree! Detailed model has been validated for a broad range of operating conditions? Validity should be checked …

Mathadon commented 6 years ago

@MassimoCimmino I think the model depends on four external variables: 2 temperatures and 2 mass flow rates? So we would need a 4D table unless the number of independent variables can be reduced somehow. Polynomials could be easier to implement, but are prone to overfitting..

An idea for a generic workflow to automatically derive table/polynomial models from stationary components: We could create a Modelica model (template) that automatically performs a parameter sweep for a developer-defined set of inputs (independent variables) and automatically generates interpolation points for a set of developer-defined outputs (dependent variables) using a Modelica algorithm or something similar. This model can be packaged into an FMU and shipped with the library. The FMU could be called in the initial algorithm section of a simplified model using C-functions, which passes model parameters. The reduced order model (table or polynomial) can then be generated on the fly using the FMU. Results (table interpolation points or polynomial coefficients) can be cached such that initialization overhead occurs only when the user changes input parameters. This approach does not require python and could be used for any component model that does not have states and only a limited set of dependent variables. We would however need a copy of the (JModelica) FMI shared libraries.

Setting this up would be a bit of work, but it would be generic. Very slow, badly designed models could in theory be speeded up significantly without having to put in the time to 'fix' the model equations. If we'd use polynomials then we'd have some work though to check whether the resulting polynomials are not overfitted. Or we could use a different set of basis functions that are less prone to overfitting. Ideally the user can specify the causality of the polynomials: whether an output is computed from an input or vice versa.

For optimization purposes polynomials or other basis functions are preferred over tables since tables cannot be parsed through JModelica.

MassimoCimmino commented 6 years ago

@lievehelsen Right now the model parameters have been calibrated so that predicted performance (capacity and COP) matches the manufacturer data over their full range.

@Mathadon @mwetter I wonder if some hybrid method inbetween the thermophysical model and a data map model could increase the computational efficiency (and maybe increase accuracy). The model in Buildings has 8 calibrated parameters that are divided into the three components: Evaporator (1), Condenser (1) and Compressor (6). The Compressor model is the one slowing down calculations because it needs to evaluate the rerigerant's equation of state. It's interfaces are Evaporating temperature, Condensing temperature, Load-side heat transfer rate, Source-side heat transfer rate, and Power input. This model could be replaced by a 2D table that returns the last 3 as a function of Evaporating and Condensing temperatures. The Evaporator and Condenser models handle the mass flow rate dependency from the epsilon-NTU method.

This requires a new callibration algorithm to fit the parameters for the Evaporator and Condenser and generate performance tables based on temperatures only.

Mathadon commented 6 years ago

@mwetter is there an interest to merge the heat pump model of @MassimoCimmino into IBPSA?

mwetter commented 6 years ago

That would be fine with me if others are interested.

damienpicard commented 6 years ago

+1 for merging the heat pump model

damienpicard commented 6 years ago

@Mathadon shouldn't we include that heat pump model in v2.0 of IDEAS?

Mathadon commented 6 years ago

This model has been merged into the IBPSA library.