Open steve-ransome opened 4 years ago
@steve-ransome looking forward to hosting this function. Have you thought about the function's signature? What are the inputs, what is output?
I should look at your PVSC paper again, refresh my memory, but Sandia is blocking your website. Would you mind sending it to me directly?
Thanks Cliff.
I've been writing Python code for myself and clients for a few years. I am not yet familiar with these collaborative efforts although I am learning. The python I added is just simple, sample code,to calculate some of the LFM coefficients and show my naming convention which allows variables to be identified as "reference, measured or normalised" and also "uncorrected or temperature corrected" (e.g. "nIsc_T") so need an idea if this is OK to use as the naming convention document just allows names like i_sc and v_oc.
Basically the LFM function will transform measured values from an IV curve to normalised values
Inputs IV curve : (mIsc, mRsc, mmp, mVmp, mRoc, mVoc ...) Reference module STC values : (rIsc, rImp, rVmp, rVoc , rIsc_Alpha etc. ...) Weather data : (Irradiance Gi, module temperature Tm, Windspeed WS ...)
Normalised Outputs LFM values : (nIsc_U, nRsc, nImp, nVmp, nRoc, nVoc_U and temperature corrected _T) if desired
The MPM function fits up to n=6 normalised coefficients to predict a normalised LFM output such as nIdc or nVoc
nLFM = fn ( Gi, Tmod, WS, C1 .. Cn)
I've been reading up on some of the coding examples such as the 1-diode model to get an idea of the sort of style used for comments, layout etc. and hope to follow that. Maybe if I follow https://www.python.org/dev/peps/pep-0362/ I can get you code that will be closer to what's needed,
Here are my Chicago files uploaded - please tell me if you can get them
1906_PVSC46_Chicago_Ransome_ppt_Presented.pdf
1906_PVSC46_Chicago_Ransome.pdf
Is anyone else blocked from www.steveransome,com ?
Does Sandia give a reason why my website is blocked? Perhaps you could ask them to unblock it, or give me a reason why not
I appreciate the amount of work you guys have done already over the years and will try to contribute as best as I can - even though I don't understand all the stuff mentioned in github.
allows variables to be identified as "reference, measured or normalised" and also "uncorrected or temperature corrected" (e.g. "nIsc_T") so need an idea if this is OK to use as the naming convention document just allows names like i_sc and v_oc.
We can work through naming conventions for these new quantities with the pull request. It really helps to see the variables in context.
I've been reading up on some of the coding examples such as the 1-diode model to get an idea of the sort of style used for comments, layout etc. and hope to follow that. Maybe if I follow https://www.python.org/dev/peps/pep-0362/ I can get you code that will be closer to what's needed,
I don't think we are using PEP-362. By "signature" I only meant a list of inputs and outputs.
I can get the files from here, thank you. www.steveransome.com is not blocked, but the links to the hosted files are blocked here. Hard to say why without a dozen phone calls.
Thanks. Maybe I can send some code over in a few days where I've tried to follow everything and made clear when I can't achieve something (like naming) as it is.
As it's Python it should be clear what it is I am trying to do but I do add comments too. You can see the basics from the first py file I sent
I presented a virtual talk at "August 5, 2020 PVPMC Webinar on PV Performance Modeling Methods" https://pvpmc.sandia.gov/resources-and-events/events/2020-pvpmc-webinars/
"How to use the Loss Factors and Mechanistic Performance Models effectively with PVPMC/PVLIB" https://pvpmc.sandia.gov/download/7879/
After receiving some positive feedback I'll soon be to be able to upload code as described in slides 15 and 16.
What's the best way to go about this? I've followed as many of the hints and tips as I can find for coding, documentation and naming but I'm sure there will be changes needed . There'll be an ipynb file, a py lib file, some default measurement data, reference measurements and it will generate the 5 graphs in slide 16. Instructions are provided for you to use your own measurement data.
I am not familiar with the procedures you've already been using and welcome suggestions and links.
Hi Steve,
This is very exciting and welcome news! Thank you for offering your contributions to pvlib!
The best source of info would be the contributing section of the pvlib documentation. Here are the salient points:
pvlib python uses Git & GitHub to manage contributions. There are other methods but this will give you the most control to ease integration of your code. Therefore you will need to have:
The easiest way to integrate your code into pvlib python is on your machine first! This is done by "cloning" the pvlib Git repository to your computer. If you installed tortoise then browse to the location where you want to work on your development version of pvlib, right click the folder and select "clone" then enter the URL to your fork of the GitHub pvlib python repo, eg: https://github.com/steve-ransome/pvlib-python, then enter your GitHub credentials when prompted (**)
pvlib python uses a strategy called feature branching to integrate enhancements like the loss factor model. If using tortoise, navigate to the Git repo you just cloned, right click anywhere and select "checkout new branch". Then enter a descriptive branch name like "loss-factor-model"
Now integrate your code into the branch in your computer. After you've made the changes, you need to "commit" them to the repository. Right click again and select "commit". You should see that there are "untracked" changes. Click the files you want to add, add a descriptive comment and click "commit".
Next you need to make your changes known to GitHub by pushing them to your fork. Using tortoise, right click and select "push". In the field select "origin" and the name of your branch: "loss-factor-model", then click okay. Enter your GitHub credentials again if prompted.
The last step is to let the maintainers know you want to integrate your changes on GitHub by creating a "pull request"online. Go to your online fork of pvlib python and you should see a message noting that GitHub observed your changes and offering to open a pull request from you to the main pvlib python repo. Do it.
Here are some more references to help you get started:
Here are the additional notes where I left asterisks
Thanks @mikofski , I'll try that over the weekend, there's a lot to take in!
Thanks @mikofski , I've been through all of your documentation but am still struggling to get this set up properly. I've used tortoisegit
I've tried to add my mlfm files in a directory pvlib_python\pvlib\mlfm. There are a notebook (mlfm_pvpmc1.ipynb), library (mlfm_lib_1.py) then reference (ref, test and matrix.csv) and measurement files (*.csv) and a graphs directory where it automatically saves generated .png files. The ipynb and .py files have a lot of documentation. I've just put in the first couple of graphs and once it's working can update and add more.
Please can anyone help let me know if this is OK or what else I should do.
@pvlib/pvlib-maintainer @steve-ransome has sent me some refactored code for the module loss factor model (MLFM) that, with some attention to formatting and variable names, could be considered for pvlib-python. I'm soliciting your views on if/how it could be added.
Summary: MLFM uses empirical expressions to predict normalized IV curves at irradiance and temperature conditions of interest. In this regard it is similar to the single diode models in pvlib (e.g. desoto). MLFM requires that the expressions be fit to input IV curve data.
Arguments in favor:
Arguments against:
MLFM might also be considered for pvlib/pvanalytics but would be an entirely new dimension for that project.
As far was where to put MLFM (assuming we agree it belongs in pvlib) I think a new code module in pvlib/ivtools makes most sense.
I don't have any personal interest in using or maintaining MLFM code. I'll hazard a guess that requiring IV curve inputs is a prohibitive obstacle for the majority of pvlib users. I wouldn't stand in the way if someone else is interested in adding it though. If it doesn't seem like a good fit for pvlib/pvlib-python
or pvlib/pvanalytics
, it could be its own repo pvlib/mlfm
.
[SRCL] Thanks for the helpful comments Cliff. I've been working on LFM type approaches for a long time and there are some advanced benefits that need further explanations, I've added comments below : For some recent background I presented a paper at the virtual PVSC http://www.steveransome.com/PUBS/2021_06_PVSC48_Florida_Ransome_210626t16_altered.pdf <- TALK http://www.steveransome.com/PUBS/2021_06_PVSC48_Florida_Ransome_210617t10_submitted.pdf <- DOCUMENT
[Cliff] @pvlib/pvlib-maintainer @steve-ransome has sent me some refactored code for the module loss factor model (MLFM) that, with some attention to formatting and variable names, could be considered for pvlib-python. I'm soliciting your views on if/how it could be added.
[SRCL] I'm working on the formatting and variable names at the moment. I'm using the pvlib naming convention but will need to extend it
[Cliff] Summary: MLFM uses empirical expressions to predict normalized IV curves at irradiance and temperature conditions of interest. In this regard it is similar to the single diode models in pvlib (e.g. desoto). MLFM requires that the expressions be fit to input IV curve data.
[SRCL] The MLFM is a little closer to the SAPM approach than the single diode model in that it "predicts points from measurements, not a full IV curve". Depending on what measured values are available e.g. DC Measurements 1 or 2 points (p_mp or i_mp and v_mp) IEC 61853 Matrix 4 points (i_mp, v_mp, i_sc and v_oc) Full IV Curve 6 or 8 points (i_mp, v_mp, i_sc, voc, rsc, roc; optional i@vmp/2 and v@_imp/2) I can derive 2,4,6 or 8 normalised loss factors depending on the measurements See this for further explanation http://www.steveransome.com/PUBS/1809_PVSEC35_Bruss)SRCL_5CV1.28%20paper.pdf)
The SAPM doesn't use resistances r_sc (= -1/(di/dv)@v=0) or r_oc (= -1/(di/dv)@i=0) but the mlfm does as they can be important in finding losses and degradation due to r_shunt and r_series respectively. Also the SAPM had two currents I_x and I_xx, the only reason I saw for them was in doing a curve fit to an IV curve from I_sc, i_x, i_mp, vmp, i_xx and v_oc. MLFM can measure i@vmp/2 and v@imp/2 and calculates values referenced to what would be expected extrapolating from (i_sc, r_sc) i = i_sc - v/r_sc or (v_oc, r_oc) v = v_oc - i * r_sc to give mathematical values indicating cell mismatch/shading and schottky rollover if they differ from their usual values of ~98-99%.
[Cliff] Arguments in favor: MLFM complements other IV prediction methods The code is not currently available and is unlikely to be distributed as it's own project. The code is non-trivial in that it encapsulates Steve and Juergen Sutterluetti's long experience using this technique.
[SRCL] There's a lot of functionality in the MLFM regarding performance limitation identification (e.g. "rsc too low"), stability/degradation/astability etc. that I haven't seen in any other models
[Cliff] Arguments against: MLFM doesn't need or use other components of pvlib so MLFM could be it's own project. I think that would be better in concept, but in reality is quite unlikely to happen.
[SRCL] what I've submitted so far hasn't linked to existing pvlib. However the MLFM can and does use it in the code I'm already using as a consultant and can maybe give to pvlib soon. For example the MLFM works best with effective irradiances (including angle of incidence and spectral corrections) as already calculated in pvlib. I had my own simple methods ( http://www.steveransome.com/PUBS/1906_PVSC46_Chicago_Ransome.pdf) but would prefer to use pvlib methods if possible. I also use sol_pos and tracking options,
[cliff] MLFM does not fit nicely into the ModelChain/PVSystem structure, in that it predicts normalized IV curves rather than power and voltage.
[SRCL] It predicts normalised points e.g. "meas_imp/ref_imp/poa_global_kwm2" and "meas_vmp/ref_vmp" rather than iv curves. Although I think it's easier to use normalised data for analysis, predictions, degradation and loss factor studies it's a simple method to generate array currents(Idc A) and voltages(Vdc V) when modelling large systems which I do. v_dc_array = norm_v_mp ref_v_mp series_modules I simulated the PVPMC SANDIA Blind modelling study using MLFM methods, I've also compared it against the SAPM in the power_to_energy yield tutorial and it could feature there if/when the algorithms are added to pvlib.
[Cliff] MLFM might also be considered for pvlib/pvanalytics but would be an entirely new dimension for that project.
[srcl] I don't have any comments
[cliff] As far was where to put MLFM (assuming we agree it belongs in pvlib) I think a new code module in pvlib/ivtools makes most sense.
[srcl] I think that's a good idea. While I am getting all the non essential calculations coding out there must be some graphical stuff in a pvlib module, things like the stacked loss charts are too difficult for users to do for themselves each time.
I'm attaching some graphs from my proposed tutorial to help demonstrate some of the capabilities. One graph shows a Gantner study with 6 points from IV curves, the others show 4 point studies from Marion's NREL dataset (without r_sc and r_oc) and an IEC 61853 indoor matrix test to prove useful information can be got without iv_curves.
Please feel free to ask for more information as it can analyse measurements in ways other models don't seem to be able to do and does not require iv curves (although they give more information)
I support adding MLFM provided that someone else is willing to be the primary maintainer. I am happy to help with python stuff but I don't want to commit to learning the nuances of a model right now. No concerns here over a need for PVSystem/ModelChain integration.
Perhaps only certain base functions need to be in pvlib per se, and the graphical analysis could go into one or more example notebooks?
Thanks for the comments. I'm happy to maintain the python code, however I would need help on any github, testing and decisions on library.
I've had several useful comments from Cliff Hansen already and am trying to follow them all where possible.
Here's a zip file with all the relevant mlfm code and 3 reference files, the html files show the output from running each file. mlfm_0_4.zip
Cliff suggested having different pandas data files e.g. meas[i_sc .. v_oc], norm[i_sc .. v_oc], stack[i_sc .. v_oc] rather than the way I had done it adding new columns to the same frame e.g. data[meas_i_sc .. meas_v_oc, norm_i_sc .. norm_v_oc, stack_i_sc .. stack_v_oc]. I can see the positives with doing this but I have problems when doing joins as I then have to rename several isc columns to have the correct prefix e.g. meas and norm_.
Thanks Anton for your comment, that's basically what I am trying to do to keep as little mlfm specific code as possible in the py file, I can do simple graphics such as figs 6,7 and 8 by defining them in the ipynb tutorial, however figs 3 and 5 are far more complex and I wonder if there could be a library for more complicated graphics.
I welcome all comments and suggestions!
I agree that appending more and more columns to a DataFrame may not be the best way. If you do need to do joins, there are rsuffix and lsuffix arguments that will help to avoid column name conflicts.
For overall code structure/organization, perhaps you can split your current py file into two: one with pure model functions (no matplotlib and no printing), and one with the rest. Perhaps that second file can then just be considered part of the example/tutorial.
Thanks Anton. Will try to adopt all your suggestions. I had forgotten about the suffix options, they should work on the joins.
I've rewritten the mlfm code now so it follows the helpful suggestions made. I separated the py functions into
mlfm_lib_code_0_41.py
def mlfm_meas_to_norm(dmeas, ref, qty_mlfm_vars):
return dnorm
def mlfm_6(dmeas, c_1, c_2, c_3, c_4, c_5, c_6):
return mlfm_6
def mlfm_norm_to_stack(dmeas, dnorm, ref, qty_mlfm_vars):
return dstack
def mlfm_fit(dmeas, dnorm, mlfm_sel):
return(dnorm, cc, ee, coeffs, errs)
mlfm_lib_graphs_0_41.py
def plot_mlfm_scatter(dmeas, dnorm, mlfm_file_name, qty_mlfm_vars):
def plot_mlfm_stack(dmeas, dnorm, dstack, ref, mlfm_file_name, qty_mlfm_vars,
xaxis_labels=12, is_i_sc_self_ref=False, is_v_oc_temp_module_corr=True)
There are now three different dataframes : dmeas, dnorm and dstack
There are three different measurement files and one for testing, as I've taken out the print statements this is selected by setting meas_file
# select one of the following meas files
meas_file = 0
if meas_file == -1:
mlfm_meas_file = 't1_041.csv'
elif meas_file == 0:
mlfm_meas_file = 'g78_T16_Xall_F10m_R900_041.csv'
elif meas_file == 1:
mlfm_meas_file = 'n05667_Y13_R1k6_fClear_041.csv'
elif meas_file == 2:
mlfm_meas_file = 'x19074001_iec61853_041.csv
This is the new version that replaces everything in version 04
I've included an html file showing how it looks running the 0Gantner data,
Again, any thoughts or comments welcome. Steve
Time flies! I think this is better for sure, but I think we need some more guidance from the maintainers here. Is it close enough to have a PR?
Is it close enough to have a PR?
I think so. Python modules should be put into a new folder pvlib.mlfm
.
What's the next stage to get this going? If I move the python modules to pvlib.mlfm can someone do a PR? I had problems last time I tried and would appreciate some help.
@steve-ransome the next step is a PR so all maintainers can review. What problems did you have? Git has a learning kerb, for sure.
Thanks Cliff. I reported problems on 18 Aug 2020 further up this thread, I couldn't seem to understand what I was doing re github, tortoise git and the PR despite Mark's @mikofski instructions
@steve-ransome my concern about creating the PR for you, is that someone will still need to use git to submit changes during the review. My workload is such that I don't think I can edit the PR to completion.
Thanks Cliff. I will contact you as requested.
I'll be adding the Loss Factors Models and Mechanistic Performance Models to PVLIB, .
See http://www.steveransome.com/PUBS/1906_PVSC46_Chicago_Ransome.pdf for more details on what they do
I hope it should be ready by the conference in Salt Lake in May
I've uploaded a couple of graphs and some sample code and Initially I'm looking for feedback and comments as I will need to enhance the present naming convention
With the LFM I have been using prefix and suffix letters to describe the type of measurement and any corrections. For example this Python 3 code shows the naming I have been using and does normalised LFM and temperature correction. I have just shown nIsc and nVoc for clarity but will upload code for all variables soon plus graphs and MPM fits.
Once I have this in a correct form I can add more of features
MLFM_SRCL_200302.py.txt
Any feedback is welcome Steve