Open ZhichaoYg opened 9 months ago
Dear ZichaoYg,
It seems the MPI adress in your MPI_shared.dat file is not matching that expected by your MPI client (.dll controller file in ServoDyn). Is the file updated when you run SC_Matlab.m?
To help you (and others) gaining a bit more understanding on how this works and hopefully find a way to debug: 1) SC_Matlab.m calls the function MPIServer_Init (source code in MPIServerSubs.f90, compiled in MPIServerSubs.dll) via a mex function of the same name (source code in MPIServer_Init.cpp, compiled in MPIServer_Init.mex64). This initialises the MPI server and broadcasts its adress in MPI_shared.dat. The status of the connection is printed in SC_MPIServer/stdout_MPIServerSubs.txt. 2) Once the server is active ("Ready to connect" is displayed), FAST.Farm is started. Each instance of OpenFAST (one per turbine) loads its MPI client controller (SC_Client_64.dll or variants from https://github.com/ValentinChb/SC_MPIClient). Connection is established by turbine nr 1 through the function MPIClient_init in SCClientSubs.f90, which reads the server address in MPI_shared.dat. Then, it complements MPI_shared.dat with other fields to be read by other instances of OpenFAST and the server. The status of the connection is printed in <.fstf folder>/OpenFAST/T1/ControlData/stdout_SCClientSubs.txt.
So, based on the info you provided, the server does start and writes its address, the client does start, finds the file and reads the server address, but fails at connecting to it. It may be because the client is run before the server is finished initialising and reads an old address.
To answer your question about the path of MPI_shared.dat: on the client side, this is specified in third line of <.fstf folder>/OpenFAST/SC_input.dat; on the server side, it is specified in SC_MATLAB.m. In your case, they seem to match as both client and server locate the file. Or, pehaps, the client reads another MPI_shared.dat than that the server creates and that's why the connection addresses don't match?
Hoping this helps.
BR, ValentinChb
Dear ValentinChb,
Thanks for your reply. Under your guidance, I have realized the link between MATLAB and FAST.Farm. But I see that you are using DTU 10MW Turbines, and I need to use DTU 5MW Turbines. How should I modify the controller (DISCON.dll )of the internal turbines to realize the link between MATLAB and FAST.Farm? I appreciate your assistance and am looking forward to any further guidance you can provide.
Best Regards,
ZhichaoYg
Dear ZichaoYg,
I am pleased to learn that you managed to run the co-simulation setup! If you want to change turbine, you may find/adapt input files of the DTUWEC controller. Do you mean the NREL5MW? If you prefer, you may want to link SCClient to the ROSCO controller as explained in the SC_MPIClient repository, and use input files for the NREL5MW. This may however need some fiddling as I only used DTUWEC lately and did not thoroughly verify the link to ROSCO.
May I ask you to describe your actions to make this working and consider helping solving this issue #1 ?
Best Regards,
Valentin
Dear ValentinChb,
Glad to receive your reply! I read this issue https://github.com/ValentinChb/FASTFarm2Simulink/issues/1 carefully and I guessed we had the same problem. When trying to modify the NREL 5MW Turbines, should the content of BLADED INTERFACE be changed to the link file and input file of the corresponding version of ROSCO controller and SCClient? Would you like to explain how you implemented the link between the SCClient and the DTUWEC controller? I read in your paper that the controller was modified for the convenience of linking with MPI. Does it modify the source file of the controller? I appreciate your assistance and looking forward to receiving your more detailed guidance! The content of "BLADED INTERFACE" as follows:
Best Regards,
ZhichaoYg
Dear ZichaoYg,
Can you describe how you solved the problem by adding a comment in https://github.com/ValentinChb/FASTFarm2Simulink/issues/1?
Regarding linking to controllers, see instructions in https://github.com/ValentinChb/SC_MPIClient. You actually do not need to modify the ROSCO source code, just recompile it so that it gives the static library libROSCO.a that will be used as dependency in the SCClient dll. I recommend you understand (and modify if needed) the way SCClient links to ROSCO by reading the source code.
Then, the BLADED INTERFACE section of ServoDyn input file should include DLLFileName=Control/SCClient_ROSCO_x64.dll, DLL_InFile=Control/DISCON.IN (or whatever name your input file has).
Best Regards, Valentin
Dear ValentinChb,
Oh, sorry! I haven't solved that problem yet. I'm just guessing that the controller linked to SCClient should correspond to the version of its input file. I am trying to establish a link between SCClient and ROSCO with the corresponding version of the input file.
Best Regards,
ZhichaoYg
Dear ValentinChb,
Sincerely thank you for your guidance! I'm going to address this as soon as possible, and may I ask which version of the ROSCO controller did you use when linking SCClient to the ROSCO controller? Thank you again for your recent guidance and look forward to hearing from you!
Best Regards,
ZhichaoYg
Dear ZichaoYg,
The version should not be important. SCClient calls the DISCON procedure in ROSCO under the hood. So as long as the procedure name and the avrSWAP array are consistent, you are all set. Most of the avrSWAP array follows a convention defined in the Bladed user manual and should not change, but it is possible that additional components appended to the array have changed in newer versions of ROSCO, which you can easily find out by reading the source codes.
BR
Valentin
Dear ValentinChb,
Thank you very much for your guidance. Next, I will deeply explore the connection between them and find the key to the problem. Thanks again for your help during this period!
Best Regards,
ZhichaoYg
Dear ZhichaoYg I encountered a similar problem when trying to replace the DTU10MW wind turbine with the NREL5MW wind turbine. Have you resolved your issue? Best Regards, dxsaigao
Dear ValentinChb,
I'm getting an MPI connection error when trying to run. I guess is related to the MPI shared .dat file? Compared to your initial shared .dat, my MPI shared .dat file is missing the following parameters. Other than that, I didn't find the generation path for the MPI shared .dat file. How will I generate an MPI shared .dat file with parameters? or is the MPI connection error something else? I would appreciate it if you could provide me with some guidance.Thank you in advance.Here is the content of my MPI shared .dat file. Best Regards,
ZhichaoYg