Closed maettu-this closed 7 years ago
Additional background and questions:
My hosting application builds assemblies always into the same folder where the script is loacted, not the CSScript temp folders, because:
How does CSScript clean up the temp folders? Can it be ensured that the temp folder doesn't exceed a certain size and amount of files? Can the cleaning strategy be configured somehow? Can the retention time be configured somehow? I have seen the "DoCleanupAfterNumberOfRuns" option, but that doesn't allow e.g. configuring a retention time (e.g. 5 days).
Would it be possible to execute an assembly fully in-memory (RAM), instead of compiling it into a file? That would totally eliminate the need to handle potential file conflicts, though for the cost of probably no longer be able to efficiently cache the assemblies... Do you recommend to use CSScript.CompileWithConfig()
and specify InMemoryAssembly = true
? After all, its comment talks about "undesired side effects"...
Another comment related to issue 1 "cache probing algorithm fails to detect my hosting application's assemblies":
CSScript.CompileFile()
takes an optional refAssemblies
parameter. However, it makes not much sense to retrieve the file paths to the assemblies currently loaded in the hosting applications AppDomain
. I'd rather see CSScript.CompileFile()
retrieve these assemblies from AppDomain.CurrentDomain.GetAssemblies()
.Hi there,
So many questions. Let's do them one-by-one :)
XML doc has been updated now. Txs.
The cache validating algorithm does verify (MetaDataItems.IsOutOfDate
) the time stamps of all dependencies including the host application (except GAC assemblies):
internal static bool ScriptAsmOutOfDateAdvanced(string scriptFileName, string assemblyFileName)
{
if (assemblyFileName == "" || assemblyFileName == null)
return true;
if (File.GetLastWriteTimeUtc(scriptFileName) != File.GetLastWriteTimeUtc(assemblyFileName))
return true;
return MetaDataItems.IsOutOfDate(scriptFileName, assemblyFileName);
}
The nature of the exception text is not clear to me. I'ts not in English but I assume that it is something about the asm file not being found. So it's not directly about the caching. I have just tested your scenario and I see that the timestamp of host app triggers the recompilation (as it should). If you are implementing your own caching algorithm then you need to also count on this and trigger the recompilation.
I cannot reproduce this one. CSScript.LastCompilingResult
is mutex protected and is never set to null
it can only have null at startup. Yes, potentially it may contain not the very latest result (e.g. on Linux where Mutx is compromised) but not null
. I am afraid I cannot do much about this one until I have a simple "Hello World" style test sample.
csscript.MetaDataItems.StampFile()
is protected by the same system wide mutex thus it's the same as for Issue 2 (CacheEnabled)
CS-Script schedules cleanup on AppDomain.CurrentDomain.ProcessExit
event. However if it is not a CS-Script temp directory then CS-Script doesn't know about it. I suggest you do the same. Subscribe on this event and do your cleaning.
DoCleanupAfterNumberOfRuns
is used for stand alone script execution with cscs.exe
.
Yes it would but only partially. You can pass InMemoryAssembly
and this will ensure that the assembly file is not locked. Though it will still be created.
The side effect is minimal. The Assembly
instance at runtime will have Location
property empty and can even throw when you try to access it. But if you don't access it you are fine.
The caching algorithm actually takes the all assemblies locations and embeds them as metadata into the assembly file. These assemblies are not just something that user specified but the assemblies that are referenced by the C# compiler during the compilation:
TestPad.exe is the host application
I have further analyzed "Issue 1 (CacheEnabled)", here's my findings:
When caching is disabled, an assembly is stamped with the current time (today at 14:45):
When caching is enabled, an assembly is stamped with the time when the script was saved (last month at 20:31):
When caching is enabled, the assembly indeed refers to the very latest version of the hosting application assemblies (1.10.6285.26702):
But, when calling CSScriptLibrary.AsmHelper.Invoke()
, there is the mentioned exception up the calling chain (details see stack trace at the initial post) and .NET complains System.IO.FileNotFoundException: The file or assembly "MT.Albatros.Scripting, Version=1.10.6285.25922, Culture=neutral, PublicKeyToken=37469a15c42ef9cb" one of its dependencies. The specified module could not be found.
Note that the exception refers to an outdated version 25922, whereas the assembly refers to 26702, so building indeed works fine. The problem happens when the AsmHelper tries to load and invoke a method on the assembly. Due to some reason, the outdated version 25922 is trying to be retrieved.
When running the whole thing without caching, all works fine.
Two thoughts on Issues 2 and 3 (both CacheEnabled) that might give a hint on the root cause (assuming they are related):
I am not understanding why csscript
needs to MetaDataItems.StampFile()
when caching is enabled. After all, the caching mechanism should find out that the assembly .dll is sound up to date and nothing needs to be compiled nor stamped.
And, could this also be the issue with LastCompilingResult
? The first instance of the hosting application will compile the assembly. But what will LastCompilingResult
of subsequent instances refer to? What will LastCompilingResult
contain if caching finds out that nothing needs to be compiled at all? And what will happen if there is a task switch from instance 'A' to instance 'B', i.e. instance 'A' starts compilation of the assembly but doesn't have the result ready yet, then instance 'B' becomes active and attempts to build the assembly and may even succeed, then instance 'A' becomes active again and already finds the assembly? Does the system wide cover this case?
A bit tricky to reproduce this in a "Hello World" style test sample... I'll give it a try...
I am not understanding why csscript needs to MetaDataItems.StampFile() when caching is enabled.
I had another look at it again and (as before) I don't see this happening. I used the code below:
static void Issue50()
{
CSScript.CacheEnabled = true;
var file = "test_script.cs";
Action call = () =>
{
var asm = CSScript.CompileFile(file, Assembly.GetExecutingAssembly().Location);
var scr = new AsmHelper(Assembly.LoadFrom(asm));
scr.Invoke("Script.Main", new object[0]);
};
call();
call();
}
I deliberately introduce the dependency on the host assembly (Assembly.GetExecutingAssembly
). This leads to the MetaDataItems.StampFile()
being invoked while there is no changes in the scripts but to the the host application (e.g. it's recompiled). Thus simple adjustment in the host code will always trigger script re-stamping. This is a correct and expected behavior, which is not immediately obvious. But stopping recompiling the host also stops triggering MetaDataItems.StampFile()
(providing scripts are not changed).
Thus indeed without corresponding test-case reproducing this problem is practically impossible.
What will LastCompilingResult contain if caching finds out that nothing needs to be compiled at all?
This is in fact a simple one. It will contain the very last compilation result. The result of the actual compilation of the code but not the result of your your CSScript.Load
call.
Imagine this, you started the host, call Compile/Load, engine detected no compilation needs to be done, so it does not update LastCompilingResult
. Thus LastCompilingResult
will be null. It makes sense, no actual compilation were triggered so the result is empty.
And what will happen if ... instance 'A' starts compilation of the assembly but doesn't have the result ready yet, then instance 'B' becomes active and attempts to build the assembly...
This is OK as the compilation is synchronized with system wide Mutex
based on script name. But as I mentioned this can be a problem on Linux where Mutex
is compromised.
Nevertheless your research findings have pushed me to the check a few extra things and I found that LastCompilingResult
indeed has a minor problem but not the one you thought. The LastCompilingResult
is not assigned if the compilation triggers the error.
The intended use-case was that user tries to compile the script and immediately has the knowledge about the success/failure via compiler error exception. The compiler history
and LastCompilingResult
were features that compromise state-less execution model that should trigger no side-effects. I let compiler history
and LastCompilingResult
in and now I am not sure their introduction was warranted. While they seems to be useful there is no real need for them.
Anyway I have updated the compiling routine so LastCompilingResult
is set even when compiling the script failes.
This issue almost drives me crazy... Though I may not haven't found the root cause yet, I have found more hints:
MyScript.cs
last modified on 2017-03-08 at 12:03 results in the mentioned System.IO.FileNotFoundException
because it searches for Version=1.10.6276.25669
of my hosting applications assemblies. Now note:
So, somewhere in the CSScript or .NET or Visual Studio cache there is a lookup for the first build of my hosting application AFTER the script was saved the last time. I have then tried to delete the cache in "C:\Users\
However, after deleting all cached data, I again ran into the System.NullReferenceException
when trying to access CSScript.LastCompilingResult.Input.ReferencedAssemblies
. I know it's not nice to delete CSScript internal files. Still, I'd prefer that CSScript.LastCompilingResult
either always has values assigned, or, its documentation clearly states that the user has to check for null
before trying to access LastCompilingResult
of one of its properties. I have now improved my code. It now checks each property of the LastCompilingResult
before trying to access it.
It took me the whole afternoon to finally find this:
"C:\Users\
And guess what's in there: MyScript.DLL created on 2017-03-08 at 14:15 !!!
After deleting that file, there is no more System.IO.FileNotFoundException
referring to Version=1.10.6276.25669
of my hosting applications assemblies. But as soon as I recompile my hosting application, things go awry again...
According to what is cache AppData\Local\assembly\dl3? this issue is "caused by the ShadowCopyFiles
property set to true
". Now, can I control how CSScript creates the AppDomain, i.e. that the ShadowCopyFiles
property shall be set to false
?
By the way, I agree that the CompilingHistory
and LastCompilingResult
compromise the state-less execution model. Still, I see that I need the information provided by the LastCompilingResult
. Would it be an option to provide additional CompileFile()
and CompileWithOptions()
methods that have an additional result
out parameter? If that parameter is provided, i.e. not null
, the methods could clone the result info object and the client would have all the needed information in a single call.
Shadow copying is a feature in the .NET framework to allow assemblies used in an app domain to be updated without unloading the app domain.
Thus changing the default value to "false"
may have unexpected side effects. I can provide the config based overloading for this property but I am not sure that it is gonna help you. I suggest you test the change in your code branch in if indeed it does help I will find the way of exposing this value to the user.
I think you have better chances of solving this by clearing the cache on the application startup. The cache location is deterministic so you can always delete the specific cache file:
string cache_to_delete = CSScript.GetCachedScriptPath(scriptFile);
I'd prefer that CSScript.LastCompilingResulteither always has values assigned, or, its documentation clearly states that the user has to check for null before trying to access LastCompilingResult of one of its properties.
I don't think this makes sense. The initial value of LastCompilingResult is null and it has very intuitive semantics - "the property is not initialized". Thus null
is a very valid value. And warning developers in the product specific documentation for necessity of null checking or any other common practices is highly unorthodox.
Would it be an option to provide additional CompileFile() and CompileWithOptions() methods that have an additional result out parameter?
Yes it would be a compromise. However I am reluctant to do it without a serious reason/demand. For example it's been shown that Mutex doesn't work. However I've seen no evidences of that. Let's see if the latest fix for LastCompilingResult does address the issue and if it doesn't then we may consider the API changes.
BTW I am going to have a new release in a day or two so you will be able to test LastCompilingResult very soon.
I agree, developers have to check for null
, that's common knowledge. Note though, many methods of the .NET Framework explicitly state what happens in such corner cases, e.g. int.TryParse()
"...or zero if the conversion failed..." or Directory.GetParent()
"...or null if path is...". Maybe such kind of statement would also be helpful for LastCompilingResult
. In any case, I consider "Issue 2 (CacheEnabled / LastCompilingResult becomes null
)" as fixed for now.
CSScript.GetCachedScriptPath()
doesn't solve the issue. This method e.g. returns "C:\Users\\System.IO.FileNotFoundException ... Version=...
exception.CSScript.CacheEnabled = false
but that isn't beneficial either.The CSScript.CacheEnabled
setting seems to have some kind of side effect on .NET assembly resolution. If CSScript.CacheEnabled = false
, the .NET cached file in "C:\Users\\CSScript.CacheEnabled = true
, .NET unfortunately considers the dependencies stated in that out-of-date assembly instead of the newly built
Unfortunately it's not that easy for me for changing to...
var setup = new AppDomainSetup();
setup.ShadowCopyFiles = "false";
AppDomain.CreateDomain("ScriptRunner", null, setup);
...because the AsmHelper
initializes the AppDomain
.
In an attempt to make these changes within CSScript source code itself, I have upgraded my solution to VS2015 and modified it such that it uses CSScriptLibrary source code instead of the .dll file. It builds and my hosting application starts, however fails as soon as I try to CSScript.CompileFile() > CSExecutor > Microsoft.CSharp.CSharpCodeGenerator.System.CodeDom.Compiler.ICodeCompiler.CompileAssemblyFromFileBatch()
because that fails to located the correct csc.exe
... So I am stuck again...
Still I was able to step into CSScript.CompileFile()
to a certain degree, trying to find the root cause for this issue. At first it looked all OK (Mutex, IsOutOfDateAlgorithm, MetaDataItems.IsOutOfDate,...)
but at the second look I think I found the root cause:
GetCompilerLockName()
uses a named mutex that is named with the current process ID. Two instances of the same application will result in two different process IDs, resulting in two separate named mutexes!
In my hosting application I use a very similar approach to prevent concurrent access to settings files. I use a combination of application and settings name for the mutex: this.mutex = new Mutex(true, Application.ProductName + "." + this.name, out createdNew)
. In case of CSScript something like "CSScript.\<ScriptFilePath>" could be the adequate name for the mutex.
In anticipation of your findings I made the change in the latest (today) release. It's not an official feature but rather something for testing/troubleshooting. This is how currently the AppDomains are initialized under the hood:
setup.ShadowCopyFiles = Environment.GetEnvironmentVariable("AppDomainSetup.ShadowCopyFiles") ?? "true";
This gives you ability to modify ShadowCopyFiles
from the host application. Though as you already found this still is not a solution.
Thus ultimately it get's you into a situation where CS-Script generic caching technique is not compatible with your specific runtime scenario. Thus you have these options:
Implement your own caching technique. Not an attractive approach but... probably the most flexible one. You can benefit from the bunch of AppDomain extensions from CS-Script for cloning, Action
execution and unloading:
var domain = AppDomain.CurrentDomain.Clone();
domain.Execute(Action);
domain.Unload();
Do everything in the same AppDomain. It looks like the problems always occur when you use remote domain. Review the your strategy and may be you can survive with the single domain model. For me it is always a default choice. I use this remote domains only when I have to,
Try to use CSScript.Evaluator with any of the supported engines. Yes evaluator doesn't give you caching when configured to use Mono or Roslyn but then you may not need caching as the compiling itself is significantly faster. The draw back - no debugging and no domain unloading. Strictly speaking unloading is still possible even with evaluator but it may not be so flexible. Have a look at the tests project:
public void CreateDelegateRemotelyMono()
{
var sum = CSScript.MonoEvaluator
.CreateDelegateRemotely(@"int Sum(int a, int b)
{
return a+b;
}");
Assert.Equal(18, (int)sum(15, 3));
sum.UnloadOwnerDomain();
}
This is something that I was really waiting for. It seems like a possible concurrency control flaw. Will have a look at it.
Yes the Issue 3
is a defect. I have created a new roport for it: https://github.com/oleg-shilo/cs-script/issues/56
It has been fixed now and you can test it from the code base. I am not sure about the release day but it will not be long.
Thanks for the update. I'm anyway busy until next Monday/Tuesday, so I'll wait for the new release and then verify the fix of issue 3 as well as look at issue 1 again. I'll let you know about the result.
I confirm that running the same script in parallel instances of my hosting application now perfectly works. So, issue 3 has been resolved as well :-)
I will now have another look at issue 1. Given your hints and recommendations, and maybe some others who ran into the same issue with other applications, I hope to find a solution to this as well. I'll let you know about my findings.
OK. I will keep the issue open for a while just in case if anyone wants to contribute to the discussion.
I have again tried to step into the CSScript source code and analyze the way caching works. So far I haven't found anything to could potentially cause issue 1. However, as soon as I get to CSScript.CompileFile() > CSExecutor > Microsoft.CSharp.CSharpCodeGenerator.System.CodeDom.Compiler.ICodeCompiler.CompileAssemblyFromFileBatch()
I get an InvalidOperationException
because it fails to located the correct csc.exe
. Can you give me a hint on the preconditions to locate the correct compiler?
I have also quickly upgraded my solution to .NET Runtime 4.0 / .NET Framework 4.6.1 (incl. the CSScriptLibrary to 4.5) in order to see whether that newer version of the runtime fixes the caching issue. (I found a note on the web that .NET 4.0 has improved its shadow copy caching algorithm.)
Next I tried to name the assembly MyScript.something instead of .dll, and it works! So it seems simply be caused by the file extension?!? I can hardly believe this... I also noted that .exe doesn't work either, but .bin/.com/.ocx/.vbe works fine... Do you have any idea where this could come from? Do you know of any such .NET policy that is based on file extensions?
Another thing that has crossed my mind:
Could you provide me a distribution where the time stamp of the assembly no longer matches the time stamp of the script file? Then I could check whether .NET simply dislikes assemblies that pretend being older than its file actually is. Just a guess, no clue whether this has any chance to circumvent the issue.
Edit:
Based on the most recent findings, I assume that the time stamp isn't the culprit. So we shall only take the idea above into account in case the asmName.EndsWith(".exe" || ".dll")
finding turns out to be a false-alarm.
Perseverance helps to tackle such tricky issues ;-)
For a while I thought that the issue is solely caused by the loading/invocation of the assembly. But that's actually not the case. It rather depends on whether I compile the script with CSScript.CacheEnabled = true
or CSScript.CacheEnabled = false
. In case of false, all works fine. In case of true
, it only works if the extension is neither .exe nor .dll.
But then, the exception will actually be caused when loading/invoking the assembly. It is caused by the following call:
asmBrowser = new AsmBrowser(Assembly.LoadFrom(AsmFile));
Apparently, Assembly.LoadFrom()
retrieves an outdated cached assembly, it somehow doesn't recognize that the .dll in the script folder is newer than the cached assembly.
Digging a bit deeper I came across the following line of code:
if (asmName.EndsWith(".dll", StringComparison.CurrentCultureIgnoreCase) || asmName.EndsWith(".exe", StringComparison.CurrentCultureIgnoreCase))
These are exactly the two extensions in question! Why are only those removed, but not ".compiled"? Or, is removing the extension actually the root cause that lets .NET retrieve the outdated assembly from the cache? Unfortunately I am not able to complete the analysis of the source code, since I run into the InvalidOperationException
when trying to invoke csc.exe
. So I leave it for the moment and ask you to read through my bunch of findings and asses them. Maybe I am running towards a cul-de-sac, maybe we are revealing a true bug...
I only fond that code you are referring to in the Utils.RemoveAssemblyExtension
implementation. This method is used to extract the possible namespace part from the assembly file name for further lookup of this namespace in GAC. That is why the method only considers exe
and dll
as only these two extensions can be in GAC. Thus I think there is no problems there.
Apparently, Assembly.LoadFrom() retrieves an outdated cached assembly,
Good. This is an important bit. LoadFrom
is fed with the correct absolute path to the assembly and if it decides to load something else (e.g. outdated assembly) then nothing we can do about it.
I had this filling that CLR may base its decision to look for the assembly cache in Asm Shadow
folders instead of the specified path depending on the file extension. May be... but who knows. You can try to push it by forcing (e.g. renaming asm before loading) your assemblies to have non traditional file extensions.
It does seem like caching is causing you quite some pain. Why don't you do your own caching that would reflect better your business logic. The default CS-Script caching is for the use-cases, which are apparently not fully consistent with yours.
Well, I don't fully agree. I don't think the root cause is the CS-Script caching. It's rather the CLR shadow copying. But that gets its information from the compilation, so I think that the root cause lies there.
Can you give me a hint on the preconditions to locate the correct compiler (env var, path,...)?Then I will be able to fully step into the CSScript source code (i.e. compilation and invocation in the same run) and can further analyze the issue. I'm sure we will also solve this one :-)
Well, I don't fully agree. I don't think the root cause is the CS-Script caching. It's rather the CLR shadow copying.
We are actually on the same page. I just speculated that by implementing your own caching you may have more flexibility with naming the compiled assemblies and this way you may indirectly affect shadow copying.
Can you give me a hint on the preconditions to locate the correct compiler.
The compiler is hidden from CS-Script. The engine invokes .NET CSharpCodeProvider
which knows how to locate MS C# compiler csc.exe:
var providerOptions = new Dictionary<string, string>();
providerOptions["CompilerVersion"] = "v4.0";
return new CSharpCodeProvider(providerOptions).CreateCompiler();
I still think that CLR may decide to disable shadow copying is there is something "unorthodox" about the assembly to be loaded. For example file extension (.dll
vs .compiled
). I remember that the first CLR implementation refused to even reference assembly unless it's dll. Thus all exe assemblies were outlawed.
CS-Script also inject the CS-Script metadata in the cached assemblies. It is a small binary footer that is appended to the end of the assembly file. It doesn't affect anything. The assembly can be loaded and executed but technically speaking it is an altered assembly file and CLR may decide to exclude it from the shadow storage.
This metadata is only appended to the *.compiled
assemblies as they are not for the distribution. But any exe or dll are not stamped with metadata. This is because the user may want to distribute them and they need to be the same way as they built by the compiler. Thus there is a binary difference between exe
/dll
and compiled
.
It's all pure speculation from my side but may be it will help.
I've been quite busy this week I didn't have the chance to dive into this topic deeply. And I'm on vacations for the next two weeks, so unfortunately the topic has to rest for a while.
I will again look into it at the end of April / beginning of May. You may set this item to fixed if you prefer to close it (after all, issues 2 and 3 have been resolved), or leave it open until I can look into issue 1 again.
One more thing that might help me: Could you roughly explain the purpose of the stamping / the binary difference between .exe/.dll and .compiled and tell me where I can find this in the source code? This might help me finding a solution (or an bug if there was one). Maybe, the best solution would be to just switch to .compiled instead of .dll for my application.
Not a problem. I think this thread is no longer appropriate for continuing this very specific discussion. I am marking this issue as fixed and isolating the issue 1 in a new dedicated discussion post (https://github.com/oleg-shilo/cs-script/issues/61), which we will use in future.
As for your question, assembly stamping is done for preserving all script execution context inside of the assembly file. Thus if the script engine finds already compiled script it can compare the current execution context with the embedded one and make a decision if the script needs recompilation or the compiled script can be executed straightaway.
Conceptually execution context
is very similar to the file timestamp. Though it covers more than a time of the compilation (e.g. dependency assemblies).
CS-Script only injects execution context info into *.compiled
assemblies but never into dll/exe.
The implementation can be found in csscript.MetaDataItems.StampFile(file)
and csscript.MetaDataItems.ReadFileStamp(file)
.
Hi Oleg,
I have ran into some issues when invoking scripts in parallel, i.e. multiple processes that refer (and therefore try to compile and load) to the same assembly. With
CSScript.CacheEnabled = false
I obviously run into an issue, since one instance may re-compile an assembly in the very moment another instance tries to re-compile or load it, so going without cache makes no sense. However, withCSScript.CacheEnabled = true
I also run into several issues:Issue 1 (CacheEnabled)
When developing, the cache probing algorithm fails to detect that my hosting application's assemblies became newer and the script must therefore me rebuilt (in order to update the references to the latest hosting application assemblies):
The script in question has e.g. been edited at 10:00 in the morning. Thus, CSScript assumes that nothing needs to be re-compiled, after all, an up-to-date assembly of the script in question can be retrieved from the cache.
However, I am working at my hosting application, thus it's assemblies get e.g. built at 15:00 in the afternoon.
As a result, there is a mismatch among the hosting applications assembly versions and the .NET runtime throws:
System.IO.FileNotFoundException: Die Datei oder Assembly "MyHostingAppAssembly, Version=1.10.6276.25669, Culture=neutral, PublicKeyToken=37469a15c42ef9cb" oder eine Abhängigkeit davon wurde nicht gefunden. Das System kann die angegebene Datei nicht finden. Stack: bei System.Signature._GetSignature(SignatureStruct& signature, Void* pCorSig, Int32 cCorSig, IntPtr fieldHandle, IntPtr methodHandle, IntPtr declaringTypeHandle) bei System.Signature..ctor(RuntimeMethodHandle methodHandle, RuntimeTypeHandle declaringTypeHandle) bei System.Reflection.RuntimeMethodInfo.get_Signature() bei System.Reflection.RuntimeMethodInfo.FetchNonReturnParameters() bei System.Reflection.RuntimeMethodInfo.GetParametersNoCopy() bei System.RuntimeType.FilterApplyMethodBaseInfo(MethodBase methodBase, BindingFlags bindingFlags, CallingConventions callConv, Type[] argumentTypes) bei System.RuntimeType.GetMethodCandidates(String name, BindingFlags bindingAttr, CallingConventions callConv, Type[] types, Boolean allowPrefixLookup) bei System.RuntimeType.GetMethodImpl(String name, BindingFlags bindingAttr, Binder binder, CallingConventions callConv, Type[] types, ParameterModifier[] modifiers) bei System.Type.GetMethod(String name, Type[] types) bei CSScriptLibrary.AsmBrowser.FindMethod(String methodName, Type[] args) bei CSScriptLibrary.AsmBrowser.FindMethod(String methodName, Object[] list) bei CSScriptLibrary.AsmBrowser.Invoke(Object obj, String methodName, Object[] list) bei CSScriptLibrary.AsmBrowser.Invoke(String methodName, Object[] list) bei CSScriptLibrary.AsmRemoteBrowser.Invoke(String methodName, Object[] list) bei System.Runtime.Remoting.Messaging.StackBuilderSink._PrivateProcessMessage(IntPtr md, Object[] args, Object server, Int32 methodPtr, Boolean fExecuteInContext, Object[]& outArgs) bei System.Runtime.Remoting.Messaging.StackBuilderSink.SyncProcessMessage(IMessage msg, Int32 methodPtr, Boolean fExecuteInContext) Exception rethrown at [0]: bei System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg) bei System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type) bei CSScriptLibrary.IAsmBrowser.Invoke(String methodName, Object[] list) bei CSScriptLibrary.AsmHelper.Invoke(String methodName, Object[] list) bei MyHostingApp.ScriptRunner.InitializeAssembly(AsmHelper helper, String scriptParentProperty) in MySolution\ScriptRunner.cs:Zeile 606.
Issue 2 (CacheEnabled)
There are moments when the
CSScript.LastCompilingResult
becomesnull
and one of the parallel instances will throw aSystem.NullReferenceException
when trying to access it. It seems thatCSScript.LastCompilingResult
or one of its underlying dependencies is not multi-process capable. I guess that it's caused by the fact that some time passes betweenCSScript.CompileFile()
and accessingCSScript.LastCompilingResult
. In case another process invokesCSScript.CompileFile()
in the meantime, something obviously goes wrong.Issue 3 (CacheEnabled)
There sometimes happens an
IOException
"process cannot access the file … because in use by another process" incsscript.MetaDataItems.StampFile()
:System.IO.IOException: Der Prozess kann nicht auf die Datei "MyTestScriptFolder\LocalUserSettings\TcpAutoSocketTerminalPair.dll" zugreifen, da sie von einem anderen Prozess verwendet wird. bei System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath) bei System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy) bei System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share, Int32 bufferSize, FileOptions options, String msgPath, Boolean bFromProxy) bei System.IO.FileStream..ctor(String path, FileMode mode) bei csscript.MetaDataItems.StampFile(String file)
By the way
CSScript.IsOutOfDateAlgorithm states that
default implementation is <see cref="CSScriptLibrary.CSScript.CachProbing.Simplified"/>
, but the static property is actually set toisOutOfDateAlgorithm = CachProbing.Advanced
, so this comment is outdated. (Note that the release notes of v3.6.0IsOutOfDateAlgorithm set to CachProbing.Advanced by default
are correct.)So I'm somewhat stuck. Disabling the cache leads to issues, as enabling it does. Any ideas on how to solve this?
Thanks, Matthias