Closed piaste closed 6 years ago
Could the ContextSchemaPath be used as a local cache if you develop against a remote database? I do have stored procedures etc, but would like to just speedup the fsharp compilation time.
Could the ContextSchemaPath be used as a local cache if you develop against a remote database? I do have stored procedures etc, but would like to just speedup the fsharp compilation time.
I'm not sure I understand what you mean. As long as the file exists, SQLProvider currently won't connect to the DB at all during compilation, so you can't get any faster than that (when it comes to avoiding remote calls).
Are you thinking of automatically updating / growing the cache file as you access new tables, instead of manually saving it? That could be nice, though I'd be happier if I could just manually force a cache update - I tried to implement something like that in another branch, but I couldn't get ITypeProvider.Invalidate
to work like I expected.
The current problem is that the caches don't span over processes: Visual Studio has its own and each compilation of fsc.exe will have empty cache. So with a semi-large project having a remote db, build takes minutes, not seconds.
This is marked as WIP. 1) I like the idea of parameter combinations. Is it baackward compatible? 2) What is still missing, can I already merge this?
I like the idea of parameter combinations. Is it baackward compatible?
In testing I found that the full combinatorial generation was a breaking change in a bad way, because it created ambiguous overloads (like (connString : string)
and (resPath : string)
) which mandated the use of named parameters everywhere.
I added some sugar instead, so now the overload list is explicit but very maintainable. There's a test to ensure no breaking changes.
What is still missing, can I already merge this?
I don't have any more code changes planned, but I would still like to do a couple of manual tests, I hope I can get a chance to do them before the week's end. In particular, this is the procedure I have in mind:
1) Save a context schema .json file 2) Close VS, drop a column in the database that was used by the application 3) Restart VS, compile the application, should use the now-outdated schema and compile 4) Add the column back to the database 5) Run the application, should not throw 6) Manually put garbage in the schema .json file, should still not throw 7) Delete the .json file, should still not throw
Ok, I've rebased on master and done the manual testing I described above, I'm OK with merging this.
Hi, is there an ETA for when this will drop on NuGet? I would love to use ContextSchemaPath but I currently can't because of this issue.
Sorry, I checked the 3 changed files and thought that this won't actually change anything, maybe I missed something. I'll release now, 1.1.50
The issue is somewhat subtle -- I'm compiling for .NET Core, so I need to use the .NET Framework compiler, and the serialization format is apparently different between the two for some reason, so the runtime failed to parse the file created at compilation.
It works fine now with 1.1.50, thanks!
Pre-submitting for review / feedback. Tests are green but I should add a few more to ensure that the designtime - runtime differences do not hold any surprises handled; the current CI tests do change the connection string and occasionally the timeout, but nothing else. And the issue with schema files on dev machines is going to need manual testing.
The urgent concern is addressing #552 . This should be solved by passing
""
as thecontextSchemaPath
in the runtime quotation here, preventing runtime loading.The rest is a cleanup of that function, which @Thorium pointed out was in sore need of it. This version currently takes the lazy approach and just generates all 32 possible overloads of
.GetDataContext()
(which should ensure no breaking changes). But if one wanted we could manually pick and choose the available overloads, by replacing the five for-loops here with an explicit list: