Closed nvivo closed 4 years ago
They would have switched to abstract base classes because interfaces can't be versioned.
I'm guessing the non-virtual methods call another method that is virtual. Virtual methods hurt performance so you don't want to make things virtual unnecessarily.
@JamesNK I see. But with .NET Core being a new API and ADO.NET API being quite stable over almost a decade, do you think this is still a valid concern? Also, talking about database access my guess is that the cost of virtual methods is dwarfed by the cost of the database access.
@NickCraver, @roji, @FransBouma since you guys seem to have interest in the ADO.NET API, have anything to say about this?
@YoungGah, is this something worth pursuing?
I'm guessing the non-virtual methods call another method that is virtual. Virtual methods hurt performance so you don't want to make things virtual unnecessarily.
In the process of executing a query on a remote database and processing results, the nanoseconds lost on a virtcall are negligible. Besides, ADO.NET uses this system since the beginning (so do lots of other APIs in .NET) and no-one complained that their DB code is so slow due to virtual method calls ;)
I can see async methods in your list so I'm guessing just a couple of years ago MS couldn't of added async to IDbCommand. Who knows what tomorrow will bring that will require new methods or properties.
Interfaces don't version.
Performance is just one reason not to make something virtual. Reducing the surface area for implentations maybe? I'll let someone at MS say why they decided not to, I don't know much about ADO.NET so I'd just be speculating.
@JamesNK I think your concerns are valid, but there are 2 important points to consider:
Consider also that even a sql server on "localhost" will spend at lease a few ms just to connect and return an empty query. In practice, most fast queries to relational databases take ~20ms.
Being able to mock the API with standard tools like NSubstitute or Moq is much more valuable to the developer today than saving microseconds in virtual method lookups.
I guess I don't have a very strong opinion here, but here are some remarks:
CommandBehavior.Default
to the overload that does. If you switch to interfaces, every provider will have to implement ExecuteReader() with the exact same boilerplate...So overall the idea of dropping either the base classes or the interfaces seems good (simpler). Since I don't see any advantage to interfaces (unless I'm wrong about the ease of mocking), and base classes can provide common functionality (i.e. the non-virtual non-abstract methods), I guess Microsoft's approach of dumping the interfaces seems good to me...
I agree with @roji on all his points.
@roji just a note, I'm not proposing droping the base classes, I'm proposing adding the interfaces as the default API. Base classes can still implement default behavior.
As for testing, I had huge issues testing if my API was calling the right methods. To check if ExecuteDataReader received the right parameters for example, you must check another protected method that is called internally with different parameters. This is far from ideal.
Currently, unless I'm mistaken, the only framework that can mock the ADO.NET API is the MS Fakes framework which can mock absolutely anything by intercepting calls. Moq and others can't do that.
I'm interested to see if other people had similar issues.
@roji just a note, I'm not proposing droping the base classes, I'm proposing adding the interfaces as the default API. Base classes can still implement default behavior.
Sorry, I misunderstood that. In that case, isn't your proposition more or less keeping things the way they are in .NET (not that there's anything wrong with that)?
As for testing, I had huge issues testing if my API was calling the right methods. To check if ExecuteDataReader received the right parameters for example, you must check another protected method that is called internally with different parameters. This is far from ideal.
If I understand your scenario (not sure), Moq's CallBase
is useful for this kind of scenario - default implementations are inherited from the base class
@roji
isn't your proposition more or less keeping things the way they are in .NET (not that there's anything wrong with that)?
Not exactly. The interface API was added in .NET 1.0 and deprecated on 2.0. Since 2.0, the interfaces are there for compatibility, but there is no interface for ProviderFactory or other classes in Data.Common. There is also nothing for the async API or 2.0 or newer methods.
Moq can only mock things that are mockable. There must be some method that is either virtual or abstract that it can override, or a protected method it can call. The current API provides method for some cases, but not for most of them. There are many things that are internal, private and out of reach unless you use reflection. Only MS Fakes can do it because it replaces the reference with a shim, but this is only available on VS Enterprise and useless for open source projects.
It sounds like I have a very specific case, but certainly anyone who ever tried to mock this api faced this issue. Just google, almost every solution ends up with "mock the legacy interface API or build a wrapper you can mock":
@nvivo OK, thanks for the extra details - I admit I haven't gone very far with mocking ADO.NET.
The thing I don't understand, is why you would want to mock internal, private and otherwise out of reach methods of an API. Shouldn't you be mocking public methods that are directly available to your own application code (which is what you're trying to test)? I do see the issue with the non-virtual methods (e.g. the 0-parameter ExecuteReader() overload), but given that in ADO.NET these always (?) call some virtual overload (e.g. ExecuteReader(CommandBehavior)), is there a real issue here?
Just trying to understand your problem scenario, can you give a simple example?
@nvivo We currently have no plan to bring interfaces in because of the versioning issue which was already pointed out by several people on this thread. Good example of interfaces getting behind is when async and streaming methods were added in .NET Framework 4.5. When we added those new features, we carefully looked into extending interfaces. The options we had at that time were either provide InterfaceFooV2 or separate interfaces for asyn and streaming. We didn't want to add InterfaceFooV2 as we can foresee that we would want to add more APIs in the future. Keep adding separate interfaces for each new functionality would be confusing as they are not tied to the existing interfaces.
@roji I had cases where I want to ensure that a specific overload of ExecuteReader was called, and not that "any of the overloads". It's the kind of thing you have only in libraries, not on user code.
@YoungGah thanks for the info. I'm closing this then.
Do people responsible for this change have any idea of its effect? The core ADO.NET Interfaces have been around for over a decade and data access being the center of most business apps, I'm having trouble conceiving how not purposely breaking so many existing code bases isn't the highest priority? These are some of the most critical high-level interfaces in .NET, removing these breaks every ADO .NET data access library out there and by consequence every project using it. Removing them creates an artificial fragmentation, causing frustration and confusion that will hamper adoption of CoreCLR.
You can still version interfaces and make them source compatible by just adding extension methods for any New API's off IDbCommand
, e.g:
public interface IDbCommand
{
//...
}
public class DbCommand : IDbCommand
{
void NewApi();
}
public static class DbCommandExtensions
{
public static void NewApi(this IDbCommand cmd)
{
((DbCommand)cmd).NewApi();
}
}
The core IDbCommand
interface never has to change after DNX is released and you can continue adding functionality with the strategy above. You could also later (in a major breaking version), roll up these extensions and merge them into the core interface. Either way we get the core stable ADO.NET interfaces that's critical for migration of existing code bases and consequently for adoption of CoreCLR.
I've been asked by @davkean to provide concrete examples of what impact removing core ADO .NET interfaces will have. I can't imagine this change was considered without evaluating the immeasurable impact it would have on the existing .NET ecosystem, but then again it's also been done so there's a possibility it wasn't considered - which I'm going to assume here-in it wasn't.
Despite EF's role of being .NET's default ORM and its outstanding success in capturing a large majority of market share, there's still a large population of .NET developers who prefer to instead use alternative ORM's for a number of different reasons. E.g. an important feature as it relates to CoreCLR is that they have first-class support running on Mono/Linux/OSX as well as supporting multiple alternative RDBMS's. Since CoreCLR is pitching heavily for the Linux/OSX developer market, the more support there is for alt RDBM's, the better. Another important trait of the population of devs who adopt Micro ORM's is that they've evaluated outside of MS ecosystem defaults to choose the ORM that's most appropriate for them. From everything I've seen there's a high correlation between active .NET OSS (i.e. anti-Dark Matter) devs and devs who adopt Micro ORMs, likewise I expect this to have a high correlation with early adopters of CoreCLR - whose major value proposition is to develop on OSX/Linux. These are some of the reasons why it'd be beneficial to include the surrounding .NET ecosystem in your decision making when making fundamental breaking design choices like this.
A cursory glance at NuGet downloads provides an indication of what the non-EF market-share looks like:
NHibernate - 1M+ Dapper - 1M+ OrmLite - 500k+ Simple.Data - 300k+ PetaPoco - ~100k NPoco - 30k+
The real numbers are a lot more than this as many Micro ORM's like Dapper, Massive, PetaPoco, NPoco, etc were designed to fit in a single drop-in .cs so NuGet isn't reporting its true usage. There's also closed source ORM's like LLBLGen Pro which have a large user base but its usage isn't reported by NuGet, likewise I'm sure I've missed a number of other ORM's I forgot / don't know about.
Thanks to GitHub we can do a quick search to see how many different source files contain the core
IDbConnection
, IDbCommand
and IDataReader
ADO .NET interfaces impacted by this change:
IDbConnection | IDbCommand | IDataReader | |
NHibernate | 59 | 181 | 132 |
Dapper | 17 | 21 | 17 |
OrmLite | 179 | 54 | 26 |
Simple.Data | 29 | 27 | 6 |
NPoco | 4 | 10 | 3 |
Note: these results only show source files, the actual number of broken references are much higher.
The actual impact of this change also extends to all projects dependencies using these ORM's. Unfortunately the effect isn't limited to internal implementations as it also breaks the customer source code as many Micro ORM's are just extension methods over ADO.NET Interfaces so the client code looks like:
IDbConnection db = ...
//Dapper
db.Query<Dog>("select Age = @Age, Id = @Id", new { Age = (int?)null, Id = guid });
//OrmLite
db.Select<Author>(q => q.Name.StartsWith("A"));
One extensibility feature from using extension methods is that these ORM's are "open-ended" and Customers can extend ORM's with their own first-class API's by adding extension methods in their own projects - these are broken as well.
Obviously any source code that passes IDbConnection
around is now also prohibited from working on CoreCLR.
E.g. the core ADO.NET Interfaces are heavily used throughout high-level frameworks like ServiceStack, adopted because it's the minimal dependency to enable multi-RDBMS data access. It was also assumed out of all the classes that were unlikely to change, it would be the core ADO.NET Interfaces.
I'm personally astonished there was ever going to be a future in .NET without these interfaces. Interfaces are by design purpose-specific to allow for multiple implementations and ADO.NET is one of the most important "open provider models" in .NET. I've no idea what priorities caused these interfaces to be removed, but both the massive existing .NET code bases that rely on these interfaces along with the alternative-EF .NET ecosystem should be given a much higher priority. This is causing a significant disruption and is a major barrier required to support both existing .NET 4.x and CoreCLR platforms, forcing a non-trivial amount of additional complexity that must be applied to all existing code-bases affected by this.
The current perception is that ADO.NET/CoreCLR is being re-designed to provide first-class support for EF and SQL Server with the rest of the ecosystem being disregarded - non-transparent breaking decisions like this only goes to re-enforce this stereotype.
As a previous member of the .NET team (I now work on Roslyn), I was heavily involved in the original design of the new data common, along with the SQL and Entity Framework teams. I'm not involved in it at the moment, but I can add some background to help correct some of the statements that I'm seeing on twitter and above.
The current design of System.Data.Common for .NET Core started in December 2012, approximately 2 years before we open sourced.
Goals:
IDbConnection
vs DbConnection
), confusions, mistakes and layering issues (split SqlClient from DataCommon, split DataSet from core abstractions) from the original design from .NET 1.0. One that would be easily picked up, both by existing consumers and new developers to .NET Framework.Correction of a few things being spread around:
Something I'm having trouble understanding:
We cannot add members to interfaces
How is it not ok to add members to CoreCLR interfaces yet, it's fine to rip them out entirely?
the proposal above provided by @mythz via extension methods requires that providers derive from the abstract base classes anyway.
The important part is that the interfaces exist and allows source code that references them to compile.
If you don't want to version the interfaces, fine EOL them, just restore the interfaces as they were before they were ripped out and mitigate the burden now imposed on every other library using them. I mean these core Interfaces were never obsoleted with no warning or migration path provided. Yet we're getting punished for adopting a published, well-known, stable API as an integral part into our libraries?
it is a very simple port to move to the base classes.
This needs to be added on every source file that references the ADO.NET Interfaces, and forces Customers to litter their code with custom build symbols.
There doesn't seem the same care for backward compatibility here, but purposely breaking existing customers in a future release is just not an option (I'm surprised it's even considered with ADO .NET's much larger market share). We can't break existing 4.x customers and yet we're being asked to support CoreCLR - So where does this leave existing 4.x libraries that want to maintain existing backward compatibility and also support CoreCLR? Should we be duplicating docs/examples as well?
How is it not ok to add members to CoreCLR interfaces yet, it's fine to rip them out entirely?
The surface area in .NET Core needs to be binary compatible with .NET Framework to enable 1st parties and 3rd parties to build against .NET Core, and run portability without changes on .NET Framework. Adding members to interfaces violates that, as consumers of those members would fail when they ran on .NET Framework.
I'm not arguing for the removal or addition of these interfaces, I just wanted to add some background to why the design ended up where it is. I'll let the current owners including @YoungGah and @saurabh500 tackle that.
Just to summarize the thread, the reason that you believe Microsoft should port these interfaces, is to enable the ecosystem to easily port to .NET Core, while maintaining their .NET Framework implementations?
is to enable the ecosystem to easily port to .NET Core, while maintaining their .NET Framework implementations?
Yes.
Just to summarize the thread, the reason that you believe Microsoft should port these interfaces, is to enable the ecosystem to easily port to .NET Core, while maintaining their .NET Framework implementations?
Yes. External APIs are now broken if I port my codebase (LLBLGen Pro) to corefx: I then have to expose 2 apis or break the existing codebase for all my users.
It might be fine for you people to break our stuff as you don't feel the pain, we do. It's not fine by me: I have to either live with a butchered code base and maintain 2 APIs which do the same thing, OR break my users' code because you thought that was OK.
I also don't get why interfaces don't version, it's just an interface, like a class has an interface too. CoreFX can perfectly fine add the async methods to the interfaces.
The surface area in .NET Core needs to be binary compatible with .NET Framework to enable 1st parties and 3rd parties to build against .NET Core, and run portability without changes on .NET Framework. Adding members to interfaces violates that, as consumers of those members would fail when they ran on .NET Framework.
Easy solution: add the interfaces as they are now. And once you all come to your senses that this rule above is actually rather stupid, you can add the methods you needed to add to the interfaces long ago to the interfaces and move on.
I work with MS software long enough that rules like the one above are great on paper but in practice are broken the second an important MS team needs it to be broken. If you are so 'open' and 'different' as you say are in the CoreFX marketing/pr hoopla, show it. All I see with respect to System.Data and CoreFX is 'what MS needs is done, what everybody else needs is on the backburner or ignored'.
Another thing I forgot to mention: Fowler mentioned yesterday on Twitter that you need everybody to port their stuff. I have to pay for porting my 500K LoC codebase to CoreFX myself, it takes time, effort and will take away time for other features. Extra friction that's totally artificial (it's a new platform! How can there be restrictive rules?) really doesn't help at all: it adds extra maintenance costs, takes extra time to port the code and test and gives extra burden for our users.
All that is out of your scope, and not your concern it seems. But you forget one thing: what if we don't port our code and with me more people? I'm willing to invest time and thus my own money to port my large codebase to your new shiny framework, but sorry to say it, whenever I run into a problem I'm met with restrictions, odd rules and endless debates ending in silence. I.o.w.: I feel very much left alone while at the same time you seem so desperately want us to like your new shiny framework.
Like I said a long time ago : Sell me this framework, this new CoreFX. Well, keeping friction and introducing a lot of moved and taken away cheese is not creating a large incentive to invest a large amount of time (and money) into this.
Just my 2cents.
@FransBouma Please let's try and keep this conversation professional, productive and focused on facts.
I'm not arguing for or against adding the interfaces. However, it is not compatible to add methods to interfaces. Let's walk through this:
1) Add IDbConnection.OpenAsync to .NET Core 2) Anyone who calls this method, will now fail to run on .NET Framework (breaking a core principle/goal that I called out above). This also breaks the XAML designer and a few other VS features which relies on this very fact. 3) To bring .NET Framework up-to-date, we ship a new version of .NET Framework "4.7" with IDbConnection.OpenAsync 4) Every single type that implemented IDbConnection prior to adding this method now fails to load on .NET Framework "4.7"
This is why we cannot add methods to interfaces.
If I keep my frustration with how things go with respect to communicating issues with MS to myself, you all won't know about it and think everything is roses and rainbows. If that looks unprofessional, so be it, I'm beyond caring whether MS thinks I'm a professional or not.
That said: I'm not married to the interfaces, so if they're gone, the fact that from then on there are classes and no interfaces to work with in theory won't make me a sad panda: what should be done can be done in theory through the base classes as well, today, as today all major ADO.NET providers play nice and derive from the base classes (this hasn't been the case in the past IIRC with ODP.NET implementing an interface but not deriving from the base class). This is also the reason why I initially up above earlier in this thread didn't really think removing them was a big deal. Since then I had some time to think about it and I think it is a big deal.
We don't live in a vacuum on Mars, and middleware/frameworks at the bottom of the stack have a problem now: users of current .NET full versions of these frameworks want to keep using them on CoreFX as they know these frameworks. Porting them over to CoreFX is however a big PITA, because of a myriad of reasons, one of them being often used interfaces exposed in public APIs not being present on CoreFX (and the reason for this thread).
For that reason alone I'd like to see the interfaces back. For me personally not for technical reasons (e.g. async needs base classes, it's a mess already). I know they lack certain methods, but that's your problem, not mine. Removing them makes that my problem and (paraphrasing now) the MS response to that is: throw up your hands with "can't be done!". But I don't have that luxury. You created this mess, you solve it. You want me to port my code, to invest a lot of time and money (which I have to pay for myself) to support your new framework, why are you then making your problem my problem?
Looking at your 4 step scenario: adding methods to interfaces isn't a problem IF you see CoreFX as a separate framework. And isn't that the case anyway? It's the same as with Compact Framework all those years ago (which I did port my framework to, and I learned a couple of hard lessons then that tell me that porting to CoreFX won't be simple, fast and easy and keeping two code bases won't either): we start with 1 API, then someone forgot something or some team within MS needs something, and viola a breaking change only a handful low-level stack devs will run into and so on and the two roads will split.
(example: Compact Framework forgot 'SerializableAttribute'. They added that with a dummy attribute doing nothing in a later version, but that broke code which anticipated on that not being present and which defined their own)
Splitting roads is understandable though: trying to keep things compatible is too restrictive. I predict here now that this rule will be broken in the future.
Seeing things as 'compatible' is important not only at the API signature level, but also on the API behavior level. Trusting that those two will be completely the same (CoreFX and .NET Full) in API behavior is too risky: a framework developer will have to test the same functionality on CoreFX and on .NET full, there's no way testing on CoreFX alone will be enough to assume the code works 100% the same on .NET full in the future: because how can you guarantee that? A call stack 20 calls deep on CoreFX has touched so much other code than on .NET full, a small detail here and there and things change.
The point in all of this is: it's a separate framework: code compiled against CoreFX can be expected to be different from code compiled against .NET full.
There are a couple of situations:
1) a framework has a code base of which 100% compiles on CoreFX. This gives a dll which is runnable on .NET full 2) a framework has a code base of which 70% compiles on CoreFX and 100% on .NET full. This gives 2 dlls: one for CoreFX, and one for .NET full. It's silly to run the CoreFX version on .NET full, as one misses 30% of the functionality.
In case of 1) I understand your point. In case of 2) (which is the case for all current .NET full targeting frameworks, among them all 3rd party ORMs) your point is really meaningless, as they'll have to work with 2 dlls anyway: effectively 2 codebases which have to be maintained separately, tested separately and migrated to their own new versions separately. Especially if CoreFX gets new features which aren't part of .NET full (which will be the case) yet. (btw: if you add DbDataReader.GetSchemaTable() to CoreFX which returns a different datastructure than a DataTable, because MS refuses to port that, code using DbDataReader.GetSchemaTable on CoreFX will break on .NET full as well. If you name it differently it will break as well as the method isn't there. I.o.w.: code will break if things which aren't in both frameworks are used. That doesn't mean things thus shouldn't be present in CoreFX).
To have no interfaces on CoreFX makes the situation of the framework in situation 2) a persistent one: they can't move to become a framework which fits in 1) because e.g. their API exposes the interfaces.
Microsoft rewriting their own stuff so their frameworks become frameworks in situation 1) is cool, however we don't have a million$ budget, 15+ people on the ORM runtime and a big PR machine on our side who will smooth over the wrinkles of breaking every app out there. So we're either stuck in 2) or require a little help from MS to move to 1).
That's what's at stake here. You said on twitter "tell us what you need". We did. Repeatedly. Especially regarding System.Data there is no communication. Nothing. No future plans, no discussion what to do, just dead ends and sometimes if a MS person steps in it's one who has no real stake in the matter. I appreciate your time on this, the more background we get the better, but at the same time, it's like talking to a co-worker about this: it won't get solved because the person(s) in charge are absent and not participating in the discussion.
If that makes me sound frustrated and god forbid 'unprofessional', then so be it.
Thanks for listening. Btw I have no illusions wrt System.Data: it will be a trainwreck of an API to port code to and as there's no communication from the people in charge with developers who write key frameworks on top of their API, there's little to no hope things will change. Not your fault, @davkean, it's nothing personal.
I have to echo the frustrations above about the lack of communication. We need bulk inserts and schema information as well. There has been no advancement or communication in over a month (see dotnet/runtime#15269 and dotnet/runtime#14302) of these missing core (in both ways) functionality. Yet, Microsoft is labelling the current code as "a candidate for release", which itself is a message of "it's good enough." It's not. Core things are missing that need to be added and if you follow these threads, need to be in the first version for similar versioning reasons.
Look at the last update on dotnet/runtime#14302 ("Why is DataTable/View/Set Absent?"), it's from 22 days ago asking:
So System.Data.Common is feature complete now for V1 of CoreCLR?
Yes, frustration can come off as unprofessional. The tone and context of text sucks and always has, but that's what we're restricted to here. I think everyone is trying to be productive here, but we're getting quite a bit of stonewalling from the CoreFX side on actual progress in the System.Data
area and that is, to be blunt, infuriating both as a library author and a user of these bits.
We need these core functional pieces, interfaces or not - I'm not hard set on interfaces and we've ported Dapper without them. But lack of DataTable, result schema info, bulk insert, and such are unacceptable in a "release candidate". Microsoft is the one increasing the frustration with labelling the current code as RC when it's almost universally agreed that it's not ready for release. Yes, it's just a label, but it's both an incorrect label and one that drastically increases the level of urgency because it's based on an arbitrary schedule (that should have changed to reflect reality). I don't think anyone in this thread is responsible for that schedule, but it's worth stating as a major factor in the frustration level.
Let's get back to the root problem. We need these pieces, and many of our millions of users do to. So let's fix it.
Lets not forget NHibernate with 1M+ downloads:
IDbConnection | IDbCommand | IDataReader |
---|---|---|
59 | 181 | 132 |
The current perception is that ADO.NET/CoreCLR is being re-designed to provide first-class support for EF and SQL Server with the rest of the ecosystem being disregarded - non-transparent breaking decisions like this only goes to re-enforce this stereotype.
That perception is reinforced by things like this: https://github.com/dotnet/corefx/issues/4646
As far as I can tell, there is zero way of implementing that API in any useful way outside of the SqlClient assembly.
I'm currently ok with testing without interfaces. But honestly I don't get the reasoning with interface versioning and compatibility.
Isn't the idea of .NET Core is that it's a new framework without the burden of compatibility, and that it's bundled with your application, so you don't have to deal with issues like that? The provider is already incompatible with the ones in .NET due to lack of things like the schemas and datatables, so what would break compatibility with? If the interface changes, just compile against the new version and bundle it with your app.
It just sounds like most of the excuses for the design are just worries from the old framework that are not applicable to the new one. Anyway, let's see how it turns out to be in practice.
For those of use who intend to support multiple frameworks, and have historically targeted the interfaces... I just want to share a pile of ugly that Dapper uses; I'm not saying this is good, but it is enough to make it compile. Of course, it is duplicated in a huge pile of files... I am mainly sharing this to emphasize yet another of the impacts:
https://github.com/StackExchange/dapper-dot-net/blob/master/Dapper/SqlMapper.cs#L6-L16
We cannot add members to interfaces
Correct, and that's a good feature of interfaces. The preference for abstract base classes is the safest way to help API entropy along, instead of fighting it.
While you don't have to follow the principles of OOD, I'd suggest that you do, when creating OO APIs. In short, the Interface Segregation Principle (ISP) states that no client should be forced to depend on methods it does not use.
If you add new methods to an existing abstraction, you automatically violate the ISP.
You can decide that you 'don't have to to adhere to SOLID' because you're Microsoft, and you're working with the BCL, so therefore 'normal rules don't apply' (not actual quotes; just paraphrasing the normal counter-arguments).
Having maintained a couple of open source projects for 6-7 years, in my experience it's better to keep interfaces small. If you need to add new capabilities to an abstraction, introduce a new interface.
If there were upvotes here, I'd have upvoted @ploeh's comment +1000
Insightful comment from @ploeh as usual.
@FransBouma, we will be adding the replacement functionality of DbDataReader.GetSchemaTable() in a way it will not break full framework. @NickCraver, SqlBulkCopy is in our future plan and we are working on schema. We are slow in making progress on schema as we also need to make reasonable progress on making our stack to work on x-platform. @mythz, thanks for providing the examples, numbers, and assessment on customer impact. We will review them.
@YoungGah Please update the issues involved with information so these issues stay up to date, e.g. https://github.com/dotnet/corefx/issues/1039, as otherwise the sparse info is scattered all around. It's nice you will be adding GetSchemaTable (and DbConnection's equivalent, don't forget that one!), but it's so hard to get any info around what will happen and when. Is there any plan what will be added when? All we have to go on now is a hint that in 'the future' something might be added. That's not really great for planning a port of a code base, to be honest.
@FransBouma, yes, I will update the other threads as well. As for your request on the more information regarding what and when it will be available, I fully understand why you guys need it. I will publish the list that will indicate if the feature/functionality will be available on v1, removed purposefully, will be available post v1, or the design of the future availability is pending. I will try to post it during next 2 weeks. As for get schema function on DbDataReader and DbConnection function, our plan is to make it available for rc2 release. If the plan changes for some unforeseeable reasons, we will update the community.
What ever happens here, and for future reference @YoungGah; IDataReader has a dependency on DataTable, which has a dependency on DataSet (which we consider a separate layer - because it's policy heavy, unlike these types which are policy-free), so there's some design work here to break that if these interfaces were ever brought back.
I'd place an other vote here for @ploeh 's approach; have interfaces, but much, much more fine grained than most of the interfaces currently in the BCL. This deals both with @davkean 's comments on decoupling as well as addressing versioning.
Why can't you just have a new interface inheriting the old one. Obsolete the old one. Remove the old one in the future. At least then you can extend it and not break existing uses.
Or multiple smaller interfaces. I just got to ploehs comment.
I don't understand this need to have perfect compatibility with the original .NET. There is nothing to break here, this is a new framework and the perfect opportunity to break the ties with legacy code and apply changes that are needed for a long time but would hurt in the original framework.
When I proposed interfaces back, I wasn't thinking about bringing the original 1.1 interfaces, but updated interfaces with the new design. There could be even be more of them, as @ploeh said.
Interfaces yes, legacy support if possible, but shouldn't be a priority at this point.
Reopening since there is a lot of interest in this topic.
There is nothing to break here, this is a new framework and the perfect opportunity to break the ties with legacy code and apply changes that are needed for a long time but would hurt in the original framework.
So the perfect opportunity to get rid of the original badly designed interfaces and standardize on base classes like ADO.NET is already doing? :trollface:
Seriously though, you have a choice between a clean API or backwards compatibility. Pick one. I don't see a way to have your cake and eat it too.
@JamesNK but that'd exactly the point. Backwards compatibility is not required, period.
You joke, but the badly design API with interfaces were bad because they were badly designed, not because they were interfaces. =) It's not like interfaces are not used absolutely everywhere in NET or this is something new. Just design the thing correctly this time and move on.
ADO.NET is one of the most stable pieces of code in all .NET. It had two or three serious changes in 15 years. It's the perfect API to stabilize to an interface and make it simpler for everybody.
As a note, this topic here is one of the most commented issues too, and had the same long discussion on the interfaces vs virtual methods, testability and stuff.
@nvivo I must admit, I'm confused. After we established that the base classes enabled testability, this thread morphed into bringing back the interfaces to enable porting .NET Framework code to .NET Core. How does redesigning the interfaces and introducing something new help that?
Why can't we have original interfaces for the backward compatibility and go forward with whatever you opt for (either abstract classes or small interfaces)? The original interfaces could sit on top of new stack and provide the backward compatibility. This part could be optional as well. That'd make porting easy and still allow new way.
@davkean
I can't respond for everyone that commented here. I proposed using interfaces as the API of ADO.NET, updated to the new current API. I didn't ask to bring the original interfaces and all the problems it had. The purpose was to have a cleaner defined API and make it easier to mock it and test code that depends on it, mostly data abstraction libraries and not user code.
Interfaces are better at describing APIs as @ploeh wisely said, and a hell lot easier to mock. The current design is terrible at mocking, and requires you to implement almost the entire provider as a library to do simple tests. If that's not important to everyone, I can understand. But I definitely don't agree it's testable enough today. Testing if a method A was called with parameter X by checking if method B called C with parameters X, Y, Z is not ideal.
Now, just to see how classes are already creating a bad design:
Get
methods for some core types but doesn't add for other like GetByteArray()
and GetDateTimeOffset()
? And is retrieving a TextReader even required if this can be done outside using strings or streams? If this was an interface, would methods like these be added to the API or created as extension methods or helpers in the concrete classes (like the GetSql*
family in SqlDataReader)?These are all rhetorical questions. I'm sure all of them have compelling answers and things have been considered. The point is that the current design is clearly not something that has been thought as an API definition, something that probably would receive more attention with interfaces.
Honestly, from the outside the discussion of the design sounds like it went as:
Yes, there is a little rant there and this discussion is going nowhere =)... But just wanted to get this out of my chest. It's 2015, everything breaks all the time and we're used to it. There will be 20 updates to ASP.NET MVC in the next years that will cause a lot more breaking changes than interface changes in ADO.NET.
I still love .NET and what you're doing with it in general, I'm sure it's a rush to get .NET Core v1 out in time and not everything will be perfect. I just hope the community can help steer this to other directions as the time goes and you're not afraid to break things as we move.
For ORM maintainers, why not do
#if COREFX
namespace System.Data {
public interface IDbConnection { ... }
}
and use the adapter pattern to wrap the new System.Data with your own implementations? In fact you could make an open source code package for it and share.
It's 2015, everything breaks all the time and we're used to it. There will be 20 updates to ASP.NET MVC in the next years that will cause a lot more breaking changes than interface changes in ADO.NET.
I still love .NET and what you're doing with it in general, I'm sure it's a rush to get .NET Core v1 out in time and not everything will be perfect. I just hope the community can help steer this to other directions as the time goes and you're not afraid to break things as we move.
- nvivo
This is the problem; rather than a considered approach we're getting a rushed rewrite in order to meet some arbitrary deadline. I'd sooner it was late, Q4/16 if need be, than have some shiny new broken crap. It's 2015 and everything breaks is a terrible justification.
@thefringeninja that either adds a completely unnecessary and confusing dependency that only works with half the systems (in the shared case), or leads to name collisions requiring extern alias
to unpick (and a lot of confusion why the method that takes a System.Data.IDbConnection
won't accept the different-but-same System.Data.IDbConnection
that you're offering it). Basically, that would be making things 10 times worse.
Can you give a concrete example @mgravell ? I can see how this would break if you used Type.GetType("System.Data.IDbConnection, System.Data")
, or maybe in PCL scenarios.
If orm A defines System.Data.IDbConnection, and orm B defines System.Data.IDbConnection, then there are now two completely different and incompatible interfaces that have the same name/namespace, conflict, and don't actually work from any of the DB providers. It solves nothing, basically. Worse: it lease to unusable APIs where someone expects to pass in a SqlConnection or NgpSqlConnection - and it doesn't work.
In fact you could make an open source code package for it and share.
If it isn't System.Data.Common, then that means DbConnection doesn't implement it, and : you might as well not bother.
From what I can see currently on the corefx-progress page for System.Data.Common, the interfaces (IDbCommand, IDbConnection, etc) were removed in favor of the usage of abstract classes.
But in the new API, most of the main methods are not virtual or abstract. On DbCommand alone we can see this:
While these methods can certainly be made virtual or abstract, it would be much more useful to have the real interfaces back, and make any public API depend on these interfaces instead of the abstract classes.
This is mostly useful when developing libraries. Today it's very hard to mock a datareader to make it return a specific value for testing purposes. The same for ensuring that ExecuteReaderAsync was called, not ExecuteReader, etc.
I propose the provider factory instead should be made as an interface:
And then follow from there to the rest of the provider to things like
IDbDataReader
,IDbTransaction
, etc.We know the interfaces became out of sync for some reason in the past and the abstract classes were made the official API, but this doesn't need to be the case anymore in corefx.
Note that this doesn't mean removing the System.Data.Common in any way, but instead make the Common classes implement these interfaces, and you wouldn't use System.Data.Common unless you're implementing the provider. Applications would depend only on the interfaces instead.
Please consider this to make the API more testable on corefx 1.0.
Related to discussions on dotnet/runtime#14302 and dotnet/runtime#15269.