SpecFlowOSS / SpecFlow

#1 .NET BDD Framework. SpecFlow automates your testing & works with your existing code. Find Bugs before they happen. Behavior Driven Development helps developers, testers, and business representatives to get a better understanding of their collaboration
https://www.specflow.org/
Other
2.25k stars 754 forks source link

[Discussion] Future Roadmap for IDE Integrations #656

Closed SabotageAndi closed 6 years ago

SabotageAndi commented 8 years ago

Because of #651 and a lot of feature request/bugs in the current Visual Studio integration, I think it is best, that we work on a new IDE integration, as the current one is based on the old Gherkin parser.

@gasparnagy started at TechTalk with a new editor which is based on the new Gherkin parser. Current features:

I suggest that the new IDE Integration is based on this, as I am allowed to publish it BSD licensed.

As I remember following features are included in the current VS integration:

As for the generation of the code behind file, I am for moving it completely into the build process. We have here already the MSBuild generation. For .Net Core we also need a solution for generating the code-behind file at build time, because there is currently no such feature like CustomTool available. I think the code generation is one of the most difficult feature, we could save time and effort, if we skip it.

So we do not have to load the SpecFlow dlls that are used in the opened project.

As an additional goal I would like to design the integration in a way, that it is easier to reuse it in other IDEs like Xamarin Studio. AFAIK Microsofts Roadmap is, that Xamarin Studio is the main IDE for Mac OS X.

As first step I would create a new repository named SpecFlow.IdeIntegration and push the current code there.

gasparnagy commented 8 years ago

Thx @SabotageAndi! Good news. My feelings/comments:

stajs commented 8 years ago

I'd love to see .NET Core support.

For .Net Core we also need a solution for generating the code-behind file at build time, because there is currently no such feature like CustomTool available.

I hacked around this by hooking in to the prebuild and shelling out to the specflow EXE to generate. It could be a way to get initial support, with the hope that a better way is available in the future.

Good luck!

samholder commented 8 years ago

@SabotageAndi that all sounds good, though like @gasparnagy said sometimes trying to aim for a generalized solution from the start is difficult to do, sometimes its better to have a working version for one, then try a second integration and see what bits can be shared. By the time you have done 3 integrations you probably have something which is quite generalized.

Knowing how we can unit test the code used in the VS integration is going to be key for getting people involved as one of the things I find difficult with the existing VS integration is not knowing if a change has broken anything.

Also having documentation about how VS extensions work in general would also be really useful, at least for me!

SabotageAndi commented 8 years ago

@gasparnagy I forgot the intellisense. ups. Are the stepmaps the Visual Studio way of doing intellisense?

I was aware, that we cannot write an IDE integration for multiple different IDEs on the first try. But we could have a more deeper look at writing it reusable.

On all other points I am with you two.

Probably we can copy some parts simply from the current integration. The step skeleton generation is the first I am thinking about.

I created the SpecFlow.VisualStudio2 repo and will push the sources there.

For building we could set up an AppVeyor build and MyGet supports VS Extension Gallery feeds. So we can get early versions out for us and users.

About the code generation: Lets create a separate issue for discussing that.

@stajs Have a look at PR https://github.com/techtalk/SpecFlow/pull/649 and branch https://github.com/techtalk/SpecFlow/tree/DotNetCore

gasparnagy commented 8 years ago

@SabotageAndi no. stepmaps is our own solution.

the problem with the generalized solution is that every IDE has its own concept about language support, and since this is a very performance intensive part (for every keystroke we have to calculate, parse and create many objects), putting to much abstraction around is dangerous.

@samholder I really recommend for everyone to look at the SpecFlowVisualStudio2 to get the feeling. It is not a big codebase. I recommend starting from the syntax coloring (https://github.com/techtalk/SpecFlow.VisualStudio2/tree/master/SpecFlow.VisualStudio.Editor/Classification) that uses the VS specific parser wrappers at https://github.com/techtalk/SpecFlow.VisualStudio2/tree/master/SpecFlow.VisualStudio.Editor/Parser.

I am also happy to give an intro to anyone in one of our online meetings.

samholder commented 8 years ago

@gasparnagy thanks. I grabbed the code last week and started to have a look. Hopefully I'll get a bit more time this weekend to play.

jmezach commented 8 years ago

@gasparnagy Couldn't the stepmaps problem be fairly easily and elegantly solved by using Roslyn? It should provide us with an in-memory representation of the current code in the solution so I guess it shouldn't be that hard to determine the available steps from that?

Edit: To prove my point I made a very crude implementation of it that utilizes Roslyn to figure out the available step definitions within the current solution. At the moment the entire set is rebuild every time something changes which is probably not the most efficient, but that could be improved by keeping track of where the step definitions are located in source and then invalidating those when the source file changes. Before I take this any further I would love your thoughts on this @gasparnagy. I left the code here

gasparnagy commented 8 years ago

@jmezach Yes, that would be definitely useful! And your code looks really a good start. Yes, please keep on doing it. We would need to reach a point where the perf can be measured (for a bigger project), so we can make the decision about when to trigger the change and whether we really need to persist the cache (which causes many issues). /cc @samholder @SabotageAndi

jmezach commented 8 years ago

First of all, apologies for the late reply. I've been away on holiday so I had some other things on my mind ;).

Glad you like my attempt at working something out @gasparnagy. I just had a quick look at the performance and it doesn't seem to be that bad right now actually. I have a solution with three projects only one of which has a reference to the TechTalk.SpecFlow assembly (which I filter on right now which I think is okay since if it doesn't have a reference to that assembly it can't contain step definitions anyway, can you confirm?).

That project currently has 251 step definitions (at least that is what my code finds at the moment, but it sounds about right) and it only takes 20 to 40ms to find them with my current implementation.

But we do have to consider the fact that this code is run every time I type something in a file so it will run very very often so it seems a bit wasteful to discover the step definitions every time a single character is typed somewhere. Would love your input on this though.

dasMulli commented 8 years ago

better late than never: @stajs

I hacked around this by hooking in to the prebuild and shelling out to the specflow EXE to generate. It could be a way to get initial support, with the hope that a better way is available in the future.

Just like microsoft moves .net build targets to a NuGet package (Microsoft.NET.Sdk) that only ships msbuild targets, one could build a NuGet package that contains all targets needed to perform the necessary build steps. This would also remove any dependency on VS-integrated tooling to at least build & run SpecFlow based test-projects on any platform. Plus one could try to move the generated source away from the gherkin files (e.g. to obj/) so no one accidentally forgets to exclude them in .gitignore files. Just like the project.lock.json is replaced by a obj/project.assets.json and the nuget generated props and targets files also land in obj/.

dasMulli commented 8 years ago

cont:

An MSBuild target could potentially participate in the project system and even incremental compilation by evaluating a custom item type. A "new-world" csproj could potentially have:

<ItemGroup>
    <Compile Include="**\*.cs" />
    <SpecFlow Include="**\*.feature" />
    <PackageReference Include="SpecFlow.Sdk.XUnit" Version="1.0.0" PrivateAssets="All" />
    <PackageReference Include="Microsoft.NET.Sdk" Version="1.0.0" PrivateAssets="All" />
</ItemGroup>

The sample SpecFlow.SDK NuGet package could then inject a before-build-target that evaluates the SpecFlow items, generates source code and emits Compile items that the standard build-targets then pick up for compilation. It could also include a target that any other language tool could use to inspect the SpecFlow aspect of a project. e.g. an msbuild my.csproj /t:ListSpecFlowBindings could be called by a language service that can then drive IntelliSense for VSCode.

stajs commented 8 years ago

The MSBuild option will be a no-brainer when the next Visual Studio tooling drops that converts project.json back to MSBuild based .csproj. It's also quite attractive for VSCode.

SabotageAndi commented 8 years ago

FYI - for generating the code behind file there is already a MSBuild integration http://www.specflow.org/documentation/Generate-Tests-from-MsBuild/

SabotageAndi commented 6 years ago

A lot of changed in the last 2 years and I think this is not more up to date. So I am closing this issue.

lock[bot] commented 5 years ago

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.