Open SkyeHoefling opened 4 years ago
I keep going back and forth on what makes the most sense. Having 1 NuGet reference for any project type makes it really easy to mix and match. This will require quite a bit of convention based programming. The 1 NuGet type for a specific project type is great until you work on a DNN module that combines WebForms and HTML modules into 1.
The most common scenario for me would be combining WebApi modules with MVC or another module type. In that scenario it just requires the DLL which makes it a moot point and the specific build NuGets would work just fine.
I think I'd need a better understanding of what differences there are between the different packages. Is is just which files are included in the package by default? With the ability to mix and match, it seems like a single NuGet package would make the most sense, but maybe that main package imports all of the other packages, and you could choose to include only one. I'm not sure if that brings any specific benefits vs. just one NuGet package.
If we look at the Module.Package
from DNN it loads files for every possible package.
<Target Name="GetFiles">
<ItemGroup>
<TextFiles Include="*.txt" Exclude="license.txt;releasenotes.txt" />
<SourceFiles Include="*.css;*.htm;*.html;*.xsl;*.png;*.gif" />
<ConfigFiles Include="*.config" Exclude="web.config;packages.config"/>
<RootConfig Include="*eb.config" />
<RootViews Include="*.ascx;*.asmx;*.ashx;" />
<Services Include="*.svc" />
<Views Include="Views/*.ascx;" />
<MVCViews Include="Views/*/*.cshtml;Views/*.cshtml"/>
<MVCConfig Include="Views/*.config" />
<Controls Include="Controls/*.ascx;" />
<ResourceFiles Include="App_LocalResources/*.resx" />
<ControlResourceFiles Include="Controls/App_LocalResources/*.resx;" />
<Images Include="Images/*.*" />
<Keys Include="keys/*.*" />
<ClientScripts Include="ClientScripts/*.*" />
<JsFiles Include="js/*.*" />
<Scripts Include="Scripts/*.*" />
<SharedScripts Include="Scripts/Shared/*.*" />
<Templates Include="Templates/*.htm" />
<DataFiles Include="Resources/*.xml" />
<Resources Include="@(ResourceFiles);@(Scripts);@(RootViews);@(Images);@(TextFiles);@(SourceFiles);@(ClientScripts);@(JsFiles);@(ControlResourceFiles);@(Keys);@(Services);@(RootConfig);@(ConfigFiles);@(Templates);@(DataFiles);@(SharedScripts);@(Controls);@(Views);@(MVCViews); @(MVCConfig)" />
</ItemGroup>
</Target>
I worry that there will be build errors when you start dealing with all possible permutations.
Another point that I have against 1 NuGet is different projects require different bundling strategies. For example an HTML module vs a MVC module vs a PersonaBar module. I can see a toolkit getting confused between HTML and PersonaBar as they are both just front-end scripts and API calls
In terms of helping smooth out the process for newer people, it seems like being more explicit, with less magic/convention, would be better.
That said, I'm not sure what the level of your ambition is, and how much is being handled by this project (generating/updating .dnn
manifest, processing front-end files, etc.). It seems like if the project is a module extension (vs. a Persona Bar extension), then including all of the project files makes sense (whether that's html
, ascx
, or cshtml
). I don't have a scenario where that would cause a problem, but I know there's a lot I haven't thought through in detail.
My idea is to tightly couple these build tools with special visual studio templates. Templates that provide the path of least resistance to get up and running.
The goal is to have the build tools package the .dnn
manifest with everything but it isn't going to read the manifest to see where files might be. There will have to be some strong opinions and assumptions made to make sure everything works together.
This toolchain could hide the majority of the manifest file from developers and generate it as part of the build step. I am not sure if that should be the goal of this project or not as it goes outside the standard way of doing things.
IMO automatically generating sections of the manifest (version(s), SQL files, assemblies, resource file(s)) is a huge benefit, and having to manually figure it out is a big stumbling block.
That is the kind of insight that is really helpful. What if we took a Xamarin.Forms android manifest approach. There is a manifest file that is simplified to include the basics
Then the build tool reads the project and determines how to properly create the manifest file
That sounds great 👍🏻
We could change an existing manifest file to something like this
<dotnetnuke type="Package" version="5.0">
<packages>
<package name="MyCustomModule" type="Module" version="00.00.02">
<friendlyName>My Custom Module</friendlyName>
<description>A module built using the new build tools</description>
<iconFile>Images/extension.png</iconFile>
<owner>
<name>Andrew Hoefling</name>
<organization>Andrew Hoefling</organization>
<url>http://www.andrewhoefling.com</url>
<email>andrew@hoefling.me</email>
</owner>
<license src="ManifestAssets\License.txt"></license>
<releaseNotes src="ManifestAssets\ReleaseNotes.txt"></releaseNotes>
<dependencies>
<dependency type="CoreVersion">06.00.00</dependency>
</dependencies>
</packages>
</dotnetnuke>
Then the build tool will insert the necessary <components>
depending on how the project builds.
I like this format because it would allow a developer to still insert their own <components>
and the build tool could merge the XML together. For example, if you know you need to do something highly customized you could still use the build tool to package your manifest file as it won't overwirte the settings in the project file.
I need to think about this some more and if the build tools should be solving this problem or just compiling the zip folder. I do like the idea of having it handle everything for the developer. The tricky part is how does the build system know if the module is a MVC module vs PersonaBar module. An easy solution would be adding some type of configuration file. We could use roslyn to analyze the code to make that determination
Yeah, automatically generating the module component could get tricky, but if we can it'd totally be worth it.
The XML merging sound like a good plan, but I'm sure there'll also be some tricky decisions to make there.
FYI, our internal tooling at Engage also auto-generates the CoreVersion dependency (based on what version the project's compiled against), so that's an option.
Right now this theory is shrinking the size down to 1 NuGet and just have DotNetNuke.Build
as a NuGet package. The build tools will understand what type of project we have and generate the correct DNN package
I like the simplicity of having 1 NuGet package.
One issue I've always ran into is that I must make modifications to the build scripts to pick up certain types of font files, etc. when using some of the common build scripts out there. I'd love if it packaged up my front-end related files as they are in the project without me having to worry if they will be ignored, etc..
Thank you @mikesmeltzer for the insight here, this is a big problem with existing scripts.
My first thought is adopt the .NET Core way of doing things and create a wwwroot
folder and everything in that folder gets deployed to the module no matter what. I don't think mixing concepts between .NET Core and .NET Framework is a great idea. It appears that a standard across web projects and front-end DNN projects is generated a dist
folder. Maybe we can have the build scripts default to pulling everything from a dist
folder.
If we decide to use a dist
folder, I see this causing problems with projects that keep it simple with just having a css/js folder. To solve that problem we could have a workflow like this.
dist
folder, if it exists grab everything in that folder and ignore all other .js
.css
etc files. This would still include .ascx
and .cshtml
files that are outside of the dist
folderdist
folder the build would grab all files that match particular extensions. (.js
, .css
, all common font files, .jpg
, .png
, .gif
, all common img files)I think the build can be smart enough to pull all files and respect folder hierarchy. If a module decides to not have a .dist
folder the folder structure from the root of the project is what is installed
I do like the wwwroot way of doing things but completely understanding not wanting to mix the two.
I use js/css folders to 'keep it simple' often in many projects. I think your fallback from dist makes sense, I like it.
For years I've been using a similar process to what you have initially highlighted. At last count, I have close to 1,000 extensions using the current process under my team's control. I've personally abandoned efforts to do much with further automation of manifest creation this due to a number of things, which I'll try to break down. (Not trying to deflate, just some real-world difficulties, these could most likely be overcome with lots of reflection etc, but the risk/time vs reward is most likely not there.)
Hybrid Modules
Given that the DNN Platform doesn't have support for all integrations for all project types, we have a need to create hybrid applications. For example, an MVC module that uses a WebForms edit control so that we can get access to the RichTextEditor
component. In my research, this makes discoverability and auto-creation hard to implement/validate.
Controls Identification
Looking specifically at the Module
package type you have the need to create controls to enable DNN to context switch to show the proper control. This is done with the
Title
, SupportsPartialRendering
, SupportsPopUps
etc are all things not defined anywhere else so the user needs the ability to manage/control this.Component Types
There are multiple component types and an individual project might include multiple of these, each with a different manifest composition. Additional meta-data needs exist for many of these that are similar to the above-noted controls identification issues. Some examples.
I guess my point here, is that so much of the extension configuration and manifest has to be done manually, or at least verified that I don't see it being practical to support dynamic creation at this level.
Other Notes
Just a few other considerations worth noting, as others have mentioned I often end up editing my build file on a particular project-by-project basis for various reasons. A few of the ones I was quickly able to find in my projects.
What we have gone to doing personally to minimize the risk is add MSBuild.CommunityTasks via NuGet but leave the ModulePackage.targets as a baseline inclusion in our templates. This way we can tweak on an as-needed basis. We have variations/templates for WebForms, MVC, and AuthenticationSystem project types.
I know others have done all kinds of crazy things with Cake, etc, but honestly for us with old & new this is a much smaller problem than some of the others and the power of being able to adjust on the fly is better. Just my $0.02 anyway
Background
DNN Build Tools removes the burden on DNN developers to maintain an independent fork of the community build tools/scripts that just about every DNN Module relies on. These tools automatically package up your module in Debug/Release mode depending on configuration. If you are unaware of what I am talking about, you might have seen something like this in your DNN Module
The
ModulePackage.targets
file provides MSBuild targets that automatically run depending on your configuration. Many DNN modules including core modules use these build tasks and each project maintains their own fork of these scripts. Which is really fine if you need to do something custom. There are many different build tools out in the wild and there isn't one way to solve this problem and there shouldn't be only one way as different projects have different needs. For example yarn is great for building frontend projects, don't forget gulp, and npm. You can even use WebPack with it to simplify how your bundle gets generated. Cake is a great build toolchain for creating a C# Make file.The majority of DNN modules aren't doing anything custom, they are following standard software design techniques and just need a package installer. This is independent of how the front-end is built including the front-end build tools such as gulp. yarn, npm, etc. In my opinion this create a difficult barrier to entry in DNN module development.
DotNetNuke.Build
Enter my new project
DotNetNuke.Build
which is going to be a NuGet package that you install into your module that automates packaging for you. Here are the getting started stepsDotNetNuke.Build
There is no need for managing the build steps they are just included as a NuGet reference and automatically executed during the MSBuild.
When building a DNN site a required artifact for everything to work is the DNN Module Installer or the package archive zip file. For a long time in DNN many developers have manually built these via automated scripts or the community build tasks. This should be the last thing a developer worries about, they should just know to build their solution and look in the bin directory for their artifacts. If the project requires any special automation they can add it on top of the
DotNetNuke.Build
tools.How does it work
DotNetNuke.Build
will publish a NuGet package with the following package idsDotNetNuke.Build.Mvc
DotNetNuke.Build.WebApi
DotNetNuke.Build.Html
DotNetNuke.Build.WebForms
That will load the correct packaging build scripts for the specific module and invoke the steps at build time. All you need to do is add the reference to your csproj file
Request for Feedback
While I research this topic I am interested in community feedback. If you were to adopt this toolchain to simplify your build environment what makes the most sense?