Closed jrburke closed 8 years ago
Why not have the reverse. Let rjs to read the runtime config, and generate modules
from bundles
. That's what I'm doing right now.
@WarWithinMe - the runtime config on its own might not be enough, because for each bundle, you may also need to declare build options such as which other modules should be excluded.
Having r.js able to emit bundles would be extremely useful. I'm envisaging a config syntax like this:
({
baseUrl: "scripts",
... // other config options
bundles: {
// If the given value is an array, treat it as { include: thatArray }
'dist/home': ['models/home-page', 'text!./templates/home.html'],
// For cases where you need to exclude items or specify further options,
// pass an object
'dist/about': {
include: ['models/about-page', 'data/repository'],
exclude: ['jquery'],
optimize: 'some'
}
},
outBundleConfig: 'dist/bundle-config.js'
})
This is only the app.build.js
config, so only affects r.js's behavior, and doesn't affect your app in dev mode. When developing your app, you're not using bundles at all, and the browser would load the modules from their source locations (which is ideal e.g. for easy debugging). When running r.js, the above config would cause it to emit dist/home.js
, dist/about.js
, and dist/bundle-config.js
in addition to its usual output.
The emitted dist/bundle-config.js
file would in this example contain:
requirejs.config({
bundles: {
'dist/home': ['models/home-page', 'text!./templates/home.html'],
'dist/about': ['models/about-page', 'data/repository']
}
})
... so if you simply add a reference dist/bundle-config.js
, then the browser will then start pulling the included modules from the relevant bundle file.
I had a go at building a Gulp plugin to do this, by wrapping the existing r.js optimizer, and invoking it once per bundle
value. It works, but what's missing is the ability to automatically exclude modules that are packaged into r.js's primary output, so in order to avoid duplicating library code like jQuery into every bundle file, you have to remember to exclude
it from every bundle. Unfortunately since there's no API for walking the dependency graph I didn't get any neat way of automatically excluding the common modules. However r.js itself does have the dependency graph info and shouldn't have much trouble doing that.
@WarWithinMe - how are you generating your bundles? Do you have any method for automatically excluding common dependencies (for example jQuery/backbone/whatever, which might be packaged as part of the primary r.js output) from each of the outputted bundles?
Actually I've found a way of getting the list of modules to exclude by hooking into onBuildWrite
when the primary r.js output is generated. So it does seem to be possible to build this on top of r.js already :)
@SteveSanderson It would be better if r.js can emmit the bundles
config, but I guess if you emmit bundles
config, you probably should remove some of the path
config ( Because it would be duplicated if sth. is defined both in path
and bundle
, but requirejs might allow it ).
Currently I have some config like this
requirejs.config({
path : {
"jquery" : "vender/jquery"
"backbone" : "vender/backbone"
"popup" : "component/popup"
"helpers" : "lib/helpers"
"core" : "lib/core"
"header" : "module/header/headerController"
},
bundles : {
"vender" : ["jquery", "backbone"]
"lib" : ["core", "helpers"]
"header" : []
}
bundleExcludes : {
"header" : ["popup"]
}
});
// Here, "header" depends on "popup", "jquery" and "backbone"
I wrote a script to transform the require.js config to r.js config. The script's job is to extract the exclusion of every bundle. In this case, the "header" bundle will include everything except : ["popup", "core", "helpers", "jquery", "backbone"]
.
The concept is if something is in previous bundles, it is excluded in next bundles.
And I did modify one of the handler in build._run
to find out what's included in previous bundle.
I hooked into onModuleBundleComplete
to achieve this:
writeBundleConfig : function() {
var bundleConfig = {};
if ( !config.__bundles_written ) {
for (var i = 0; i < config.modules.length; i++) {
bundleConfig[config.modules[i].name] = config.modules[i].include;
}
var fs = nodeRequire("fs"),
fsPath = config.dir + "/" + config.mainConfigFile,
contents = "require.config({bundles: " + JSON.stringify(bundleConfig) + "});";
fs.appendFile(fsPath, contents, function( err ) {
if (err)
throw err;
console.log("+ appended bundles to `" + config.mainConfigFile + "`\r\n");
});
config.__bundles_written = true;
}
},
onModuleBundleComplete: function ( data ) {
this.writeBundleConfig();
}
this appends the bundle config to my mainConfigFile. I.e:
require.config({ /* regular config *);
require.config({ bundles: [ ... ]});
It's not the prettiest piece of code this way (there's nothing to hook into after all the bundles are done, only per bundle, forcing me to write some flag to the config to indicate I've already done the work), but it does the job for now. If desired I can add this functionality to r.js natively and make a pull request.
I hooked into onModuleBundleComplete to achieve this
Nice idea!
Note however, that you don't need to parse the list of includes for module bundles from the build configuration itself. (Infact; you probably shouldn't: it tends to be incomplete because it does not list "includes of includes" that are closed over by the build process.)
Have a look at the data
argument to the onModuleBundleComplete
method. It holds the normalized name and full set of normalized includes for the module bundle that is being completed.
https://github.com/jrburke/r.js/blob/2.1.11/build/example.build.js#L548-L564
//A function that is called for each JS module bundle that has been
//completed. This function is called after all module bundles have
//completed, but it is called for each bundle. A module bundle is a
//"modules" entry or if just a single file JS optimization, the
//optimized JS file.
//Introduced in r.js version 2.1.6
onModuleBundleComplete: function (data) {
/*
data.name: the bundle name.
data.path: the bundle path relative to the output directory.
data.included: an array of items included in the build bundle.
If a file path, it is relative to the output directory. Loader
plugin IDs are also included in this array, but depending
on the plugin, may or may not have something inlined in the
module bundle.
*/
}
I however do want to only include that shallow list. That's my list of entry points. All other items shouldn't be exposed in my bundles config. You do raise a valid point though, I don't do any analysis yet on cross-bundle boundaries. So I might run into loading issues where some file in A depends on some file in B, and B isn't loaded yet. In those cases I'll have to manually add that particular entry point into my modules list. Not a big deal currently, as my packaging model is very simple (all bundles are conjunct, except for lib
and shared
, which will usually be loaded anyway). It's something to fix in the future though.
+1 It should export the bundles config, because when sharing code between projects where you don't want to include a million copies of require, you have a project outputting a number of modules and then you provide the consumer of that library a bundles definition that allows them to use it.
I'll be using a modified version of @gsmeets approach as @rjgotten suggested. Thanks!
Thanks for the tips contributed in this discussion thread! I am now successfully using the following build scripts to generate "bundles" configs (i.e. explicit listing of all modules IDs contained within a given bundle). Note how the code differs depending on whether a single file or multiple bundles are generated by the optimizer (my project outputs both types of builds). Obviously, when my app code is optimized into a single file, I use the Almond loader internally, so strictly-speaking I do not need the bundles config at runtime (but I still generate it so I can double-check the aggregated result, if needed).
As this technique relies on the config
context, onModuleBundleComplete
must be located within the top-level RequireJS build configuration file (invoked from the command line), the one that contains the mainConfigFile
directive:
onModuleBundleComplete: function(data) {
// CHOOSE ONE OR THE OTHER:
var filePath = process.cwd() + "/onModuleBundleComplete_SINGLE.js";
var filePath = process.cwd() + "/onModuleBundleComplete_MULTIPLE.js";
var fs = nodeRequire("fs");
fs.readFile(
filePath,
{encoding: 'utf-8'},
function(err, fileContents) {
if (!err) {
var func = eval("("+fileContents+")");
return func(data);
} else {
console.log(err);
}
}
);
}
Note how both SINGLE/MULTIPLE scripts log messages into the console / shell, which provides handy "at a glance" information about the module IDs and the corresponding file paths.
onModuleBundleComplete_SINGLE.js
function(data) {
console.log("========> onModuleBundleComplete");
console.log(data.name);
var fs = nodeRequire("fs");
for (var i = 0; i < config.modules.length; i++) {
if (config.modules[i].name !== data.name)
continue;
var rootPath = process.cwd() + "/build-output/SINGLE/";
rootPath = rootPath.replace(/\\/g, '/');
console.log(rootPath);
var path = config.modules[i].out; //config.modules[i].layer.buildPathMap[config.modules[i].name];
console.log(path);
// var shortpath = path.replace(rootPath, './');
// console.log(shortpath);
// var pathConfig = {};
// pathConfig[config.modules[i].name] = shortpath;
var bundleConfig = {};
bundleConfig[config.modules[i].name] = [];
for (var moduleName in config.modules[i].layer.modulesWithNames) {
bundleConfig[config.modules[i].name].push(moduleName);
console.log(">> " + moduleName);
}
for (var moduleName in config.modules[i].layer.needsDefine) {
bundleConfig[config.modules[i].name].push(moduleName);
console.log(">> " + moduleName);
}
fs.writeFile(
path + ".bundles.js",
"require.config({" +
//"paths: " + JSON.stringify(pathConfig) + ", " +
"bundles: " + JSON.stringify(bundleConfig) + "});",
function(error) {
if (error) throw error;
}
);
}
}
onModuleBundleComplete_MULTIPLE.js
function(data) {
console.log("========> onModuleBundleComplete");
console.log(data.name);
var fs = nodeRequire("fs");
for (var i = 0; i < config.modules.length; i++) {
if (config.modules[i].name !== data.name)
continue;
var rootPath = process.cwd() + "/build-output/MULTIPLE/";
rootPath = rootPath.replace(/\\/g, '/');
console.log(rootPath);
var path = config.modules[i].layer.buildPathMap[config.modules[i].name];
console.log(path);
// var shortpath = path.replace(rootPath, './');
// console.log(shortpath);
// var pathConfig = {};
// pathConfig[config.modules[i].name] = shortpath;
data.includedModuleNames = [];
for (var j = 0; j < data.included.length; j++) {
var fullPath = rootPath + data.included[j];
for (var modulePath in config.modules[i].layer.buildFileToModule) {
if (fullPath === modulePath) {
data.includedModuleNames.push(config.modules[i].layer.buildFileToModule[modulePath]);
break;
}
}
}
var bundleConfig = {};
bundleConfig[config.modules[i].name] = [];
//for (var moduleName in config.modules[i].layer.modulesWithNames) {
for (var j = 0; j < data.includedModuleNames.length; j++) {
var moduleName = data.includedModuleNames[j];
if (moduleName === config.modules[i].name)
continue;
bundleConfig[config.modules[i].name].push(moduleName);
console.log(">> " + moduleName);
}
fs.writeFile(
path + ".bundles.js",
"require.config({" +
//"paths: " + JSON.stringify(pathConfig) + ", " +
"bundles: " + JSON.stringify(bundleConfig) + "});",
function(error) {
if (error) throw error;
}
);
}
}
In both cases, the generated "bundle" config files look like:
require.config({bundles: {"BUNDLE_NAME":["MODULE_1", "MODULE_2", "ETC."]}});
...and in the specific case of multiple bundles, here is how I use them at runtime to let RequireJS load the required modules dynamically, as needed:
<html>
<head>
<script type="text/javascript" src="RequireJS.js"> </script>
<script type="text/javascript">
requirejs.config({
baseUrl: './build-output/MULTIPLE/'
});
</script>
<script type="text/javascript" src="../build-output/MULTIPLE/BUNDLE_NAME1.js.bundles.js"> </script>
<script type="text/javascript" src="../build-output/MULTIPLE/BUNDLE_NAME2.js.bundles.js"> </script>
</head>
<body>
</body>
</html>
+1
I had been solving it by parsing the build.txt file. I like the suggested solutions above better.
FYI, I have posted a $150 bounty for this issue.
I am slowly getting to addressing fixes I want to get in for the next release. I made a first pass at this in a write-bundles branch, here is the current snapshot if you want to try it: https://raw.githubusercontent.com/jrburke/r.js/e44431f38271eeca83d4edf7496ca9d643214805/dist/r.js
There is a test, here is the build file for that test: https://github.com/jrburke/r.js/blob/write-bundles/build/tests/lib/bundlesConfig/build.js
Notes about it:
If I get some confirmation from some others that this seems reasonable, then I will look more into finishing it up for merging, including updating the bundles config docs to mention this option for building.
only works with "dir" style whole project optimizations. It did not seem to make sense to support single file optimization runs, since if that file is loaded, all of its define()'d modules will be seen by the loader automatically.
I concur, but just FYI: our implementation (which relies on onModuleBundleComplete
) produces "bundle" files even for optimized single JS files (Almond-loaded) so that the list of AMD modules present in any given bundle can easily be checked.
Fantastic news. FYI, I can't use require.config() because of how skipDataMain
works. I include
a loader and prepend a var require = {}
instead. So for this to be useful for me, I would probably need more direct access to the generated bundles object to do whatever I need with it.
@danielweck for almond-style builds: almond itself does not reference bundles config, as all the modules are already in the file. So building a bundles config in that case for loader purposes is not needed. However, if it helps with other processes you have in place, great.
@sholladay I will look at reflecting the full set of IDs in the information passed to the onModuleBundleComplete callback. While the paths are part of the data passed to onModuleBundleComplete, the module IDs are not. Depending on the build config, it may not be possible to backtrace from the paths to the IDs, and it is possible a file could have more than one module in it.
One of the commits in https://github.com/jrburke/r.js/tree/write-bundles branch, https://github.com/jrburke/r.js/commit/e127cc9219245e2bc49eb6cff9fb6d3c9b2f2e85 highlights an issue for getting this fully complete, will need some more time for it, but will not make the 2.1.21 release.
Thanks for the work on this. Your comment about module IDs makes sense.
I landed this for 2.2.0. It uses the "bundlesConfigOutFile" config to specify where the config should be written. So it only works with "dir" styles of whole project optimization.
If you wanted to scan a file to get all the named, define()'d module IDs, then the interior module parse
has a parse.getAllNamedDefines
function that can do this. This can be done via the useLib
API:
var requirejs = require('requirejs');
requirejs.tools.useLib(function(require) {
require(['parse'], function(parse) {
// contents is the string contents of a file.
// parsedIds will be an array of module IDs.
var parsedIds = parse.getAllNamedDefines(contents);
});
});
I went with just reparsing the final built file looking for all named define calls. While a bit slower than accumulating the IDs while other processing happens, it was difficult to catch all edge cases with the accumulation approach. With the reparse, it is guaranteed to pick up all the named modules.
Hi, I know i'm late to the party, but I'm working on optimizing a few packages in my SPA that are used to encapsulate specific groups of functionality (some is a rest API wrapper, another is a set of UI components, other is a data model).
To get started, I put my different components into a package and i used the r.js optimizer to optimize that one package (in this case: cohort builder). My build.js file looks like this:
({
baseUrl: '../js',
separateCSS: true,
mainConfigFile: '../js/main.js',
optimize:'none',
name: 'cohortbuilder',
include: ['cohortbuilder/CohortDefinition'],
exclude: ['text', 'css', 'databindings', 'conceptpicker/ConceptPicker','conceptsetbuilder/InputTypes/ConceptSet'],
fileExclusionRegExp: '^.git$',
out: '../dist/cohortbuilder.min.js'
});
My folder structure looks like this: i have a js dir that is under the webroot whcih has all the code. I have a separate folder under root called 'build' (where this build file is, i'll have differnt build.js files for the different pakcages i want to bundle up) and finally there's a dist folder that i write the optimized files to.
I've made some good progress: I can get only the assetts (.js, html and css files) that are directly relevant to the package to bundle up into a single file, and inspecting the non-uglify form of the output I see how all the modules of the package have been given an identifier which starts at the root of the package (cohortbuider/main, cohortbuilder/SubGroup/ModuleA, cohortbuilder/Subgroup/ModuleB, etc)
The problem is hooking it back up into the application. Initially, all the modules were resolved using the pacakge configuration, so someone could require(["cohortbuilder/SubGroup/ModuleA']) and it would find the file at js/modules/cohortbuilder/SugGruop/ModuleA.js and load it. But now that everything is bundled up into one file, it wasn't clear what the package config was supposed to say....
But here's what I've done so far to get some of it working: I added a deps entry to 'cohortbuilder.min. so that the optimized file would be loaded at startup. Ideally, I'd like to load it only when one of the modules in the package is requested, but this is what I tried first just to get all of the modules loaded. Since it's loaded up, all the defines register their module IDs, so anyone who does require(["cohortbuilder/SubGroup/ModuleA']) will find the module.
But, if instead i don't want to put it into 'deps' and instead only load the file when some code-path requests it, the only thing I could think of doing was adding a bundles option with all the moduleIds contained in the optimized file:
bundles: {
'cohortbuilder.min':['cohortbuilder/CohortDefinition', 'cohortbuilder/components']
},
Note I stopped after just these two because I was overwhelmed with despair that I'd have to put about 30-50 items in this list, and I didn't want to do this by hand.
That's when I found this thread, that talks about spitting out a bundles config, but I was then double-overwhelmed with despair when I read that it only works on "dir" styles of whole project optimization!
Is there any hope that something in the standard tool-kit could spit out the moduleIDs from a single module r.js optimize call? I did read the above comments about adding a onModuleBundleComplete, but I'm such a node noob and a r.js noob that ... is that function declared within the build.js file?
Or is there some other way to go from a package with dozens and dozens of modules in separate files to one optimized file from just the package that I can then easily reference in another host app?
I'm very much trying not to have to re-structure my entire package into a single-module where main exposes all the sub-modules through a set of nested properties, and I have to go back to everywhere I've done
define(["cohortbuilder/SubGroup/ModuleA'], function(moduleA) {} );
And replace it with this:
define(["cohortbuilder'], function(cohortBuilder) {
var moduleA = cohortBuilder.SubGroup.ModuleA;
// use moduleA as before...
} );
Any suggestions would be greatly appreciated!
Since you have more custom build needs, you might try using amodro-trace to do the module bundling. It is a lower level imperative approach to bundling vs the r.js declarative config approach.
So you could call the amodro-trace to get the list of modules that should be bundled together in a file, and since all the module IDs are known in the data structure returned by amodro-trace, you could use that to then write out a bundles config.
There is an interior module in amodro-trace that will allow you to modify a requirejs.config() call in an existing JS file. You can use that to add the new bundles config you can derive from the amodro-trace output. Use the interior module's modifyConfig method:
var amodroTrace = require('amodro-trace');
var allWriteTransforms = require('amodro-trace/write/all');
var transform = require('amodro-trace/lib/transform');
var fs = require('fs');
var writeTransform = allWriteTransforms({});
amodroTrace({
// amodro config here...
includeContents: true,
writeTransform: writeTransform
}, {
// requirejs loader config here
})
.then(function (tracedResult) {
// derive the bundles info from the traceResult
var bundlesConfig = {
myBundle: tracedResult.traced.map(function (entry) {
return entry.id;
})
};
// then write the trace of the modules to a file, pretend it is called bundle.js.
var bundleContents = traceResult.traced.map(function (entry) {
return entry.contents;
}).join('\n');
fs.writeFileSync('myBundle.js', bundleContents, 'utf8');
// Now update the config with bundle info
var configContents = fs.readFileSync('config.js', 'utf8');
var updatedContents = transform.modifyConfig(configContents, function (config) {
// Add the bundles config to this config:
config.bundles = bundlesConfig;
return config;
});
fs.writeFileSync('config.js', updatedContents, 'utf8');
});
Since amodro-trace is lower level, it does require a bit more work to stitch what you want together, but it has the benefit that it can be very custom then, and for example, not dependent on dir
config.
Thanks, @jrburke , that's really helpful. Could I ask your opinion on if you detect any code smell by what I'm trying to accomplish by wrapping up packages (with many files) into a single file that can be referenced in a requireJS / AMD aware application? The idea is that I have a few packages of functionality that do different things like statistical modeling and others that do visualizations and others that are user entry, and so being able to manage those separately yet have a simple and effective download path to get it into a target app. I'm thinking that each package that I produce i'll have a set of instructions on how to modify an existing requireJS.config() including how to modify the bundles entry. If I can get your approach to work (as above with andro-trace) then that will help me generate some of that documenation (and also get it working together with all these other apps I'm trying to integrate (whcih are all requireJS/AMD based).
Thanks again.
-Chris
Separating that functionality in logical source packages, one for modeling, one for visualization, one for user input, makes sense for source code organization. However I would normally try to do the module concatenation along the app concerns, vs bundling by logical source package unit.
The bundling by app concerns fits well with how other module optimizers like webpack are trying to do split bundle concatenation too: use dynamic require() calls to load the next route or section of UI in the code, which also sets up natural boundary points for code loading. This is how the r.js optimizer was structured to work, with the modules config containing route/ui section module entry points, with perhaps a common layer loaded up front.
This is not always possible though, it could be hard to figure out those layers. Sometimes there are not clear boundaries for those sections, which is where bundles config can help out. But for me it was a last resort option if I could not get the partitioning done based on app segments vs source code package segments.
You have it right @jrburke , I mispoke slightly: for a few of my packages, it has a combination of data structures, UI components (to manipulate the data structures) and the visualizations that are tied to a specific user role in the system. In some cases, tho, there's assets (like the data models) that could span different user roles, so those elements get split out into their own shared package that creates a requirement that if package A is being used, package B also needs to be included in the app. Unfortunately I can only notify the users who are consuming package A about the dependency via documentation (or the'll just get errors in their app that package B wasn't found when they tried to use something from A that required B).
I know this is sounding like the build process/optimization process should take care of finding all the referenced components at the client app at build time (so that when a external app wants to use some functionality from my component library, they grab the source, and do their own local optimize and only components they use will get bundled into their application output file), but I'm trying to save external consumers of this code from having to go through an optimization step (even tho it might mean when they include the library in their app, they are getting more than they potentially need). I see this like the d3 library or a jquery-ui library where you have the option of downloading the whole libary that gives you access to everything (perhaps from CDN) vs. pulling down the pieces of source that you want and doing a local build (which gives you the smallest footprint possible). I'll probably try to address both of these perspectives, but for now I'm just trying to get my hands around the tool.
Also, one last thing about amdro-trace vs r.js getting the bundle config, i was looking at your commit and it seems that regardless of how the r.js got kicked off, it seems it parses the output file for the module IDs and spits those out....and I'm wondering if you could tell me if there's a technical reason why it only works in dir mode...i was thinking of making a local branch where i remove that 'must have a dir' option checked, and let the process execute using the same directory as the out file instead of leveraging the dir option (which also trigers whole project processing). I'm not going to go down this route tho, if you can tell me that there's other technical reasons that it wouldn't' work.
I'm definitely taking your suggestions to heart and I'm fully ready to try your amdro-trace code to try to solve this, but I'm just a little OCD when it comes to pulling in differnt tools to solve a problem when it looks like the r.js is doing exactly what I want, just not on a package basis.
-Chris
It felt overwhelming to have more options around bundle config, and with the tool doing single file and whole project optimizations, keeping all the configs separate can be difficult. In this case, it seemed odd to single file optimize, but then also mandate that there is a second file, a config file, that is also modified. Where should that modified file live given that r.js tries very hard to not modify the input files, and single file optimizations only output is the out
file location for the bundle? Seems like it would require even more r.js config to support that, and it was already feeling unwieldy to me.
Yeah, I can understand the tool complexity getting out of control and also your point is taken about single file optimize. However, in the case of packages, it's not single file, and passing the package name as the moduleID to optimize means that the only way to get at any module in the package is to leverage a bundleConfig so that every module contained therein is 'discoverable'.
What would happen if when a r.js optimize call produces an outfile, that the set of moduleIDs that were put into the outfile created a new entry in the require config's bundle list?
For example if i have a package 'someLib' with 5 Modules in it (ModA..ModC) and I did a r.js build with name="someLib" (passing the config options to resolve the someLib to the appropriate location), It will build the outputfile someLib.min.js.
The contents of that outfile might look something like this:
define('someLib/main',[..deps..], function(..deps...) { } );
define('someLib/ModA',[..deps..], function(..deps...) { } );
define('someLib/ModB',[..deps..], function(..deps...) { } );
define('someLib/ModC',[..deps..], function(..deps...) { } );
I recall seeing a comment you made in either one of the commits or one of the other threads where you parsed the resulting file for module IDs (you mentioned that it was slower, but handled edge cases better). Is there a way to get at the function that you built that given an optimized file it can generate a list of moduleIDs that are contained within the file? If that existed, then all I'd do is have a post-step of running the function, given the outfile as an input and get out the list of module IDs....
Basically the code you have here:
var bundleContents = file.readFile(finalPath);
var excludeMap = {};
excludeMap[module.name] = true;
var parsedIds = parse.getAllNamedDefines(bundleContents, excludeMap);
entryConfig.push.apply(entryConfig, parsedIds);
Basically I want to do the above to the outfile. How would I be able to invoke this code as a sort of stand alone function call? Again, I very much apologize i'm not a node user, so I'm not completely familiar with the capabilities....
Actually, @jrburke , the more i'm thinking this through, the more i think that if i define a package, i have to make all the submodules reachable from the top level main module within the package. It means i have a lot of refactoring to do, but I think in the long run it's the proper way to structure it. Thanks again for all your input on this, i appreciate it.
I was going to suggest the same: normally the "main" modules in a package is its primary public interface to the outside world, so if something is consuming the whole package then it makes sense to use it through the main module.
requirejs 2.1.10 supports a bundles config in the runtime, but right now it requires the developer to construct those bundles. Having them generated by r.js would be useful.
A bit tricky in that it means inserting build config.