Open francoischalifour opened 6 years ago
To go further, I think it'd be a good idea to go with the array version and to add the getVersions()
function (like the current fetch function) for each dependency. We'll need that for iOS templates that need to retrieve versions from Cocoapod and not npm.
The current fetch function would be the default one but could be overriden like so:
const client = algoliasearch(algoliaConfig.appId, algoliaConfig.apiKey);
const index = client.initIndex(algoliaConfig.indexName);
module.exports = {
dependencies: [
{
title: 'InstantSearch iOS',
name: 'instantsearch-ios',
async getVersions() {
const library = await index.getObject('cocoapods');
return Object.keys(library.versions).reverse();
}
},
{
title: 'JavaScript Client',
name: 'algoliasearch',
getVersions() {
return ['1.0.0']
}
},
{
title: 'JavaScript Helper',
name: 'algoliasearch-helper'
// Fetch from npm as default
},
]
};
In the CLI, we'll have to call this function if specified, otherwise, call the current fetch function.
Thoughts?
The way we fetch the correct library version for each template right now is defined by the
libraryName
property on each template configuration. Then, we fetch the library versions based on this name.This is fine for now but won't work as soon as we release new major versions (i.e. InstantSearch 3) because we will rely on the libraries
instantsearch.js
andalgoliasearch
.Here is my proposal.
Template configuration file
What I propose is to remove this
libraryName
property and to add adependencies
property in the.template.js
file:CLI
Then, in the CLI, we prompt all these dependencies (instead of the current "InstantSearch version"). It would look like so:
We can then add these versions in a
versions
key in the config passed tocreateInstantSearchApp()
.Templates
Using Handlebars, we'll be able to access them in the template:
I think this solution is more future proof; I already needed such a feature for installing templates without dependencies, and other templates depending on CSS libraries to fetch on npm.
Let me know what you think.