Closed AllanOricil closed 2 weeks ago
I like the idea @AllanOricil. It does look we can configure hooks to run post-retrieval (for the "decompose" command) and run pre-deploy (for the "compose" command), which would definitely be a neat way to optimize the use of this plugin with the CLI.
My one concern would be how we would allow a specific user to modify the hook (probably in the sfdx-project.json) so it only fires for the specific metadata types they wish to use this plugin for.
I currently don't see any indication of this level of customization in the docs for plugin hooks.
Ideally, I think the desired implementation for this plugin would be:
Perhaps we should raise a ticket with the Salesforce CLI team to provide this level of customization for hooks?
@mcarvin8 I think you have to create your own config file, like .my_plugin_name.json|js|yml
. But I agree that it would be a good feature for oclif plugins. Vscode API has such a thing. In vscode you can declare props for your plugin`s package.json file, and then access those props programatically using its name.
@scolladon I believe he created config prop parser and file name for his plugin
@AllanOricil is right, I read the sfdx-project.json
to find a special prop and use it in sfdx-git-delta
.
Latest release now reads the sfdx-project.json
file for package directories, which allows the decomposer to process multiple package directories at once and removes the need to do anything additional for this task in regards to package directories.
This should make implementing hooks easier now since the hook only needs to parse the sfdx-project.json
for meta suffixes to decompose/recompose. Still need to figure out how to test hooks before building a new release (if possible?). I'm having some issues committing hooks to a branch now so I need to look more into why.
The 2 hooks should just need to run the default classes similar to how the tests are calling them:
import DecomposerRecompose from '../commands/decomposer/recompose.js';
const metaSuffixes: string | undefined = await getSuffixes(SFDX_PROJECT_FILE_NAME);
if (metaSuffixes) {
const suffixArray: string[] = metaSuffixes.split(',');
for (const metaSuffix of suffixArray) {
await DecomposerRecompose.run(['--metadata-type', metaSuffix]);
}
}
Thinking the sfdx-project.json
will need a new/optional key named decomposerSuffixes
:
{
"packageDirectories": [{ "path": "force-app", "default": true }, { "path": "unpackaged" }, { "path": "utils" }],
"namespace": "",
"sfdcLoginUrl": "https://login.salesforce.com",
"sourceApiVersion": "58.0",
"decomposerSuffixes": "workflow,labels,profiles"
}
Deploy and Retrieve hooks may be broken... https://developer.salesforce.com/blogs/2022/02/we-broke-deploy-retrieve-hooks
Based on the current OCLIF docs, that might explain why retrieve and deploy hooks don't show as a lifecycle event -https://oclif.io/docs/hooks/
Also, based on some testing, looks like we might not be able to add extra keys to the sfdx-project.json
file. SFDX seems to flag keys in the JSON it does not recognize. So we probably would need to a make new JSON file for this plugin to not interfere with the Salesforce CLI.
But I'm not sure if this effort would be worth it if Salesforce has intentionally broken deploy/retrieve hooks in newer versions of the CLI due to them pushing SDR. It also looks like they're pushing SDR updates to support custom registry variants soon to handle decomposing/recomposing certain meta types from the package...
Also, based on some testing, looks like we might not be able to add extra keys to the
sfdx-project.json
file. SFDX seems to flag keys in the JSON it does not recognize. So we probably would need to a make new JSON file for this plugin to not interfere with the Salesforce CLI.But I'm not sure if this effort would be worth it if Salesforce has intentionally broken deploy/retrieve hooks in newer versions of the CLI due to them pushing SDR. It also looks like they're pushing SDR updates to support custom registry variants soon to handle decomposing/recomposing certain meta types from the package...
@mcarvin8 you probably have to add your schema changes in this repo. There is a schema for sfdx-project.json
https://github.com/forcedotcom/schemas
The post retrieve hook for decompose has been op-tested and will be shipped with v3.3.5.
I have elected to use environment variables since @salesforce/kit provides some methods to parse env variables. Seemed easier then updating the sfdx-project.json
schema.
I still need to workout the predeploy hook for the recompose command. Based on https://developer.salesforce.com/blogs/2022/02/we-broke-deploy-retrieve-hooks, it seems like the previous method of changing files before a deployment is impossible with the SDR. I need to figure out an alternative of changing files before deployment and then testing it out with a new beta build.
The post retrieve hook for decompose has been op-tested and will be shipped with v3.3.5.
I have elected to use environment variables since @salesforce/kit provides some methods to parse env variables. Seemed easier then updating the
sfdx-project.json
schema.I still need to workout the predeploy hook for the recompose command. Based on https://developer.salesforce.com/blogs/2022/02/we-broke-deploy-retrieve-hooks, it seems like the previous method of changing files before a deployment is impossible with the SDR. I need to figure out an alternative of changing files before deployment and then testing it out with a new beta build.
@mcarvin8 why you think it is impossible? I read the blog post and understood it is still possible. Can you expand on it.
None of the workarounds is good anyway. The idea was to let the User use the standard deploy command and not a custom one to simplify as much as possible. So, I think the best option is wait for the predeploy
hook to work again. I would ask @shane when they will have it fixed.
What about opening a PR in the SDR package to enable custom "pre deploy" hooks? The idea is to enable SDR to have any number of custom pre/post deploy hooks, instead of just a single one
[pre deploy stage hooks] => deploy => [post deploy stage hooks]
Inside a stage hook, there can exist any number of hooks as needed from any number of plugins. When registering a hook to a stage hook, the plugin dev must specify an Index to instruct the cli about the order each hook must run. Before a hook runs, the list of hooks in a stage is resolved to determine the order hooks will run for that stage. The higher the index, the earlier the hook runs, because it has "high" precedence (like z-index in css). Plugin developers can specify the index of their plugin hook for a particular stage to ease setup, but a User must be able to overwrite it the case he/she has installed 2 plugins that both contain a stage hook with the same index. In this scenario, without enabling the User to configure the order, there wouldn't be a way for the cli to resolve the order of the hooks for a stage.
in sfdx-project.json
,
...
"hooks": {
"predeploy" : [
{ "name": "bar", "plugin":"plugin1-id", "index": 3},
{ "name": "foo", "plugin":"plugin2-id", "index": 100},
]
}
...
Resolved order:
[foo, ..., bar, ...] => deploy => []
It is a configurable pipeline of hooks.
I think this would have to be added in oclif core, where it specifies how hooks can be created.
@AllanOricil - Based on how I was reading @mshanemc 's article, the pre-deploy hooks worked originally like
Convert to source format in temp dir --> run pre-deploy hook --> deploy
Since SDR doesn't create a temp dir anymore, there's no where to update the files before they are deployed if the deploy command does the source conversion before running a predeploy hook.
Unless I misread the article where the source format conversion takes place pre-deployment and where the predeploy hooks are ran.
I can whip up a predeploy hook beta soon since the code would be very similar to the post retrieve hook to just see what happens in a sandbox deploy test.
I was just playing around with making a custom deploy command and I agree that it's not a good idea to avoid further deviations from the CLI standard. Plus, I would have to worry about deploy fulls, deploy validate, deploy quick, etc.
@AllanOricil - When I clear out the original custom labels file in my branch, set the environment variable SFDX_DECOMPOSER_METADATA_TYPES
to labels
and attempt to run a deployment of custom labels after adding the scopedPreDeploy.ts
hook below, I get a Error (1): No source-backed components present in the package.
error. So I think that confirms you're unable to change the files via pre deploy hook before the deploy runs.
sf project deploy start --metadata CustomLabels
import { Command, Hook, Config } from '@oclif/core';
import { ScopedPreDeploy } from '@salesforce/source-deploy-retrieve';
import { env } from '@salesforce/kit';
import DecomposerRecompose from '../commands/decomposer/recompose.js';
type HookFunction = (this: Hook.Context, options: HookOptions) => Promise<void>;
type HookOptions = {
Command: Command;
argv: string[];
commandId: string;
result?: ScopedPreDeploy;
config: Config;
};
export const scopedPreDeploy: HookFunction = async function () {
const postpurge = env.getBoolean('SFDX_DECOMPOSER_POSTPURGE', false);
const metadataTypes: string = env.getString('SFDX_DECOMPOSER_METADATA_TYPES', '.');
const format: string = env.getString('SFDX_DECOMPOSER_METADATA_FORMAT', 'xml');
if (metadataTypes.trim() === '.') {
return;
}
const metadataTypesArray: string[] = metadataTypes.split(',');
const commandArgs: string[] = [];
for (const metadataType of metadataTypesArray) {
const sanitizedMetadataType = metadataType.replace(/,/g, '');
commandArgs.push('--metadata-type');
commandArgs.push(sanitizedMetadataType);
}
commandArgs.push('--format');
commandArgs.push(format);
if (postpurge) {
commandArgs.push('--postpurge');
}
await DecomposerRecompose.run(commandArgs);
};
I agree with the idea of allowing multiple pre-deploy hooks based on a priority list in the package.json
.
But I think we're still unable to proceed with a pre-deploy hook for the recompose command since it cannot change the source files before starting the deploy due to how they made SDR.
Ok 👍 if there is no context in the hook about the files that are going to be deployed, then transformations can't be done. I hope they fix it, or at least create another built-in hook to enable transformations.
I don't think it is a good idea to use env variables to store metadata types the way you did because there is a limit of characters we can store in env variables. Better to store those type of configuration values using your original idea of having your own config file.
Perhaps I can just require a new JSON file called sfdx-decomposer.json
that can extract the values from there if the user makes the file in their root folder similar to the sfdx-project.json. Probably the quickest way to switch to a file without worrying about what Salesforce checks for in their json.
I would remove dx from the name because sf did for their cli, and use a common convention for config files to make it less difficult for people to remember its name
.sfdecomposerrc
.sfdecomposer.config.js
.sfdecomposer.config.json
Similar to what other tools did, like prettier. Maybe there is a npm package which already handles any type of config file, and you just need to give it a name, like sfdecomposer
. I just did not google it.
I hope one day sf cli changes its config file to follow any of these common patterns .sfrc
, .sf.config.json
, .sf.config.js
like most of other tools do.
Thank you @AllanOricil for your continued support in making this plugin better.
I have switched the post retrieve hook from environment variables to a JSON in the repo root and just renamed it to .sfdecomposer.config.json
. This is working right now:
{
"metadataSuffixes": "labels,workflow,profile",
"prePurge": true,
"postPurge": true,
"decomposedFormat": "xml"
}
Probably would make sense to rename my plugin to sf-decomposer
but I'm unclear how renaming NPM packages work (if that will make a brand new instance or if it's smart enough to pick up the name). I know we talked about renaming this plugin in the past. Quick google switch found me this:
I saw some deprecated packages and they all did what you shared in the image. They publish a deprecated version of the current package, with an updated README as well, to send people to the new package. As an additional step, I think you could also bump its major version.
Thanks. This has been renamed to sf-decomposer
and released @ 4.0.0.
The sfdx-decomposer
has been marked as deprecated on NPM.
@mcarvin8 check if that command that lists plugins to be installed in sf cli is still showing your deprecated plugin. Im not sure if they have considered filtering out deprecated packages.
@AllanOricil - seems like it was still showing the plugin installed. You do get a deprecation warning now trying to install sfdx-decomposer, but I think they will need to manually uninstall sfdx-decomposer if they have it locally.
sf plugins uinstall sfdx-decomposer
I'll make an Issue here on this repo telling people to uninstall the sfdx decomposer manually if they have it before they can use sf-decomposer and pin it.
@AllanOricil - I figured this out. Instead of using a predeploy hook, I created a prerun hook for the recompose command which will only run when the sf command is project deploy start
/project deploy validate
. The prerun hook will use the same JSON file as the postretrieve hook.
This works as expected when testing retrievals and deployments locally. When retrieving, the prerun hook is skipped and the postretrieve hook decomposes the metadata. When deploying, the prerun hook recomposes the files before deploying, no hooks run after deploying. Shipped with 4.0.1.
Let me know if you have any concerns or feedback, but I'll close this issue as complete now.
Amazing work @mcarvin8 👏
Command would run automatically if sfdx-project.json has a prop that tells the plugin hook to run.
Hook is called, if prop = true, Command runs when extracting(decompose) or deploying(compose) metada.
Obs: If sfdx does not provide such hooks I can open a PR on the repo.