Open brookesy2 opened 11 years ago
That's a great idea! I'd love to make a high performance version of this that grabbed all the metadata in parallel. Maybe just chunking up each class of metadata into a separate job and running them all at once?
Does the large org scenario consist of one particular type of metadata that is overwhelming large? Like 1M reports that take forever to download. Or is the "largeness" somewhat reasonably distributed among the metadata?
Ant it made to be kind of "batchy" rather than atomic, so it is a challenge, but definitely do-able. The trick is just designing the batches.
Would be great to do that. Only bummer is things like Permission Sets will always show up empty unless pulled with the objects/classes they reference. Obviously this can get unwieldy very quickly if you have a lot of permission sets and can cause you to go over the 400mb limit. But most other things can be pulled alone without much impact on them.
Large org scenario varies, usually its just 200mb+ of Custom Objects/Standard objects with a lot of fields. But you are already pulling them out in bits, so not too much of an issue.
Sometimes I just wish I knew how to program :) One thing that did help was adding a base dir to settings.php so getOrgData.php could read it in. Makes it easier to move around!
Also thanks a lot for this app, it is great!
Yeah, the oddities around the salesforce dependencies (sometimes they almost seem arbitrary, but there must be underlying reasons) would make the "chunking" tricky. On second thought, this may be difficult. I expect salesforce will come up with solutions for this in the future. May be years, though.
Very true. I guess the easiest chunk would be to do Objects + Permission sets + apex (could tip over 400mb though) and then the rest could be done as is. Would at least speed things up a tiny bit! But yes, definitely years :(
Hello, I'm neewbie to linux, I use a Virtual Machine Linux OpenSUSE 13.1 with XAMPP 1.8.3-3 on Windows7
when I launch in localhost I get these alerts:
Warning: Invalid argument supplied for foreach() in /opt/lampp/htdocs/www/salesforceMetadataBackup/getOrgData.php on line 667
Warning: Invalid argument supplied for foreach() in /opt/lampp/htdocs/www/salesforceMetadataBackup/getOrgData.php on line 667
here is the function
function stripBlanks($arrInput) {
$arrStripped = array();
foreach ($arrInput as $stripFile) {
if (trim($stripFile) == "") {
//don't add it
} else {
array_push($arrStripped, trim($stripFile));
}
}
return $arrStripped;
}
a little help?
thanks a lot
Hi @danieljpeter do we have video/steps/instruction for setup a repo to daily sync process?
Not really an issue so much as a question. How difficult do you think it would be to make this into a more threaded process? Using it on longer orgs can take a LONG time. Would be interesting to see what bits could be threaded together.
Too annoying to do? :)