You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As we start to track more and more metadata, our pipeline is going to slow down.
In theory, we could generate a package.xml from only the inbound changes and then use that to sync master from production.
We could have a nightly process (4am) that runs to do a full sync. OR, if we wanted to get fancy, we could have the pipeline check for when the last full sync was, and run a full sync if it's > X days.
Either way, we will need to modify the build package script to look something like this:
git checkout $pr-branch
git merge master
sfdx git:package -d dist/pgk
git checkout master
sfdx force:source:retrieve -x dist/pgk/package.xml
git add .
# I guess if there are ANY changes then we could commit to master and abort?
# Or would it make more sense to commit / merge master into $pr-branch again
# and see if there are conflicts?
# I guess there could be changes that will merge?
The text was updated successfully, but these errors were encountered:
As we start to track more and more metadata, our pipeline is going to slow down.
In theory, we could generate a package.xml from only the inbound changes and then use that to sync master from production.
We could have a nightly process (4am) that runs to do a full sync. OR, if we wanted to get fancy, we could have the pipeline check for when the last full sync was, and run a full sync if it's > X days.
Either way, we will need to modify the build package script to look something like this:
The text was updated successfully, but these errors were encountered: