Closed devjgm closed 4 years ago
Here's a more detailed list of the steps that we'll likely need to do:
Here's work items to do before the actual PR where the code is moved. This work is helpful because it minimizes the changes that are needed in the PR that actually moves the code.
google/cloud/$project/README.md
fileIn the -cpp
monorepo, each product has its own README.md
file. Add this now before even moving the code over. Example: https://github.com/googleapis/google-cloud-cpp-bigquery/pull/54
googleapis
and cpp-cmakefiles
depsMake sure these deps are using the same version as what's used in -cpp
. Example: https://github.com/googleapis/google-cloud-cpp-bigquery/pull/53
$project
-specific variables from CMake filesWhen the code moves into -cpp
, the variables used in -cpp
's CMake files may be different. For example, the top-level CMakeLists.txt
file might contain BIGQUERY_CLIENT_VERSION_MAJOR
rather than GOOGLE_CLOUD_CPP_VERSION_MAJOR
. Update all these settings so that when the code is copied into -cpp
, it works without modification.
Don't forget to also update the version_info.h.in
file to use the new CMake macro names.
Example: https://github.com/googleapis/google-cloud-cpp-bigquery/pull/52
.bzl
filesWe'd like the generated .bzl
files to have the same copyright year as they had in the original repo. To make sure this happens, change all the relevant CMakeLists.txt
files to specify the YEAR "20XX"
param to the relevant cmake functions.
Example: https://github.com/googleapis/google-cloud-cpp/pull/3819
include(GoogleCloudCppCommon)
In -cpp
, most common CMake settings are configured in one place. So before the code is moved into -cpp
, update the google/cloud/$project/CMakeLIsts.txt
file to source this one file, and it probably don't need to source any others. Again, this is attempting to make it so the code and build work unchanged when moved into -cpp
.
Example: https://github.com/googleapis/google-cloud-cpp-bigquery/pull/48
-cpp
googleapis.BUILD
rules needed by new repoThe bazel/googleapis.BUILD
file contains the build rules for the the protos we use. Make sure the build rules that will be needed by the incoming repo already exist in -cpp
.
Example: #3775
-cpp
(a single PR)google/cloud/$project
Think of this as simply copying the files. But ideally, we'll copy the files over and preserve their git history. See https://github.com/googleapis/google-cloud-cpp/pull/3728#issuecomment-611804509 for details about how this was done for the release/
directory. Once the files are in -cpp/google/cloud/$project
, we'll need to make further changes, otherwise the -cpp
build will break.
GOAL: Try make preparations in the original repo so that when the code moves into -cpp
, you can make zero changes to the newly added files in google/cloud/$project
.
CMakeLists.txt
Add an option/flag to make building the new dir optional.
ci/...
grep in ci/...
for places that mention bigtable
or storage
to see where you might need to add an additional configuration for the new $project
being added. In particular, ci/kokoro/docker/build-in-docker-cmake.sh
has a list of expected directories that get installed, which will likely need to be updated.
This file should contain a link/reference to the newly added code. This file is probably autogenerated, so be sure to edit the right file(s).
build.sh asan
, which will use bazel to build.build.sh clang-tidy
, which will use cmakebuild.sh integration
In both cases, try creating a build break in the newly imported code to make sure the build fails. Also try creating a runtime test failure, to make sure the test fails.
Also, create a DRAFT PR and see what the presubmit CI builds say.
NOTE: All these steps should go in a single PR.
Note: here's the script I used to move the code preserving history. It should be easy to modify for other cases:
#!/bin/bash
#
# Creates a `NEW_REPO` that contains all code in `google/cloud` from the old
# repo and its git history. Uses the "filter-repo" command from
# https://github.com/newren/git-filter-repo. This could also work with the
# built-in "filter-branch", but the docs for that command say "don't use it;
# use filter-repo instead".
set -eu
readonly OLD_REPO="git@github.com:googleapis/google-cloud-cpp-bigquery.git"
readonly NEW_REPO="git@github.com:devjgm/google-cloud-cpp.git"
readonly TMP_DIR="$(mktemp -d "/tmp/move-repo.XXXXXXXX")"
cd "${TMP_DIR}"
echo "$(date -u): ## In $(pwd)"
git clone "${OLD_REPO}" old
git clone --shallow-since="2020-04-01" "${NEW_REPO}" new
readonly PYTHON_CALLBACK='
message = re.sub(b"\(#(\d+)\)", b"(googleapis/google-cloud-cpp-bigquery#\\1)", message)
return re.sub(b"\s#(\d+)", b" googleapis/google-cloud-cpp-bigquery#\\1", message)
'
cd old
git remote remove origin
git filter-repo --path google/cloud --force \
--message-callback "${PYTHON_CALLBACK}"
cd ../new
git checkout -b move-repo
git remote add old ../old
git fetch old
git merge --no-edit old/master --allow-unrelated-histories
echo "SUCCESS! Check it out"
echo "cd $(pwd)
... to
google/cloud/bigquery
.We should investigate how to preserve history as well when doing this copy.