Many academic organizations are present as such on GitHub, but may also have a self-hosted GitLab CE instance.
In order to automatically and fully backup the former onto the latter, they may want to run some kind of script that triggers their GitLab to import all repos from their GitHub org. This is possible through the GUI with the github_importer
(docu). It's not feasible manually and continuously for many repos.
Using GitLab's Create project
and Import file
APIs, I puzzled together this (Bash) code which creates GitLab projects from the given GitHub repos:
curl --request POST
--header "PRIVATE-TOKEN: $API_SCOPE" \
--form "namespace_id=$GROUP_ID" \
--form "path=$REPO" \
--form "import_url=https://github.com/$ORG/$REPO \
https://git.domain.edu/api/v4/projects/
ORG
would be the source organization on GitHub and GROUP_ID
a dedicated GitHub-archive
group on the GitLab CE instance.
However, this only imports files & Git history, not GitHub issues, PRs, labels, etc. but those are desired for a full backup. So, I wonder:
- Is there a 3rd API endpoint relevant to this task?
/projects/import
seems to require afile
attribute, but trying to read the file list from GitHub first, would probably exhaust the API limit too quickly for a a feasible backup solution. - Is there another
--form ""
flag one can add to the/projects
requests that makes it behave like thegithub_importer
? - Does the latter maybe have an undocumented API?
- Does one need to use
gitlab-rake import:github[…]
?
Thanks for any hints to the above questions, or pointers to existing solutions/tools/scripts, regardless of language. Thank you!
PS: Yes, the above script would be overkill compared to GitLab's built-in Repository mirroring
, but that doesn't include issues, PRs & labels either.
As of August 2018,
4.
seems to be the only answer. Once GitLab's own "Import repositories API endpoint" issue gets implemented, it may become an option for1.
.