I think currently you are supposed to use the "Multi-branch pipeline" for that, where you have separate branch in your repo for every configuration/job you would need.
We have decided to go different route, even though it is kinda held together by duct-tape :)
1. We have several shell-scripts that use the $JENKINS_URL/script to update global configuration. A.f.a.i.k. the /script endpoint gives you access to almost all of the jenkins internals to be operated with groovy. So thats how we set-up i.e. plugins, slave-providers, secrets, shared config-files, e.t.c
2. for job definitions, we use jenkins-job-builder [1] with the pipeline plugin [2] We then store both the job-builder configs and the jenkins-files they point to in a single repo, achieving the "define a repo for all-the-builds" solution.
3. to reduce repetition we used shared pipeline library, where we put all of (groovy) functions to be shared across jobs.
Neither shell-scripts for /script endpoint or the job-builder configs are particularly nice, but they get the job done. But we used them even before jenkins-files/pipelines/2.x
The 2.x bits do help quite a lot :-)
[1] https://docs.openstack.org/infra/jenkins-job-builder/
[2] https://github.com/rusty-dev/jenkins-job-builder-pipeline