-
Notifications
You must be signed in to change notification settings - Fork 2
Importing files from source folder into destination #89
Description
Similar issue in gs-plugins repo.
The problem
Developer often need to reuse existing code and configs. This way companies developer and reuse their infra. This infra could be docker files, helm charts, plugins (datasource, eventsource), sample code, project scaffoldings, READMEs, whole projects itself or anything under the sun.
Currently there is now way for a developer to import files from another folder (a remote git repository, or a local folder). This is useful when importing sample code and configs from a Godspeed plugin, project template or a starter scaffolding.
. The developer should be able to pick up and choose what it wants to use in the project.
For ex. a kafka plugin could have sample docker-compose, .env , config, src, docs, README.md, tsconfig.json, a docs/swagger.json etc. Developer may choose to import selected files and configs.
Developer may choose to do this during plugin installation process or anytime after it is installed.
The (proposed) solution
- When plugin is installing the developer should be shown what all the plugin exports
- Compulsory items like src/{datasources/kafka.ts, eventsources/kafka.ts}
- And optional items like
src/{events,functions,mappings}, config/{default,custom-environment-variables}
- Developer will get the choice to select the files they want to
extract, - Those paths/files which do not exist in destination folder will get created
- Those paths/files which already exist in destination folder will be appended (merged) to or overwritten (allow the developer to select the action. Default will be append),
- Folders will always get merged - i.e., those files which exist in a dest folder but not in src folder, will continue to remain in the dest folders after the merge.
How to solve
- Define the plugin install process
- Files import process
- Write a reusable utility function in godspeed-cli to provide this directory export/import functionality by taking the path of src folder, destination folder, and the exports file
async function merge(src: Path, dest: Path, exportConfig: Path | JSON): GSStatus {}
-
exportConfigwill be a yaml/json file path or a JSON itself). Define a json format in which the importing logic from the source folder is defined (for ex. which files, folder to import. whether to import in append mode or overwrite mode. Can set merge policy per file or for the whole folder. Default merge mode is append mode (in case the file exists in destination folder).
defaults:
mergePolicy: 'append' # <MergePolicy can be append or overwrite. Default value of this default mergePolicy is append. This applies at a file level.
imports: # Each array item is either a string or a kay value pair
# {[key:String]: <MergePolicy | undefined>} Each entry (as you would express in Typescript)
- .env: append # can leave value as append, overwrrite or undefined.
# If you leave undefined default merge policy
# will get executed
- config: overwrite # path can be a folder or a file
- src/datasources/types/kafka.ts # use default defaults.mergePolicy
- src/eventsources/types/kafka.ts # use default defaults.mergePolicy
- src/events/kafka/helloworld.ts # use default defaults.mergePolicy
- src/functions/kafka/ts/helloworld.ts # use default defaults.mergePolicy
- src/functions/kafka/yaml/helloworld.yaml # use default defaults.mergePolicy
- Reuse this code in the plugin install process. The developer only needs to edit the
exports file. This can be done before, during or after the plugin install process (Go back to the 1st point of this list).