-
Notifications
You must be signed in to change notification settings - Fork 1
Develop script to update metadata locally #4
Copy link
Copy link
Open
Labels
📖 documentationImprovements or additions to documentationImprovements or additions to documentation🏛️ Organizefile, folder, directory, architecturefile, folder, directory, architecture🪄 enhancementNew functionality or feature requestNew functionality or feature request
Milestone
Metadata
Metadata
Assignees
Labels
📖 documentationImprovements or additions to documentationImprovements or additions to documentation🏛️ Organizefile, folder, directory, architecturefile, folder, directory, architecture🪄 enhancementNew functionality or feature requestNew functionality or feature request
Type
Projects
Status
🏷 Ready
Descripton
Current workflow for submitting data and metatdata to KNB is a hacky hybrid approach. We created the data package via website GUI and then pull down the package locally to change public viewing options. Moving forward we would like to provided more detailed and compressive information via scripts about the 2 KNB dataset we now maintain (doi:10.5063/F1CZ35N7 & doi:10.5063/F1862DXT). This would include:
knb_submit.Rscript to use locally build EML file to update datapackage.parquetprobablyConsiderations
artis-modelpipeline? Would make sense if we are working towards automating more and more of the pipeline.00-*scripts? These might eventually be in a configuration YAML, can we use that in conjunction with the EML script to avoid duplicating definitions? Possibly like this:Resources