environment:
docker compose with openjdk 11, minio, xtable, spark 3.4, hive 2.3.10, hadoop 2.10.2
If you change to not use OS variables and define them in YAML.
root@spark:/opt/LakeView# cat delta.yaml
version: V1
onehouseClientConfig:
# can be obtained from the Onehouse console
projectId: c3eb3868-6979-41cd-9018-952d29a43337
apiKey: asU2Pb3XaNAc4JwkkWpNUQ==
apiSecret: IBaLVxloIzU36heBooOBsPp5MhD6ijjyIk88zvH2ggs=
userId: x2gblCN8xNSurvCsqDaGJ84zy913
fileSystemConfiguration:
# Provide either s3Config or gcsConfig
s3Config:
region: us-east-1
accessKey: admin
accessSecret: password
endpoint: http://minio:9000
metadataExtractorConfig:
jobRunMode: ONCE
pathExclusionPatterns:
parserConfig:
- lake: <lake1>
databases:
- name: people
basePaths: ["s3://warehouse/people"]
# Add additional lakes and databases as needed
then I get this error
root@spark:/opt/LakeView# java -jar LakeView-release-v0.10.0-all.jar -p '/opt/LakeView/delta.yaml'
17:05:25.080 [main] INFO com.onehouse.Main - Starting LakeView extractor service
Exception in thread "main" java.lang.RuntimeException: Failed to load config
at com.onehouse.config.ConfigLoader.loadConfigFromConfigFile(ConfigLoader.java:31)
at com.onehouse.Main.loadConfig(Main.java:92)
at com.onehouse.Main.start(Main.java:56)
at com.onehouse.Main.main(Main.java:41)
Caused by: com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "endpoint" (class com.onehouse.config.models.common.S3Config$S3ConfigBuilder), not marked as ignorable (3 known properties: "accessKey", "region", "accessSecret"])
at [Source: UNKNOWN; byte offset: #UNKNOWN] (through reference chain: com.onehouse.config.models.configv1.ConfigV1$ConfigV1Builder["fileSystemConfiguration"]->com.onehouse.config.models.common.FileSystemConfiguration$FileSystemConfigurationBuilder["s3Config"]->com.onehouse.config.models.common.S3Config$S3ConfigBuilder["endpoint"])
at com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException.from(UnrecognizedPropertyException.java:61)
at com.fasterxml.jackson.databind.DeserializationContext.handleUnknownProperty(DeserializationContext.java:1127)
at com.fasterxml.jackson.databind.deser.std.StdDeserializer.handleUnknownProperty(StdDeserializer.java:2023)
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownProperty(BeanDeserializerBase.java:1700)
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownVanilla(BeanDeserializerBase.java:1678)
at com.fasterxml.jackson.databind.deser.BuilderBasedDeserializer.vanillaDeserialize(BuilderBasedDeserializer.java:298)
at com.fasterxml.jackson.databind.deser.BuilderBasedDeserializer.deserialize(BuilderBasedDeserializer.java:217)
at com.fasterxml.jackson.databind.deser.impl.MethodProperty.deserializeSetAndReturn(MethodProperty.java:158)
at com.fasterxml.jackson.databind.deser.BuilderBasedDeserializer.vanillaDeserialize(BuilderBasedDeserializer.java:293)
at com.fasterxml.jackson.databind.deser.BuilderBasedDeserializer.deserialize(BuilderBasedDeserializer.java:217)
at com.fasterxml.jackson.databind.deser.impl.MethodProperty.deserializeSetAndReturn(MethodProperty.java:158)
at com.fasterxml.jackson.databind.deser.BuilderBasedDeserializer.vanillaDeserialize(BuilderBasedDeserializer.java:293)
at com.fasterxml.jackson.databind.deser.BuilderBasedDeserializer.deserialize(BuilderBasedDeserializer.java:217)
at com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:323)
at com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4650)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2831)
at com.fasterxml.jackson.databind.ObjectMapper.treeToValue(ObjectMapper.java:3295)
at com.onehouse.config.ConfigLoader.loadConfigFromJsonNode(ConfigLoader.java:47)
at com.onehouse.config.ConfigLoader.loadConfigFromConfigFile(ConfigLoader.java:29)
... 3 more
export AWS_SECRET_ACCESS_KEY=password
export AWS_ACCESS_KEY_ID=admin
export ENDPOINT=http://minio:9000
export AWS_REGION=us-east-1
Originally posted by @alberttwong in #78 (comment)
environment:
docker compose with openjdk 11, minio, xtable, spark 3.4, hive 2.3.10, hadoop 2.10.2
then I get this error
Originally posted by @alberttwong in #78 (comment)