This repository will transport the data in real time API hit and then put it in responsible table in Google Cloud Big Query.
Use git to clone this repository
git clone https://github.com/ghazimuharam/bq-api-gatekeeperMake sure you have python 3.7 installed on your machine
> python --version
Python 3.7.10To run the script in this repository, you need to install the prerequisite library from requirements.txt
pip install -r requirements.txtStore all your service account json files to ./service directory
To run the script, you need to run kafka server on your machine, see Kafka Docs for the installation.
create api-gatekeeper topic on you kafka server.
Before running the setup.sh script, you have to specify your Google Cloud Application Credentials using command below
export GOOGLE_APPLICATION_CREDENTIALS="./service/your-credentials.json"Create dataset in bigquery using command below
sh setup.shRun main application for the API Endpoint using command below
uvicorn main:app --reloadRun Kafka Consumer script using command below
python kafka-consumer.py


