Before executing the script, please following instructions from BigQuery and establish the connection to the BigQuery API:
import os
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = "<your.json file path>"
As the name implies, with the download link to the CSV file, this script will upload to your dataset automatically. Could've overcomplicated the script with all the clean up scripts but I figured it would probably be better to it on BigQuery with SQL.
Inputs:
- Link to CSV
- Project ID and Dataset ID in project_id.dataset_id format
- Name the Table ID
- Done!
Upload directly from APIs to BigQuery, modify the .ipynb to you liking.
- Excel convestion to CSV then upload
- JSON Upload
- Check folder for new CSV/JSON file, then upload.
- After all the above, modularise(idk if that's a word)