Despite the fact that an ETL task is pretty challenging when it comes to loading big data sets, there’s still the scenario in which you can load terabytes of data from Postgres into BigQuery relatively easily and very efficiently. This is the case when you have a lot of immutable data distributed in tables by some timestamp. For example, a transactions
table with a created_at
timestamp column. BigQuery and Postgres have great tools in order to do this pretty quickly and conveniently.
Preparing Postgres Tables
Laisser un commentaire
Participez-vous à la discussion?N'hésitez pas à contribuer!