# Save to Storage

Quick Set
  • CHOOSE: "Process" ➡️ "Add new process" ➡️ "Save to storage"
  • DECIDE: Do you want to save an existing table or the result of a Query Job
  • SPECIFY: BigQuery table locations or select Juery Job
  • DEFINE Details of the target file - extension and availability.
    🎉 READY give yourself a high-five 🎉

# Introduction

The Save to Storage module allows you to save data in Google Cloud Storage. It can be either a Query result or an existing table.

# An example of operation

Usage examples:

  • You want to attach offline transactions in Google Ads
  • You query the enriched session table in WitCloud
  • You can publish the results as a csv file at the URL on Google Cloud Storage
  • You configure the systematic check-in and retrieval of data to Google Ads

# Configuration

# Before you start

Prepare a table or database query (Query Job) whose results you want to save to Google Cloud Storage.

# Start creating the module

From the menu on the left, select the "Process" tab, then click the "Add new Data Process" button.

image alt text

Select "Save to Storage" from the list of available modules

image alt text

# Initial settings

In the window, after pressing the module, a form for our function will appear, containing the following fields:

Name - The name of our process, it will be visible under this name in other places on the WitCloud platform

Select Source - Source file location - Query Job result, or existing table in BigQuery.

Query Job - available, predefined Query Jobs.

Current Google Cloud Project - a Google Cloud project where the source file is located _ (the field is available if you select the source as BigQuery) _.

Current Dataset - dataset where the source file is located _ (the field is available if the source is selected as BigQuery) _.

Current Table - table constituting the source file _ (the field is available if the source is selected as BigQuery) _.

The advanced options include:

Output Format - the format of the csv output file (a file with comma separated values) or AVRO (file standard based on the JASON structure)

Use Compression - allows you to compress the content of a file to limit its size.

Make file public - defines the pluki attribute as public - recommended setting for most files to be able to download its content without additional authorization.

image alt text

Ready! After pressing the "Finish" button, the configuration of our process will be saved.

# Setting the module in the schedule

Remember about the schedule !!

Please note that the configured data import process must be included in the Workflow configuration for it to be recalculated.

Last updated: 2/3/2021, 10:28:38 AM