Member-only story

‘COPY INTO’ Snowflake

DataGeeks
4 min readMar 15, 2023

--

A feature that enables us to load and unload the data from snowflake’s existing tables into the snowflake stages.

You can use the snowflake copy command to load the data from the local system by internal stage or from the cloud storage like s3 by external stage.

If you are reading this article I assume that you are already aware of the Snowflake stage concept, Cloud storage S3, how to integrate cloud storage with snowflake, and Snowflake storage integration.

Check out my other articles to get some understanding of these topics.
Export Data From Snowflake To S3
Load CSV from Local to Snowflake
Based on the above articles create a storage integration in snowflake( In my case it is AYUSH_SI)

I have two file load1.csv and load2.csv that contains two columns NAME and MARKS.

load1.csv
load2.csv

Now upload the load1.csv file to the S3 bucket
(In my case it is s3://dev-ayush-test/test/) and create an external stage to view the data from the loaded CSV file directly into the snowflake.

CREATE OR REPLACE DATABASE AYUSH_DEV;
CREATE OR REPLACE SCHEMA AYUSH_DEV.RAW;
CREATE STAGE impact_stage
STORAGE_INTEGRATION = AYUSH_SI
URL = 's3://dev-ayush-test/test/';
SHOW STAGES;
--Access S3 data
SELECT $1 FROM @test_stage;
The output of ‘SELECT $1 FROM @test_stage;’

We can see that the data has a header and the columns are separated by semicolons so we need to tell the snowflake about our file structure. We can do it by creating a file format in the snowflake and afterward passing the file format in the stage definition.

CREATE OR REPLACE FILE FORMAT csv_semicolon
TYPE = CSV
FIELD_DELIMITER = ';'
SKIP_HEADER = 1
NULL_IF = ('NULL', 'null')
EMPTY_FIELD_AS_NULL = true;

CREATE…

--

--

DataGeeks
DataGeeks

Written by DataGeeks

A data couple, Having 15 years of combined experience in data and love to share the knowledge about Data

No responses yet

Write a response