How to load desk export data into Snowflake

Supported Plans

  • Starter
  • Teams
  • Enterprise

 

Robin's Analytics API helps your team better understand the bigger picture of what’s happening at your workplace. This API helps you bring Robin data into your preferred tool for a more holistic workplace view, helping you drive your hybrid strategy based on criteria unique to your organization. Below is one example of how to do this with Snowflake.

How to find your Account ID & generate an API token in Robin

1. Log into Robin and navigate to Manage > Integrations > scroll down to the "API Tokens" section. 

2023-01-13_12-14-53.png

2. Click +Generate new token (captured above).

3. Name your token and check the "Basic_read permissions" box. 

2023-01-13_12-37-32.png

4. Click Generate Token.

5. Copy your token and save it. 

2023-01-13_12-41-19.png

How to create an export request

The code below will make an export request for a single day of desk reservations and then poll the API to determine when the request has been completed. When it is determined that the request has been completed, the data will be saved to disk with the filename as the from_date provided.

The code below will check if the export request is ready up to 6 times with 10 seconds between each check – this is to ensure your requests aren’t rate-limited.

Note: If you do not already have the requests package installed, you will need to install it like so: pip install requests

It is not required, but it’s recommended to install packages in a virtual environment.

import requests
from datetime import datetime
from datetime import timedelta
import time

account_id =   # <-- add your account ID here, e.g. 00
read_token = '' # <-- add your token here

def make_export_creation_request(export_type, from_date):
    to_date = from_date + timedelta(1)
    print(f'Requesting an export from {from_date} to {to_date}.')

    url = f'https://api.robinpowered.com/v1.0/insights/reports/organizations/{account_id}/{export_type}'
    response_object = requests.post(url,
                             headers={
                                 'Authorization': f'Access-token {read_token}',
                                 'Content-type': 'application/json'
                             },
                             json={'from': from_date.isoformat(), 'to': to_date.isoformat()})

    response_object.raise_for_status()
    response = response_object.json()
    export_id = response['data']['export_id']
    print(f'The export request has been received and assigned the ID {export_id}.')

    return export_id


def retrieve_export(export_id, from_date):
    url = f'https://api.release.robinpowered.com/v1.0/insights/reports/{export_id}'

    iteration = 0
    while iteration < 6: # 720:
        print(f'Attempting to retrieve export {export_id}')
        response_object = requests.get(url,
                                       headers={
                                           'Authorization': f'Access-token {read_token}'
                                       })

        if response_object.status_code != 404:
            filename = f'./{from_date}.csv'
            response_object.raise_for_status()

            file = open(filename, 'wb')
            file.write(response_object.content)
            print(f'The CSV content was received and saved as {filename}.')
            break

        iteration += 1
        print(f'{iteration}. Export not ready. Will try again in 10 seconds.')
        time.sleep(10)


# Request that an event export be prepared.
from_date = datetime.strptime('2022-12-01T00:00:00+0400', '%Y-%m-%dT%H:%M:%S%z')
export_id = make_export_creation_request('desks', from_date)

# Get the export contents and save it to disk.
retrieve_export(export_id, from_date)

How to load a desk export CSV into Snowflake 

Before you can load the Robin desk export (Analytics > Exports > Desks) into Snowflake you need to create a database, schema, and table to load the CSV data into. Follow the commands below: 

1. Log into Snowflake and create a database: 

create database if not exists robin_test;

2. Create a schema in the new database:

create schema if not exists robin_test.test_schema;

3. Now create the table to load the desk export CSV into:

create or replace table robin_test.test_schema.desk_export (
    organization string,
    building string,
    floor string,
    desk_group string,
    desk_id number,
    desk_name string,
    reservation_id string,
    series_id number,
    type string,
    "START" timestamp_ntz,
    end timestamp_ntz,
    hour_duration number,
    checked_in_at timestamp_ntz,
    canceled_at timestamp_ntz,
    cancellation_type string,
    creator_name string,
    creator_department string,
    creator_groups string,
    creator_email string,
    assignee_name string,
    assignee_department string,
    assignee_groups string,
    assignee_email string);

4. In order for Snowflake to know how to handle the uploaded desk export file, you need to specify the format of the file. For example, we define a CSV format named "Robincsv," as well as a variety of settings to ensure that we can load the API data successfully. 

create or replace file format robincsv
  type = 'CSV'
  field_delimiter = ','
  record_delimiter = '\n'
  FIELD_OPTIONALLY_ENCLOSED_BY = '0x22'
  skip_header = 1;

5. Next, define a stage in Snowflake.  A stage is essentially an intermediate area where the data resides in Snowflake before it's loaded into a table. The example below defines the stage named "robin_csv_stage" and tells Snowflake that the files we'll upload to this stage will match the CSV file format we defined above. 

create or replace stage robin_csv_stage
  file_format = robincsv;

6. So at this point, you should have a file with data stored locally, a defined file format, and a stage associated with that file format. Now you're ready to load the data into Snowflake. To do that, install the SnowSQL CLI.

7. After SnowSQL CLI is installed & it's configured to authenticate to your Snowflake account, run the command below via the CLI to load your desk export data to your stage:

put file:///<path>/<to>/<file>/robin_desk_export.csv @robin_csv_stage auto_compress=true;

This tells the CLI to load your data to your stage and automatically compress the data for efficient storage.  

8. Finally, you're ready to load the data from your stage to your table using the Snowflake Web console: 

copy into desk_export
  from @robin_csv_stage/robin_desk_export.csv.gz
  file_format = (format_name = robincsv)
  on_error = 'skip_file';

At this point, you can query your table to see the data that you exported from the Robin Analytics API earlier:

select * from robin_test.test_schema.desk_export;

Did this article help?