Before we start, make sure you don't prefer the recommended method of streaming S3 data - which is to setup a Lambda that will trigger when new S3 objects are created and stream them to Sophie.

Prerequisites

To have Loom connect to your AWS S3 bucket and read the data directly from it, please first make sure that the objects (files) within every "folder" are alphabetically-ordered in alignment with the time of writing, i.e. the alphabetical sorting matches the time-ordering of the events within the objects.

Here's an example of a good "folder" structure:

app1
|-- host1
|    |-- 2019-03-18T08:31:11Z.txt.gz
|    |-- 2019-03-17T11:21:19Z.txt.gz
|    +-- ...
+-- host2
    |-- 2019-03-17T05:30:19Z.txt.gz
    +-- ...

Here's an example of a bad one:

app1
|-- host1-2019-03-18T08:31:11Z.txt.gz
|-- host2-2019-03-17T05:30:19Z.txt.gz
|-- host1-2019-02-10T11:21:19Z.txt.gz
+-- ...

First step - granting permissions

Go to AWS console, then to S3. Open the properties of the target bucket.
Under permissions, add aws@loomsystems.com as Grantee and give it read permissions.
If this doesn't work, replace aws@loomsystems.com with the following and try
again: 09bf21d91fa6deeeb28e81c2d5755494adf8e0650f623ce8f63b865c7721c489 

If you prefer to create a policy for an AWS Account ID, use the following:
5330-7262-0987

Second step - configuring the data-input in Loom 

Open Loom in your browser and go to the data-sources screen.
Click "add a new data source":

Choose S3 and fill out the form. Leave the optional keys fields empty - this will tell Loom to use its own account keys:

Click "Test & Save" - and you're done!

Did this answer your question?