Before we start, make sure you don't prefer the recommended method of streaming S3 data - which is to setup a Lambda that will trigger when new S3 objects are created and stream them to Sophie.
To have Loom connect to your AWS S3 bucket and read the data directly from it, please first make sure that the objects (files) within every "folder" are alphabetically-ordered in alignment with the time of writing, i.e. the alphabetical sorting matches the time-ordering of the events within the objects.
Here's an example of a good "folder" structure:
| |-- 2019-03-18T08:31:11Z.txt.gz
| |-- 2019-03-17T11:21:19Z.txt.gz
| +-- ...
Here's an example of a bad one:
First step - granting permissions
Go to AWS console, then to S3. Open the properties of the target bucket.
Under permissions, add firstname.lastname@example.org as Grantee and give it read permissions.
If this doesn't work, replace email@example.com with the following and try
If you prefer to create a policy for an AWS Account ID, use the following:
Second step - configuring the data-input in Loom
Open Loom in your browser and go to the data-sources screen.
Click "add a new data source":
Choose S3 and fill out the form. Leave the optional keys fields empty - this will tell Loom to use its own account keys:
Click "Test & Save" - and you're done!