- Have Loom ingest textual, machine-generated data of at least 2-4 source-types. Note that it is very important to keep the different source types in separate files, so Loom's auto-extraction and structure recognition algorithms work for each format separately.
- Data is to be captured continuously for a period of no less than two weeks.
- It is recommended that the data weighs at least 100GB, but no more than 1TB.
- Data should be such that it is currently being used for performing root-cause analysis in your teams, i.e. that problems are solved today by analyzing this data.
- The chosen data sources should be ones that are "painful", i.e. sources which often experience issues, and which convey business impact when unhealthy.
- Make sure the logs are timestamped. in a real time streaming scenario Loom can handle log lines as they arrive, but when working with a dump file, timestamps should appear on each line for best results.
Upload your files to your S3 account - Best practice is to upload your files to an AWS S3 bucket and configure Loom to read from your S3 bucket.
Share the files over Secured FTP - Send a request for setting up an FTP connection to this address FTP@loomsystems.com, and you will be granted with specific credentials and a guide for uploading files using SFTP.
- Share your files using any of the common cloud utilities (such as Google Drive, Dropbox, Box, Microsoft's OneDrive) and provide us with credentials to read-only. Contact us at email@example.com.