All Collections
Integrations
Adding a Source - Cloud Storage
Adding a Source - Amazon S3 Cloud Storage
Adding a Source - Amazon S3 Cloud Storage
Updated over a week ago

If your data is residing in a cloud storage location, you can integrate this source type to securely extract your data from the location.

To begin, select a cloud storage provider, shown below is Amazon S3 as a cloud storage.


The data from your cloud storage sources is collected via our secure connector integration. After you configure your cloud storage account with your credentials, our platform will extract the data in batches every hour, and transfer it to your secure bunker.


Setup Source - Amazon S3

  1. Enter the source name

    Enter the following details to configure your source setup

  2. Key Prefix - The delimiter and prefix combination for your bucket name in case it resides inside folders.

    Note -

    a. Key Prefix - The value should be the folder structure path in which the ad logs files

    are available in your bucket. The Bucket name should not be included in this field value:

    Ex: If ad logs are present in below location

    s3://lifesight-adlogs/dooh/Campaign_Name/logs/adlog.csv

    The values to be filled is

    Bucket Name : lifesight-adlogs

    Key Prefix : dooh/Campaign_Name/logs/

    Please provide key prefix to the parent folder of ad logs so that the system can

    auto pickup if multiple ad log files are present inside the logs folder.

    b. Please ensure that within your Cloud Storage infrastructure each campaign has its own bucket to place/store campaign specific adlogs.

    c. We would recommend that the naming convention to enable rapid identification + processing is to include the campaign name and date range of the adlogs contained in each adlog file.

  3. Bucket Region - Bucket region where your bucket resides. For example: "us-east-2"

  4. Bucket Name - Amazon S3 bucket name in which your data resides is your globally unique namespace shared across AWS accounts in a given bucket region.

  5. Secret Token - Like the Username/Password pair you use to access your AWS Management Console, Access Key Id and Secret Access Key are used for accessing buckets through APIs.

  6. Access Key - Like the Username/Password pair you use to access your AWS Management Console, Access Key Id and Secret Access Key are used for accessing buckets through APIs.

  7. After filling in all the above details , you can click Next to move to the Schema mapping section.


Schema Mapping

  1. If your cloud storage is successfully integrated, you will be taken to the schema mapping section where we will require you to map your data to our data schema. To map the attributes, select the drop down menu under “Factori Data” and select the respective attribute.

  2. You can select the “Primary ID” to indicate the primary identifier that you would like to use.

  3. Click “Save” to complete adding the source.

Did this answer your question?