?>

Integrate Basin with Amazon EC2

Appy Pie Connect allows you to automate multiple workflows between Basin and Amazon EC2

  • No code
  • No Credit Card
  • Lightning Fast Setup
20 Million man hours saved

Award Winning App Integration Platform

About Basin

Basin is a basic form backend that lets you collect data from submissions without writing a single line of code.

About Amazon EC2

Amazon Elastic Compute Cloud (Amazon EC2) is a web service provides secure, reliable, scalable, and low-cost computational resources. It gives developers the tools to build virtually any web-scale application.

Amazon EC2 Integrations

Best ways to Integrate Basin + Amazon EC2

  • Basin Integration Amazon EC2 Integration

    Basin + Amazon EC2

    Start Stop or Reboot Instance in Amazon EC2 when New Submission is created in Basin Read More...
    Close
    When this happens...
    Basin Integration New Submission
     
    Then do this...
    Amazon EC2 Integration Start Stop or Reboot Instance
  • Basin Integration Salesforce Integration

    Basin + Salesforce

    Add new Basin submissions to Salesforce as leads. Read More...
    Close
    When this happens...
    Basin Integration New Submission
     
    Then do this...
    Salesforce Integration Create Record
    Transform any Basin submission into an opportunity in Salesforce. This Basin-Salesforce integration will automatically create leads in your Salesforce account corresponding to new Basin submission so that you can focus on moving them down the funnel, not wrangling with data entry.
    How This Basin-Salesforce Integration Works
    • A new form submission is received on Basin
    • Appy Pie Connect adds new lead to Salesforce
    What You Need
    • Basin account
    • Salesforce account
  • Basin Integration AWeber Integration

    Basin + AWeber

    Add new AWeber subscribers from new form submission in Basin Read More...
    Close
    When this happens...
    Basin Integration New Submission
     
    Then do this...
    AWeber Integration Create Subscriber
    Use this Appy Pie Connect integration to instantly add new customers from Basin into your AWeber account. By enabling this Basin-AWeber integration, every new submission received in Basin will be automatically added to your AWeber account as a new subscriber. This is a great way to kick off successful email campaigns complete with the correct details automatically.
    How This Basin-AWeber Integration Works
    • A new form submission is received on Basin
    • Appy Pie Connect adds that contact to AWeber as new subscriber
    What You Need
    • Basin account
    • AWeber account
  • Basin Integration Google Sheets Integration

    Basin + Google Sheets

    Create Google Sheet rows on new Basin form submissions Read More...
    Close
    When this happens...
    Basin Integration New Submission
     
    Then do this...
    Google Sheets Integration Create Spreadsheet Row
    Get the most out of your new Basin forms by connecting it to Google Sheets. This Basin-Google Sheet integration will create rows in a Google sheet each time users submit forms on your Basin, allowing you to keep a historical record of all the data you've collected. Each row will be a unique submission to your spreadsheet.
    How This Integration Works
    • A new form submission is received on Basin
    • Appy Pie Connect adds that contact to AWeber as new subscriber
    What You Need
    • Basin account
    • Google Sheets account
  • Basin Integration Gmail Integration

    Basin + Gmail

    Create Draft to Gmail from New Submission in Basin Read More...
    Close
    When this happens...
    Basin Integration New Submission
     
    Then do this...
    Gmail Integration Create Draft
  • Basin Integration {{item.actionAppName}} Integration

    Basin + {{item.actionAppName}}

    {{item.message}} Read More...
    Close
    When this happens...
    {{item.triggerAppName}} Integration {{item.triggerTitle}}
     
    Then do this...
    {{item.actionAppName}} Integration {{item.actionTitle}}
Connect Basin + Amazon EC2 in easier way

It's easy to connect Basin + Amazon EC2 without coding knowledge. Start creating your own business flow.

    Triggers
  • New Submission

    Triggers when a user submits to your form.

  • New Instance

    Triggers when a new instance is created.

  • New Scheduled Event

    Triggers when a new event is scheduled for one of your instances.

    Actions
  • Start Stop or Reboot Instance

    Start Stop or Reboot Instance

Compliance Certifications and Memberships

Highly rated by thousands of customers all over the world

We’ve been featured on

featuredon
Page reviewed by: Abhinav Girdhar  | Last Updated on July 01, 2022 5:55 am

How Basin & Amazon EC2 Integrations Work

  1. Step 1: Choose Basin as a trigger app and authenticate it on Appy Pie Connect.

    (30 seconds)

  2. Step 2: Select "Trigger" from the Triggers List.

    (10 seconds)

  3. Step 3: Pick Amazon EC2 as an action app and authenticate.

    (30 seconds)

  4. Step 4: Select a resulting action from the Action List.

    (10 seconds)

  5. Step 5: Select the data you want to send from Basin to Amazon EC2.

    (2 minutes)

  6. Your Connect is ready! It's time to start enjoying the benefits of workflow automation.

Integration of Basin and Amazon EC2

Basin is a community effort to build a global decentralized database service. It is an open source and the purpose of this project is to provide unlimited storage via IPFS. Last year Amazon announced the Amazon EBS (Elastic Block Store. which is a storage for volumes and it contains a block level storage service having three types of volume. Magnetic, General Purpose SSD (gp2), and Provisioned IOPS (io1. With the help of these volumes one can store their data as a file and as an object and this data can be used in any cloud like AWS, Azure, Google Cloud, etc. So, Basin can serve as the interface between Amazon EBS and IPFS.

Amazon EBS is already integrated with the S3 (Simple Storage Service. The S3 provides simple storage services and it also makes use of IPFS. S3 provides a way to store the data at rest but it does not provide any access control or encryption or authentication. The S3 is often used for web hosting purposes like websites, photos, videos, etc.

The Basin will be an interface between IPFS and Amazon EBS. It will store the data as files in an encrypted form. It will provide the access controls for data using a cryptographic key based on an encryption scheme. It will make use of Amazon’s IAM for authentication purposes.

    Integration of Basin and Amazon EC2

The Basin will be implemented as a plugin to the existing IPFS daemon that comes built-in with EC2. The API, which is responsible for interacting between the Basin plugin and the IPFS daemon will be provided by the Basin application itself. The complete process of accessing the Amazon EBS volumes from Basin through EC2 will be explained below:

Step 1. First of all, the user will log in into the EC2 that runs on AWS hosted by Amazon. The EC2 should be running on Linux OS. The user will choose an instance type according to his needs. This process will involve creating a new instance on an AWS EC2. For this, one must have an AWS account and then create a new instance using Amazon Machine Image (AMI. which is created using the Amazon Linux AMI and has the pre-installed tools needed to run the Basin plugin on it. In this case, we are assuming that our EC2 instance uses CentOS 7 AMI which has a pre-installed IPFS daemon version 0.4.14. When IPFS starts up, it loads modules in memory so making changes to the code requires restarting IPFS daemon. After launching the instance one can get its IP address using the AWS console. The console allows users to connect to instances from their browser using secure shell (SSH. or remote desktop connection (RDP. using their public and private keys and these keys need to be generated before launching the instance itself.

Step 2. After logging into the EC2 instance it should be updated with all the latest packages that are available in the AWS repository using yum update command in Linux terminal and it should also be updated with the IPFS daemon through git pull command in Linux terminal.

Step 3. Next step is to install the Basin application by following the instructions given here https://basin-project.github.io/docs/installation/aws/index.html . Once installed, there are two ways of interacting with Basin. command line interface (CLI. or Python API. CLI provides more options than Python API does, so we will go with CLI because it’s easier to understand when you’re just starting out with Basin. To start using CLI, we need to cd into the basin directory inside the GOPATH and then run basin commands from there. We can see all available commands by typing basin --help in Linux terminal and then run basin setup --help command to see all available options related to setting up Basin on Amazon EBS volumes. One can find all essential information about running Basin in their official documentation https://basin-project.github.io/docs/running-basin/index.html . We can see that there are two ways of accessing Basin from Amazon EBS volume. one by way of SFTP protocol and another one by way of NFS protocol where NFS is provided by Amazon EBS itself. SFTP is popular among users who want some sort of security over their data whereas NFS allows one to manage permissions on a per-directory basis and so we will go with NFS protocol for our explanation here since it gives us more flexibility to manage access controls on our data while maintaining some level of encryption over our data since encryption is done using user’s public key which is stored in user’s home directory on Amazon EBS volume itself.

To get started with setting up our Basin plugin on Amazon EBS volumes we need to first create a bucket on S3 and then create an IAM policy for user that needs to access our new bucket on S3 which can be found in the AWS console at AWS IAM console under Policies section under Policy Templates section. This policy needs to include s3:GetObject , s3:PutObject , s3:DeleteObject , and s3:ListBucket permissions for that user without granting him any other permissions aside from these four permissions listed here https://docs.aws.amazon.com/AmazonS3/latest/dev/access-policy-variables-s3-actions-full-control.html#s3_permissions . After creating this policy we need to attach it to our newly created bucket on S3 which we can do so by clicking Attach Policy button at Access Control tab shown in S3 console within AWS console at Resources section under Buckets section where we can see our new bucket with default name which we have given it during creation time along with a description if any given during creation time and clicking Show Attachments button given at Attachments section would show us a list of all policies attached to our new bucket on S3 where we need to select our newly created IAM policy and click Update bucket button given at Permissions section shown below Attachments section at Resources section within Buckets section shown within AWS console and then give write permissions to this newly created bucket by clicking Set Permissions button given at Permissions section shown below Attachments section at Resources section within Buckets section shown within AWS console after selecting our newly created bucket from drop down given at Permissions section shown below Attachments section at Resources section within Buckets section shown within AWS console to give write permissions for user that needs read/write access for this particular bucket as shown below screenshot where we can see read permission being given only for user that needs to read objects from this newly created bucket:

Once this is done, we can create our directory structure for storing data within this newly created bucket on S3 where we have called it test directory as shown below screenshot where we need to give appropriate permissions for this directory as well:

Now we need to download our IAM user credentials file through wget command in Linux terminal by downloading them from https://basin-project.github.io/docs/getting-credentials/index.html . We need to give appropriate permissions for this file as well through chmod command in Linux terminal by giving appropriate permissions for this file as well which can be done by running chmod 600 iam_credentials file command in Linux terminal once file is downloaded from https://basin-project.github.io/docs/getting-credentials/index.html . We should now have access credentials ready for use as shown below screenshot:

We are now ready to set up an AWS security group through which we will allow packets originating from EC2 instance that runs our Basin plugin onto AWS network so that there won’t be any major security issues because nobody wants unauthorized traffic from outside into their network and so we can do so through security groups under Network & Security section within Security Groups section within Network & Security tab shown on bottom left hand side panel within AWS console where we can apply TCP rules to port 22 (SSH. so that corresponding port would be opened for our EC2 instance only as shown in above screenshot where we can see TCP rule number 2 applied for port 22 only for instance running our Basin plugin so that there won’t be any unauthorized traffic coming into network from outside source such as hackers and others where ports 8080 , 9001 , 80 , 443 , 5500 , 50070 , 50075 , 50076 , 50077 , 8080 , 9001 , 80 , 443 , 5500 , 50070 , 50075 , 50076 , 50077 are allowed through firewall only for EC2 instance that runs our Basin plugin as shown in above screenshot

The process to integrate Basin and Amazon EC2 may seem complicated and intimidating. This is why Appy Pie Connect has come up with a simple, affordable, and quick solution to help you automate your workflows. Click on the button below to begin.