php – mails from amazon ec2 server are not sent stored in mailq

I host my website on the Amazon ec2 Ubuntu 18.08 instance,
I send the details of the contact form to my email id using the PHP Mail function
Emails are stored in mailq I cannot receive a contact form notification
What I need to install.

amazon dynamodb – When the partition key is the primary key, is it possible that 2 different elements are saved in the same partition?

When the partition key is the primary key, is it possible that 2 different items are saved in the same partition?

I have checked the documents and it says that dynamodb will calculate the hash value of the partition key to decide in which partition an item will be saved.
Is it possible that 2 different elements have the same hash value?

Amazon Web Services – AWS Multiple SNS and Single Lambda Topic

I am creating several lambda messages on several SNS subjects and these messages will be received by another single lambda. Please clarify, is there a performance issue on this model.

The implementation can as below.

Lambda 1 -> SNS Subject 1

Lambda 2 -> SNS Subject 2

.
.

Lambda 3 -> SNS Subject N

SNS subject triggers (1 to N) -> Lambda DM

Lambda DM – separate messages according to subscription and send them to API Gateway.

Please give me advice, what are all the problems I might face.

amazon web services – How do I determine the cause of slow I / O speeds in Windows on my EBS provisioned io1 disks in AWS?

In AWS, I have a server running Windows Server 2016 Datacenter. There are two EBS io1 disks provisioned on this server. They are provisioned up to 9,600 IOPS. The E: drive is 2.24 TB wide and is the GPT partitioning style, while the D: drive is 1.99 TB wide and is the MBR partitioning style.

I'm trying to move around 1.7 TB of data (mostly a few large SQL Server database files) from drive D: to drive E: and the write speeds I see in the transfer of Windows files experienced a short burst of around 120MB / s but averaged 30MB / s. The IOPS I see writing to the E: drive was on average about 200 IOPS, but is now much lower.

How can I debug the bottleneck behind my local file transfer speeds between the two drives? (Note: I have never seen writing IOPS over 200 on drive E: even if it is provisioned for 9,600 IOPS.)

amazon web services – How to List All Private Images (IAM) in AWS Cloudformation Settings

How can I list all private AMIs in the dropdown menu of the service catalog.

AWSTemplateFormatVersion : 2010-09-09
Description: "simple web layer"
Parameters:
  ImageId:
    Description: 'web Layer'
    Type: 'AWS::SSM::Parameter::Value'
    AllowedPattern: "^[a-zA-Z][-a-zA-Z0-9]*$"
    Default: ami-244333
    OwnerId: '836749474673'

Error: Invalid parameter property & # 39; ownerid & # 39;

amazon s3 – Uploading photos to AWS S3 Bucket using PHP

I use Heroku to host my application which allows you to upload photos. As you may know, Heroku's file system is fleeting. So I'm trying to save the photo in my S3 bucket. But I get the following error:

PHP Notice:  Undefined variable: s3 in /app/includes/photograph.php on line 93

PHP Fatal error:  Uncaught Error: Call to a member function upload() on null in /app/includes/photograph.php:93

I don't have a lot of experience with S3, so I'm not sure if I also put the code in the right place.

photograph.php

 '2006-03-01',
    'region'   => 'us-east-1',
));

$bucket = getenv('S3_BUCKET')?: die('No "S3_BUCKET" config var in found in env!');
class Photograph extends DatabaseObject {

    protected static $table_name = "photographs";
    protected static $db_fields = array('id','filename', 'type', 'size', 'caption');

    public $id;
    public $filename;
    public $type;
    public $size;
    public $caption;
    private $temp_path;
    protected $upload_dir;
    public $errors = array();
    protected $upload_errors = array(
        UPLOAD_ERR_OK => "No errors.",
        UPLOAD_ERR_INI_SIZE => "Larger than upload_max_filesize.",
        UPLOAD_ERR_FORM_SIZE => "Larger than form MAX_FILE_SIZE.",
        UPLOAD_ERR_PARTIAL => "Partial upload.",
        UPLOAD_ERR_NO_FILE => "No file.",
        UPLOAD_ERR_NO_TMP_DIR => "No temporary directory.",
        UPLOAD_ERR_CANT_WRITE => "Can't write to disk.",
        UPLOAD_ERR_EXTENSION => "File upload stopped by extension."
    );


    // Pass in $_FILE(('uploaded_file')) as an argument
    public function attach_file($file) {
        // Perform error checking on the form parameters
        if (!$file || empty($file) || !is_array($file)) {
            // error: nothing uploaded or wrong argument usage
            $this->errors() = "No file was uploaded.";
            return false;
        } elseif ($file('error') != 0) {
            // error: report what PHP says went wrong
            $this->errors() = $this->upload_errors($file('error'));
            return false;
        } else {
            // Set object attributes to the form parameters.
            $this->temp_path = $file('tmp_name');
            $this->filename = basename($file('name'));
            $this->type = $file('type');
            $this->size = $file('size');
            // Don't worry about saving anything to the database yet.
            return true;
        }
    }

    // Common Database Methods
    public static function find_all() {
        return self::find_by_sql("SELECT * FROM ".self::$table_name);
  }

    public function save() {
        // A new record won't have an id yet.
        if (isset($this->id)) {
          //if(!empty($this->id)){
            // Really just to update the caption
            $this->update();
        } else {
            // Make sure there are no errors
            // Can't save if there are pre-existing errors
            if (!empty($this->errors)) {
                return false;
            }

            // Make sure the caption is not too long for the DB
            if (strlen($this->caption) > 255) {
                $this->errors() = "The caption can only be 255 characters long.";
                return false;
            }

            // filename and temp location
            if (empty($this->filename) || empty($this->temp_path)) {
                $this->errors() = "The file location was not available.";
                return false;
            }

            // Determine the target_path


           *line 93 $target_path = $s3->upload($bucket, $_FILES('userfile')('name'), fopen($_FILES('userfile')('tmp_name'), 'rb'), 'public-read');

            // Make sure a file doesn't already exist in the target location
            if (file_exists($target_path)) {
                $this->errors() = "The file {$this->filename} already exists.";
                return false;
            }

            // Attempt to move the file 
            if (move_uploaded_file($this->temp_path, $target_path)) {
                // Success
                // Save a corresponding entry to the database
                if ($this->create()) {
                    // We are done with temp_path, the file isn't there anymore
                    unset($this->temp_path);
                    return true;
                }
            } else {
                // File was not moved.
                $this->errors() = "The file upload failed, possibly due to incorrect permissions on the upload folder.";
                return false;
            }
        }
    }

    public function destroy() {
        // First remove the database entry
        if ($this->delete()) {

             $target_path = SITE_ROOT . DS . 'public' . DS . $this->upload_dir . $this->filename;
            return unlink($target_path) ? true : false;
        } else {
            // database delete failed
            return false;
        }
    }

    public function image_path() {
        return $this->upload_dir . DS . $this->filename;
    }

    public function size_as_text() {
        if ($this->size < 1024) {
            return "{$this->size} bytes";
        } elseif ($this->size < 1048576) {
            $size_kb = round($this->size / 1024);
            return "{$size_kb} KB";
        } else {
            $size_mb = round($this->size / 1048576, 1);
            return "{$size_mb} MB";
        }
    }

    public function comments() {
        return Comment::find_comments_on($this->id);
    }

    public static function count_all() {
      global $database;
      $sql = "SELECT COUNT(*) FROM ".self::$table_name;
    $result_set = $database->query($sql);
      $row = $database->fetch_array($result_set);
    return array_shift($row);
    }
}
?>

photo_upload.php

is_logged_in()) {
    redirect_to("login.php");
}
?>
caption = $_POST('caption');
    echo "attempting to attach file now";
    $photo->attach_file($_FILES('file_upload'));
    echo "attempting to save file now";
    if ($photo->save()) {
        // Success
        $session->message("Photograph uploaded successfully.");
        redirect_to('list_photos.php');
    } else {
        // Failure
        $message = join("
", $photo->errors); } } ?>

Photo Upload

Caption:

Can anyone help?
thank you,
Iron Man

Image hosting – Amazon, Digital Ocean or Google?

I have a VPS.
I wish to unload the hosting of images and all other media.

Who should I go with?

Amazon? What should I choose? The options are so confusing!
Google? Same thing here – confusing.
Digital ocean? This one is preferred – $ 5 for CDN and get 250 GB, that will be good!

I'm confused … with Google for example …
To stay with Google, there seem to be 3 types of storage (please correct me if I'm wrong):

Google Cloud
Google Cloud CDN – is faster
Personal Google Cloud Storage

I already pay for personal storage on Google Cloud.
Can I just use the space I have there?

I read CDN is the fastest.
But I suppose that the others also have a place?

Can I use personal storage from Google?
If yes … what is preventing someone from opening 1000 Goolge accounts and using the free 15 GB storage?
(I guess they have ways to stop this! + I guess the speeds are much slower or something)

amazon web services – Prevents Cloudfront from transferring part of the way to the origin server

Context:
I have an S3 Bucket (Origin 1) which serves as a static website under the domain example.com using Cloudfront.

Goal:

More i want example.com/subfolder to stream content from second.com (Origin 2). so that the following is true example.com/subfolder = second.com

Currently:

Under the Cloudfront distribution, I configured Origin 1 with behavior Default (*)
and Origin 2 with behavior /subfolder*

Problem:

Going to example.com/subfolder I am served second.com/subfolder

Q:
How and where to adjust the behavior of Cloudfronts not to pass the first part of the URL.

Selling on the Amazon Platform – How Do You Track Changes to Your Ads? Is there a viable application?

How do you track changes on your ads? Is there a viable application? How can you track if the changes lead to more sales / sessions?

Hi everyone,
This has been a problem for me for quite some time now. I am using a regular Excel spreadsheet to track list changes manually for now, which is unachievable and time consuming as you know if you do the same.
SEMrush

I was looking for different seller apps, but couldn't find anything substantial. All different apps, follow specific things like reviews, comments, bindwise apps also track different changes in titles, descriptions and the like. However, they send you these change alerts in an email, which is an unusable format. Also, one very important thing that we would need to track is the "Amazon choice" and its modifications, as well as other badges and active promotions, for which I can't find anything.
Ideally, I would like to be able to export this information to a csv file so that I can import it into Excel.

In the end, the goal is of course to increase / maintain sales. Tracking changes is therefore only the first step. Ideally I would like an app that also correlates list changes with sales changes, trying to find the changes that are most correlated with sales and also to see if the sales changes are statistically significant, or just a normal daily deviation.

Right now, I'm tracking both the changes and the correlation with the sales changes, which is incredibly long and frustrating. The "Amazon choice" badge has a huge influence on our sales, but I only do it manually, which means that I sometimes get a few days late to notice the change.

Also, when I'm trying to see if the change was statistically significant for better or worse, it's hard to see it from the normal graph, so I have to use Excel and plug everything in. At the end of the day, we just don't have the time to do it for all the announcements all the time.

Has any of you tried to do it / solve it? Any suggestions for apps / how you do it?

I'm so frustrated with this that I'm thinking of having an application programmed for it … It won't be cheap though … :( Does any of you have a similar problem and would like to potentially participate in this effort?

Thank you for all suggestions!

amazon web services – Encrypt wildcard SSL that doesn't work behind AWS ELB

I have two servers behind an ELB on AWS. I am using the latest generic SSL functionality from Lets Encrypt, but I am experiencing the following error ERR_SSL_PROTOCOL_ERROR the strange part is that sometimes it will work and sometimes it won't.

I generated the site using the following command with the last certbot

./certbot-auto certonly                                     
--manual 
--preferred-challenges=dns 
--email tech@example.com 
--server https://acme-v01.api.letsencrypt.org/directory 
--agree-tos 
-d *.example.net

Here is also my curl output

curl -v https://admin.example.net
* Rebuilt URL to: https://admin.example.net/
*   Trying 54.222.163.142...
* TCP_NODELAY set
* Connected to admin.example.net (xxx.xxx.xxx.xxx) port 443 (#0)
* ALPN, offering http/1.1
* Cipher selection: ALL:!EXPORT:!EXPORT40:!EXPORT56:!aNULL:!LOW:!RC4:@STRENGTH
* successfully set certificate verify locations:
*   CAfile: /etc/ssl/certs/ca-certificates.crt
  CApath: /etc/ssl/certs
* TLSv1.2 (OUT), TLS header, Certificate Status (22):
* TLSv1.2 (OUT), TLS handshake, Client hello (1):
* error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol
* Closing connection 0
curl: (35) error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol

any idea why this is an unknown protocol? Ironically, my front-end server which is not behind the load balancer is working