Outsourcing SEO Services / Company
Why Choose Blazingcoders for SEO Outsourcing? Your go-to source for outsourcing software and digi
Read More
From what I’ve read and seen, CloudFront does not consistently identify itself in requests. But you can get around this problem by overriding robots.txt at the CloudFront distribution.
1) Create a new S3 bucket that only contains one file: robots.txt. That will be the robots.txt for your CloudFront domain.
2) Go to your distribution settings in the AWS Console and click Create Origin. Add the bucket.
3) Go to Behaviors and click Create Behavior: Path Pattern: robots.txt Origin: (your new bucket).
4) Set the robots.txt behavior at a higher precedence (lower number).
5) Go to invalidations and invalidate /robots.txt.
Now domainname.cloudfront.net/robots.txt will be served from the bucket and everything else will be served from your domain. You can choose to allow/disallow crawling at either level independently.
Another domain/subdomain will also work in place of a bucket, but why go to the trouble.
Why Choose Blazingcoders for SEO Outsourcing? Your go-to source for outsourcing software and digi
Read MoreBlazingcoders in India has a dedicated team of testers to perform Integration testing. Integration t
Read MoreWays to Speed up OpenCart website There are several ways to Optimize OpenCart website that
Read More