Skip to main content

Posts

Showing posts from December, 2016

Is your SQL Server Database backup good enough? Can it save you during a disaster?

Met a DBA from a company who are in business for ~2 years. During the conversation came to know they have never tested their backup file - not even once. Also, they have very rarely used DBCC CheckDB command. Should I call it as surprised or shocked? It has become common to see Database Administrators to have fancy DB Backup plan, automate it and that's it. Their idea is whenever the need arises (read as disaster strikes) they can make use of it and restore the database. Although theoretically sounds like a good plan it actually isn't. Why? All those efforts to take regular backups would become completely useless if those files can't be used to recover the database. One important question missed by many companies to whom I have consulted for - "Do we regularly make sure that we are able to restore a database from our backups?". Despite backup process has succeeded how do we know it isn't corrupted (or) has some issues which don't allow it to be rest...

Positives of Cyclone Vardah - Unity, Humanity, Self-help

Chennai, Cyclone Vardah (Dec 12, 2016) - In person, have never witnessed anything as close to this in my life. The winds were so powerful it has uprooted close to 15 trees in our colony, the howling sound of the winds blowing was so scary which gives me goosebumps even thinking of it now. It was a complete carnage by "Vardah" on that day. Our colony had ~50 trees and out of which we lost 15 odd trees now. Minimum of 2 trees has fallen in each street. Colony compound wall has collapsed as one tree had fallen straight over it. Everywhere we were able to see only leaves, fallen branches, fallen trees, cut cables and not an inch of the actual road was clearly visible. Bad things do happen in the world, like war, natural disasters, disease. But out of those situations always arise stories of ordinary people doing extraordinary things - Daryn Kagan The positive side of it was from each street at least 2 members automatically volunteered and in no time we were a solid team ...

How I got billed even though I had registered as an AWS Free Tier user

Amazon AWS offers 12 months free tier to get started with their services. At a high level went through this link and created a free tier account to play around. Approximately after a month checked the Billing dashboard and was surprised to see that I am getting billed $0.81 for some usage under the heading "Elastic Cloud Compute" (EC2). Points mentioned about EC2 in AWS Free tier - 12 Months Introductory Period So I was mindful of those numbers mentioned there alone. But only after seeing the detailed bill understood that there are few items which are getting billed on an hourly basis even within Free Tier. Billing Dashboard: Billing Summary Billing Details Looks like NAT Gateway & Elastic IP address usage are charged even in Free Tier. $0.056 per GB Data Processed by NAT Gateways $0.056 per NAT Gateway Hour (I had used it for 6 hrs) $0.005 per Elastic IP address not attached to a running instance per hour on a pro rata basis (I had use...

From a Windows Machine - Connect (ssh) to Linux Instances running in a Private Amazon VPC

To start with the following tools need to be downloaded : 1. PuTTY 2. PuTTYGen 3. Pageant If you are a Windows user & trying to connect to (SSH into) an AWS EC2 instance then you need to use PuTTY. During the process of provisioning an EC2 instance, you would have created/downloaded a key pair file which would be in a .pem format. But PuTTY doesn't support the .pem format and it needs the key pair file to be in .ppk file format. That's where PuTTYGen comes into play. PuTTYGen helps in converting a .pem file into a .ppk file with a click of a button. Connecting to an EC2 instance in a public subnet: 1. Open PuTTyGen 2. Load >> Choose the .pem which you want to convert 2.1 [Optional] Provide a key passphrase & confirm passphrase. For simplicity sake, I skip it for now. 3. Click on "Save private key" 4. Open Putty 5. Enter the hostname / IP - For ex: ec2-user@35.154.74.77 6. Copy paste that into Saved Sessions textbox as well 7. ...

AWS fatal error: An error occurred (400) when calling the HeadObject operation: Bad Request

While using AWS and trying to copy a file from a S3 bucket to my EC2 instance ended up with this error message. Command Used: aws s3 cp s3://mybucketname/myfilename.html /var/www/html/ Error: fatal error: An error occurred (400) when calling the HeadObject operation: Bad Request The error goes off if we add the region information to the command statement. I am using Asia Pacific (Mumbai) so used ap-south-1 as the region name. Modified Command: aws s3 cp s3://mybucketname/myfilename.html /var/www/html/ --region ap-south-1