Saturday, December 17, 2016

Is your SQL Server Database backup good enough? Can it save you during a disaster?

Met a DBA from a company who are in business for ~2 years. During the conversation came to know they have never tested their backup file - not even once. Also, they have very rarely used DBCC CheckDB command. Should I call it as surprised or shocked?

It has become common to see Database Administrators to have fancy DB Backup plan, automate it and that's it. Their idea is whenever the need arises (read as disaster strikes) they can make use of it and restore the database. Although theoretically sounds like a good plan it actually isn't. Why?

All those efforts to take regular backups would become completely useless if those files can't be used to recover the database. One important question missed by many companies to whom I have consulted for - "Do we regularly make sure that we are able to restore a database from our backups?".

Despite backup process has succeeded how do we know it isn't corrupted (or) has some issues which don't allow it to be restored properly?

Our data is only as good as our last restorable backup

The bottom line is unless we do a database restore somewhere, we can't be completely sure that a given backup file is good.

At a high level, based on the business need we should first decide on:
- How much of "downtime" the system can have,
- How much of "data loss" is acceptable,
- After that put together a "Backup plan" to satisfy those requirements.
- Then the "Restore plan" should make sure all backup files are verified to be perfect and restorable in a test environment. Believe one would be already aware & using DBCC CheckDB often if not check the link provided at the end of this article.

DBA need to have a system that allows them to periodically review their plan and ensure they can get everything back up & running again.

Take away: I can't stress enough - Test your Backup file by periodically restoring in a test environment and make sure it's fine.

Recommended for further reading:

1. Paul Randal's The Accidental DBA (Day 8 of 30): Backups: Planning a Recovery Strategy
2. Award winning Maintenance script of Ola Hallengren
3. DBCC CheckDB FAQ by Kendra Little

Thursday, December 15, 2016

Positives of Cyclone Vardah - Unity, Humanity, Self-help

Chennai, Cyclone Vardah (Dec 12, 2016) - In person, have never witnessed anything as close to this in my life. The winds were so powerful it has uprooted close to 15 trees in our colony, the howling sound of the winds blowing was so scary which gives me goosebumps even thinking of it now. It was a complete carnage by "Vardah" on that day.

Our colony had ~50 trees and out of which we lost 15 odd trees now. Minimum of 2 trees has fallen in each street. Colony compound wall has collapsed as one tree had fallen straight over it. Everywhere we were able to see only leaves, fallen branches, fallen trees, cut cables and not an inch of the actual road was clearly visible.

Bad things do happen in the world, like war, natural disasters, disease. But out of those situations always arise stories of ordinary people doing extraordinary things - Daryn Kagan

The positive side of it was from each street at least 2 members automatically volunteered and in no time we were a solid team of 25 to 30 members. It had all walks of people from varied age group and a mix of both male & female. Few started cutting down the fallen trees to make the road accessible for others, few started sweeping the roads, Tea & snacks served by some for the volunteers etc.,

After ~3 days of effort, almost 95% of fallen trees were cut into small pieces by our volunteers and most of the garbage were swept from the road by us. A big share of credit goes to the younger generation (kids of the age group of ~10 to college goers) who were very active and contributed enormously in this effort. Pretty happy that our colony is having a good amount of socially active kids - Healthy sign & feels safe.

The icing on the cake was at the end of the day a colony doctor with the help of few of the volunteers have arranged for TT injection for all those who were in need of it. I myself had cut my left-hand finger and my left foot with a glass piece, got bruises on my leg etc., and was planning to visit a doctor to get a TT as it's been more than 5 yrs since I had that vaccination. It certainly was a nice timely gesture from the doctor.

Overall during this tough times found that,
no specific gender was superior
no specific caste was superior
no specific religion was superior
no specific profession was superior
It's the unity & humanity which won at the end of the day.

Wish it remains the same and am really proud of the way our colony people pulled together to recover from this disaster. Once again I would say there is no better place to live in Chennai than "Park Dugar,  Ramavaram" :)

First time ever I haven't taken any pictures maybe will update this post later after collecting the snaps from others who might have taken it.

Related:

These videos from cyclone Vardah might explain the intensity of the winds

1. Car Topples - https://youtu.be/JsSOuwx-gW8
2. Bus Flipping over - https://youtu.be/q7LjSYpiDTU
3. People trying to get to safety from that flipped bus - https://youtu.be/Dw5iY4hXwXw

Monday, December 12, 2016

How I got billed even though I had registered as an AWS Free Tier user

Amazon AWS offers 12 months free tier to get started with their services. At a high level went through this link and created a free tier account to play around.

Approximately after a month checked the Billing dashboard and was surprised to see that I am getting billed $0.81 for some usage under the heading "Elastic Cloud Compute" (EC2).

Points mentioned about EC2 in AWS Free tier - 12 Months Introductory Period


So I was mindful of those numbers mentioned there alone. But only after seeing the detailed bill understood that there are few items which are getting billed on an hourly basis even within Free Tier.

Billing Dashboard:



Billing Summary



Billing Details


Looks like NAT Gateway & Elastic IP address usage are charged even in Free Tier.

$0.056 per GB Data Processed by NAT Gateways
$0.056 per NAT Gateway Hour (I had used it for 6 hrs)
$0.005 per Elastic IP address not attached to a running instance per hour on a pro rata basis (I had used it for 72.4 hrs)

Just blogged it for those like me who weren't aware or didn't read some fine print where they might(!!) have actually mentioned this :) But for now I consider it as a hidden charges and frankly expected Amazon to upfront list down all things which are chargeable in Free tier.

Also would suggest to read about Billing Alarms

"If you want the alarm to trigger as soon as you go over the free tier, set When my total AWS charges for the month exceed to $.01. This means that you'll receive a notification as soon as you incur a charge. Otherwise, set it to the amount you want to trigger the alarm, and you will be notified when you go over that amount."

Friday, December 09, 2016

From a Windows Machine - Connect (ssh) to Linux Instances running in a Private Amazon VPC

To start with the following tools need to be downloaded:

1. PuTTY
2. PuTTYGen
3. Pageant


If you are a Windows user & trying to connect to (SSH into) an AWS EC2 instance then you need to use PuTTY.

During the process of provisioning an EC2 instance, you would have created/downloaded a key pair file which would be in a .pem format. But PuTTY doesn't support the .pem format and it needs the key pair file to be in .ppk file format. That's where PuTTYGen comes into play. PuTTYGen helps in converting a .pem file into a .ppk file with a click of a button.

Connecting to an EC2 instance in a public subnet:

1. Open PuTTyGen
2. Load >> Choose the .pem which you want to convert
2.1 [Optional] Provide a key passphrase & confirm passphrase. For simplicity sake, I skip it for now.



3. Click on "Save private key"
4. Open Putty
5. Enter the hostname / IP - For ex: ec2-user@35.154.74.77
6. Copy paste that into Saved Sessions textbox as well
7. Category > Connection > SSH > Auth > Choose the .ppk file which you saved in step 3.
8. Category > Session > Click on Save.
9. Now click on "Open" to ssh to that instance.

Connecting to an EC2 instance in a private subnet:

EC2 instance in a private subnet will not have a Public IP instead it would only have a Private IP. So the question in hand is from a windows machine how would you connect to this EC2 instance with Private IP on a private subnet.

1. For this, we need to make use of Pageant. It's actually a windows service. Just double click to run Pageant and it would sit in the system tray doing nothing until you add a private key into it

2. Double clicking on the Pageant icon in the system tray will open up the "Pageant key list" dialogue box. 

3. Choose the .ppk file by clicking on "Add Key". Pageant will now load the private key. If the key is protected by a passphrase, Pageant will ask you to type the passphrase. When the key has been loaded, it will appear in the list in the Pageant window.



4. Once you see a screen like shown above click on "Close".

(Now if you start PuTTY and open an SSH session which expects this keypair file. It will notice that Pageant is running, retrieve the key automatically from Pageant, and use it to authenticate. You can now open as many sessions as you like without having to type your passphrase again.)

5. Now open PuTTy and if you had saved the earlier session the name would be listed there. Click on it and Load.

6. Navigate to connection configuration section and check "Allow Agent Forwarding". For me, it worked only after providing the "Private key file for authentication" here as well. [But you can check by leaving it empty as well.]


Do not forget to save this session again so that this change is also saved in PuTTY.

7. As agent forwarding enabled in the PuTTY configuration, from this connection you can connect to any other instance in the VPC without having the SSH private key here. 

Assuming our EC2 Instance private IP is 10.0.2.82 we can just do a basic SSH command as shown below and connect into it.


Monday, December 05, 2016

AWS fatal error: An error occurred (400) when calling the HeadObject operation: Bad Request

While using AWS and trying to copy a file from a S3 bucket to my EC2 instance ended up with this error message.

Command Used: aws s3 cp s3://mybucketname/myfilename.html /var/www/html/
Error: fatal error: An error occurred (400) when calling the HeadObject operation: Bad Request

The error goes off if we add the region information to the command statement. I am using Asia Pacific (Mumbai) so used ap-south-1 as the region name.

Modified Command:
aws s3 cp s3://mybucketname/myfilename.html /var/www/html/ --region ap-south-1