Saturday, January 23, 2016

Amazon Web Service Part 2 - Lou Person

Amazon Web Service Part 2 - Lou Person


This is a follow-up to my AWS experience so far.  I've learned a great deal since the first 1.5 hours and have more to learn.  I figured today was a good day because we are snowed in by Blizzard Jonas.

I was having some configuration issues that I'm sure I could have resolved on my own, but wanted to cut down the cycle.  If I could save myself an hour or two, I figure it was worth close to $200 of my time.  My free subscription to AWS included the "Basic" plan and I'm sure it would have been fine, but it did not include any SLA's, chat or phone support.  Developer support seemed reasonable at $49/month, but was limited to local business hours and email contact only and since I am learning AWS during non selling hours, I needed 24/7 support that I could communicate with easily (which for me meant chat and phone).  I choose Business for $100/month because it provided 24/7 access via chat, phone and email.  I have used it a few times and it is fantastic.  The engineers I work with solved my issues very quickly and never said "sorry, not covered".  They always provide me with more knowledge above and beyond the specific  issue I called in about. Since I am engaging with support, the issues I have are complicated otherwise I would have been able to solve them on my own.  I can call as many times as I need and the support engineers are experts with technology and have outstanding communication skills.  They are also super nice. They help me with complex issues such as network routing, windows server configuration, dns, active directory, VPN, etc.  For $100/month, I feel as if I have a the world's best engineers on my team (on par with the team at brightstack CIS). 

While talking with the engineers, they pointed out that opening up RDP over port 3389 to the world is not secure.  This took me to the marketplace.  From the AWS help topics: "AWS Marketplace is an online store that helps customers find, buy, and immediately start using the software and services they need to build products and run their businesses.  AWS Marketplace complements programs like the Amazon Partner Network and is another example of AWS’s commitment to growing a strong ecosystem of software and solution partners. Visitors to the marketplace can use AWS Marketplace’s 1-Click deployment to quickly launch pre-configured software and pay only for what they use, by the hour or month.  AWS handles billing and payments, and software charges appear on customers’ AWS bill".  I chose an OpenVPN access server which I spun up in a few minutes.  I then followed the directions to create an encryption key, connect via SSH (putty) and complete the configuration.  I installed the VPN client on my local computer and had others on my team do the same.  Connecting was very easy!  However, accessing the server over RDP took some configuring, and that is where support was so helpful.  The marketplace automatically created a new security group with inbound ports open to allow access to the VPN server.  I then had to allow access to the WebserverSG inbound access from the OpenvpnSG.  Finally, the support engineer helped me, over the phone via a screen share, realize that we couldn't use a host name because local DNS (on my computer) did not have the host names for the Amazon servers.  So, for now, we'll just live with accessing services by IP address over the VPN. 

The next service I leveraged was AWS Directory service.  I had the choice of Microsoft AD (recommended for workloads requiring up to 50,000 users) and Simple AD (recommended for workloads up to 5,000 users).  There was also an option for an AD Connector to create a Hybrid Cloud by integrating to an on-premises Active Directory.  I chose the Simple AD.  I tested it by installing AD Users and Computers on the Web server, but couldn't connect to the new domain.  I knew it should be easy, but I was having too much trouble.  My troubleshooting was wasting my time as I wasn't getting anywhere.  I value my time more than $100/month (see above) and this issue prompted me to sign up for the Business support plan.  The engineer who assisted, over chat, pointed out that I need to add inbound traffic for Microsoft AD between the directory security group created when I setup the directory and the subnet for my entire VPC.  That did the trick, I was able to connect using AD Users and Computers.  I added my Web Server Windows AMI to the domain and was now able to create domain users and security groups. Eventually we'll use it to setup group policies.

Next, I needed to add additional storage for the Web Server because the volume that came with the AMI (the c: drive) was filling up and it was needed for the operating system and supporting files.  The services I could chose from currently included:
  • S3 - Simple Storage used to store any amount of data.  I see this more as file storage for low latency applications, such as file sharing, software distribution and large content such as videos.
  • CloudFront - This is storage for a Content Delivery network. 
  • Import/Export with Snowball - This allows IT administrators to send "seed" images of data to AWS to be imported into the AMI using a secure appliance.  Seems smart if there are terrabytes of data to upload with bandwidth limitations at the local site. 
  • Glacier - This is a low latency storage service primarily used for backups and archives.
  • Storage Gateway - This is used to store local copies of files stored at Amazon for local access on the same network that the user resides.  Meaning, the user would pull down a copy of the file closest to where they reside, without having to traverse the public Internet.  Changes are synchronized using the gateway to/from the local image and AWS.
  • Elastic File System - This is the option I would like to implement for the new volume, but it is only in preview.  AWS EFS is a file storage service for Amazon Elastic Computing instances. 
I wound up adding an Elastic Block Storage volume by clicking on the Elastic Block Storage section of the console.  I had to make sure to prevision it in the same Availability zone as my Webserver AMI.  Once created, I simply clicked on it and attached it to the instance of my Webserver (I had to make sure to select the right instance which I first didn't do correctly and couldn't figure out why it wasn't appearing in the instance itself).  I then used disk management tools to format it, allocate it, create a simple partition and assign it a drive letter.  The business benefits are tremendous.  I can add storage as needed, pretty much on the file, without having to reboot the server, attach a SAN or power it down to install hard drives.  I am exited, however, to learn more about EFS once accepted into the preview!

Now, on to Identity and Access Management (IAM).  IAM is described here as: "AWS Identity and Access Management (IAM) is a web service that helps you securely control access to AWS resources for your users. You use IAM to control who can use your AWS resources and what resources they can use and in what ways authorization."  I think the name is funny, in a good way.  IAM as in "I am" for an identity offering, very clever.  "I am able to access resources because I am authenticated".  The first thing I did was add MFA "Multi Factor Authentication" by linking my Google Authenticator to my AWS account.  I scanned a QR Code and opened a web browser and now have an AWS MFA verification code in Authenticator.  This is a very important benefit because I've always said "username and password are not enough".  MFA provides a third level of authentication after username and password.  Once an IAM user is authenticated to the console either through the root Amazon account or another IAM account, the user is then prompted for an authentication code.  This code is generated by the Google Authenticator application on the user's phone.  The google authentication service and the Amazon IAM login service must be synchronizing the Authentication code to validate the user.  Below is how Goolge Authenticator looks on my phone:


I created users and groups and assigned roles to the groups.  Finally I set a password policy and I now have multi factor authentication setup for login to my AWS console!  As soon as I set the policy, AWS console forced a logout and I logged back in using the new IAM integrated root account with multi-factor authentication.  I am also able to login using this account on my Samsung Galaxy to use mobile to manage my AWS console.  I will eventually add other IAM users and grant granular access to my AWS console.  This business benefit of IAM is that I can delegate control of the console to multiple members of my team and only give them access to what they need. Since it uses multi factor authentication, I don't have to worry about security controls and the others mistakenly granting access to unauthorized users.  The best part about it?  It's FREE with my AWS plan.

My next task was to setup Monitoring and Alerting of the system. This was pretty easy to do using Amazon CloudWatch. This service is FREE (excessive alarm notifications may incurr additional charges) and the business benefit is tremendous.  Being able to easily monitor and alarm on key peformance metrics ensures a well run operation.  Monitoring reduces incidents and highlights areas which are low on resources before they run out of capacity.  Quoting from here: "Amazon CloudWatch is a monitoring service for AWS cloud resources and the applications you run on AWS. You can use Amazon CloudWatch to collect and track metrics, collect and monitor log files, set alarms, and automatically react to changes in your AWS resources. Amazon CloudWatch can monitor AWS resources such as Amazon EC2 instances, Amazon DynamoDB tables, and Amazon RDS DB instances, as well as custom metrics generated by your applications and services, and any log files your applications generate. You can use Amazon CloudWatch to gain system-wide visibility into resource utilization, application performance, and operational health. You can use these insights to react and keep your application running smoothly."  It was pretty easy to setup and very fulfilling to see the key metrics on the dashboard.  I also feel comfortable that I'm not missing key events (such as disk storage running low) because I can set alarms based on various metrics and thresholds.  I can then email the team when an alarm is reached.  While creating the alarms, I saw previous history in chart form so I can see what historical usage is so I could easily determine the alarm threshold. 

After my initial experience with Amazon, I figure I am about 4 hours into the project so far.  The support has been world class, I have a fully integrated Active Directory with Multi Factor Authentication, as much storage as I need when I need it and great tools to manage the enviornment.  Things have come a very long way since we built our first Cloud applications back in 1994!

No comments:

Post a Comment