Lab 5: Adding EC2 Virtual Machines and Deploying the Web App

The lab script explains the initial step in this lab is “to create roles that access other amazon Services so that applications running on EC2 instances don’t have to have credentials baked into the code.”

In the IAM service, a policy can be created using the Policy Generator. This policy has the following settings:

Part 1
Effect: Allow
AWS Service: Amazon DynamoDB
Actions: deleteitem, describetable, getitem, putitem, updateitem
ARN: arn:aws:dynamodb:ap-southeast-2:[ACCOUNT NUMBER]:table/ASP.NET_SessionState

Part 2
Effect: Allow
AWS Service: Amazon SQS
Actions: deletemessage, deletemessagebatch, getqueueurl, receivemessage, sendmessage, sendmessagebatch
ARN: arn:aws:sqs:ap-southeast-2:[ACCOUNT NUMBER]:dinoorders

The policy is then named ‘DynamoSqsPolicy’
001 DynamoSQS Policy Generator

Again in IAM, a new role needs to be created. The role is called ‘WebServerRole’ and it’s AWS service roles are ‘Amazon EC2’, and it contains the customer managed policy of ‘DynamoSqsPolicy’.
002 IAM WebServerRole

Then in the EC2 service, a new instance can be created with the following settings:
Instance: Free tier Microsoft Windows 2012 R2 Base,
Type: General Purpose t2.micro (free tier available)
IAM Role: WebServerRole
Name: Web Server DSL5 18-4
Security Group: Create new security group

Name: WebRDPGroup
Description: Web or RDP access – created for lab web server ec2 instance
Input Protocol:
RDP -Location IP
HTTP -All Sources

003 WebRDP Instance
With this security group, I attached an already created key pair.

Also in the EC2, another instance needs to be created for the queuing server. Again, a free tier t2.micro Windows Server 2012 R2 Base instance is launched.
IAM Role: WebServerRole
Name: Queue Server DSL5 18-4
Security Group: Create new security group

Name: RDPGroup
Description: RDP access – created for lab queue server ec2 instance
Input Protocol:
RDP -Location IP

004 Queue Server Instance
I also used a previously created key pair for this security group.

For the web server instance, the remote desktop file is downloaded and the password decrypted using the key pair. Once connected to the server, IIS (including asp.Net 4.5 with developer files) HTTP connectors, and Windows authentication role services need to be installed.
005 Install IIS

In Visual Studio, the DinoStore needs to be published as file system which can be copied into the web server.
006 Publishing DinoStore Project

In the web server, the published dinostore file is copied into the folder \inetpub\wwwroot.  In the IIS manager the dinostore folder can be converted to an application by selecting the folder and pressing the ‘convert to application’ option.

007 Copying Files to wwwroot in RDP

Moving file into \wwwroot

008 Convert Dinostore File to Application

Converting file into an application



In order to allow instances in the RDP and WebRDP security groups to access the instances in the RDS security group, the security group created from the RDS is selected and in the inbound tab, two new rules need to be created. Both have type: All traffic, Protocol: All, and Source: Being their respective security group.
009 RDS Sec Group Access to RDPs

Once again in the web server, the Web.config file is opened in Notepad for editing. The DynamoDBSessionStoreProvider keys should be deleted from between their quotations. This also needs to occur for the keys below , then the file can be saved.

If internet explorer is opened in the web server, the link shows the following information, which are temporary credentials.
010 Temp Credentials from Role

In IIS Manager in the web server, the website needs to be selected from the left panel of the window, and the centre pane changed to ‘Content View’. From there, the ‘default.aspx’ can be right-clicked, and the option to ‘browse’ can be chosen. This leads to the DinoStore home page, of which, the various aspects such as login and buy can be used.
013 Dinostore Home on VM

The public DNS of the Web Server DSL5 needs to be tested on a public internet connection. This is done by copying the DNS into a new browser window on the desktop (rather than the web server itself). and adding on the website name to the end of the URL. In this scenario, both IP addresses, from the server and the browser, will be the same.
016 DinoStore Connection over Public IP

The next step is to setup the order processing app in the queue server. Before the file can be published, it needs to be ‘released’ from the DinoStore solution. This is done by selecting the ‘Net702.DinoStore.OrderProcessor’ from the Solution Explorer, then in the icon bar, placed directly below the Tool tab, is an option window that can be changed from debug to release. Once the solution has been released, it needs to be published before being copied into the Queue server’s cloud desktop.
017 Configuration Manager in VS

The OrderProcessor application needs to run at the server’s startup. This is done by copying the ‘setup’ executable found in the publication and pasting it within C:\ProgramData\Microsoft\Windows\Start Menu\Programs\StartUp. The application can then be run.
020 OP exe into Startup File

In the local server, the AWS DinoStore database needs to be opened in order to determine what orders are present in the order table.

Then in local browser, the cloud website can be opened for the purpose of logging in and purchasing some dinosaurs through the checkout.

While the DinoStore is open, the queue server needs to be ready to quickly access so that the OrderProcessor console can be seen. As the DinoStore purchase is made, there is a ‘Queue messages received: count is 1’ line that shows up on the console, followed by  a ‘Queue message(s) deleted’ line.
024 Polling Queue in QS VM after Order

Finally, the AWS DinoStore database is re-examined to check the new order that has been recorded in the order table.

I also faced a few challenges throughout the course of this lab as well.

My first challenge was easily enough solved, but it involved the internet explorer in the web server. When the internet explorer is first accessed in the remote desktop, it has high security settings that make it very hard to do anything in the browser. This problem was solved my checking online on how to reduce the browser’s security.

Another small problem that I had, was that I didn’t know where \inetpub\wwwroot was located. Due to my lack of familiarity with 2012 Windows version, I had trouble with locating it on my own. I solved this by looking at a classmate’s blog for assistance. One of their pictures showed the file path for wwwroot, which enabled me to access it for myself as well.

Another error that I faced, which caused some difficulties was attempting to run my converted file without realizing that I needed to manually convert another portion of the file. The folder that I copied from my local server into the web server contained the DinoStore information within another folder in it. When I converted my main folder to an application, I was unaware that the conversion had not reached folder that contained the DinoStore information. This resulted in the following error screen:
012 Parse Error

I managed to solve this when I was looking through the main folder in the IIS manager. I was attempting to check whether there were any other ‘default.aspx’ files or ‘web.config’ files that perhaps were being accessed instead of the ones that I had adjusted. From a technological perspective, my arrangement and organization of the DinoStore and DinoStore related files were poor, which could be considered as the main factor for this error’s occurrence. The ‘Net702.DinoStore’ folder within the ‘1-Net702.DinoStore’ folder was converted once I realized my mistake, and this solved the configuration error.


QwikLabs: Introduction to AWS Identity and Access Management (IAM)

AWS facilitates security and user control over accounts through IAM. In a business environment, there would be different restrictions on certain accounts pertaining to their respective level of clearance within the business system. This QwikLabs course aims to provide a basic understanding of how to manage and utilize the IAM system for various accounts.

Topics covered in this Lab

  • Exploring pre-created IAM users and groups
  • Inspecting IAM policies as applied to the pre-created groups
  • Following a real-world scenario, adding users to groups with specific capabilities enabled
  • Updating passwords for users
  • Locating and using the IAM sign-in URL
  • Experimenting with the effects of policies on service access

Exploring Users and Groups

AWS Management Console –> Services –> IAM

In the lab, there are already three users set up: “userone”, “usertwo”, and “userthree”. Since the lab is being used as a guideline, rather than practical use, I need to set up three new users on my own AWS account in order to follow through with the instructions given in QwikLabs.

Username Password

This is where the lab script and my practical application of it, start to differ. Rather than add groups to the already created users, I will add the user to one of the script suggested groups as part of the user set-up.

UserUno: Group  = “EC2Support”, Attached Policies = “SupportUser”
UserDos: Group = “EC2Admin”, Attached Policies = “SystemAdministrator”
UserTres: Group = “S3Admin”, Attached Policies = “DatabaseAdministrator”

Determining what policies to attach to these groups was harder than I first anticipated. Having never looked around at all the different policies available before, I was slightly overwhelmed. However, I looked around at my choices and realized that I could define the polices by their job functions. From this point onward I sought help from the AWS user guide and from the code supplied under each policy choice.

The AWS policy user guide can be found here:

Setting Passwords
Once the three new users had been created and associated with a group, I logged into each user and was forced to create a new password. The new passwords had to follow certain specifications that I had arranged prior through my Administrator account.

Password Settings

Experimentation of Policies on Access
This part involves the application of my accounts to determine whether I have grouped them with the right access restrictions and permissions.

  • UserUno: User Uno is under the group labelled “EC2 Support”. As such, I organized with the job policy of “SupportUser”, which the AWS Policy User Guide records as ‘This policy grants permission to create and update AWS support cases.’ The user in this case can ‘contact AWS support, create support cases, and view the existing cases.’

With this is mind, I gave UserUno three tasks: View other users, access EC2, and access S3 buckets.

Access other users:
InkedUno User Permission_LIResult: UserUno has permission to see other users.

Access EC2
InkedUno EC2_LIResult: Uno can access EC2

Access S3 bucket
InkedS3 uno_LIResult: Uno cannot access the bucket

Conclusion: The job policy of UserSupport for Uno has many of the required authorizations, but is not quite correct as it shouldn’t be able to access EC2.

  • User Dos: User Dos is labelled under the group “EC2Admin”, and has the job policy of “SystemAdministrator”. The AWS Policy Guide states that ‘This user sets up and maintains resources for development operations.’ To gain a more comprehensive understanding of this user’s role, I gave it the three tasks I had given User Uno.

Access other Users:InkedDos user permission_LI
Result: Dos does not have permission to manipulate the other users

Access EC2:
InkedInstance Dos_LI
Result: Dos is able to access EC2

Access S3 bucket:
Inkeds3 dos_LI
Result: Dos is able to access S3 buckets

Conclusion: Again Dos’ job policy incorporates most of the features that I desired. However Dos should only be able to access EC2 as EC2Admin, but not S3.

  • User Tres: User Tres is labelled under the group “S3Admin” and has been assigned the job policy of “DatabaseAdministrator”. The AWS Policy Guide states that ‘This user sets up, configures, and maintains databases in the AWS cloud.’ Below are the results of the three tasks.

Access other users
InkedTres User Permission_LI
Result: Tres is unable to access other users

Access EC2:
InkedInstance Tres_LI
Result: Tres is unauthorized to access EC2

Access S3 bucket:
InkedS3 tres_LI
Result: Tres is able to access the S3 bucket

Conclusion: The job policy applied to Tres is exactly what I want for a user that is an S3Admin only.

Final Remarks
Although this practical digressed from the QwikLab script due lack of pre-created users, the application still provided me with plenty of information into how to create and define users within AWS. The job policies that I had associated with users Uno and Dos weren’t quite what I was after. Perhaps, the minor discrepancies of the chosen job policies could be resolved by using some inline policies, which, as the AWS guides defines, are policies inherent or unique to a user.