Budget Report 14/04/2017

I received another billing alarm last night, this time with respect to my EC2 budget.
EC2 Alarm Forecasted50

Upon further inspection, the source of expenditure was from AWS’ EBS service. This seemed strange to me as I haven’t used EBS during this DinoStore project.

In order to gain a greater understanding of my expenditure, I compiled my AWS bills into a spreadsheet.

Budget Spreadsheet

EBS charge for April has been $0.64

I originally thought that it was due to the MySQL RDS instances that I created for Lab 2, and considered that it may relate to the snapshots that I had created. The dates however, for the creation of the EBS volumes were all in March, and my RDS snapshots were created in April. Upon looking into greater detail of the EBS volumes, I was able to determine that they related to previous RDS instances that I had created for my QwikLabs project.

By looking into AWS’ documentation on EBS volumes, I was able to determine their use and their cost:

Device use:

Device Cost

My next step was to detach the EBS volumes in order to cease any further charge.
Detaching EBS volumes

As for my DinoStore budgeting; I have been using Free-Tier available services and so will not have incurred any other charges.

Introduction to Amazon Relational Database Service (RDS)-Linux

Introduction and Aim
The purpose of this lab is to create and use an Amazon Relational Database service through AWS. Amazon RDS is a cloud based service that deals with databases. Databases can be created, operated, and scaled within the Amazon RDS, which has the ability to make MySQL, PostGRE, Oracle, and SQL Server databases.



  • Create an Amazon Relational Database Service (RDS) instance
  • Connect to the RDS instance with client software


Creating a Relational Database Service (RDS) instance
RDS is a service of its own within the Amazon Management Console, rather than being one created through EC2. The lab script requires the MySQL database to have many specific features, These are as follows:

  1.          Specify DB Details
    InkedMySQL database option_LI
  2. DB Instance Class: db.t1.micro (The free tier one)
    DB Instance Class db.t1.micro
    AWS makes mention that this DB instance class is a ‘previous generation instance, that may have lower performance and higher cost than the newer generation.’ Because of this, I looked into the db.t2.micro, which is the current generation instance. The current instance has higher memory and network performance while still being on the free tier, so I will be using the current generation instance in this lab.
    DB Instance Class db.t2.micro
  3. Multi-AZ Deployment: No
  4. Storage Type: General Purpose (SSD)
  5. Allocated Storage: 5
  6. DB Instance Identifier: RDSLab
  7. Master Username: AWSMaster
  8. Master Password: AWS12345
  9. Confirm Password: AWS12345
    Specify DB Details Full
  10.          Next Step->Configure Advanced Settings
  11. Publicly Accessible: No
  12. VPC Security Group(s): Choose a security group containing the text qls. I’m using a security group that I’ve created as I don’t have access to the QwikLabs one.
  13. Database Name: RDSLab
    Configure Advanced Settings
  14. Backup Retention period: 0 days (to disable automatic backups)
    CAS Part 2
  15.           Launch DB Instance

Now that the database instance has been launched, it is important to double check the security groups of the selected VPC and make sure that the inbound rules contain: Type-MySQL/Aurora (3306) with Source-
Editing Inbound Rules of SG


Create an Amazon Linux instance from an Amazon Machine Image (AMI)
Under the EC2 Launch Instance, the Amazon Linux AMI is selected. The instance type is kept as default, which is t2.micro. The next steps, ‘Configure Instance Details’, and ‘Add Storage’, are kept with their default settings. In the ‘Tag Instance’ step, the value given for the name attribute is RDS Free Lab. The final step is to review and launch.
RDS Free Lab Instance

Connecting to Amazon EC2 instance via SSH
Once the instance is launched, the PuTTY Secure Shell client is used to connect to the server. This involves using the instance’s public DNS value into the PuTTY Host name box, prefixed by <ec2-user@>. In the category list, under the SSH option, the Auth option can be clicked which will provide a ‘Private key file for authentication’ box. This is where I use my private key that I’ve previously created.

Connecting to the RDS instance
Within the terminal that opens up, the command ‘sudo yum install mysql’ is typed in, and the install agreement is accepted.
Install mysql

Once installed connect to MySQL, the following text is typed in, with the endpoint name of the RDS instance.
‘mysql –host cjcfraykqpwn.rds.ap-southeast-2.amazonaws.com –password –user AWSMaster’.
This prompts for the AWS12345 password that was created earlier.
InkedEnter mySQL_LI
The darker text at the top is where I accidentally typed the command incorrectly.

The MySQL is now logged into, and the mysql> prompt is visible. The ‘show database’ command can be entered in order to check whether any records return.
mySQL Show DatabasesThe returned output shows that the RDS instance has been connected to successfully.


I found this to be an interesting lab with using bash to install MySQL and connect to the RDS. Prior to attempting the Linux RDS lab, I had attempted to complete the Windows RDS Lab. I’m curious to find out whether the Windows’ VM command tool would be as successful in connecting to the RDS.

Introduction to AWS Lambda

Introduction and Aim
The purpose of this lab is to gain a basic understanding of AWS Lambda through creating and ‘deploying a lambda function in an event driven environment’. -As stated in the QwikLabs lab script.
The labscript states that ‘Lambda is a compute service that runs code in response to events and automatically manages the compute resources, making it easy to build applications that respond quickly to new information.’ Lambda is serverless.



  • Create an AWS Lambda S3 event function
  • Configure an Amazon S3 bucket
  • Upload a file to an Amazon S3 bucket
  • Monitor AWS Lambda S3 functions through Amazon CloudWatch


Configure an Amazon S3 bucket as the Lambda event source
The first step in configuring an Amazon S3 is to determine the region the lab is running in. In my case, it’s Sydney. Under S3 services, the bucket I’ve created is called ql-lambda, and is set in my current region.

Create an S3 function
On the AWS console, Lambda is located in the Services. In the Lambda console, the ‘Get Started Now’ button is pushed, followed by the ‘New Function’ button.
Lambda The QwikLab instructions for creating the function are as follows:

Select Bluprint: S3-get-object
Configure Triggers: Set  bucket name to bucket that has just been created
Set Event Type to ‘Object Created (All)’
Enable Checkbox: Enable Trigger
InkedConfigure Triggers_LI

–> Next
Configure Function:
Name: S3Function
Description: S3 Function for Lambda
Runtime: Node.js
-There were two available .js nodes, so I chose Node.js 4.3
Configure part 1Handler: Leave as index.handler
Role: Choose an existing role
Existing Role: lambda-role
-As I’m not doing this through QwikLabs, there wasn’t an existing role called lambda-role. Instead, I  created a new role called lambda role. The lambda role contained two policies; Simple Microservice Permission, and S3 Object Read-Only Permission. I chose those two policies as they seemed to best fit the role required for this lab.
Configure part 2 (handler and role)
–> Advanced Settings
Memory (MB): 128
Timeout (s): 5

The final section involves the Review section, and then the function can be created.


Upload a file to an Amazon S3 bucket to trigger a Lambda event.
The next step is to upload a file to the S3 bucket in order to trigger a call to the Lambda function.
The file uploaded to the bucket, for the purpose of this test, contains only lowercase lettering with no spaces.
Bucket with Upload

In the Lambda functions page, the function itself can be clicked and then the ‘Monitoring’ tab can be opened. This will provide four graphs: Invocation count, Invocation duration, Invocation errors, and Throttled invocations.
Monitoring in Lambda
Below is a screenshot of the QwikLabs script, which explains what each graph measures.
Graph Explanation

All of this information can be viewed in CloudWatch. This can be accessed by clicking on the ‘View logs in CloudWatch’ button, which is located above the graphs. In the logs section of CloudWatch, the first log stream contains information on ‘Start Request’, ‘End Request’, and ‘Report Request’ of the associated lambda event.
cw log info


The information recorded when a Lambda event is triggered appears to be very informative for an overview of financial transactions. This sort of service implemented into a business may help keep track of business expenditure from staff.

Introduction to AWS CloudFormation

Introduction and Aim
The purpose of this lab is to use an Amazon EC2 instance and install WordPress with a local MySQL database. QwikLabs states that AWS CloudFormation ‘gives developers and systems administrators an easy way to create and manage a collection of related AWS resources, provisioning and updating them in an orderly and predictable fashion.’



  • Create a stack using an AWS CloudFormation template
  • Monitor the progress of the stack creation
  • Use the stack resources
  • Clean up when the stack is no longer required


Create a stack
In this section, I create a stack  from an AWS CloudFormation template.
InkedCreate Stack_LI

CloudFormation is one of the services found in the AWS management console. In the service, I can ‘Create Stack’, selecting the ‘WordPress blog’ template.
The details are as follows:
Name: MyWPTestStack
DBPassword: Pa55word
DBRootPassword: Pa55word1
DBUser: AWSQLStudent
Specifying Details

The lab script makes mention here that ‘the same WordPress template contains an input parameter, KeyName, which specifies the EC2 key pair for the Amazon Ec2 instance that is declared in the template. An Amazon key pair has been created for you.’

As I’m only following the lab script, not actually completing the lab through QwikLabs, I don’t have a pre-made template. However, I do have access to creating EC2 instances, alongside the ones I’ve already created, and I already have a key pair.

In the KeyName drop down on the Details page, I select the key pair that I’ve already created.

The automatically filled parameters are kept on their default settings, and no ‘Tags’ or ‘Advanced Options’ settings are changed, so all that is left to do is create the instance.

Stack Review

Monitoring stack creation
The AWS service for CloudFormation monitors the progress of the stack’s creation. Whilst being created, the status will be CREATE_IN_PROGRESS. Once finished, the status notification will show CREATE_COMPLETE.

This slideshow requires JavaScript.



Using the stack
The WordPress installation still need to be completed. This is done by clicking on the outputs tab, and using the hyperlink located on the page.
Outputs WP
Once the installation is complete, the WordPress dashboard appears. From here, customization and blog posts can happen.

This slideshow requires JavaScript.


Deleting the stack.
Deleting the stack involves selecting the stack to be deleted, under ‘Actions’ pressing ‘Delete Stack’, and then confirming the deletion process.
Delete Stack

During the process, the stack status changes to DELETE_IN_PROGRESS. When a stack is deleted, all of the resources associated with the stack will also be deleted.


By following this lab, I’ve manged to learn how to use CloudFormation in creating a stack, and install a WordPress template.  The implementation of stacks appear to be very useful for running applications, though I would be interested in comparing it to the AWS Lambda service.

Introduction to Elastic Load Balancing

Introduction and Aim
The purpose of this lab is to gain an understanding of the Amazon Elastic Load Balancer. QwikLabs describes the Amzon Elastic Load Balancer (ELB) as a ‘service that automatically distributes incoming application traffic across multiple EC2 instances.’ This can increase the fault tolerance in applications as the ELB service responds to incoming traffic with the required load balancing capacity. The ELB service can be provided for within a single availability zone, or throughout many zones. This service can also be used in a VPC.



  • Logging into the Amazon Management Console
  • Creating an Elastic Load Balancer
  • Adding Instances to an Elastic Load Balancer


Logging into the Amazon Management console
When using AWS, I log into the console through my administrator account rather than my root account. This is a security measure as my root account has access to the financial aspect of AWS. If I were intending to use AWS in  a business scheme or for sensitive information, I would have more users, each with access corresponding to the level of security required.
In order to reduce latency, my AWS account is set in the Sydney region. Although not every service is available at the Sydney zone, I’m currently only working with the basics of what AWS can provide, so I haven’t yet come across any availability issues.


Creating an Elastic Load Balancer
ELBs are located within the EC2 service. For this lab, I choose a classic load balancer which I’ve called ‘free-lab-load-balancer’.
Classic LB
The security group assigned to the ELB is a new one called ELB-SG1. The lab script has a preset one, but as the lab script is being used only as a guideline, then I needed to use an existing one or make a new one.
InkedAssign SG (NEW) New SG_LI
The Type is an AWS preset configuration, so I’m keeping it as is.

The next step in the Load Balance launch is the ‘Configure Security Settings’,  in which nothing is changed, so I just move onto the ‘Configure Health Check’ screen. When I did this, a warning screen appeared:
Config Sec Settings Warning
This warning is something to be heeded for future professional use, but not for this lab.
The lab script asks for the following values:
Response Timeout: 2 seconds
Health Check Interval: 6 seconds
Unhealthy Threshold: 2
Healthy Threshold: 2
Config Health Check

The next step is to add EC2 Instances, I chose two arbitrary instances that were displayed in my instance option list.
Adding EC2 Instances

As Tags are not a part of this exercise, I move on to the final step of reviewing all the load balance specifications.
ELB Review
After checking that everything was according to the script, the load balance can be created.


Once the load balance is created, I can click on the ‘Instance’ tab alongside the ‘Description’ tab near the bottom of the screen. ELB has alt-text that is displayed over the ‘i’ picture next to the instances. The alt-text reports on the status of the instances in relation to the load balance.
Instances Within the ELB
In the ‘Description’ tab, the DNS field name contains a hyperlink that when copied into the browser window, directs to the load balance page. QwikLabs states that ‘While it all looks the same on the front end, as you refresh the page, on the back end your requests are being load balanced between your two running instances.’

The DNS link didn’t work for me, and instead just showed a blank screen. Upon further inspection with the Firefox developer tool, the network was reporting an Error 503, which is a back end server problem.
Back End Server Unavailable

I considered that perhaps I had made a mistake during the load balance launch process, so I created another load balance, taking a look at a classmate’s blog for assistance and rigorously looking over the lab script again.

The DNS link result this time was: Server not found. Using the developer tool, I was able to see that it wasn’t the same problem as my previous load balance as my previous load balance, which implied that it wasn’t a back-end server issue anymore.
Network Display for LB 2
DNS Resolution
Unfortunately, I still didn’t know what the problem seemed to be, or why the problem was no longer a back end issue.


This was an interesting lab in the application of a multi-instance service such as the Amazon Elastic Load Balance. I would like to know why the DNS link failed, and I’m not confident that I could determine that on my own. Having a trained person explain the methods and reasons behind the specifications of the launching of an ELB, may be a beneficial method in helping me understand how to correctly implement the ELB service.

Introduction to Amazon DynamoDB

Introduction and Aim
The purpose of this lab is to create a simple table in Amazon DynamoDB, which is used to store information about a music library. QwikLabs describes Amazon DynamoDB as ‘a fast and flexible NoSQL database service for all applications that need consistent, single-digit millisecond latency at any scale.’



  • Create an Amazon Dynamo DB  table
  • Loading data into an Amazon DynamoDB table
  • Querying Amazon DynamoDB
  • Deleting an Amazon DynamoDB table


Creating a new table
In the DynamoDB, I create a Table called ‘Music’. For the primary key, I write ‘Artist’ and have its type as string. The next step is to click ‘Add Sort Key’ and create a new field type called ‘SongTitle’ that is also a string.
Then Table settings are left as default, and now the database can be created.
InkedCreate Table_LI

The action of creating a DynamoDB is a very simple process on the user’s end, as AWS controls most of the set-up process.


Adding and modifying table data
In DynamoDB, each item is made up of attributes, which is similar to way an entity contains attributes. For DynamoDB, only the primary key attributes are required.
In order to create an item, the specific table is selected, in my case it’s the music table. Then the items tab can be clicked and a ‘Create Item’ option will be displayed.
The lab script asks for the following information to be input into the new item:

Artist: No One You Know (String)
SongTitle: Call Me Today (String)

Then to create another attribute, the ‘Append’ button is used. In this instance, another string type with Field: AlbumTitle, and Value: Somewhat Famous.
Additional Attribute to Create Item

Another attribute is made, this time of type number, with Field: Year, and Value:2015.

Then the item can be saved, now with four attributes.
Personalised Item to Create

The lab script asks for two more items to be created. The final table looks like this:
All Items


Modifying an existing item in the table
The table can be modified by selecting the Music table, and either double-clicking on the cell to edit, or as the lab script suggests; click the items tab and select the year, then in the Actions drop-down, press edit, and save any changes made. The lab script directs for the year ‘2014’ to be changed to ‘2013’.
Editing Date on Music Item


Querying the table.
The table can be queried to find specific items based on various information. The lab script makes mention that ‘the primary key is made of Artist (partition key) and SongTitle (sort key).
In the music table, under the items tab, I can change the drop-down labelled ‘Scan’ to ‘Query’.
The first query requires me input into the Partition…SongTitle…String=value box; ‘No one You know”. Once searched, all tracks with the artist ‘No One You Know’ are displayed.
Query Search -No One You Know
The next query is, keeping with the previous query add another query for Sort Key…SongTitle…String= ‘Call Me Today’.
Query Search-Call Me Today

The final query is, still keeping with the partition key specifications, clear the sort key data, and press the ‘Add filter’ button. In a new filter row, the attribute is set to year of type number, and the value set to 2013. This limits songs from the specified album that have the specified date.
Query Search -Filter Year 2013


Deleting the table
Deleting a table also deletes all the data within the table. To delete the table,I select table to delete, click ‘Actions’, then press ‘Delete Table’. A confirmation pop-up appears, and once confirmed,  the database is deleted.
Delete Table



I found DynamoDB to be a very user-friendly AWS environment due to preset parameters, which resulted in very little configuration adjustments. Although this is suitable for simple tables such as the Music table that was created, I can understand that the reduced amount of configuration can limit the use of the database. A thorough run-through and play within the DynamoDB is a potential solution for gaining understanding of the extents of this service.

Introduction to Amazon Elastic Block Store (EBS)

Introduction and Aim
The purpose of this lab is to gain basic understanding and comprehension of what is the Elastic Block Store (EBS) in AWS.
The Amazon Elastic Block Store is a service that provides block level storage for both EC2 instances and the AWS Cloud. Block level storage volumes are replicated within their availability zone, the redundancy increases security from an AWS server failure, and ensures a constant low latency. EBS also allow for scaling up and down of usage in very short time frames.


  • Create an EBS Volume in the Amazon Management Console
  • Add an EBS Volume to an instance
  • Snapshot an EBS Volume


Creating an Elastic Block Store Volume
EBS Volumes are found within the EC2 service. QwikLabs describes EBS Volumes as ‘hard drives in a computer. The data on them persists  through the lifetime of the volume and can be transported between virtual machines as needed.’

On the side panel of the EC2 display, the EBS contains two options; Volume and Snapshot. When creating a volume, the region to contain the volume is important to consider as the volume is replicated within the region.
In the ‘Create Volume’ dialog box, the settings are:
(a). Type: General Purpose (SSD)
(b). Size (GiB): 1
Availability Zone: Sydney (The region in which my AWS server is set): ap-southeast-2a

Create Volume

The created volume is able to be attached to an instance. I will use the instance that I created for my Introduction to EC2 with Windows server lab.
InkedRunning Instance_LI

Adding an EBS Volume to an instance
If the state of the volume is ‘available’, then a running instance will be able to be attached to it.
Attaching Instance Cropped


Snapshotting an EBS Volume and increasing performance
QwikLabs explains that ‘a snapshot of volume replicates the data in the volume. It also allows you to change the properties of the volume to enable features like provisioned IOPS’.

In the 1GiB volume, I can right click and ‘Force Detach Volume’. The lab script makes mention that the instance would need to be stopped before doing this so as to not force detach the drive. However, in this lab, the instance will remain running as there isn’t anything of importance within it, and the lab is focusing more on what can be done with a volume rather than whether the order of actions on the volume would follow production protocol.
InkedDetaching Volume_LI

Once the volume is detached, I can right click and ‘Create Snapshot’.
I want to ensure that the screenshot dialog box contains the following settings:
(a). Volume field matches created volume
(b). Name Box: qlebslab
(c).Description: ql ebs volume snapshot
Create Snapshot

I can then create the snapshot which will be stored within Snapshot under Elastic Block Store.

Then I can right click the snapshot and ‘Create Volume’. I want the following settings to be set within the volume dialog box:
(a). Type: Provisioned IOPS (SSD)
(b). Size (GiB): 10
(c). IOPS:300
(d). Availability Zone: Sydney: ap-southest-2a

Create Volume through Snapshot

Under Volumes, the new volume is present and contains the same data from the original but is larger in size and has IOPS.
Final Volume


It seems to me that EBS could be a very useful storage scheme for businesses that require the security of the replicated data. However, if I were looking to start my own business, I would want to compare pricing vs storage amount as to whether this form of storage compared to S3 storage is worthwhile from a cost perspective.

Introduction to Amazon Elastic Compute Cloud (EC2) with Windows Server

Introduction and Aim
The purpose of this QwikLabs session is to run a Windows server through an Amazon EC2 instance.
For more information on EC2, check out my blog ‘Introduction to Amazon Elastic Compute Cloud (EC2)‘.


  • Logging into the Amazon Management Console
  • Creating a Windows Server instance from an Amazon Machine Image (AMI)
  • Finding the instance in the Amazon Management Console
  • Logging into the instance


Logging into the Amazon Management Console
When logging into the Amazon services, I ensure that I am logging in through the https://console.aws.amazon.com website as this provides the access to my administration account but not my root account. This is healthy practice as a security measure and as a business technique. The next step is to check my region as not all AWS services are available in every zone. My zone is set to Sydney which is an optimal region for what this lab involves. As Sydney is the closest region to where I live, the latency is reduced, while still providing the resources that I require.


Create an Amazon EC2 instance running Windows server
The Windows server that will be run on the instance is Windows Server 2012 R2 Base, which is available on the free tier so I have no qualms about choosing it.

The next move is to run through the configuration steps:
>Configure Instance Details: Everything is kept as default.
>Add Storage: Everything is kept as default
>Tag Instance: A name is created for the tag to assist in easy identification
>Configure Security Group: Leave setting as ‘Create a new security group’ that has a rule for port 3389 open, which is RDP, remote desktop protocol.
>Review Instance Launch: This is a summary of the configuration choices
Review Instance Launch

The final step is choose or create a key pair, in which I choose my existing key pair. Once the instance has been launched, it is a matter of waiting until the instance state shows ‘running’.


Connect to your Amazon EC2 instance
In order to connect to the instance, I need an RDP client. I am able to obtain an RDP when I connect to the instance as I’m using a Windows computer already.
Connect to Instance Popup Window Ws

Once the RDP is downloaded, I can get a password which will be used for the Windows instance. The password is first acquired by me providing my private key, which grants me access to the encrypted password. The decrypted form of the password is used in the Windows instance.
InkedCon2Inst Get Password Ws_LI
(The above screenshot has the encrypted password and Key Name whited-out for security reasons).
Now that an RDP program is available and the password has been determined, I can complete the Windows server launch. The RDP is automatically connected to the server so all that is required is the password input. The result is as expected, a Windows 2012 instance is launched, and appears as follows in the slideshow below.

This slideshow requires JavaScript.


The Amazon EC2 is proficient at both Windows servers and Linux servers (which were used in the previous lab). It is interesting to me, that the Windows layout is far more application oriented compared to the command line that was the Linux server. This may be due to the different setups of the operating systems, and is something that I could potentially look into further.

Introduction to Amazon Elastic Compute Cloud (EC2)

Introduction and Aim
The purpose of this lab is to gain basic knowledge and experience of Amazon EC2. QwikLabs describes Amazon EC2 as a ‘web service that provides resizable compute capacity in the cloud’. EC2 is designed for developers who want complete control over their compute capacity and resources, with a need or desire to re-scale the capacity as conditions change.


  • Log in to the AWS Management Console
  • Create an Amazon Linux Instance from an Amazon Machine Image (AMI)
  • Find your instance in the Amazon Management Console
  • Log into your instance

Create an Amazon Linux instance from an Amazon Machine Image (AMI)
The Linux instance that I will be using will be General purpose-t2.micro. The lab script suggests the General purpose-t2.small, but is not the one I have decided to pick because the t2.micro is on the free tier, whereas the t2.small is not. I consider this to be a justifiable change as my understanding of this part of the lab is that it is focused on learning how to create an instance from an AMI. As AWS charges differ with relation to the size of the instance, I’m choosing to attempt this lab with the instance size available on the free tier.
InkedInstance Type_LI

The next step is to Configure Instance Details. The lab asks to keep all of the options as default but makes note that this is where I could adjust settings such as the network settings, monitoring, and access settings.
InkedConfigure Instance Details_LI

Following the Configure Instance Details, is Add Storage where nothing is adjusted for this lab, then Add Tags. Tags are useful when there are many instances that will be launched, as tagging the instances makes them easier to identify.
The final step before launching the instance is Configure Security Group. The default security group is kept, and it allows port 22 to this instance. At this point in time Amazon brings up a security warning as the source is set to which allows all IP addresses, and that is a potential security risk.
Configure Security Group Instance Creation

While launching the instance, Amazon Management Console organizes Key pairs. These are a set of public and private keys that are used to access the instance. As I already have a key pair that was made during the QwikLab VPC lab (Link Here), I can choose that rather than create another one.
That is everything that is required for the launching of the instance.

Windows User: Connecting to your Amazon EC2 instance via SSH
In order to run the instance, I need the program PuTTY, which is a secure shell client, and the public DNS corresponding to my instance.

The public DNS is copied into the Host Name of the PuTTY program as an extension to ‘ec2-user@’.
PuTTy host name

Then in the SSH Connection category, the private key is used for authentication. The previous time when I used PuTTY to run an instance, I created a PassPhrase, which is an extra layer of security available when starting an instance.
PassPhrase w PuTTy

Once the passphrase is inserted, I am granted access to theEC2 instance.
PuTTy w Correct Passphrase

On the Linux command line, I can type out basic bash prompts and receive a response from the instance. As I’m not familiar with bash, I searched online for some basic commands. The website I accessed is at the bottom of this blog.
CMD line Linux Output
An interesting observation here, is that this instance is connected to a different timezone than NZDT.

Amazon EC2 has been interesting to work with as I’m not used to creating elastic compute clouds. The next step in instance configuration for myself, outside of a QwikLab task, would be to experiment with the Configure Instance Details setting and gain further knowledge and experience of how to personalize the configuration. Although I am not familiar with bash for the Linux command line, it would be interesting to learn more in order to gain a greater understanding of the use of the Linux instance in both a personal and professional environment.



SS64, (N.D), An A-Z Index of the Bash command line for Linux, Retrieved from: https://ss64.com/bash/




Introduction to Amazon Simple Storage Service (S3)

Introduction and Aim
The purpose of this lab is understand Amazon Simple Storage Service. Amazon S3 is an internet storage utility designed for scalability. It is the same infrastructure that Amazon use for their own websites.


  • Create a Bucket in S3
  • Add an Object to Amazon S3
  • View an Object in a Bucket
  • Delete an Object from a Bucket in Amazon S3

Amazon S3
As stated in the Introduction and Aim section, S3 is a highly scalable global network infrastructure. In S3, data is stored as objects in the bucket. An object contains a file and data describing the file. The AWS Management Console contains the ability to store objects within folders, and folders within folders.

Creating a Bucket in Amazon S3
As with every QwikLab lab, the first action required is to ‘verify your region’. For me, the region is Sydney.
I have already created buckets for previous labs, but will do so again for this lab. Creating a bucket does not cause a charge to my account so there is no concern in making another bucket for the purpose of the lab.Creating Bucket
An interesting piece of information about bucket creation from QwikLabs is that the region choice will influence latency, costs, and address regulatory requirements. Although I knew of the latency issue, I hadn’t considered cost or address requirements before now.

Adding an object to Amazon s3
Again, adding a file to an S3 bucket is not unfamiliar to me. However, I haven’t explored the use of metadata or control access permissions before.
InkedUploaded File to Bucket_LI

As I have just discovered, both can be accessed within the properties of a selected uploaded file.

Whilst I’m not needing to adjust any of the current settings for this lab, this knowledge will likely be beneficial in the future.

Upload a folder
The method in which I upload a folder is exactly the same as that to upload a file. In this upload, I have chosen a folder that contains a .txt document.
Upload Folder
Once fully uploaded, the folder appears just below the file upload in the Transfers window.
Uploaded Folder to Bucket

Viewing an object in Amazon S3
As the objects are now uploaded to an internet storage base, the objects can be viewed within a browser or downloaded onto the computer.
Popup file

QwikLabs makes mention that the default setting for S3 buckets and for objects is private. This can be changed by right-clicking on the object that you want to change.
Private v Public

Moving an object in Amazon S3
Objects can be moved between different buckets. As I already have another bucket: net-702, I can an object from my qwiklab-s3 to that bucket. The method in doing this involves a right-click and ‘copy’ from the qwiklab-s3 bucket, followed by a right-click and ‘paste’ into the net-702 bucket.
Moving between buckets
The object movement is noted within the Transfers window.

Deleting an object and bucket in Amazon S3
Deleting objects within a bucket is simply a matter of a right-click then ‘delete’. Deleting a bucket however, contains a slightly more rigorous method, as the AWS Management Console requires manual input specifying which bucket is to be deleted.
Deleting Bucket containing objects
It is important to delete the uploaded data that I no longer need to use as S3 charges for storage within the buckets.

I have found S3 storage to be a very user-friendly storage environment. If I were to pursue using S3 as a storage facility, I would certainly take a further look into permissions, both for a bucket and for specific files or folders. From a business perspective, this could be an ideal method in managing employee access to confidential information for specific clients.