AWS S3 Storage Concepts for Windows Admins

linux and windows language barriers in aws storage concepts

If you happen to be a Windows Admin (you know who you are) and are beginning to work in the Amazon Web Services Environment, there are some things you need to know! Unless you are at least a little familiar with Linux, you are in for an adventure in confusion. As with any product there is a learning curve when you first begin to explore it. The first challenge is learning to “speak the same language.” Even though I have experience using Linux, I still stumbled until I digested the explanations. Be careful! The language/concept barrier can lead to a great deal of confusion and difficulty. There are many AWS terms that have special meaning. In fact, it is almost a different way of thinking.This blog will cover how I went about breaking the barrier and how you can too!

The Concepts Problem

Learning to work with Amazon Simple Storage Service (S3) is a good example of the challenges a “Windows Admin” faces when learning to work with the Amazon Web Services (AWS). The heart of the issue for me was that the Amazon paradigm is very Linux oriented in their structure, naming conventions, API support, documentation and organizational conventions. For Windows Admins this can be daunting. There are many new concepts and terms for old familiar items, like using “Buckets” instead of directories or “keys” instead of filename. Sadly, the Amazon documentation can be confusing being written from a Linux perspective. 

Understanding Simple Storage Service (S3) in AWS in Easy Dynamics

Deploying a storage solution using the Amazon Simple Storage Service (S3), I learned the hard way (once again) that a simple task can be very difficult if you don’t understand the key terms. For instance, a recent tutorial that I was working through referred to a user’s login for working with AWS resources through a Command Line Interface as a “Key” and their password is referred to as the “Secret Key.” If you are using the management console, these are in addition to Account ID, User Name, and login Password. In reality, they were referring to PKI authentication and Public and Private keys but the tutorial referred to them as a login and password which is confusing at best. 

Command Line Archiving

I began my search by looking for a way to setup a network{{cta(‘d173b5b6-18d9-4d5d-9383-d06f0cfdfa78′,’justifyright’)}} share (Windows Concept). I started searching for a solution that would allow me to create a mapped drive that could be used with Familiar Windows backup solutions. I was approaching the solution from a Windows perspective which was my first mistake. I found numerous third party solutions (for a price) that would simplify the process, but I did not have that option and many of the free solutions simply would not work on Windows servers. I tried and still use some of the free Browser solutions for other tasks.

The Firefox S3 organizer is my favorite. However, after an exhaustive search, the solutions that I found were manual file transfers and not suitable for scheduled archiving of files. Eventually I learned that I could create a Powershell script using the AWS module and create a Windows scheduled event that would execute a few command lines and archive the files. It took a long time to find because I was searching for the wrong thing. In Linux, it would be obvious. But from a Windows perspective, it required an adjustment in thinking. I should have been looking for AWS CLI commands. Below is the code that I found (some ********** values need to be entered);

Powershell Codeaws powershell module featured in easy dynamics blog

Initialize-AWSDefaults -ProfileName default -Region us-east-1

foreach ($i in Get-ChildItem %root%LogsSPLogs)


   if ($i.CreationTime -lt ($(Get-Date).AddDays(-6)))


       if ($i.Length -gt 0)


             #Write-output “$i filename = ” $

             Write-S3Object -BucketName ***********/ -Key “*******/$” -File $i.FullName


       Remove-Item $i.FullName



Caution: Using the AWS module some standard Powershell commands are altered in unexpected ways that a Windows Admin simply would not think of.

 AWS S3 logo featured in Easy Dynamics blog

Critical AWS Terms

In order to understand the Powershell commands and AWS instructions it is necessary to become familiar with AWS terms. Below is a list of the critical terms that you need to understand when using S3 archiving. 

1. Buckets

Top level directories are referred to as Buckets. A bucket is a container for file objects. Buckets have unique properties and only generally resemble directories. Buckets must have a unique naming scheme. Since they are accessed through URL’s their URL’s must be unique. In addition, they cannot simply be moved but must be copied and the old Bucket deleted. Most operations are through the AWS management console, but Buckets can also be managed directly through several free browser tools. I have tried S3Browser and the Firefox S3 tools. One of the drawbacks is the latency issue. It takes a little while for files to transfer and register. This should be kept in mind when creating archiving scripts. There is also a multi-part bulk upload feature that kicks in with large files or a large number of files which can quickly overload resources. 

2. Keys

A key is the unique identifier for an object within a bucket. Every object in a bucket has exactly one key. Because the combination of a bucket, key, and version ID uniquely identify each object, Amazon S3 can be thought of as a basic data map between “bucket + key + version” and the object itself. Every object in Amazon S3 can be uniquely addressed through the combination of the web service endpoint, bucket name, key, and optionally, a version. For example, in the URL, “doc” is the name of the bucket and “2006-03-01/AmazonS3.wsdl” is the key.

3. Objects

Objects are the fundamental entities stored in Amazon S3. Objects consist of object data and metadata. The data portion is opaque to Amazon S3. The metadata is a set of name-value pairs that describe the object. These include some default metadata, such as the date last modified, and standard HTTP metadata, such as Content-Type. You can also specify custom metadata at the time the object is stored. An object is uniquely identified within a bucket by a key (name) and a version ID. 

4. Versioning

Versioning is a means of keeping multiple variants of an object in the same bucket. You can use versioning to preserve, retrieve, and restore every version of every object stored in your Amazon S3 bucket. With versioning, you can easily recover from both unintended user actions and application failures. In one bucket, for example, you can have two objects with the same key, but different version IDs. 

Handling both Linux and Windows languages/concepts at Easy Dynamics

The Take-Away

Amazon Web services offer amazing features at reasonable prices.{{cta(‘8a9fb233-94e7-4220-901a-dc7f556ed529′,’justifyright’)}} If you are a Windows Admin and are going to be working in the AWS environment, be aware that the terms and concepts can be confusing. My recommendation: Read and digest all of the information before doing anything else. Even if you are one of those people who disdain instructions and tutorials. 

While you’re here, make yourself comfortable and check out our blog home page to explore other technologies we use on a daily basis and the fixes we’ve solved in our day to day work. To make your life even easier, subscribe to our blog to get instant updates sent straight to your inbox:


Leave a Comment

Easy Dynamics Login