Amazon Web Services (AWS) Certified Security Specialty (CSS) Beta Exam

*** NOTE: AWS has pulled this certification refunding all that took the exam. ***

I had the opportunity to take the AWS Certified Security Specialty Exam at re:Invent 2016. The exam is in beta phase where questions are being tested, refined and the exam pass line is being set. I won't find out if I passed until March 2017 and I can't share actual exam questions but I can share advice for others that are interested in the exam in the future. Note that as of Jan 2017 the beta is currently closed as it's proved to be very popular.

Preparation:

I entered the exam cold, drawing only on my working knowledge of AWS and its services so my perspective should be an unbiased view of the exam. There is an exam blueprint but it's been pulled from the AWS website.

Format:

  • ~3hr Exam Time
  • > 100 Questions
  • Reading Comprehension Questions
  • Question Nuances Where Important
  • Heavy Focus on Services and Service Components with Security Relationship
    • IAM
    • WAF
    • CloudFront
    • ACM
    • Security Groups
    • NACLs
    • VPC
    • etc.

My Exam Perspective:

I found the questions to be very long, requiring significant reading and reading comprehension in order to answer questions. I also found the possible answers to be long and requiring reading comprehension. I had to read a number of questions at least twice to pickup on all of their nuances and be able to differentiate answer validity. The questions for the exam had some substantial parallels to security related questions on other exams. 

Amazon Cognito User Pool Admin Authentication Flow with AWS SDK For .NET

Implementing the Amazon Cognito User Pool Admin Authentication Flow with AWS SDK For .NET offers a path to implement user authentication without management of a host components otherwise needed to signup, verify, store and authenticate a user. Though Cognito is largely framed as a mobile service, it is well suited to support web applications. In order to implement this process you would use the Admin Auth Flow outlined in the AWS produced slide below. This example assumes that you have already configured both a Cognito User Pool w/ an App, ensuring the "Enable sign-in API for server-based authentication (ADMIN_NO_SRP_AUTH)" is checked for that app on the App tab and that no App client secret is defined for that App. App client secrets are not supported in the .NET SDK. It is also assumed that a Federated Identity Pool is configured to point to the before mentioned User Pool.

This auth flow bypasses the use of Secure Remote Password (SRP) protocol protections heavily used by AWS to prevent passwords from even been sent over the wire. As a result, when used in a client server web application, your users passwords would be transmitted to the server and that communication must be encrypted with strong encryption to prevent compromise of user credentials. The below code implements a CognitoAdminAuthenticationProvider with Authenticate and GetCredentials members. The Authenticate method returns a wrapped ChallengeNameType and AuthenticationResultType set of responses. A challenge will only be returned if additional details are needed for authentication, in which case you would simply ensure those details are included in the UserCredentials provided to the authenticate method and call Authenticate again. Once authenticated, a AuthenticationResultType will be included in the result and can be used to call the GetCredentials method and obtain temporary AWS Credentials.

Getting Started with AWS Lambda C# Functions

For those of us that are .NET developers at heart, we finally have the ability to run server-less C# applications on AWS! Support for the C# Language in AWS Lambda was announced at AWS re:Invent 2016 (1-Dec-2016). This post is a quick getting started guide to help you get started.

C# support in Lambda requires use of .NET Core targeted assemblies as the Core CLR offers cross platform support enabling the Linux based Lambda infrastructure to execute .NET complied binaries. Lambda accepts the zipped build output of .NET Core targeted class library rather than raw code for C# Lambda fucntions. Function handlers are referenced using he syntax <Assembly Name>::<Fully Qualified Class Name>::<Method Name>. Which in the case of a project with output assembly named "myassembly", a namespace of "myassemblynamespace", a class named "myclass" and a method named "mymethod" would be "myassembly::myassemblynamespace.myclass::mymethod". AWS Provides a project type and tooling through its Toolkit for Visual Studio that enable creation of C# Lambda functions however you can build your own from a standard class library project.

Prerequisites:

  1. Development Environment (See .NET Core Installation Guide)
    • Visual Studio 2015 with Update 3
    • .NET Core Tools
  2. AWS Visual Studio IDE ToolKit
    • AWS SDK for .NET (v3.3.27.0 or greater required)
    • AWS Toolkit for Visual Studio (v1.11.0.0 or greater required)

Required Project References:

  • Amazon.Lambda.Core (Install-Package Amazon.Lambda.Core)
  • Amazon.Lambda.Serialization.Json (Install-Package Amazon.Lambda.Serialization.Json)
  • Amazon.Lambda.Tools (Install-Package Amazon.Lambda.Tools -Pre)

FISMA, FedRAMP and the DoD CC SRG: A Review of the US Government Cloud Security Policy Landscape

The Federal Information System Management Act (FISMA), a US Law signed in 2002, defines the information protection requirements for US Government, "government", data and is applicable to all information systems that process any government data regardless of ownership or control of such systems. Systems Integrators (SI) under contract to perform work for the government are almost always provided some government furnished information (GFI) or government furnished equipment(GFE) and FISMA requirements extend to the systems owned and/or operated by these SIs if they store or process government data. Government data always remains under the ownership of the source agency with that agency holding sole responsibility for determining the data's sensitivity level. It is usually a contractual requirement for an SI, charged with management of government data, to ensure FISMA compliance and an SI is obligated to destroy or return all GFI and GFE at the end of contractual period of performance. Government data falls into a number of information sensitivity categories ranging from public information to the highest of classification and the compliance requirements imposed by FISMA increase in lockstep with that sensitivity.

A large portion of government data under the management or control of most SI's will fall in the public or controlled unclassified information (CUI) buckets. Public data is rather straightforward in that it is publicly releasable and if compromised would have little to no impact on the public image, trust, security or mission of the owning government agency and/or its personnel and as such, requires the least compliance overhead. CUI on the other hand is significantly more complex and nuanced. CUI data could compromise the public image, trust, security or mission of the owning government agency and/or its personnel. As such, CUI data has some restriction applied to its distribution [https://www.archives.gov/cui/registry/category-list.html]. With Department of Defense (DoD) data, there are additional types of distribution restrictions defined in DoD Directive (DoDD) 5200.01 v4 [http://www.dtic.mil/whs/directives/corres/pdf/520001_vol4.pdf] and a host of marking requirements [http://www.dtic.mil/whs/directives/corres/pdf/520001_vol2.pdf]. A common misunderstanding of CUI requirements is that, due to its unclassified nature, it does not require significant security consideration. This misunderstanding is something to be cognoscente of in any engagement with government agency or SI relationship and it is advisable to inquire about CUI data restrictions as this area comes with certain legal as well as contractual ramifications.

Using Linqpad to Query Amazon Redshift Database Clusters

Looking for a quick and easy way to query an Amazon Redshift Database Cluster? I was and the first place I turned was to my favorite tool for this kind of thing, Linqpad. I was a bit dismayed to find that none has developed, that I could find, a Linqpad database driver for Redshift. Small note, there are a few Postresql options and Redshift is supposed to be Postresql compatible however, none of them seemed to work for Redshift.

Giving credit to the author of this article http://forum.linqpad.net/discussion/384/how-to-connect-to-and-query-a-ms-access-database-mdb-and-accdb  describing the use of Linqpad for connections to MS Access, I made a few few tweaks and boom, I have a working way to connect to and query Redshift. So in the pay it forward spirit, I thought I'd share.


// PREREQUISITES:
// (1) Copy and paste this entire block of code into a Linqpad query window, no connection needed, and change language to C# Statement(s).
// (2) To use the .NET ODBC assembly, you'll have to press F4 then click on the "Additional Namespace Imports" tab. Add "System.Data.Odbc",
//     no quotes, on a single line and click OK.
// (3) Install the x86 Amazon Redshift ODBC Driver (http://docs.aws.amazon.com/redshift/latest/mgmt/install-odbc-driver-windows.html). The 
//     x64 driver does not work
// (4) Update the query settings.

// ************************************************ Update Settings Below ************************************************
string endpoint = "";
string database = "";
string user = "";
string pass = "";
string port = ""; //Default is 5493

string table = "";
string query = "SELECT * FROM " + table; //Optionally Update Query
// ************************************************ End Update Settings Section ************************************************

// ************************************************ Do Not Modify Below ************************************************
string connectionString = "Driver={Amazon Redshift (x86)}; Server="+endpoint+"; Database="+database+"; UID="+user+"; PWD="+pass+"; Port="+port;

using(OdbcConnection connection = new OdbcConnection(connectionString)) 
{
 Console.WriteLine("Connecting to ["+connectionString+"]");
 try
 {
  Console.WriteLine("Executing query ["+query+"]");
  
  if (query.StartsWith("SELECT", StringComparison.OrdinalIgnoreCase))
  {
   using (OdbcDataAdapter adapter = new OdbcDataAdapter(query, connection))
   {  
    DataSet data = new DataSet();
   
    adapter.Fill(data, table);
    
    Console.WriteLine("Found ["+data.Tables[0].Rows.Count+"] rows");
    
    data.Dump();
   }
  }
  else
  {
   connection.Open();
   using (OdbcCommand command = new OdbcCommand(query, connection))
   {
    var impactedRows = command.ExecuteNonQuery();
    
    Console.WriteLine("["+impactedRows+"] rows impacted");
   }
  }
 }
 catch (Exception ex)
 {
  Console.WriteLine(ex.ToString());
 }
}
// ************************************************ End Do Not Modify Section ************************************************

NOTICE: All thoughts/statements in this article are mine alone and do not represent those of Amazon or Amazon Web services. All referenced AWS services and service names are the property of AWS. Although I have made every effort to ensure that the information in this article was correct at writing, I do not assume and hereby disclaim any liability to any party for any loss, damage, or disruption caused by errors or omissions, whether such errors or omissions result from negligence, accident, or any other cause.