Showing posts with label Cloud. Show all posts
Showing posts with label Cloud. Show all posts

Amazon Web Services (AWS) Certified Security Specialty (CSS) Beta Exam

*** NOTE: AWS has pulled this certification refunding all that took the exam. ***

I had the opportunity to take the AWS Certified Security Specialty Exam at re:Invent 2016. The exam is in beta phase where questions are being tested, refined and the exam pass line is being set. I won't find out if I passed until March 2017 and I can't share actual exam questions but I can share advice for others that are interested in the exam in the future. Note that as of Jan 2017 the beta is currently closed as it's proved to be very popular.

Preparation:

I entered the exam cold, drawing only on my working knowledge of AWS and its services so my perspective should be an unbiased view of the exam. There is an exam blueprint but it's been pulled from the AWS website.

Format:

  • ~3hr Exam Time
  • > 100 Questions
  • Reading Comprehension Questions
  • Question Nuances Where Important
  • Heavy Focus on Services and Service Components with Security Relationship
    • IAM
    • WAF
    • CloudFront
    • ACM
    • Security Groups
    • NACLs
    • VPC
    • etc.

My Exam Perspective:

I found the questions to be very long, requiring significant reading and reading comprehension in order to answer questions. I also found the possible answers to be long and requiring reading comprehension. I had to read a number of questions at least twice to pickup on all of their nuances and be able to differentiate answer validity. The questions for the exam had some substantial parallels to security related questions on other exams. 

FISMA, FedRAMP and the DoD CC SRG: A Review of the US Government Cloud Security Policy Landscape

The Federal Information System Management Act (FISMA), a US Law signed in 2002, defines the information protection requirements for US Government, "government", data and is applicable to all information systems that process any government data regardless of ownership or control of such systems. Systems Integrators (SI) under contract to perform work for the government are almost always provided some government furnished information (GFI) or government furnished equipment(GFE) and FISMA requirements extend to the systems owned and/or operated by these SIs if they store or process government data. Government data always remains under the ownership of the source agency with that agency holding sole responsibility for determining the data's sensitivity level. It is usually a contractual requirement for an SI, charged with management of government data, to ensure FISMA compliance and an SI is obligated to destroy or return all GFI and GFE at the end of contractual period of performance. Government data falls into a number of information sensitivity categories ranging from public information to the highest of classification and the compliance requirements imposed by FISMA increase in lockstep with that sensitivity.

A large portion of government data under the management or control of most SI's will fall in the public or controlled unclassified information (CUI) buckets. Public data is rather straightforward in that it is publicly releasable and if compromised would have little to no impact on the public image, trust, security or mission of the owning government agency and/or its personnel and as such, requires the least compliance overhead. CUI on the other hand is significantly more complex and nuanced. CUI data could compromise the public image, trust, security or mission of the owning government agency and/or its personnel. As such, CUI data has some restriction applied to its distribution [https://www.archives.gov/cui/registry/category-list.html]. With Department of Defense (DoD) data, there are additional types of distribution restrictions defined in DoD Directive (DoDD) 5200.01 v4 [http://www.dtic.mil/whs/directives/corres/pdf/520001_vol4.pdf] and a host of marking requirements [http://www.dtic.mil/whs/directives/corres/pdf/520001_vol2.pdf]. A common misunderstanding of CUI requirements is that, due to its unclassified nature, it does not require significant security consideration. This misunderstanding is something to be cognoscente of in any engagement with government agency or SI relationship and it is advisable to inquire about CUI data restrictions as this area comes with certain legal as well as contractual ramifications.

ANSWERED: Amazon Web Services (AWS) Certified Solutions Architect (CSA) – Associate Level, Sample Exam Questions

There are many posts with various accounts from the AWS CSA exam, so I will try to keep mine concise and to the point. You need to know the basics of all AWS services. The exam is not weighted towards any one specific service over another, though some crosscut other services like IAM for example, and come up several times. Questions are situational and focused on specific knowledge of various AWS services. The sample exam questions accurately represent the format of the questions on the exam. The questions focus on specific technical aspects and nuances of AWS services. A test for your familiarity with their products perhaps rather than a test of your knowledge about applying their services to larger systems architecture and design requirements.

My studies for the AWS Certified Solutions Architect Exam began in the natural starting place, the sample exam questions provided by AWS. AWS does not answer the questions and though I knew the answer to most or could make a reasonable guess on others I found myself researching a couple of subjects. Since I cannot give any specifics on questions I saw on the exam, I thought I would answer the sample questions.

*** UPDATE ***: When I first posted this article, there was no official study guide. Since that time, AWS has published an official exam study guide available on Amazon.com.
AWS Certified Solutions Architect Official Study Guide: Associate Exam

AWS Sample Exam Questions:

The 7 sample exam questions can be found at: http://awstrainingandcertification.s3.amazonaws.com/production/AWS_certified_solutions_architect_associate_examsample.pdf 

Questions:

  1. Amazon Glacier is designed for (Choose 2 answers)

    • Answer(s): B - infrequently accessed data, C - data archives.
    • Explanation: Glacier is an archival storage service. You are charged every-time you access data over the free tier threshold. When you put data in Glacier you want to have a reasonable expectation that you will at most need to recover a small portion at most per-month unless there is a disaster/emergency scenario.
    • Other Choices: The other choices suggest scenarios where data access is required much more frequently than the ideal Glacier use case.
  2. Your web application front end consists of multiple EC2 instances behind an Elastic Load Balancer. You configured ELB to perform health checks on these EC2 instances. If an instance fails to pass health checks, which statement will be true?

    • Answer(s): C - The ELB stops sending traffic to the instance that failed its health check.
    • Explanation: ELBs are deigned to dynamically forward traffic to the eth0 interface of some set of ec2 instances in one or more availability zones of a single region. When monitoring is setup, the ELB will see that the instance is not responding and stop sending traffic to the failed instance.
    • Other Choices: The other choices suggest that an ELB will take unsupported or inaccurate actions against your instances or actions that are capabilities of other services, specifically Auto Scaling.
  3. You are building a system to distribute confidential training videos to employees. Using CloudFront, what method could be used to serve content that is stored in S3, but not publicly accessible from S3 directly?

    • Answer(s): A - Create an Origin Access Identity (OAI) for CloudFront and grant access to the objects in your S3 bucket to that OAI.
    • Explanation: CloudFront is a CDN capability that distributes S3 objects geographically. An OAI is sort of like a service account for a CloudFront distribution. Using an OAI you can restrict access to S3 content effectively preventing direct access to content in S3 but still allowing CloudFront access to distribute that data.
    • Other Choices: The other choices either refer to actions that do not make sense in the context of the question.
  4. Which of the following will occur when an EC2 instance in a VPC (Virtual Private Cloud) with an associated Elastic IP is stopped and started? (Choose 2 answers)

    • Answer(s): B - All data on instance-store devices will be lost, E - The underlying host for the instance is changed
    • Explanation: It is important in this question to note that the instance is in a VPC to rule out other answers. Any instance storage device is only persisted during the running life of the instance because instance storage is physically attached to the host rather than SAN storage like EBS. Now part of the reason that instance storage only persists while an instance is powered on is because the host could/always changes when the instance is started. Remember that instance resources are very loosely coupled with other resources. When you start an instance, it gets a resource reservation on a carefully chosen, presumably with some complex algorithm, available host.
    • Other Choices: The other choices either refer to behaviors of instances not in a VPC, are outright incorrect or do not make sense in the context of the question. Reference the AWS article for behaviors when stopping or starting an instance. http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/Stop_Start.html.
  5. In the basic monitoring package for EC2, Amazon CloudWatch provides the following metrics:

    • Answer(s): D - hypervisor visible metrics such as CPU utilization
    • Explanation: A responsibility boundary exists between the hypervisor and guest operating system. AWS does not have access to the guest operating system and therefore cannot see anything that is not visible to the hypervisor. Such information would be resource demands of the guest operating system that the hypervisor must service like, CPU usage. Refer back to the shared responsibility model discussed in the AWS Security Whitepaper.
    • Other Choices: The other choices refer to data that would not be visible to the hypervisor and that would not be visible within CloudWatch unless published by the instance owner. See publishing custom metrics.
  6. Which is an operational process performed by AWS for data security?

    • Answer(s): B - Decommissioning of storage devices using industry-standard practices
    • Explanation: The key to this question is understanding the shared responsibility boundary between AWS and its customers as well as the specific statement "operational process". Again, we need to refer to the AWS Security Whitepaper. As a standard practice, AWS shreds all physical disks after magnetically wiping them as part of their decommissioning process. 
    • Other Choices: The other options refer to processes or practices that cross the responsibility boundary or that simply do not make sense in the context of the question or AWS operations.
  7. To protect S3 data from both accidental deletion and accidental overwriting, you should:

    • Answer(s): A - enable S3 versioning on the bucket 
    • Explanation: By enabling versioning, you ensure that if accidentally or otherwise overwritten any previous object version is persisted as a previous version. In addition, you protect against complete loss from accidental deletion. 
    • Other Choices: The other choices, though referring to valid S3 bucket features, would not provide any protection against deletion or overwriting.
NOTICE: All thoughts/statements in this article are mine alone and do not represent those of Amazon or Amazon Web services. All referenced AWS services and service names are the property of AWS. Although I have made every effort to ensure that the information in this article was correct at writing, I do not assume and hereby disclaim any liability to any party for any loss, damage, or disruption caused by errors or omissions, whether such errors or omissions result from negligence, accident, or any other cause.

Cloud Hosting and The US Defense Industry, Whose Winning the Hearts and Minds of Leadership and Security?

With shrinking budgets many would wonder how the US Defense Department will maintain the infrastructure necessary to sustain the force. IT infrastructure supports everything from the acquisition, logistics and sustainable of something as simple as a bandage to something as complex as an aircraft carrier. It would simply not be possible to look away from the IT resources that have become a critical component of our ability to operate yet the cost of operations has never been higher. The DoD finds itself in a bit of a quandary and many organizations are rushing to consolidate data centers, reduce support or decommission older systems.

So what, you might ask, are they to do to weather this storm? I personally believe that the commercial world has the solution, The Cloud, and you better believe that defense contractors are all scrambling to build a show of knowledge and flashy prototypes targeted at courting the fancy of those in the DoD already looking towards this path. One might suggest that the federal government world have already embraced the Cloud however the DoD is a different animal with it's own mindset. Cloud Computing Providers like Amazon Web Services (AWS) are feverishly working to overcome a mountain of security folks clamoring over the challenges that the Cloud presents and the hoards of naysayers touting everything from concerns over intellectual property rights to those who feel like the only people to be trusted with DoD information is the DoD. For those that work in this industry it's well known that the DoD's IT expertise ranges from the best in the business to the outright embarrassing with a tendency towards the latter. None the less it is my belief that the numbers will win out and that the corporate world will triumph moving from a service industry supporting DoD infrastructure to providing the infrastructure outright.

In June 2012 The DoD CIO released their Could Computing Strategy suggesting that the Defense Information Systems Agency (DISA) lead the charge in establishing a brokerage for Cloud services be it from a commercial source or an DISA service offering yet to be established. This flies in the face of other offerings like that established by the Department of the Navy, Space and Naval Warfare Systems Command (SPAWAR)program in cooperation with AWS for example who is already several years ahead of DISA. The DoD has an internal battle for ownership of the Cloud however here again I believe that thee who hold the operational viable stick will win the war. SPAWAR has nearly gained approval for their Cloud brokerage service knocking down many of the security barriers to Cloud Hosting of systems and I believe that they will set the rules of the road for the DoD's future IT strategy.

Even though many of the security barriers have been knocked down or are soon to fall, would-be system owners still have to overcome the naysayers within their organizations. In my opinion this is the next big challenge facing Cloud Hosting of DoD systems. I believe that in the coming years Cloud Hosting will replace most of the unclassified data centers within the DoD and providers like AWS will see significant market growth as a result. DISA, who has been appointed the lead by the DoD CIO stands to loose a huge chunk of their core business which will result in resistance from within and in my opinion render their efforts to facilitate Cloud brokerage unsuccessful. I believe we will see several DoD Cloud brokerages emerge and a number opportunities available for the Defense industry to support.