Cloud

System Manager cross-account Parameter sharing

Recently on Feb 22, 2024 AWS has launched  new capability to share Parameters in System Manager across the AWS  accounts.

AWS Systems Manager’s Parameter Store is a tool that helps one to safely store configuration information. Now, it lets us share more advanced configuration details with other AWS accounts. This makes it easier to manage all your configuration data from one central location. 

Many customers have work spread across different AWS accounts, all relying on the same configuration data. With this update, you can avoid the hassle of copying and syncing data manually. Instead, you can share parameters with other accounts that need access, creating a single, reliable source for all your configuration data.

In this blog post, I will guide you through the process of sharing AWS System Parameters with another AWS account. Here are the steps to setup how to create and share Parameter:

  1. First of all let’s create a new Parameter in Parameter Store, click on ‘Create parameter’ button
  2. Provide Name and ensure to select Advanced Tier. Please note cross account Parameter share is only possible if you select Tier as Advanced
  3. For now lets consider type as String and value as t3.medium 
  4. Click on ‘Create parameter’ button
  5. The parameter is created now
  6. Click on the parameter and select ‘Sharing’ tab and click on Share button.
  7. This will open new window for ‘Resource Access Manager’
  8. Click on ‘Create resource share’
  9. Enter Name, in Resources drop down select ‘Parameter Store Advanced Parameters’
  10.  This will show you Parameters that you have created in this account, select parameter which you want to share with other accounts. In our case it would be InstanceType and then select Next
  11. Choose Permissions for sharing parameters either AWS Managed or Customer Managed, for this demo lets choose AWS Managed Permission
  12. Let’s select AWS account with whom you want to share Parameter. In our case we choose to share it with account named ‘AI-POC’
  13.  Click next and click ‘Create resource share’
  14. Now your resource share has been created 

Next let’s see how to access Parameter created above:

  1. Now lets login to consumer AWS account in our case it would be ‘AI-POC’ 
  2. Go to RAM (Resource Access Manager) and click on ‘Resource shares’ under ‘Shared with me’
  3. You would be able to see Parameter which was shared from earlier account.
  4. Consumer account can access shared parameters using the AWS command line tools, and AWS SDKs.
  5. Here is the CLI command –> aws ssm get-parameter –name arn:aws:ssm:us-east-1:<ProducerAWSAccount#>:parameter/InstanceType
  6. Here is the output of the command, you would able to access Parameter value.

Important Notes..

  • Your consumer accounts receive read-only access to you shared parameters
  • The consumer can’t update or delete the parameter
  • The consumer can’t share the parameter with a third account
  • To share a parameter with your organization or an organizational unit in AWS Organizations, you must enable sharing with AWS Organizations
  • To share a SecureString parameter, it must be encrypted with a customer managed key, and you must share the key separately through AWS Key Management Service

I hope you find this blog post enjoyable and that it provides you with a high-level understanding of how cross-account parameter sharing operates. If you appreciate this post, please consider sharing it with others 🙂

Connect AWS EC2 without Public IP Address

AWS has recently unveiled an exceptional feature that enables direct connections to instances in a Private Subnet without the need to assign them a Public IP address. With the assistance of EC2 Instance Connect Endpoint, you can now conveniently establish SSH/RDS connections to these instances.

Previously, customers had to rely on a bastion host to establish connections with instances in a Private Subnet. However, the reality is that managing a bastion host can be burdensome in itself.

In this blog, i will show you simple steps to SSH your instance in Private Subnet from browser and local machine.

Note

There is no additional cost for using EIC endpoints. Standard data transfer charges apply.

Here are the steps to setup this configuration:

  1. First of all ensure to grant required IAM permissions to user who want to use EC2 Instance Connect Endpoint
  2. Setup Security groups for EC2 Instance Connect Endpoint
    •  To setup security group, consider following sample setup An EC2 Instance Connect Endpoint with a security group and an EC2 instance with a security group.
    • The EIC Endpoint Security Group has one outbound rule that allows TCP traffic to the Development Security Group. This configuration means that the EC2 Instance Connect Endpoint can only send traffic to instances that are assigned the Development Security Group
    • EIC Endpoint SG send outbound TCP request (SSH and RDP) to Development SG 
    • Development SG accept inbound TCP traffic (SSH and RDP) from EIC Endpoint SG
  3. Create new endpoint as shown below
    • Name –  MyEndPoint 
    • Service Category – EC2 Instance Connect Endpoint
    • VPC of your choice
    • Security Group – EIC Endpoing Security Group
    • Subnect – Select your Private Subnet in which Instance will be launched
    • click ‘create endpoint’ button
  4. Provision new EC2 Instance in Private Subnet and associate Development SG to it.
  5. Note there is new EC2 instance created without and Public IP address. There is Private IP assigned -10.0.4.135
  6. In order to connect EC2 instance, simply select instance -> click Connect button
  7. Select ‘Connect using EC2 Instance Connect Endpoint’
  8. Select EC2 Instance Connect Endpoint created in step #3 and click on Connect
  9. You are successfully connected to EC2 instance from browser.
  10. Alternatively, if you want to connect EC2 instance from local machine using Powershell then simply run this command aws
    ec2-instance-connect ssh –instance-id i-0a9e3ddcddfaedb2c
  11. ta-da, you are conneted to instance in Private Subnet.
I hope you found this article enjoyable!! Feel free to share it within your network.

Walkthrough of AWS User Notifications

On May 3, 2023 AWS has launched one new service called as ‘AWS User Notifications’. AWS User Notifications enables users to centrally setup and view notifications from AWS services, such as AWS Health events, Amazon CloudWatch alarms, or EC2 Instance state change, in a consistent, human-friendly format.

Users can view notifications across accounts, regions, and services in a Console Notifications Center, and configure delivery channels, like email, chat, and mobile push notifications, where they can receive these notifications.

Here are the steps to configure AWS User Notification into your account

  • Go to the AWS Console
  • Search for ‘AWS User Notifications’ service
  • Click on ‘Create notification configuration’ button
  • Fill in Events detail
  • In Event rules section, mention for which AWS service you want to receive notification, event type and Region as well.
  • In this blog, we are configuring event to receive notification as soon as there is any state change in any of EC2 instance within US East (North Virginia) region
  • You can add multiple Event rules
  • Next, mention delivery time of notifications based upon your need. 
  • Now configure how would you like to receive notification. In this example, we have selected to receive Email notification on email address and on AWS Chatbot
  • Once done, click on ‘Create notification configuration’ button
  • At this stage you have successfully done the Event configuration
 
Now let us start the EC2 instance in (North Virginia) region
 
Soon you will automatically receive notification in notification area

User will receive email notification on configured Email address

Here is the notification on AWS Chatbot configured Microsoft Teams Channel

Importanat Notes:

1. This service is offered by AWS at no additional cost

2. As of today customization of notification title or body is not supported

Similarly, you can configure event notifications for other AWS services like S3, ECS, Event Bridge, Step Function and many more..

AWS Chatbot Integration With Microsoft Teams

Going forward you can use AWS Chatbot to view, troubleshoot and operate AWS resources directly from Microsoft Teams. By leveraging AWS Chatbot for Microsoft Teams or any other chat platforms, you can receive notifications from AWS services in your chat channels and execute infrastructure-related tasks by entering commands, eliminating the need to switch to another tool.

What is ChatOps?

Communicating and collaborating on IT operation tasks through chat channels is known as ChatOps. Basically, it allows Cloud Engineers to centralize the management of infrastructure and applications, automate and streamline workflows

AWS had launched Chatbot back in 2020 with Amazon Chime and Slack integrations. Subsequently, the chat platform ecosystem has undergone swift development, and a large number of individuals are presently utilizing Microsoft Teams.

In general, real-time notifications regarding system health, budget, new security risks or threats, and the status of your CI/CD pipelines are desired to Cloud Engineers. This is where ChatOps integration with MS Teams could be helpful. Additionally, you have the option to directly input most AWS Command Line Interface (AWS CLI) commands into the chat channel. This enables you to access supplementary telemetry data or resource information, or execute runbooks to resolve issues. 

AWS Chatbot allows you to create custom aliases for frequently used commands and their parameters, reducing the steps needed to complete a task. These flexible aliases can include personalized parameters, making command entry easier. AWS Chatbot’s natural language processing allows you to ask questions in everyday language, and receive relevant AWS documentation or support article extracts as answers. You can also use natural language to execute commands. E.g. show me my ec2 instances in us-east-1.

In this video I will showcase how to configure the Integration Between AWS Chatbot and Microsoft Teams.

Enable AWS Systems Manager for all EC2 instances in an account

Recently, on Feb 17, 2023 AWS have released new feature which will enable customers to on-board all EC2 instances in account with AWS System Manager, that too with minimum configuration. Isn’t it great!!

Did you Know?

Any instance/ node which is configured for AWS System Manager is called as Managed Instance/ Managed Node. Whether it is AWS EC2 instance, Azure VM (Hybrid Environment) or On-Premise Server.

Earlier if any EC2 instance was require to be configured as Managed Instance then an IAM instance profile/ custom role was needed to be attached with every EC2 Instance manually. This could get cumbersome if there are EC2 instances to be managed at the scale.

This scalability is possible with new feature called as Default Host Management Configuration (DHMC) agent. DHMC simplifies the experience of managing EC2 instances by attaching permissions at the account level

You can begin utilizing the benefits of DHMC in just a few clicks from the Fleet Manager console. This feature ensure Patch Manager, Session Manager, and Inventory are available for all new and existing instances in an account.

 

Important:

  1. In order to leverage benefit of Default Host Management Configuration feature, you need to ensure all instances with Instance Metadata Service Version 2 (IMDSv2) in your account  should have SSM Agent version 3.2.582.0 or later.
  2. Default Host Management Configuration doesn’t support Instance Metadata Service Version 1.
  3. You need to attach IAM instance role at System Manager level, System Manager assume role by calling EC2 services.
  4. You must turn ON the Default Host Management Configuration setting in each Region you wish to automatically manage your Amazon EC2 instances.

In this short video I will demonstrate how to use this new feature. 

Azure Automation Visual Studio Code Extension

During January 2023 Microsoft had launched Preview of Visual Studio Code Extension for Azure Automation. Azure Automation is one of the commonly used Azure service, which is used to automate mundane activities by IT Professional.

Azure Automation provides a new extension from VS Code to create and manage runbooks. Using this extension, you can perform all runbook management operations such as, creating and editing runbooks, triggering a job, tracking recent jobs output, linking a schedule, asset management, and local debugging.

 

Pros
  • No need to go to Azure Portal for Managing Runbook
  • Improve overall E2E time for support
  • Local Debugging – Yes you can debug your runbook locally, this was headache for Support Engineers since there was no provision for debugging script from Azure Portal (Except relying on output stream). Though this is feature is still in preview but will definitely be helpful in future.
 
Limitations as of writing this blog (Feb 2023)
  • Creation of new schedules.
  • Adding new Certificates in Assets.
  • Upload Modules (PowerShell and Python) packages from the extension.
  • Auto-sync of local runbooks to Azure Automation account. You will have to perform the operation to Fetch or Publish runbook.
  • Management of Hybrid worker groups.
  • Graphical runbook and workflows.
  • For Python, we don’t provide any debug options. We recommend that you install any debugger extension in your Python script.
  • Currently, we support only the unencrypted assets in local run.
Please watch this video to understand how to create and author runbook with VS Code
 

Connect Non Azure VM or AWS VM with Azure Automation Account

If you the Cloud Support Engineer and handling day to day Cloud Operations then it would be obvious that you would be doing Patching and Installation of VM updates.

As we know Azure provides an option to handle auto installation of updates at the scale with help of Automation Account. You can not only handle updates installation for Azure VMs, but also for Non-Azure VM like VM from other Cloud provider AWS and GCP (Multi-Cloud), On-Premise VMs etc. Here are the steps to connect AWS VM with Azure Cloud

1. Setup Automation Account
  • Go to the Azure Portal https://portal.azure.com/
  • Create new Automation Account
  • Once done, go inside the newly create Automation Account
  • In left hand side menu select ‘Update Management’
  • We need to create Log Analytics workspace, basically it captures all log data from Virtual Machines and send it to workspace.
  • Select ‘Create New Workspace’ in drop down and click Enable button
  • It will take approx. 5 mins to setup new Workspace.
 
2. Get Log Analytics Workspace Details
  • You would need to get agent details from Workspace
  • Go to the Workspace created in Step 1
  • In left hand side menu select ‘Agents management’
  • Based upon the type of OS (Windows/ Linux), download the agent installation file
  • Also, copy 3 important details i.e. Workspace ID, Primary Key, Secondary Key
 
3. Install Agent on Non-Azure VM
  • Let’s create new Windows Machine in AWS Cloud
  • Login to Virtual Machine
  • We need to install agent which was downloaded in Step 2
  • Copy agent file MMASetup-AMD64.exe to AWS VM
  •  Click on exe file and start installation process
  • Follow the instructions, click on ‘Connect the agent to Azure Log Analytics’ option
  • Insert the Workspace ID, Primary Key which you had copied in Step 2, click Next
  • Finish the installation

Come back to the ‘Update management’ option with Automation Account. If you notice, it has already started detecting that one new VM is connected and sending logs to Log Analytics Workspace. Just click on ‘Click to manage machine’. 


This is EC2 VM, just click on Enable button



It will take approx 45 mins for AWS VM to show up in Azure Update Management and from there you can monitor status, compliance, schedule updates deployment to AWS VM. Please note Platform is Non-Azure and OS is Windows



This is one the the multi-cloud scenario. Happy learning!!

Getting Started with OpenAI API and Postman

Recently there is a big wave about Artifical Intelligence from Open AI, you may see people have started talking about Chat GPT-3 (Generative Pre-training Transformer). As name suggest it is one of the largest language Pre-Trained model which uses deep learning to produce human like answers.

Did you Know?

Open AI’s GPT-3 is the largest Language Model having 175 BN parameters, 10x more than that of Microsoft’s Turing NLG

In this article I will show you the process to leverage Open AI APIs, so let’s begin:

1. Create New Account on OpenAI web site
 2. Get API Keys
  • Click on Personal menu link in top right ribbon 
  • Click on ‘View API keys’
  • Click on ‘Create new secret key’ button, it will generate new API secret key
  • It will generate new API key, something like in this format sk-<random text>. Copy this key and keep at safe place. You would need this to call OpenAI APIs
  • Important Note: As of writing this blog, OpenAI does not provide official API for ChatGPT. One can use an existing models to get answers that are somewhat similar to the ChatGPT capabilities: Davinci.
3. Call OpenAI API with Postman
  • Open Postman Application
  • Enter URL as https://api.openai.com/v1/completions
  • Select POST HTTP method
  • In Authorization tab, select TYPE as ‘Bearer Token’
  • Paste API key (from Step 2) in token field
  • Now we need to send query/prompt in API body, so to do that select Body tab, select raw format as JSON and paste your JSON like this.
  • In this example we are using text-davinci-003 model and asking it to correct English Grammer in given sentense
  • Once done, click on Send button
  • The JSON response will return sentense with correct English Grammer.
 
Congratulations!! with this you learned how to consume OpenAI API using Postman client. Similarly you can explore many more features on OpenAI website on examples page. 
This includes Classification, Translation, Code etc.