API Gateway - Configuration

1. Create REST API Go to API Gateway console Create new REST API Create new resource and method Add resource: e.g., “/user-list” Add GET method Integration type: Lambda Function Select your Lambda function Enable CORS if needed Actions → Enable CORS Accept default settings for testing 2. Update “Method request” 3. Update “Integration request” { "limit": "$input.params('limit')" } 4. Deploy and Test Deploy API Note the API endpoint URL ...

June 15, 2025

API Gateway - API Key

In API Gateway: Click on “API Keys” Generate API key for each team member Under API’s Usage Plans: Create new usage plan Add API stage to plan Associate API keys with plan See also: AWS Credentials for CLI AWS STS - Temporary Access Tokens Amazon DynamoDB - Create a Table Amazon DynamoDB - Import CSV Data AWS Lambda - Create a Function AWS Lambda - Grant Access API Gateway - Usage Plan ...

June 15, 2025

API Gateway - Usage Plan

1. Create new usage plan Rate and Burst Rate: Set to 10-20 requests per second for development/testing Recommended: Start with 10 req/sec for controlled testing Burst: Set to 2x your rate (20-40) Recommended: Start with 20 to handle short traffic spikes Quota Settings Quota period: MONTH (most common) Alternative periods: WEEK, DAY Requests per quota period: Start with 50,000/month This allows approximately 1,600 requests per day Can be adjusted based on actual usage patterns Recommended Initial Configuration: ...

June 15, 2025

Amazon DynamoDB - Create a Table

Sign in to AWS Console and navigate to DynamoDB Click “Create table” Table name: e.g., “user_list” Partition key: “user_id (String) Sort key (optional): “first_name” (String) See also: AWS Credentials for CLI AWS STS - Temporary Access Tokens Amazon DynamoDB - Create a Table Amazon DynamoDB - Import CSV Data AWS Lambda - Create a Function AWS Lambda - Grant Access API Gateway - Usage Plan API Gateway - API Key ...

June 15, 2025

AWS STS - Temporary Access Tokens

1. Generate Temporary Credentials First, use the AWS STS (Security Token Service) to generate temporary credentials: # 3600 x 5 = 18000 (5 hours) aws sts get-session-token --duration-seconds 18000 This will return something like: { "Credentials": { "AccessKeyId": "ASIA...", "SecretAccessKey": "...", "SessionToken": "...", "Expiration": "2025-06-13T..." } } 2. Set Environment Variables Then set these environment variables: # Replace the values with your actual credentials from the previous step. export AWS_ACCESS_KEY_ID="your_access_key" export AWS_SECRET_ACCESS_KEY="your_secret_key" export AWS_SESSION_TOKEN="your_session_token" export AWS_DEFAULT_REGION="ap-southeast-2" # Sydney region 3. Verify the environment variables env | grep AWS After setting these variables, try running your Python script again. The credentials will be automatically picked up by the AWS SDK. ...

June 15, 2025

AWS Lambda - Create a Function

Navigate to Lambda in AWS Console Click “Create function” Choose “Author from scratch” Runtime: Python 3.x Name: e.g., “get-user-list” Paste the Python code into “Code” page and click “Deploy” button import boto3 from datetime import datetime from boto3.dynamodb.conditions import Key dynamodb = boto3.resource('dynamodb') table = dynamodb.Table('user_list') def create_nested_structure(data, current_level, max_level): if current_level >= max_level: return data return { f"level_{current_level}": { "data": data, "nested": create_nested_structure(data, current_level + 1, max_level), "metadata": { "level_info": f"This is level {current_level}", "timestamp": datetime.now().isoformat(), "metrics": { "depth": current_level, "remaining_levels": max_level - current_level, "complexity_score": max_level * current_level } } } } def create_complex_response(user_data, nested_level): base_data = { "id": f"user_{user_data['user_id']}", "timestamp": datetime.now().isoformat(), "category": "Personnel", "details": { "name": { "first": user_data['first_name'], "last": user_data['last_name'] }, "company": { "name": user_data['company_name'], "web": user_data['web'] }, "contact_info": { "address": { "street": user_data['address'], "city": user_data['city'], "state": user_data['state'], "postcode": user_data['post'] }, "communication": { "phones": [ { "type": "primary", "number": user_data['phone1'] }, { "type": "secondary", "number": user_data['phone2'] } ], "email": user_data['email'] } } } } return create_nested_structure(base_data, 1, nested_level) def lambda_handler(event, context): try: # Get parameters from event body limit = int(event.get('limit', 10) if event.get('limit') else 10) nested_level = int(event.get('nested_level', 1) if event.get('nested_level') else 1) # Validate nested_level if nested_level < 1: nested_level = 1 elif nested_level > 30: # Set a reasonable maximum nested_level = 30 # 29 nested is the limit on Blue Prism # Scan DynamoDB table with limit response = table.scan( Limit=limit ) items = response.get('Items', []) # Transform items into complex nested structure transformed_data = [create_complex_response(item, nested_level) for item in items] # Create final response return { "statusCode": 200, "headers": { "Content-Type": "application/json", "Access-Control-Allow-Origin": "*" }, "success": True, "timestamp": datetime.now().isoformat(), "total_records": len(transformed_data), "limit_applied": limit, "nesting_level": nested_level, "data": transformed_data, "metadata": { "api_version": "1.0", "service": "user-data-api", "complexity_info": { "max_depth": nested_level, "structure_type": "recursive", "total_nodes": len(transformed_data) * nested_level } } } except Exception as e: return { "statusCode": 500, "success": False, "message": "Error processing request", "error": str(e) } ...

June 15, 2025

AWS Credentials for CLI

1. Using AWS CLI Configuration aws configure This will prompt you to enter: AWS Access Key ID AWS Secret Access Key Default region name Default output format 2. Environment Variables export AWS_ACCESS_KEY_ID="your_access_key" export AWS_SECRET_ACCESS_KEY="your_secret_key" export AWS_DEFAULT_REGION="your_region" 3. Credentials File Create or edit ~/.aws/credentials: [default] aws_access_key_id = your_access_key aws_secret_access_key = your_secret_key 4. Clear AWS CLI Configuration (OPTIONAL) To clear your AWS CLI credentials, you have several options: Delete the credentials file: rm ~/.aws/credentials Delete the config file: rm ~/.aws/config Clear specific profile: aws configure --profile your_profile_name and press Enter without entering values # Remove both credentials and config files rm ~/.aws/credentials ~/.aws/config After clearing the credentials, you can reconfigure them using any of the methods described above. ...

June 15, 2025

AWS Lambda - Grant Access

Go to AWS IAM Console Find your Lambda’s role Click on the role name Click “Add permissions” → “Create inline policy” In the JSON editor, paste this policy: { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "dynamodb:Scan", "dynamodb:GetItem", "dynamodb:Query" ], "Resource": "arn:aws:dynamodb:ap-southeast-2:6850********:table/user_list" } ] } Click “Review policy” Name it something like “DynamoDBScanPolicy” Click “Create policy” After adding this policy, wait a few seconds and try your Lambda function again. The error should be resolved. ...

June 15, 2025

Amazon DynamoDB - Import CSV Data

1. Save the CSV file in the same location as the Python code user_id,first_name,last_name,company_name,address,city,state,post,phone1,phone2,email,web U001,Rebbecca,Didio,"Brandt, Jonathan F Esq",171 E 24th St,Leith,TAS,7315,03-8174-9123,0458-665-290,rebbecca.didio@didio.com.au,http://www.brandtjonathanfesq.com.au U002,Stevie,Hallo,Landrum Temporary Services,22222 Acoma St,Proston,QLD,4613,07-9997-3366,0497-622-620,stevie.hallo@hotmail.com,http://www.landrumtemporaryservices.com.au U003,Mariko,Stayer,"Inabinet, Macre Esq",534 Schoenborn St #51,Hamel,WA,6215,08-5558-9019,0427-885-282,mariko_stayer@hotmail.com,http://www.inabinetmacreesq.com.au U004,Gerardo,Woodka,Morris Downing & Sherred,69206 Jackson Ave,Talmalmo,NSW,2640,02-6044-4682,0443-795-912,gerardo_woodka@hotmail.com,http://www.morrisdowningsherred.com.au U005,Mayra,Bena,"Buelt, David L Esq",808 Glen Cove Ave,Lane Cove,NSW,1595,02-1455-6085,0453-666-885,mayra.bena@gmail.com,http://www.bueltdavidlesq.com.au U006,Idella,Scotland,Artesian Ice & Cold Storage Co,373 Lafayette St,Cartmeticup,WA,6316,08-7868-1355,0451-966-921,idella@hotmail.com,http://www.artesianicecoldstorageco.com.au U007,Sherill,Klar,Midway Hotel,87 Sylvan Ave,Nyamup,WA,6258,08-6522-8931,0427-991-688,sklar@hotmail.com,http://www.midwayhotel.com.au U008,Ena,Desjardiws,"Selsor, Robert J Esq",60562 Ky Rt 321,Bendick Murrell,NSW,2803,02-5226-9402,0415-961-606,ena_desjardiws@desjardiws.com.au,http://www.selsorrobertjesq.com.au U009,Vince,Siena,Vincent J Petti & Co,70 S 18th Pl,Purrawunda,QLD,4356,07-3184-9989,0411-732-965,vince_siena@yahoo.com,http://www.vincentjpettico.com.au U010,Theron,Jarding,"Prentiss, Paul F Esq",8839 Ventura Blvd,Blanchetown,SA,5357,08-6890-4661,0461-862-457,tjarding@hotmail.com,http://www.prentisspaulfesq.com.au 2. Set a temporary token for VS Code Reference: AWS STS - Temporary Access Tokens ...

June 15, 2025

Create a MS SQL Server Container

# This is the current folder structure sh-5.2$ tree . ├── Dockerfile ├── backups │ ├── APP-6.3.2-lab_Stage_2.bak │ ├── APP-6.3.2-lab_Stage_3.bak │ ├── APP-6.3.2-lab_Stage_4.bak │ ├── v9.1.23_APP_632_lab_Stage_3.bak │ └── v9.1.23_APP_632_lab_Stage_4.bak ├── certs │ ├── server-bundle.crt │ └── server.key ├── containers │ └── sql1 │ ├── data [error opening dir] │ ├── log [error opening dir] │ └── secrets [error opening dir] └── mssql.conf Create Dockerfile file FROM mcr.microsoft.com/mssql/server:2022-latest USER root # Install required dependencies RUN apt-get update && \ apt-get install -y curl apt-transport-https gnupg2 && \ mkdir -p /etc/apt/keyrings && \ curl -sSL https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > /etc/apt/keyrings/microsoft.gpg && \ chmod 644 /etc/apt/keyrings/microsoft.gpg && \ echo "deb [signed-by=/etc/apt/keyrings/microsoft.gpg] https://packages.microsoft.com/ubuntu/22.04/prod jammy main" > /etc/apt/sources.list.d/mssql-release.list && \ apt-get update && \ ACCEPT_EULA=Y apt-get install -y mssql-tools unixodbc-dev && \ ln -s /opt/mssql-tools/bin/sqlcmd /usr/bin/sqlcmd && \ ln -s /opt/mssql-tools/bin/bcp /usr/bin/bcp && \ apt-get clean && \ rm -rf /var/lib/apt/lists/* # Switch back to default user USER mssql Create mssql.conf file [network] tlscert = /var/opt/mssql/secrets/server-bundle.crt tlskey = /var/opt/mssql/secrets/server.key tlsprotocols = 1.2 forceencryption = 1 Build an image # Build new image sudo docker build -t mssql-with-tools . Test locally # Run new container sudo docker run -e 'ACCEPT_EULA=Y' -e 'MSSQL_SA_PASSWORD=Password123' \ -p 1433:1433 \ -v /data/containers/sql1/data:/var/opt/mssql/data \ -v /data/containers/sql1/log:/var/opt/mssql/log \ -v sql-certs:/var/opt/mssql/secrets:ro \ -v /data/mssql.conf:/var/opt/mssql/mssql.conf:ro \ -v /data/backups:/var/opt/mssql/backups \ --restart always \ --name sql1 \ -d mssql-with-tools Build a custom container and push into ECR in AWS. # The container URI is below ACCOUNTID.dkr.ecr.ap-southeast-2.amazonaws.com/gcs-sql-server:latest Then run the script to deploy a MS SQL Container #============================================================================= # The following approach successfully copy "server.key" #============================================================================= # Create a Docker volume for the certificates sudo docker volume create sql-certs # Copy the necessary certificate files into the volume sudo cp /data/certs/server-bundle.crt /var/lib/docker/volumes/sql-certs/_data/ sudo cp /data/certs/server.key /var/lib/docker/volumes/sql-certs/_data # Change the ownership sudo chown -R 10001:0 /var/lib/docker/volumes/sql-certs/_data/ sudo chmod -R 600 /var/lib/docker/volumes/sql-certs/_data/ # Retrieve an authentication token and authenticate your Docker client to your registry. Use the AWS CLI: aws ecr get-login-password --region ap-southeast-2 | sudo docker login --username AWS --password-stdin ACCOUNTID.dkr.ecr.ap-southeast-2.amazonaws.com # Deploy MS SQL Server container sudo docker run -e 'ACCEPT_EULA=Y' -e 'MSSQL_SA_PASSWORD=Password123' \ -p 1433:1433 \ -v /data/containers/sql1/data:/var/opt/mssql/data \ -v /data/containers/sql1/log:/var/opt/mssql/log \ -v sql-certs:/var/opt/mssql/secrets:ro \ -v /data/mssql.conf:/var/opt/mssql/mssql.conf:ro \ -v /data/backups:/var/opt/mssql/backups \ --restart always \ --name sql1 \ -d ACCOUNTID.dkr.ecr.ap-southeast-2.amazonaws.com/gcs-sql-server:latest After the deployment, check the status of the container # Check the login sudo docker exec -it sql1 /opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P 'Password123' #Check the files sudo docker exec -it sql1 ls -l /var/opt/mssql/backups

April 4, 2025