AutoGuided Onboarding WebApp for Carbon Footprint Reduction

Challenge Statement: Individuals and businesses are increasingly seeking to understand and reduce their carbon footprint. However, the journey to carbon reduction can be complex and overwhelming.

The AGO (AutoGuided Onboarding) challenge is to develop a web application that provides a seamless and guided onboarding experience for users to understand their carbon footprint, leveraging satellite raster data and intelligent carbon modeling.

Core Components & Development Tasks

1. Raster Data Foundation for AutoGuided Insights

Goal: Provide the data foundation for the AutoGuided Onboarding experience by acquiring and processing satellite raster data that will inform carbon footprint insights.

2. Intelligent Carbon Modeling for AutoGuided Understanding

Goal: Develop a simplified yet informative carbon model that translates processed raster data into user-understandable carbon footprint metrics.

3. Chatbot for AutoGuided Onboarding & Personalized Guidance

Goal: Create a chatbot interface that guides users through carbon insights and recommendations in an intuitive manner.

Technical Setup & AI DevOps Integration

Setting up the Notebook Manually

To set up the notebook manually, follow these steps:

  1. Ensure you have Python 3.9+ installed. You can check your Python version by running python --version in your terminal. If you don't have Python 3.9 or higher, please install it before proceeding.
  2. Install Jupyter (Lab or Notebook):
    pip install jupyterlab
    This command installs JupyterLab, which provides a more feature-rich environment. If you prefer the classic Jupyter Notebook interface, you can install it using pip install notebook.
  3. Clone the repository:
    git clone https://github.com/aimtyaem/CFP-csv-datasets-.git
    This command clones the repository containing the notebook and associated files to your local machine.
  4. Navigate to the cloned directory:
    cd CFP-csv-datasets-
    This command changes your current directory in the terminal to the newly cloned repository folder.
  5. Open the Jupyter notebook:
    jupyter lab
    or
    jupyter notebook
    This command launches JupyterLab (or Jupyter Notebook) in your web browser.
  6. In Jupyter Lab (or Notebook), open the coefficients.ipynb file and run all cells. Once Jupyter is open, you should see the coefficients.ipynb file in the file browser. Open it, and then execute all cells in the notebook to run the code and generate the results. You can typically run all cells by going to the "Run" menu and selecting "Run All Cells".

AI DevOps Integration

This section provides examples of integrating various AWS services in an AI DevOps context:

Example: AWS Ground Station Integration

import boto3

client = boto3.client('groundstation', region_name='us-west-2')

response = client.list_dataflow_endpoint_groups()

for endpoint_group in response['dataflowEndpointGroupList']:
    print(endpoint_group['dataflowEndpointGroupId'])

Example: Carbon Model Deployment with AWS SageMaker

aws sagemaker create-model \
    --model-name my-model \
    --primary-container Image=123.dkr.ecr.us-west-2.amazonaws.com/my-container-image

Example: Onboarding Dashboard Setup with AWS QuickSight

{
  "Name": "MyDashboard",
  "SourceEntity": {
    "SourceTemplate": {
      "DataSetReferences": [
        {
          "DataSetPlaceholder": {
            "DataSetId": "my-dataset-id"
          },
          "DataSetArn": "arn:aws:quicksight:us-west-2:123456789012:dataset/my-dataset-id"
        }
      ],
      "Arn": "arn:aws:quicksight:us-west-2:123456789012:template/my-template-id"
    }
  },
  "Permissions": [
    {
      "Principal": "arn:aws:iam::123456789012:root",
      "Actions": [
        "quicksight:DescribeDashboard",
        "quicksight:ListDashboardVersions",
        "quicksight:UpdateDashboardPermissions",
        "quicksight:QueryDashboard",
        "quicksight:UpdateDashboard",
        "quicksight:DeleteDashboard",
        "quicksight:UpdateDashboardPublishedVersion"
      ]
    }
  ]
}

Example: Image Processing Pipeline with AWS Step Functions

{
  "Comment": "Image Processing Workflow",
  "StartAt": "ProcessImage",
  "States": {
    "ProcessImage": {
      "Type": "Task",
      "Resource": "arn:aws:lambda:us-west-2:123456789012:function:processImageFunction",
      "End": true
    }
  }
}

Example: Real-time Dashboard Monitoring with AWS CloudWatch

aws cloudwatch get-metric-data \
    --metric-data-queries file://metric_queries.json

I hope these code snippets help you in your AI DevOps journey with AWS services! Let me know if you need more assistance.

4. Networking Considerations for Carbon Efficiency

What is the carbon cost of networking? While difficult to measure with competing models, two key factors emerge:

Implementation Strategy: