Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file added docs/services/img/level2_miais.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
242 changes: 137 additions & 105 deletions docs/services/waste_collection.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,45 +6,30 @@
</figure>

## Introduction
This guide shows how to deploy an AI-based service to optimize city waste collection using context information and the [Openroute](https://openrouteservice.org/) optimization service. As a limited example and possible starting point to build your own, it illustrates a **Minimal Interoperable AI Service (MIAIS)** that follows [MIMs embraced by the CitCom.ai project](https://oasc.gitbook.io/mims-2024).

!!! abstract "Scenario"
Different sensors are deployed throughout the city to monitor the fill levels of waste containers. These sensors periodically collect data on the fill levels and send it to the TEF site data platform. The goal is to use this context information to create optimal truck waste collection routes. The solution will only consider the current waste container filling level, their location, available trucks, and start and end location.

Do not worry if your TEF site lacks some components from the described scenario. As a demo, we offer a docker instance with all the necessary components to quickly test the service, including some dummy data. Further sections explain how to adapt and deploy the service on your TEF site. To successfully deploy the service in a real environment, **you must meet the minimum requirements** below or at least be close to them.
The **MAIS** is a conceptual and technical demonstration of how AI can be deployed across heterogeneous smart city platforms. Its purpose is to validate AI integration under [different levels of interoperability maturity](./../getting_started/interoperability.md/#interoperability-levels).
Below, we outline how the MAIS, a waste collection route optimization service, can be deployed in environments adhering to **Level 1** and **Level 2** interoperability models.

### Minimal requirements
Below are the minimum requirements for deploying the service on your TEF site. Please remember that you can always [test the service via the demo](#deploying-the-demo) if you still need to meet them.

#### NGSI-LD for Context Information (MIM1)
```mermaid
graph LR
A["🏛️ TEF site data platform"] ---|NGSI-LD| B["🤖 AI service"];
B --- C["📈 Dashboard"];
C["📈 Dashboard"] --- D["👥 Users"];
```

The above image shows the overall architecture: The AI service gets the necessary information from the TEF site data platform using the **NGSI-LD** specification compliant with MIM1. In the future, once [the data space connector](./../documentation/data_space_connectors/fiware/index.md) is deployed, the AI service will get the data through it.
In Level 0 environments, the MAIS can not be deployed. However, as an example, it includes a Docker instance with all the necessary components to easily simulate a Level 1 environment, including some dummy data.

An intermediary adapter may be required in cases where the city data platform does not comply with the proposed NGSI-LD standard. If your current data platform uses the NGSIv2 specification, check the [documentation section](./../documentation/index.md) for more details about using [Lepus](../documentation/data_federation/ngsiv2_to_ld/lepus.md) or [connecting an NGSI-V2 broker with an NGSI-LD broker through subscriptions](../documentation/data_federation/ngsiv2_to_ld/iot_agent.md).
At Level 1, interoperability is established by adopting **shared data standards and APIs**. The foundational components include **NGSI-LD (MIM1)** as the standard interface for context information exchange and **Smart Data Models (MIM2)** as a shared vocabulary for entities and attributes (e.g., [WasteContainer](https://github.com/smart-data-models/dataModel.WasteManagement/tree/master/WasteContainer), [Vehicle](https://github.com/smart-data-models/dataModel.Transportation/tree/master/Vehicle)).

The AI service will use the gathered information to offer an interactive service through a web dashboard. Once the user provides a desired config the AI service will produce an optimal solution.
Level 2 enhances the Level 1 foundation by introducing **secure, governed, and federated data exchange mechanisms through a [data space connector](./../documentation/data_space_connectors/index.md)**.

#### SmartDataModels for entities (MIM2)
The following entities are retrieved from the TEF site data platform and used in the service: [WasteContainer](https://github.com/smart-data-models/dataModel.WasteManagement/tree/master/WasteContainer) and [Vehicle](https://github.com/smart-data-models/dataModel.Transportation/tree/master/Vehicle). Feel free to click on them and explore their corresponding [Smart Data Model](https://smartdatamodels.org) specifications.

#### Openroute API key
### Openroute API key
[Openroute](https://openrouteservice.org/) offers a free vehicle routing optimization service based on the [Vroom](https://github.com/VROOM-Project/vroom) project. The MIAIS uses this service to provide and optimal solution. To access the service you will need a valid API key, so go over to [openrouteservice.org](https://openrouteservice.org) and get one; you will need it later. If you want to learn more, the API and parameters specification are explained [on the Vroom repository](https://github.com/VROOM-Project/vroom/blob/master/docs/API.md).

## Deploying the demo
As mentioned, a demo example with some dummy data has been provided so partners can quickly test the service without worrying about the minimal requirements. Below, you will find step-by-step instructions on deploying the minimal interoperable service for waste collection using docker.
## Running MIAIS locally
As mentioned, a demo example with some dummy data has been provided so partners can quickly test the service without worrying about the interoperability level. Below, you will find step-by-step instructions on deploying the MIAIS:

1. Clone the repository and navigate to its root folder:
```bash
git clone https://github.com/CitComAI-Hub/waste-collection-demo.git && cd waste-collection-demo
```

2. Init git submodules with the following command. This will clone and install a dead simple [ngsi-ld client library](https://github.com/CitComAI-Hub/ngsild-client) in `lib` folder. Please note that the library is for testing purposes only and lacks most functionality. However, it quickly allows you to implement your own methods to interact with the context broker.
2. Init git submodules with the following command. This will clone and install a dead simple [ngsi-ld client library](https://github.com/CitComAI-Hub/ngsild-client) in `lib` folder. Please **note that the library is for testing purposes only and lacks most functionality**. However, it quickly allows you to implement your own methods to interact with the context broker.
```bash
git submodule init && git submodule update
```
Expand Down Expand Up @@ -91,8 +76,129 @@ python3 upsert_fake_data.py
flask --app server run
```

## Deploying in your TEF
If your TEF site meets all minimum requirements, you can go over deploying the MIAIS in your city. Start by changing your `.env` variables so they point to your real data platform. However, some changes, such as implementing an authentication method, may be required. The Minimal Interoperable AI Service is a starting point; therefore, feel free to explore and edit the project to start building it up on your own. Here are some tips that can help you adapt this example to your needs:
## Running MIAIS in Level 1
As described above, the MIAIS gets the necessary information from the TEF site data platform using NGSI-LD (MIM1) as the standard interface for context information exchange and Smart Data Models (MIM2) as a shared vocabulary for entities and attributes. An intermediary adapter may be required in cases where the city data platform does not comply with the latest NGSI specification. If your current data platform uses the NGSIv2 specification, check the [documentation section](./../documentation/index.md) for more details about using [Lepus](../documentation/data_federation/ngsiv2_to_ld/lepus.md) or [connecting an NGSI-V2 broker with an NGSI-LD broker through subscriptions](../documentation/data_federation/ngsiv2_to_ld/iot_agent.md).

<figure>
```mermaid
graph LR
A["**TEF Context Broker**"] ---|NGSI-LD| B["🤖 **AI Service**"];
```
<figcaption>Figure 1: MIAIS integration in Level 1</figcaption>
</figure>


If your TEF site meets all minimum requirements, you can proceed with deploying the MIAIS in your city. Follow [previous instructions](#running-miais-locally) to get it up and running (except for the commands aimed at creating the context broker and inserting dummy data). Just remember that you must update the `.env` variables file so they point to the TEF data platform. If everything goes well, the example should work. Of course, some `WasteContainer` and `Vehicle` **entities must be available in your context broker**. Otherwise, use and adapt the `upsert_fake_data.py`script to your needs.

### Authentication
When working with brokers in a production state, authentication is often required. The [`ngsild-client`](https://github.com/CitComAI-Hub/ngsild-client) library included in the example does not come with authentication support. However, it is quite straightforward to extend it to meet authentication requirements.

As an example, check the following code of the [Valencia TEF site](./../tef/south_connect/valencia.md) implementation, which integrates [authentication for their NGSIv2 context broker](https://github.com/CitComAI-Hub/ngsild-client/blob/master/Authv2.py).

```python
from lib.ngsildclient.Auth import Authv2
from lib.ngsildclient.Client import Client


# Define service & subservice
service = "tef_city"
subservice = "/containers"

# Authenticate
auth = Authv2()
token = auth.get_auth_token_subservice(service, subservice)

# Ngsi-ld broker client
client = Client()

# Fetch WasteContainer entities
context = os.environ.get("WASTECONTAINERS_CONTEXT")
containers = client.get_all_entities_by_type("WasteContainer", context, 100, 0, service, subservice, token).json()
```

Environment variables in `.env` file:

```bash
AUTH_PROTOCOL="https"
ENDPOINT_KEYSTONE="auth.tef.com:15000"
AUTH_USER="xxxxx"
AUTH_PASSWORD="xxxxx"
```

## Running MIAIS in Level 2
At Level 2, deploying the MAIS requires integration with the **data space infrastructure**. The AI service no longer connects directly to raw NGSI-LD endpoints; instead, it accesses data via the standardized interface provided by the connector. This approximation guarantees that all data exchanges are **traceable, governed, and policy-compliant**.

<figure markdown>
![Level 2 MIAIS](img/level2_miais.png){ loading=lazy }
<figcaption>Figure 2: MIAIS integration at Level 2</figcaption>
</figure>

To illustrate the case (see Figure 2), a simple data space structure is assumed, composed of a **trust anchor, a data space connector in the provider role, and another in the role of consumer**. The data provider and the consumer are **registered in the trust anchor**, establishing a trust relationship. Moreover, the **consumer is registered in the provider's Trusted Issuer List**, which allows the consumer to issue credentials to third parties with permissions to access the provider's data.

In addition, the AI service has a **wallet identity** consisting of a **Decentralized Identifier (DID)** and its associated private key, enabling the service to authenticate itself and sign verifiable credentials. Finally, the AI service must authenticate against the consumer's **Keycloak** identity server to get the corresponding access token.

This authentication process leverages the **OpenID for Verifiable Presentations (OID4VP)** protocol, allowing the AI service to obtain a verifiable credential from the consumer's identity provider and present it as cryptographic proof of authorization. As a wallet, the service generates a verifiable presentation signed with its private key and submits it to the identity server. After successful verification, the service receives an access token, enabling secure and trusted interaction with the provider. For more details about this process, refer to the [Data Space Connectors documentation](./../documentation/data_space_connectors/index.md).


Below, you will find step-by-step instructions on deploying the MIAIS:

1. Clone the repository and navigate to its root folder:
```bash
git clone https://github.com/CitComAI-Hub/waste-collection-demo.git && cd waste-collection-demo
```

2. Switch the branch to `mvds`:
```bash
git checkout mvds
```

3. Prepare wallet-identity:
```bash
mkdir wallet-identity
chmod o+rw wallet-identity
docker run -v $(pwd)/wallet-identity:/cert quay.io/wi_stefan/did-helper:0.1.1
# unsecure, only do that for testing
sudo chmod -R o+rw wallet-identity/private-key.pem
```

4. Create wallet identity secret
```bash
kubectl create secret generic wallet-identity-secret \
--from-file=did.json=wallet-identity/did.json \
--from-file=private-key.pem=wallet-identity/private-key.pem \
-n consumer

# Check
kubectl get secrets -n consumer
kubectl describe secret wallet-identity-secret -n consumer
```

5. Create ORS API key secret
```bash
kubectl create secret generic ors-api-key --from-literal=OPENROUTESERVICE_API_KEY=your_api_key -n consumer
```

6. Create keycloak login secret
```bash
kubectl create secret generic keycloak-credentials \
--from-literal=KEYCLOAK_USER='test-user' \
--from-literal=KEYCLOAK_PASSWORD='test' \
--from-literal=KEYCLOAK_CLIENT_ID='admin-cli' \
-n consumer

# Check
kubectl get secrets -n consumer
kubectl describe secret keycloak-credentials -n consumer
```

7. Deploy the MIAIS in your Kubernetes cluster:
```bash
kubectl apply -f flask-app.yaml -n consumer
kubectl get pods -n consumer
```

## Extending MIAIS
The Minimal Interoperable AI Service is a starting point; therefore, feel free to explore and edit the project to start building it up on your own.

??? tip "Project structure"
- `static/`: Frontend folder.
Expand All @@ -109,83 +215,9 @@ If your TEF site meets all minimum requirements, you can go over deploying the M

Moreover, maybe your situation needs to consider some time restrictions or priorities. Check out the [Openroute service API specification](https://github.com/VROOM-Project/vroom/blob/master/docs/API.md), which is powerful and includes many parameters to fit your optimization needs. To change/add additional query parameters, go over [`Optimization.py`](https://github.com/CitComAI-Hub/waste-collection-demo/blob/mvs-orionld/services/Optimization.py) and [`Optimizer.js`](https://github.com/CitComAI-Hub/waste-collection-demo/blob/mvs-orionld/static/modules/Optimizer.js) files.

??? tip "Level 1: Authentication"
When working with brokers in a production state, authentication is often required. The [`ngsild-client`](https://github.com/CitComAI-Hub/ngsild-client) library included in the example does not come with authentication support. However, it is quite straightforward to extend it to meet authentication requirements. As an example, see the following code from the [Valencia](https://github.com/CitComAI-Hub/waste-collection-demo/tree/valencia) TEF site implementation, which implementes [authentication for their NGSIv2 setup](https://github.com/CitComAI-Hub/ngsild-client/blob/master/Authv2.py).

```python
from lib.ngsildclient.Auth import Authv2
from lib.ngsildclient.Client import Client


# Define service & subservice
service = "tef_city"
subservice = "/containers"

# Authenticate
auth = Authv2()
token = auth.get_auth_token_subservice(service, subservice)

# Ngsi-ld broker client
client = Client()

# Fetch WasteContainer entities
context = os.environ.get("WASTECONTAINERS_CONTEXT")
containers = client.get_all_entities_by_type("WasteContainer", context, 100, 0, service, subservice, token).json()
```

Environment variables in `.env` file:

```bash
AUTH_PROTOCOL="https"
ENDPOINT_KEYSTONE="auth.tef.com:15000"
AUTH_USER="xxxxx"
AUTH_PASSWORD="xxxxx"
```

??? tip "Level 2: Data Space Connector Authentication"
If your TEF has a data space connector deployed, you can use the `fdsauth` python library to authenticate and retrieve the corresponding token. To install `fdsauth`, simply use `pip`:
```bash
pip install fdsauth
```
Next, a DID (Decentralized Identifier) and the corresponding key-material is required. You can create such via:
```bash
mkdir certs && cd certs
docker run -v $(pwd):/cert quay.io/wi_stefan/did-helper:0.1.1
```

Then, use the following example code to obtain your authentication token:
```python
from fdsauth import Consumer
import requests

consumer = Consumer(
keycloak_protocol="http",
keycloak_endpoint="keycloak.consumer-a.local",
keycloak_realm_path="realms/test-realm/protocol",
keycloak_user_name="test-user",
keycloak_user_password="test",
apisix_protocol="http",
apisix_endpoint="apisix-proxy.provider-a.local",
certs_path="./certs",
)

token = consumer.get_data_service_access_token()

try:
# Attempt to access data using the obtained service token. Get entities of type EnergyReport.
url = f"http://apisix-proxy.provider-a.local/ngsi-ld/v1/entities?type=EnergyReport"
headers = {
"Accept": "application/json",
"Authorization": f"Bearer {token}",
}
response = requests.get(url, headers=headers)
response.raise_for_status()
print(response.json())
except Exception as req_err:
print(f"Request error occurred: {req_err}")
```
For more details, check out the [fdsauth repository](https://github.com/CitComAI-Hub/fdsauth/).

## Track and status of known problems
## Issues
Did you find any problem? [Create a new issue](https://github.com/CitComAI-Hub/waste-collection-demo/issues/new).

### Track and status of known problems
- [X] Openroute optimization service has a maximum limit of 70 locations. This can be solved by [deploying your own Openroute instance](https://giscience.github.io/openrouteservice/getting-started).
- [ ] Solutions offered by the AI service should also be provided following MIM1 and MIM2 recommendations. Eg: using Smart data models format like ([FleetVehicle](), [FleetVehicleOperation](), [Road]() and [RoadSegment]()).
- [ ] Output from MIAIS service should be provided following MIM1 and MIM2 recommendations. Eg: using Smart data models format like ([FleetVehicle](), [FleetVehicleOperation](), [Road]() and [RoadSegment]()).
10 changes: 10 additions & 0 deletions docs/stylesheets/extra.css
Original file line number Diff line number Diff line change
Expand Up @@ -52,3 +52,13 @@ footer .footer-tagline p {
margin: 0;
padding: 0;
}

figure {
max-width: 100% !important;
width: auto !important;
}

figure > svg {
width: 100% !important;
height: auto !important;
}