Configuring Caching and Queueing
The technologies the gDcc uses for caching and communications can be changed through configuration.
The dcc has the following configurable components
Component | Notes | Default | Examples of Alternatives |
---|---|---|---|
Cache | Caches the values of lookups and traces. | Custom redis based implementation | IDistributedCache using SQL or Redis |
Queue | Sends requests to the backend (via dapr) | RabbitMQ | Azure Service Bus |
Pubsub | Sends messages to subscribers (via dapr) | RabbitMQ | Azure Service Bus Redis |
State | Stores state. Used for results, binary data storage and roadnetmanager state (via dapr) | Redis | Azure Storage |
Configuring Secrets for Azure Connections
To use Azure Service Bus or Azure Storage, you'll need to configure the connection strings to azure. To avoid accidentally committing credentials to git it is recommended to use Dapr secrets. The examples below assumes you have configured dapr secrets.
Copy
dev\dapr\alt-components\secretstore.yml
todev\dapr\components\
Create the file
secrets.json
indev\dapr\configuration
. It should contain the following:{
"connections": {
"ASBConnectionString": "Endpoint=sb://jens-dcc-poc.servicebus.windows.net/;SharedAccessKeyName=XXX_YOUR_VALUE_XXX;SharedAccessKey=XXX_YOUR_VALUE_XXX",
"AzureStorageAccountKey": "XXX_YOUR_VALUE_XXX"
}
}For the values of these properties:
- ASBConnectionString can be found on https://portal.azure.com → Service Bus → Your service bus (ex: jens-dcc-poc) → Shared access policies → Your Policy.
- AzureStorageAccountKey can be found on https://portal.azure.com → Storage accounts → Your storage account (ex: jensdccpocsa) → Access keys → Key.
Configuring PubSub and Queueing via Azure Service Bus
If you wish to use Azure ServiceBus you will need a subscription for Azure Service Bus.
- Start by configuring secrets as described above
- Copy
dev\dapr\alt-components\asb-pubsub-queue.yml
to the folderdev\dapr\components\
- Copy
dev\dapr\alt-components\asb-pubsub.yml
to the folderdev\dapr\components\
- In
dev\docker-compose\.env
update the following valuesPubSubQueue=pubsub-queue
→PubSubQueue=asb-pubsub-queue
PubSub=pubsub
→PubSub=asb-pubsub
RoadNetPubSub=pubsub
→RoadNetPubSub=asb-pubsub
This removes the dependency on RabbitMQ so you can optionally remove it from your image:
Delete
dev\dapr\components\pubsub-queue.yml
Delete
dev\dapr\components\pubsub.yml
Delete the
services:rabbitmq
section indev\docker-compose\docker-compose.yml
Delete the rabbitmq dependency from the
x-services:dapr-service:depends_on
section indev\docker-compose\docker-compose.yml
Delete the
mnesia
entry from thevolumes
section indev\docker-compose\docker-compose.yml
Configuring State Storage via Azure Storage
If you wish to use Azure Storage you will need a subscription for Azure Storage.
- Start by configuring secrets as described above
- Copy
dev\dapr\alt-components\asb-state.yml
to the folderdev\dapr\components\
- In
dev\docker-compose\.env
update the following valuesState=state
→State=asb-state
RoadNetState=state
→RoadNetState=asb-state
BinaryStorage=state
→BinaryStorage=asb-state
The cache still needs redis, so you can't remove redis from the image, but you can optionally remove the redis component from dapr:
Delete
dev\dapr\components\state.yml
Delete the redis dependency from the
x-services:dapr-service:depends_on
section indev\docker-compose\docker-compose.yml
Configuring Cache via IDistributedCache
The cache isn't controlled by dapr and we currently support three implementations:
- Custom Redis based cache. The default
- Cache using IDistributedCache backed by Redis
- Cache using IDistributedCache backed by SQL Server
Using IDistributedCache with Redis
- In
dev\docker-compose\.env
update the following valueCacheType=RedisSpecialized
→CacheType=Redis
Using IDistributedCache with SQL Server
To use SQL Server you'll have to add an SQL Server image to the compose file. The image is stored in our own cloud so retrieving it requires some extra steps.
Pull the SQL Server image:
- Run the batch script
/dev/PullSqlServerImage.bat
by double clicking it. - A browser tab will open asking you to log in to Azure. Do that.
- Wait for the script to finish (it will take some time).
- Run the batch script
Prepare storage by adding a named volume for the image under volumes. This could look like:
volumes:
mnesia:
redis:
sqlserver:Add the sql image, by adding the following section under services to
dev\docker-compose\docker-compose.yml
:sqlserver:
# pulling this may require auth: 1. install 'azure CLI' if missing 2. run 'az login' and login via browser/popup 3. run 'az acr login --name amcsmainprdcr'
image: amcsmainprdcr.azurecr.io/sql-server:2019-win2019-1.0.0
ports:
- "1431:1433"
environment:
- SA_PASSWORD=dv4ZUD3GGsKeaMsc
- ACCEPT_EULA=Y
- MSSQL_PID=Developer
- 'attach_dbs=[{dbFiles: "c:\\databases\\DccCache.mdf", dbName: "DccCache"}]'
volumes:
- sqlserver:c:\databases
networks:
- $Network
healthcheck:
test: ["CMD", "sqlcmd", "-q", "SELECT 1"]
interval: 30s
timeout: 10s
retries: 5Optionally add a dependency so the components doesn't start until the sql server is running. The easiest is to add it to the the dapr components under the
x-services:dapr-service:depends_on
section:depends_on:
rabbitmq:
condition: service_healthy
redis:
condition: service_healthy
sqlserver:
condition: service_healthyInitialize the cache database:
- In VS open the "dev" folder in solution explorer, right click the docker-compose project and choose "Set as Startup Project", then press Ctrl+F5 to start without debugging. This will build and start a number of containers, including the one containing the database.
- Open Docker Desktop. In "Containers" there should be an "globaldcc-compose" entry. Unfold it and wait for the sqlserver container to have status running.
- Open SQL Server Management Studio (SSMS). When asked which server to
connect to, choose
- Server type: Database Engine
- Server name: 127.0.0.1,1431
- Authentication: SQL Server Authentication
- Login: sa
- Password: dv4ZUD3GGsKeaMsc
- In SSMS open "Databases". This folder should be empty except for a couple of standard subfolders.
- In VS open the "tools" folder in solution explorer, right click the DccCache.DbUp project and choose "Set as Startup Project", then press F5 to run DbUp. This should create the database and the needed tables.
- In SSMS right click "Databases" and click "Refresh". There should now be a DccCache database.
- In Docker Desktop stop globaldcc-compose by clicking the stop button.
Optional: Removing Redis
If you switch the state to using Azure Storage and the Cache to using SQL Server you can remove the Redis dependency from your image:
- Delete
dev\dapr\components\state.yml
- Delete the
services:redis
section indev\docker-compose\docker-compose.yml
- Delete the redis dependency from the
x-services:dapr-service:depends_on
section indev\docker-compose\docker-compose.yml
- Delete the
redis
entry from thevolumes
section indev\docker-compose\docker-compose.yml