Explain how to configure and run the sample.
If you want to run the sample on Windows, OSX, or Linux, you need to following tools.
- Azure Function Core Tools (v3 or above)
- Node.js (10 or 12)
- AzureCLI
However, If you can use DevContainer, you don't need to prepare the development environment. For the prerequisite for the dev container is:
- Docker for Windows or Docker for Mac
- Visual Studio Code
- Visual Studio Code - Remote Development extension
DevContainer will set up all of the prerequisites includes AzureCLI with local Kafka Cluster.
Go to the samples/javascript
directory then open the Visual Studio Code.
$ cd samples/javascript
$ code .
Visual Studio might automatically ask you to start a container. If not, you can click the right bottom green icon (><), then you will see the following dropdown.
Select Remote-Containers: Reopen in Container.
It starts the DevContainer, wait a couple of minutes, you will find a java development environment, and a local Kafka cluster is already up with Visual Studio Code.
In the table below, Kafka Cluster
local means that the sample users a Kafka cluster that is started with the DevContainer.
Name | Description | Kafka Cluster | Enabled |
---|---|---|---|
KafkaTrigger | Simple Kafka trigger sample | local | yes |
KafkaTriggerMany | Kafka batch processing sample with Confluent Cloud | Confluent Cloud | no |
If you want to use the KafkaTriggerMany
sample, rename KafkaTriggerMany/function.json_
to KafkaTriggerMany/function.json
. This allows the Azure Functions Runtime to detect the function.
Then copy local.settings.json.example
to local.settings.json
and configure your ccloud environment.
If you want to run the sample on your Windows with Confluent Cloud and you are not using DevContainer, uncomment the following line. It is the settings of the CA certificate. .NET Core that is azure functions host language can not access the Windows registry, which means it can not access the CA certificate of the Confluent Cloud.
UserTriggerMany/function.json
"sslCaLocation":"confluent_cloud_cacert.pem",
For downloading confluent_cloud_cacert.pem
, you can refer to Connecting to Confluent Cloud in Azure.
This command will install Kafka Extension. The command refer to the extensions.csproj
then find the Kafka Extension NuGet package.
$ func extensions install
Check if there is dll packages under the target/azure-functions/kafka-function-(some number)/bin
. If it is success, you will find Microsoft.Azure.WebJobs.Extensions.Kafka.dll
on it.
Before running the Kafka extension, you need to configure LD_LIBRARY_PATH
to the /workspace/bin/runtimes/linux-x64/native"
. For the DevContainer, the configuration resides in the devontainer.json
. You don't need to configure it.
$ func start
Deploy the app to a Premium Function You can choose.
- Quickstart: Create a function in Azure using Visual Studio Code
- Quickstart: Create a function in Azure that responds to HTTP requests
- Azure Functions Premium plan
Go to Azure Portal, select the FunctionApp, then go to Configuration > Application settings. You need to configure these application settings. BrokerList
, ConfluentCloudUsername
and ConfluentCloudPassowrd
are required for the sample.
LD_LIBRARY_PATH
is required for Linux based Function App. That is references so library that is included on the Kafka extensions.
| Name | Description | NOTE | | BrokerList | Kafka Broker List | e.g. changeme.eastus.azure.confluent.cloud:9092 | | ConfluentCloudUsername | Username of Confluent Cloud | - | | ConfluentCloudPassword | Password of Confluent Cloud | - | | LD_LIBRARY_PATH | /home/site/wwwroot/bin/runtimes/linux-x64/native | Linux only |
Send Kafka events from a producer, and you can use ccloud CLI for the confluent cloud.
$ ccloud login
$ ccloud kafka topic produce message
For more details, Go to ccloud.
If you want to send an event to the local Kafka cluster, you can use kafakacat instead.
$ apt-get update && apt-get install kafkacat
$ kafkacat -b broker:29092 -t users -P