How to Install Confluent Kafka on Centos & Ubuntu
How to Install Confluent Kafka on Centos & Ubuntu
You can install the Confluent Platform using a variety of formats, including Docker, ZIP, TAR, APT, and YUM.
Prerequisites:
Oracle Java 1.7 or later.
You can check your Java version with this command:
$ java -version
If java is not installed, then Follows these procedure:
[root@kafkaserveringserver kafkaManager]# yum install java-1.8.0-openjdk java-1.8.0-openjdk-devel -y
Port Configuration
By default the Confluent Platform components use these ports to communicate. The following ports must be open:
Component Port
Apache Kafka brokers (plain text) 9092
Confluent Control Center 9021
Kafka Connect REST API 8083
REST Proxy 8082
Schema Registry REST API 8081
ZooKeeper 2181
Apache Kafka brokers (plain text) 9092
Confluent Control Center 9021
Kafka Connect REST API 8083
REST Proxy 8082
Schema Registry REST API 8081
ZooKeeper 2181
RHEL and CentOS
The YUM repositories provide packages for RHEL, CentOS, and Fedora-based distributions.Install the Confluent public key, which is used to sign packages in the YUM repository.
# rpm --import https://packages.confluent.io/rpm/3.3/archive.key
[root@kafkaserver ~]# rpm --import https://packages.confluent.io/rpm/3.3/archive.key
[root@kafkaserver ~]# rpm --import https://packages.confluent.io/rpm/3.3/archive.key
Add the repository to your /etc/yum.repos.d/ directory in a file named confluent.repo.
[root@kafkaserver ~]# vi /etc/yum.repos.d/confluent.repo
[Confluent.dist]
name=Confluent repository (dist)
baseurl=https://packages.confluent.io/rpm/3.3/7
gpgcheck=1
gpgkey=https://packages.confluent.io/rpm/3.3/archive.key
enabled=1
[Confluent]
name=Confluent repository
baseurl=https://packages.confluent.io/rpm/3.3
gpgcheck=1
gpgkey=https://packages.confluent.io/rpm/3.3/archive.key
enabled=1
[Confluent.dist]
name=Confluent repository (dist)
baseurl=https://packages.confluent.io/rpm/3.3/7
gpgcheck=1
gpgkey=https://packages.confluent.io/rpm/3.3/archive.key
enabled=1
[Confluent]
name=Confluent repository
baseurl=https://packages.confluent.io/rpm/3.3
gpgcheck=1
gpgkey=https://packages.confluent.io/rpm/3.3/archive.key
enabled=1
Clear the YUM caches.
# yum clean all
[root@kafkaserver ~]# yum clean all
Loaded plugins: fastestmirror, langpacks, product-id, search-disabled-repos, subscription-manager
1 local certificate has been deleted.
Cleaning repos: Confluent Confluent.dist docker-ce-stable elasticsearch-6.x epel epel-debuginfo ius mariadb pgdg10 pgdg11 pgdg12
: pgdg94 pgdg95 pgdg96 rhel-7-server-eus-rpms rhel-7-server-extras-rpms rhel-7-server-optional-rpms rhel-7-server-rpms
: rhel-7-server-supplementary-rpms treasuredata zabbix zabbix-non-supported
Cleaning up list of fastest mirrors
Other repos take up 75 M of disk space (use --verbose for details)
[root@kafkaserver ~]# yum clean all
Loaded plugins: fastestmirror, langpacks, product-id, search-disabled-repos, subscription-manager
1 local certificate has been deleted.
Cleaning repos: Confluent Confluent.dist docker-ce-stable elasticsearch-6.x epel epel-debuginfo ius mariadb pgdg10 pgdg11 pgdg12
: pgdg94 pgdg95 pgdg96 rhel-7-server-eus-rpms rhel-7-server-extras-rpms rhel-7-server-optional-rpms rhel-7-server-rpms
: rhel-7-server-supplementary-rpms treasuredata zabbix zabbix-non-supported
Cleaning up list of fastest mirrors
Other repos take up 75 M of disk space (use --verbose for details)
Install Confluent Platform.
Confluent Enterprise:
# yum install confluent-platform-2.11
Confluent Open Source:
# yum install confluent-platform-oss-2.11
[root@kafkaserver ~]# yum install confluent-platform-oss-2.11 -y
Loaded plugins: fastestmirror, langpacks, product-id, search-disabled-repos, subscription-manager
Determining fastest mirrors
[root@kafkaserver ~]# yum install confluent-platform-oss-2.11 -y
Loaded plugins: fastestmirror, langpacks, product-id, search-disabled-repos, subscription-manager
Determining fastest mirrors
To start just ZooKeeper, Kafka and Schema Registry run:
$ confluent start schema-registry
[root@kafkaserver ~]# confluent start schema-registry
This CLI is intended for development only, not for production
https://docs.confluent.io/current/cli/index.html
Starting zookeeper
zookeeper is [UP]
Starting kafka
kafka is [UP]
Starting schema-registry
schema-registry is [UP]
[root@kafkaserver ~]# confluent start schema-registry
This CLI is intended for development only, not for production
https://docs.confluent.io/current/cli/index.html
Starting zookeeper
zookeeper is [UP]
Starting kafka
kafka is [UP]
Starting schema-registry
schema-registry is [UP]
Remove Confluent Platform
[root@kafkaserver ~]# yum remove confluent-platform-oss-2.11
Loaded plugins: fastestmirror, langpacks, product-id, search-disabled-repos, subscription-manager
Resolving Dependencies
--> Running transaction check
---> Package confluent-platform-oss-2.11.noarch 0:3.3.3-1 will be erased
--> Finished Dependency Resolution
rhel-7-server-eus-rpms/7Server/x86_64 | 2.0 kB 00:00:00
rhel-7-server-extras-rpms/x86_64 | 2.0 kB 00:00:00
rhel-7-server-optional-rpms/7Server/x86_64 | 1.8 kB 00:00:00
rhel-7-server-rpms/7Server/x86_64 | 2.0 kB 00:00:00
rhel-7-server-supplementary-rpms/7Server/x86_64 | 2.0 kB 00:00:00
Dependencies Resolved
====================================================================================================================================
Package Arch Version Repository Size
====================================================================================================================================
Removing:
confluent-platform-oss-2.11 noarch 3.3.3-1 @Confluent 0.0
Transaction Summary
====================================================================================================================================
Loaded plugins: fastestmirror, langpacks, product-id, search-disabled-repos, subscription-manager
Resolving Dependencies
--> Running transaction check
---> Package confluent-platform-oss-2.11.noarch 0:3.3.3-1 will be erased
--> Finished Dependency Resolution
rhel-7-server-eus-rpms/7Server/x86_64 | 2.0 kB 00:00:00
rhel-7-server-extras-rpms/x86_64 | 2.0 kB 00:00:00
rhel-7-server-optional-rpms/7Server/x86_64 | 1.8 kB 00:00:00
rhel-7-server-rpms/7Server/x86_64 | 2.0 kB 00:00:00
rhel-7-server-supplementary-rpms/7Server/x86_64 | 2.0 kB 00:00:00
Dependencies Resolved
====================================================================================================================================
Package Arch Version Repository Size
====================================================================================================================================
Removing:
confluent-platform-oss-2.11 noarch 3.3.3-1 @Confluent 0.0
Transaction Summary
====================================================================================================================================
Debian and Ubuntu
The APT repositories provide packages for Debian-based Linux distributions such as Debian and Ubuntu.Install the Confluent public key, which is used to sign the packages in the APT repository.
$ wget -qO - https://packages.confluent.io/deb/3.3/archive.key | sudo apt-key add -
Add the repository to your /etc/apt/sources.list:# add-apt-repository "deb [arch=amd64] https://packages.confluent.io/deb/3.3 stable main"
Run apt-get update and install Confluent Platform.
Confluent Enterprise:
# apt-get update && sudo apt-get install confluent-platform-2.11
Confluent Open Source:
# apt-get update && sudo apt-get install confluent-platform-oss-2.11
To start just ZooKeeper, Kafka and Schema Registry run:
$ confluent start schema-registry
[root@kafkaserver ~]# confluent start schema-registry
This CLI is intended for development only, not for production
https://docs.confluent.io/current/cli/index.html
Starting zookeeper
zookeeper is [UP]
Starting kafka
kafka is [UP]
Starting schema-registry
schema-registry is [UP]
[root@kafkaserver ~]# confluent start schema-registry
This CLI is intended for development only, not for production
https://docs.confluent.io/current/cli/index.html
Starting zookeeper
zookeeper is [UP]
Starting kafka
kafka is [UP]
Starting schema-registry
schema-registry is [UP]
Remove Confluent Platform
[root@kafkaserver ~]# apt-get remove confluent-platform-oss-2.11
Let's Test Confluent Kafka
Now that you have all of the services running, you can send some Avro data to a Kafka topic. Although you would normally do this from within your applications, this quick start uses the Kafka Avro Console producer utility (kafka-avro-console-producer) to send the data without having to write any code.
Start the Kafka Avro Console Producer utility. It is directed at your local Kafka cluster and is configured to write to topic test, read each line of input as an Avro message, validate the schema against the Schema Registry at the specified URL, and finally indicate the format of the data.
[root@kafkaserver ~]# kafka-avro-console-producer \
--broker-list localhost:9092 --topic test \
--property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}'
Enter a single message per line and press the Enter key to send them immediately. Try entering a couple of messages:--broker-list localhost:9092 --topic test \
--property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}'
{"f1": "value1"}
{"f1": "value2"}
{"f1": "value3"}
When you’re done, use Ctrl+C to shut down the process.
Now we can check that the data was produced by using Kafka’s console consumer process to read data from the topic. We point it at the same test topic, our ZooKeeper instance, tell it to decode each message using Avro using the same Schema Registry URL to look up schemas, and finally tell it to start from the beginning of the topic (by default the consumer only reads messages published after it starts).
[root@kafkaserver ~]# kafka-avro-console-consumer --topic test \
--zookeeper localhost:2181 \
--from-beginning
Using the ConsoleConsumer with old consumer is deprecated and will be removed in a future major release. Consider using the new consumer by passing [bootstrap-server] instead of [zookeeper].
{"f1":"value1"}
{"f1":"value2"}
{"f1":"value3"}
^CProcessed a total of 3 messages
[root@kafkaserver ~]#
--zookeeper localhost:2181 \
--from-beginning
Using the ConsoleConsumer with old consumer is deprecated and will be removed in a future major release. Consider using the new consumer by passing [bootstrap-server] instead of [zookeeper].
{"f1":"value1"}
{"f1":"value2"}
{"f1":"value3"}
^CProcessed a total of 3 messages
[root@kafkaserver ~]#
When you’re done testing, you can use confluent stop to shutdown each service in the right order. To completely delete any data produced during this test and start on a clean slate next time, you may run confluent destroy instead. This command will delete all the services’ data, which are otherwise persisted across restarts.
[root@kafkaserver ~]# confluent destroy
This CLI is intended for development only, not for production
https://docs.confluent.io/current/cli/index.html
Stopping connect
connect is [DOWN]
Stopping kafka-rest
kafka-rest is [DOWN]
Stopping schema-registry
schema-registry is [DOWN]
Stopping kafka
kafka is [DOWN]
Stopping zookeeper
zookeeper is [DOWN]
Deleting: /tmp/confluent.xaXITdrl
root@kafkaserver
This CLI is intended for development only, not for production
https://docs.confluent.io/current/cli/index.html
Stopping connect
connect is [DOWN]
Stopping kafka-rest
kafka-rest is [DOWN]
Stopping schema-registry
schema-registry is [DOWN]
Stopping kafka
kafka is [DOWN]
Stopping zookeeper
zookeeper is [DOWN]
Deleting: /tmp/confluent.xaXITdrl
root@kafkaserver
0 Response to "How to Install Confluent Kafka on Centos & Ubuntu"
Post a Comment