Python Forum
Integration of apache spark and Kafka on eclipse pyspark
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Integration of apache spark and Kafka on eclipse pyspark
#1
Photo 
These are my development environments to integrate kafka and spark.

IDE : eclipse 2020-12
python : Anaconda 2020.02 (Python 3.7)
kafka : 2.13-2.7.0
spark : 3.0.1-bin-hadoop3.2

My eclipse configuration reference site is here. Simple codes of spark pyspark work successfully without errors. But integration of kafka and spark structured streaming brings the errors. These are the codes.

from pyspark.sql import SparkSession

spark = SparkSession.builder.master("local[*]").appName("appName").getOrCreate()
df = spark.read.format("kafka")\
            .option("kafka.bootstrap.servers", "localhost:9092")\
            .option("subscribe", "topicForMongoDB")\
            .option("startingOffsets", "earliest")\
            .load()\
            .selectExpr("CAST(value AS STRING) as column")
df.printSchema()
df.show()
The thrown Errors are

Error:
pyspark.sql.utils.AnalysisException: Failed to find data source: kafka. Please deploy the application as per the deployment section of "Structured Streaming + Kafka Integration Guide".;
So I insert python codes which bind the related jar files.

import os

os.environ['PYSPARK_SUBMIT_ARGS'] = '--packages org.apache.spark:spark-sql-kafka-0-10_2.12:3.1.0,org.apache.spark:spark-streaming-kafka-0-10_2.12:3.1.0'
But this time another errors occurs.

Error:
Error: Missing application resource. Usage: spark-submit [options] <app jar | python file | R file> [app arguments] Usage: spark-submit --kill [submission ID] --master [spark://...] Usage: spark-submit --status [submission ID] --master [spark://...] Usage: spark-submit run-example [options] example-class [example args] Options: --master MASTER_URL spark://host:port, mesos://host:port, yarn, k8s://https://host:port, or local (Default: local[*]). --deploy-mode DEPLOY_MODE Whether to launch the driver program locally ("client") or on one of the worker machines inside the cluster ("cluster") (Default: client). --class CLASS_NAME Your application's main class (for Java / Scala apps). --name NAME A name of your application. --jars JARS Comma-separated list of jars to include on the driver and executor classpaths. --packages Comma-separated list of maven coordinates of jars to include on the driver and executor classpaths. Will search the local maven repo, then maven central and any additional remote repositories given by --repositories. The format for the coordinates should be groupId:artifactId:version.
I am stuck here. My eclipse configuration and pyspark codes have some issues. But I have no idea what causes the errors. Kindly inform me of the integration configuration of kafka and spark pyspark. Any reply will be welcomed.
Reply
#2
Removed, no new info.
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  pyspark parallel write operation not working aliyesami 1 252 Oct-16-2021, 05:18 PM
Last Post: aliyesami
  pyspark creating temp files in /tmp folder aliyesami 1 288 Oct-16-2021, 05:15 PM
Last Post: aliyesami
  install apache-airflow[postgres,google] on Python 3.8.12 virtual env ShahajaK 1 258 Oct-07-2021, 03:05 PM
Last Post: Larz60+
  Apache 2.0 Licensed Python code Furkan 0 296 Jul-26-2021, 11:12 PM
Last Post: Furkan
  KafkaUtils module not found on spark 3 pyspark aupres 2 2,026 Feb-17-2021, 09:40 AM
Last Post: Larz60+
  PySpark Coding Challenge cpatte7372 3 1,612 Feb-14-2021, 04:49 PM
Last Post: ndc85430
  pyspark dataframe to json without header vijz 0 688 Nov-28-2020, 05:36 PM
Last Post: vijz
  Pyspark SQL Error - mismatched input 'FROM' expecting <EOF> Ariean 3 14,200 Nov-20-2020, 03:49 PM
Last Post: Ariean
  Tableau Time Series Prediction using Python Integration tobimarsh43 0 775 Jul-24-2020, 10:38 AM
Last Post: tobimarsh43
  R-PYTHON INTEGRATION RELATED PROBLEM arnab93 0 715 Jun-05-2020, 02:07 PM
Last Post: arnab93

Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020