{"id":46660,"date":"2021-08-20T00:00:00","date_gmt":"2021-08-20T07:00:00","guid":{"rendered":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/"},"modified":"2025-11-13T12:55:32","modified_gmt":"2025-11-13T20:55:32","slug":"using-griddb-as-a-source-for-kafka-with-jdbc","status":"publish","type":"post","link":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/","title":{"rendered":"Using GridDB as a source for Kafka with JDBC"},"content":{"rendered":"<p>Last year we released a <a href=\"https:\/\/docs.griddb.net\/tutorial\/kafka\/\">guide\/tutorial<\/a> on how to ingest data using GridDB and Kafka. In that guide, we walked developers through the process of feeding data from a CSV, into the console producer, through Kafka and then through the GridDB JDBC <em>Sink<\/em> through to GridDB itself.<\/p>\n<p>For this blog, because of the new update to the <a href=\"https:\/\/github.com\/griddbnet\/kafka-connect-jdbc-griddb\/\">JDBC Kafka Connector<\/a>, we will be going backwards. That is, we will be using the GridDB JDBC <em>Source<\/em> to move data from GridDB the database, through to Kafka and out to the consumer (or another Kafka Sink plugin).<\/p>\n<p>To help demonstrate this process, we will be employing the help of our <code>go<\/code> script from our most recent tutorial: <a href=\"https:\/\/docs.griddb.net\/gettingstarted\/go\/#simulating-an-iot-dataset\">Simulating IoT Data with Go<\/a>. In that tutorial, we go through the process of creating a simple script that inserts a large amount of generated\/random data to simulate\/mimic an IoT dataset &#8212; the purpose being for creating quick and dirty IoT proof of concepts.<\/p>\n<p>So, if with the help of that tutorial and script, the basic flow will look like this: a <code>go<\/code> script will write data into our GridDB database, which will then be read by the <em>source<\/em> connector, which will then be fed into Kafka, and then finally can be output to the console.<\/p>\n<p><a href=\"https:\/\/griddb.net\/wp-content\/uploads\/2021\/08\/jdbc-diagram.png\"><img fetchpriority=\"high\" decoding=\"async\" src=\"https:\/\/griddb.net\/wp-content\/uploads\/2021\/08\/jdbc-diagram.png\" alt=\"\" width=\"889\" height=\"500\" class=\"aligncenter size-full wp-image-27718\" srcset=\"\/wp-content\/uploads\/2021\/08\/jdbc-diagram.png 889w, \/wp-content\/uploads\/2021\/08\/jdbc-diagram-300x169.png 300w, \/wp-content\/uploads\/2021\/08\/jdbc-diagram-768x432.png 768w, \/wp-content\/uploads\/2021\/08\/jdbc-diagram-150x85.png 150w, \/wp-content\/uploads\/2021\/08\/jdbc-diagram-600x337.png 600w\" sizes=\"(max-width: 889px) 100vw, 889px\" \/><\/a><\/p>\n<h2>Installation<\/h2>\n<p>A majority of the instructions to getting this running can be found in the original <a href=\"https:\/\/docs.griddb.net\/tutorial\/kafka\/\">tutorial<\/a>. Essentially you will need to download\/setup Kafka, run the zookeeper server, and then run the kafka server. From there, you can build the <a href=\"https:\/\/github.com\/griddbnet\/kafka-connect-jdbc-griddb\/\">Kafka-Connect-JDBC-GridDB<\/a> connector from the Git Repo with <code>mvn package<\/code>.<\/p>\n<p>Once you build the <code>.jar<\/code> file, <code>cp<\/code> it into your <code>kafka\/libs<\/code> directory. You should also place your <a href=\"https:\/\/github.com\/griddb\/jdbc\">GridDB JDBC Driver<\/a> <code>.jar<\/code> into that directory as well. And with that of the way, we will now diverge from the original tutorial into new territory.<\/p>\n<h3>Configuring the GridDB Source Connector<\/h3>\n<p>We will now need to configure our GridDB sink connector&#8217;s config file. This file will define the parameters and connection information to give the credentials to allow our Kafka server to communicate with our GridDB server. If you have the default GridDB installation, you can almost use this configuration file verbatim.<\/p>\n<p>So, for your <code>config\/connect-jdbc.properties<\/code> file, you can add the following:<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-bash\">bootstrap.servers=localhost:9092\nname=griddb-sources\nconnector.class=io.confluent.connect.jdbc.JdbcSourceConnector\ntasks.max=1\nkey.converter.schemas.enable=true\nvalue.converter.schemas.enable=true\nbatch.size=1\nmode=bulk\n\ntopic.prefix=gridstore-02-\ntable.whitelist=\"kafkaBlog\"\n\nconnection.url=jdbc:gs:\/\/239.0.0.1:41999\/defaultCluster\/public\nconnection.user=admin\nconnection.password=admin\nauto.create=true\n\ntransforms=TimestampConverter\ntransforms.TimestampConverter.type=org.apache.kafka.connect.transforms.TimestampConverter$Value\ntransforms.TimestampConverter.format=yyyy-MM-dd hh:mm:ss\ntransforms.TimestampConverter.field=datetime\ntransforms.TimestampConverter.target.type=Timestamp<\/code><\/pre>\n<\/div>\n<p>The main difference here (when compared to the original tutorial) is that the connector class has been changed and we have added a table.whitelist parameter. For the purposes of this demo, we will simply take a look at a singular GridDB container that will make up the entirety of our Kafka topic.<\/p>\n<h2>Usage<\/h2>\n<h3>GridDB as a Data Source<\/h3>\n<h4>Inserting<\/h4>\n<p>For this blog, we will use our <code>go<\/code> <a href=\"https:\/\/docs.griddb.net\/gettingstarted\/go\/#simulating-an-iot-dataset\">script\/tutorial<\/a> which will insert random values into our GridDB database. The tutorial documentation goes over it with some more details, but essentially the script allows a developer to insert &#8220;generated&#8221; <code>IoT<\/code> data over X amount of times, with N amount of sensors, into a GridDB server. Because it&#8217;s IoT-based data, all data from the &#8220;sensors&#8221; are inserted into a <code>Time Series container<\/code>.<\/p>\n<p>To run the script:<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-bash\">$ go run singlePut.go 24 5<\/code><\/pre>\n<\/div>\n<p>The first number is total number of hours to simulate, and the second number is the increments of data being &#8220;emitted&#8221;. So in this case, the script will generate data from a timespan of <code>current time<\/code> through 24 hours from now, with data coming in every 5 minutes. The script also has a default of 15 sensors which will be generated for each point in time into the same container.<\/p>\n<p>If you are following along, make sure you change the <code>singlePut.go<\/code> script to update the container name to whatever you like; just make sure the container name matches the table whitelisted in the <code>jdbc.properties<\/code> file. For this blog, we are using <code>kafkaBlog<\/code>.<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-go\">containerName := \"kafkaBlog\"\nconInfo, err := griddb_go.CreateContainerInfo(map[string]interface{} {\n    \"name\": containerName,\n    \"column_info_list\":[][]interface{}{\n        {\"timestamp\", griddb_go.TYPE_TIMESTAMP},\n        {\"id\", griddb_go.TYPE_SHORT},\n        {\"data\", griddb_go.TYPE_FLOAT},\n        {\"temperature\", griddb_go.TYPE_FLOAT}},\n    \"type\": griddb_go.CONTAINER_TIME_SERIES,\n    \"row_key\": true})\nif (err != nil) {\n    fmt.Println(\"Create containerInfo failed, err:\", err)\n    panic(\"err CreateContainerInfo\")\n}\ndefer griddb_go.DeleteContainerInfo(conInfo)<\/code><\/pre>\n<\/div>\n<p>The rest of the the script will handle the actual data generation with a couple of <code>for loops<\/code><\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-go\">for i := 0; i &lt; int(arrLen); i++ {\n\n    innerLen := numSensors\n    fullData[0][i] = make([]interface{}, innerLen)\n    times[i] = make([]time.Time, innerLen)\n    id[i] = make([]int, innerLen)\n    data[i] = make([]float64, innerLen)\n    temp[i] = make([]float64, innerLen)\n\n    var rowList []interface{}\n\n    \/\/ iterates through each sensor (ie. will emit data N amount of times )\n    for j := 0; j &lt; innerLen; j++ {\n        addedTime := i * minutes\n        timeToAdd := time.Minute * time.Duration(addedTime)\n        incTime :=  now.Add(timeToAdd)\n        \n        times[i][j] = incTime\n        id[i][j] = j\n        data[i][j] = (r1.Float64() * 100) + numSensors \/\/ using the random seed\n        x := (r1.Float64() * 100) + 2  \n        temp[i][j] = math.Floor(x*100) \/ 100 \/\/ temp should only go 2 decimal places\n\n        var row []interface{}\n        row = append(row, times[i][j])\n        row = append(row, id[i][j])\n        row = append(row, data[i][j])\n        row = append(row, temp[i][j])\n        rowList = append(rowList, row)\n        \/\/ fmt.Println(\"fullData: \", fullData[0][i][j])\n    }\n    fullData[0][i] = rowList\n}<\/code><\/pre>\n<\/div>\n<h4>Using The Source Connector<\/h4>\n<p>Once this is complete, we can use our <code>Source Connector<\/code> to read the table and feed our data into Kafka. So, let&#8217;s run the source connector.<\/p>\n<p>From your kafka directory, run the following:<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-bash\">$ .\/bin\/connect-standalone.sh config\/connect-standalone.properties config\/connect-jdbc.properties<\/code><\/pre>\n<\/div>\n<p>This will kick off the connector and it will begin looking for the table that is whitelisted. If none is found, you will get the following message:<\/p>\n<p><code>[2021-08-14 01:04:08,047] WARN No tasks will be run because no tables were found (io.confluent.connect.jdbc.JdbcSourceConnector:150)<\/code><\/p>\n<p>But once that data is populated, you will see the connector output:<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-bash\">[2021-08-14 01:05:18,149] INFO WorkerSourceTask{id=griddb-sources-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:487)\n[2021-08-14 01:05:18,156] INFO Started JDBC source task (io.confluent.connect.jdbc.source.JdbcSourceTask:257)\n[2021-08-14 01:05:18,156] INFO WorkerSourceTask{id=griddb-sources-0} Source task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSourceTask:225)\n[2021-08-14 01:05:18,157] INFO Begin using SQL query: SELECT * FROM \"kafkaBlog\" (io.confluent.connect.jdbc.source.TableQuerier:164)<\/code><\/pre>\n<\/div>\n<p>With this message, we can now be sure our database is connected and is now being streamed to Kafka.<\/p>\n<h3>Reading Kafka Content<\/h3>\n<p>Now that this is working, you can see your Kafka messages in the terminal like so:<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-sh\">$ bin\/kafka-console-consumer.sh --topic gridstore-03-kafkaBlog --from-beginning --bootstrap-server localhost:9092<\/code><\/pre>\n<\/div>\n<p>It should output the contents of your container as a Kafka message<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-sh\">{\"schema\":{\"type\":\"struct\",\"fields\":[{\"type\":\"int64\",\"optional\":false,\"name\":\"org.apache.kafka.connect.data.Timestamp\",\"version\":1,\"field\":\"timestamp\"},{\"type\":\"int32\",\"optional\":true,\"field\":\"id\"},{\"type\":\"double\",\"optional\":true,\"field\":\"data\"},{\"type\":\"double\",\"optional\":true,\"field\":\"temperature\"}],\"optional\":false,\"name\":\"kafkaBlog\"},\"payload\":{\"timestamp\":1628903073145,\"id\":4,\"data\":9.307559967041016,\"temperature\":38.70000076293945}}\n{\"schema\":{\"type\":\"struct\",\"fields\":[{\"type\":\"int64\",\"optional\":false,\"name\":\"org.apache.kafka.connect.data.Timestamp\",\"version\":1,\"field\":\"timestamp\"},{\"type\":\"int32\",\"optional\":true,\"field\":\"id\"},{\"type\":\"double\",\"optional\":true,\"field\":\"data\"},{\"type\":\"double\",\"optional\":true,\"field\":\"temperature\"}],\"optional\":false,\"name\":\"kafkaBlog\"},\"payload\":{\"timestamp\":1628903373145,\"id\":3,\"data\":35.1313591003418,\"temperature\":79.73999786376953}}\nProcessed a total of 2 messages<\/code><\/pre>\n<\/div>\n<h2>Conclusion<\/h2>\n<p>With that done, you can now use your GridDB database as a Kafka Source Connector.<\/p>\n<p>For next steps, you can play around with other Sink connectors. You can, for example, use the <a href=\"https:\/\/docs.confluent.io\/kafka-connect-http\/current\/overview.html#http-connector-authentication\">Kafka HTTP Sink Connector<\/a> to send your payloads to an HTTP endpoint. This allows you to do almost anything, including sending your data to a slack channel of your choice for alerts.<\/p>\n<p>The possibilities here are endless; you can even try a <a href=\"https:\/\/www.confluent.io\/hub\/jcustenborder\/kafka-connect-twitter\">Twitter Connector<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Last year we released a guide\/tutorial on how to ingest data using GridDB and Kafka. In that guide, we walked developers through the process of feeding data from a CSV, into the console producer, through Kafka and then through the GridDB JDBC Sink through to GridDB itself. For this blog, because of the new update [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":27721,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[121],"tags":[],"class_list":["post-46660","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-blog"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.1.1 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Using GridDB as a source for Kafka with JDBC | GridDB: Open Source Time Series Database for IoT<\/title>\n<meta name=\"description\" content=\"Last year we released a guide\/tutorial on how to ingest data using GridDB and Kafka. In that guide, we walked developers through the process of feeding\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Using GridDB as a source for Kafka with JDBC | GridDB: Open Source Time Series Database for IoT\" \/>\n<meta property=\"og:description\" content=\"Last year we released a guide\/tutorial on how to ingest data using GridDB and Kafka. In that guide, we walked developers through the process of feeding\" \/>\n<meta property=\"og:url\" content=\"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/\" \/>\n<meta property=\"og:site_name\" content=\"GridDB: Open Source Time Series Database for IoT\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/griddbcommunity\/\" \/>\n<meta property=\"article:published_time\" content=\"2021-08-20T07:00:00+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-11-13T20:55:32+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/wp-content\/uploads\/2021\/08\/griddb-jdbc-1.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1160\" \/>\n\t<meta property=\"og:image:height\" content=\"653\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Israel\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@GridDBCommunity\" \/>\n<meta name=\"twitter:site\" content=\"@GridDBCommunity\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Israel\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"7 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/\"},\"author\":{\"name\":\"Israel\",\"@id\":\"https:\/\/griddb.net\/en\/#\/schema\/person\/c8a430e7156a9e10af73b1fbb46c2740\"},\"headline\":\"Using GridDB as a source for Kafka with JDBC\",\"datePublished\":\"2021-08-20T07:00:00+00:00\",\"dateModified\":\"2025-11-13T20:55:32+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/\"},\"wordCount\":827,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/griddb.net\/en\/#organization\"},\"image\":{\"@id\":\"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/#primaryimage\"},\"thumbnailUrl\":\"\/wp-content\/uploads\/2021\/08\/griddb-jdbc-1.png\",\"articleSection\":[\"Blog\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/\",\"url\":\"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/\",\"name\":\"Using GridDB as a source for Kafka with JDBC | GridDB: Open Source Time Series Database for IoT\",\"isPartOf\":{\"@id\":\"https:\/\/griddb.net\/en\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/#primaryimage\"},\"thumbnailUrl\":\"\/wp-content\/uploads\/2021\/08\/griddb-jdbc-1.png\",\"datePublished\":\"2021-08-20T07:00:00+00:00\",\"dateModified\":\"2025-11-13T20:55:32+00:00\",\"description\":\"Last year we released a guide\/tutorial on how to ingest data using GridDB and Kafka. In that guide, we walked developers through the process of feeding\",\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/#primaryimage\",\"url\":\"\/wp-content\/uploads\/2021\/08\/griddb-jdbc-1.png\",\"contentUrl\":\"\/wp-content\/uploads\/2021\/08\/griddb-jdbc-1.png\",\"width\":1160,\"height\":653},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/griddb.net\/en\/#website\",\"url\":\"https:\/\/griddb.net\/en\/\",\"name\":\"GridDB: Open Source Time Series Database for IoT\",\"description\":\"GridDB is an open source time-series database with the performance of NoSQL and convenience of SQL\",\"publisher\":{\"@id\":\"https:\/\/griddb.net\/en\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/griddb.net\/en\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/griddb.net\/en\/#organization\",\"name\":\"Fixstars\",\"url\":\"https:\/\/griddb.net\/en\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/griddb.net\/en\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/griddb.net\/wp-content\/uploads\/2019\/04\/fixstars_logo_web_tagline.png\",\"contentUrl\":\"https:\/\/griddb.net\/wp-content\/uploads\/2019\/04\/fixstars_logo_web_tagline.png\",\"width\":200,\"height\":83,\"caption\":\"Fixstars\"},\"image\":{\"@id\":\"https:\/\/griddb.net\/en\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.facebook.com\/griddbcommunity\/\",\"https:\/\/x.com\/GridDBCommunity\",\"https:\/\/www.linkedin.com\/company\/griddb-by-toshiba\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/griddb.net\/en\/#\/schema\/person\/c8a430e7156a9e10af73b1fbb46c2740\",\"name\":\"Israel\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/griddb.net\/en\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/4df8cfc155402a2928d11f80b0220037b8bd26c4f1b19c4598d826e0306e6307?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/4df8cfc155402a2928d11f80b0220037b8bd26c4f1b19c4598d826e0306e6307?s=96&d=mm&r=g\",\"caption\":\"Israel\"},\"url\":\"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/author\/israel\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Using GridDB as a source for Kafka with JDBC | GridDB: Open Source Time Series Database for IoT","description":"Last year we released a guide\/tutorial on how to ingest data using GridDB and Kafka. In that guide, we walked developers through the process of feeding","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/","og_locale":"en_US","og_type":"article","og_title":"Using GridDB as a source for Kafka with JDBC | GridDB: Open Source Time Series Database for IoT","og_description":"Last year we released a guide\/tutorial on how to ingest data using GridDB and Kafka. In that guide, we walked developers through the process of feeding","og_url":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/","og_site_name":"GridDB: Open Source Time Series Database for IoT","article_publisher":"https:\/\/www.facebook.com\/griddbcommunity\/","article_published_time":"2021-08-20T07:00:00+00:00","article_modified_time":"2025-11-13T20:55:32+00:00","og_image":[{"width":1160,"height":653,"url":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/wp-content\/uploads\/2021\/08\/griddb-jdbc-1.png","type":"image\/png"}],"author":"Israel","twitter_card":"summary_large_image","twitter_creator":"@GridDBCommunity","twitter_site":"@GridDBCommunity","twitter_misc":{"Written by":"Israel","Est. reading time":"7 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/#article","isPartOf":{"@id":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/"},"author":{"name":"Israel","@id":"https:\/\/griddb.net\/en\/#\/schema\/person\/c8a430e7156a9e10af73b1fbb46c2740"},"headline":"Using GridDB as a source for Kafka with JDBC","datePublished":"2021-08-20T07:00:00+00:00","dateModified":"2025-11-13T20:55:32+00:00","mainEntityOfPage":{"@id":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/"},"wordCount":827,"commentCount":0,"publisher":{"@id":"https:\/\/griddb.net\/en\/#organization"},"image":{"@id":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/#primaryimage"},"thumbnailUrl":"\/wp-content\/uploads\/2021\/08\/griddb-jdbc-1.png","articleSection":["Blog"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/","url":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/","name":"Using GridDB as a source for Kafka with JDBC | GridDB: Open Source Time Series Database for IoT","isPartOf":{"@id":"https:\/\/griddb.net\/en\/#website"},"primaryImageOfPage":{"@id":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/#primaryimage"},"image":{"@id":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/#primaryimage"},"thumbnailUrl":"\/wp-content\/uploads\/2021\/08\/griddb-jdbc-1.png","datePublished":"2021-08-20T07:00:00+00:00","dateModified":"2025-11-13T20:55:32+00:00","description":"Last year we released a guide\/tutorial on how to ingest data using GridDB and Kafka. In that guide, we walked developers through the process of feeding","inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/using-griddb-as-a-source-for-kafka-with-jdbc\/#primaryimage","url":"\/wp-content\/uploads\/2021\/08\/griddb-jdbc-1.png","contentUrl":"\/wp-content\/uploads\/2021\/08\/griddb-jdbc-1.png","width":1160,"height":653},{"@type":"WebSite","@id":"https:\/\/griddb.net\/en\/#website","url":"https:\/\/griddb.net\/en\/","name":"GridDB: Open Source Time Series Database for IoT","description":"GridDB is an open source time-series database with the performance of NoSQL and convenience of SQL","publisher":{"@id":"https:\/\/griddb.net\/en\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/griddb.net\/en\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/griddb.net\/en\/#organization","name":"Fixstars","url":"https:\/\/griddb.net\/en\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/griddb.net\/en\/#\/schema\/logo\/image\/","url":"https:\/\/griddb.net\/wp-content\/uploads\/2019\/04\/fixstars_logo_web_tagline.png","contentUrl":"https:\/\/griddb.net\/wp-content\/uploads\/2019\/04\/fixstars_logo_web_tagline.png","width":200,"height":83,"caption":"Fixstars"},"image":{"@id":"https:\/\/griddb.net\/en\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/griddbcommunity\/","https:\/\/x.com\/GridDBCommunity","https:\/\/www.linkedin.com\/company\/griddb-by-toshiba"]},{"@type":"Person","@id":"https:\/\/griddb.net\/en\/#\/schema\/person\/c8a430e7156a9e10af73b1fbb46c2740","name":"Israel","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/griddb.net\/en\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/4df8cfc155402a2928d11f80b0220037b8bd26c4f1b19c4598d826e0306e6307?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/4df8cfc155402a2928d11f80b0220037b8bd26c4f1b19c4598d826e0306e6307?s=96&d=mm&r=g","caption":"Israel"},"url":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/author\/israel\/"}]}},"_links":{"self":[{"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/posts\/46660","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/comments?post=46660"}],"version-history":[{"count":1,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/posts\/46660\/revisions"}],"predecessor-version":[{"id":51335,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/posts\/46660\/revisions\/51335"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/media\/27721"}],"wp:attachment":[{"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/media?parent=46660"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/categories?post=46660"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/tags?post=46660"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}