{"id":46816,"date":"2024-11-06T00:00:00","date_gmt":"2024-11-06T08:00:00","guid":{"rendered":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/"},"modified":"2025-11-13T12:57:08","modified_gmt":"2025-11-13T20:57:08","slug":"saving-your-iot-data-into-griddb-with-rabbitmq","status":"publish","type":"post","link":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/","title":{"rendered":"Saving your IoT Data into GridDB with RabbitMQ"},"content":{"rendered":"<p><a href=\"https:\/\/www.rabbitmq.com\/\">RabbitMQ<\/a> is a popular message-queuing system, used in a variety of systems where message delivery are of utmost importance. For our case, we would like to use RabbitMQ to ensure delivery of an on-the-field sensor data to be delivered to GridDB for later processing.<\/p>\n<p>Of course, we could always send data from the field to our main server via other means, namely HTTP, but those methods of data transfer can be finicky and unsafe; how often have you tried listening a song via Apple Music through a sparsely connected rural part of the state, only to be met with a connection error and then dead silence? Once that connection is broken, it won&#8217;t come back until the entire handshake process occurs again, and all of the data sent in the intermediary is completely lost. The goal of RabbitMQ in the context of this project will be to ensure that even if there are connection issues, the data will persist until it receives acknowledgement from the server that the data has been received and saved into GridDB.<\/p>\n<h2>The Project<\/h2>\n<p>The goal of this article is to create a proof-of-concept for a very basic IoT message-queue system; we will have one physical sensor out &#8220;in the field&#8221; reading data from its environment, pushing the readings onto an exchange which will then push the data onto the queue and then finally into our server. Once that server acknowledges that it has received the entirety of the data, it will remove that value from the queue and move on to the next one (if it exists).<\/p>\n<p>To accomplish this, first let&#8217;s talk hardware.<\/p>\n<h3>The Hardware<\/h3>\n<p>We have set up a <a href=\"https:\/\/www.raspberrypi.com\/products\/raspberry-pi-4-model-b\/\">Raspberry Pi 4<\/a> to connect with an air quality sensor <a href=\"https:\/\/www.adafruit.com\/product\/4632\">Adafruit PMSA003I Air Quality Breakout<\/a> via this <a href=\"https:\/\/www.adafruit.com\/product\/4688\">STEMMA Hat<\/a> and a STEMMA wire; if you are interested in learning more about this particular sensor, you can read about it in the <a href=\"https:\/\/learn.adafruit.com\/pmsa003i?view=all\">Docs<\/a> page provided by adafruit.<\/p>\n<p><a href=\"https:\/\/griddb.net\/wp-content\/uploads\/2024\/11\/hardware-scaled.jpg\"><img fetchpriority=\"high\" decoding=\"async\" src=\"https:\/\/griddb.net\/wp-content\/uploads\/2024\/11\/hardware-scaled.jpg\" alt=\"\" width=\"1739\" height=\"2560\" class=\"aligncenter size-full wp-image-30767\" srcset=\"\/wp-content\/uploads\/2024\/11\/hardware-scaled.jpg 1739w, \/wp-content\/uploads\/2024\/11\/hardware-204x300.jpg 204w, \/wp-content\/uploads\/2024\/11\/hardware-696x1024.jpg 696w, \/wp-content\/uploads\/2024\/11\/hardware-768x1131.jpg 768w, \/wp-content\/uploads\/2024\/11\/hardware-600x883.jpg 600w, \/wp-content\/uploads\/2024\/11\/hardware-1043x1536.jpg 1043w, \/wp-content\/uploads\/2024\/11\/hardware-1391x2048.jpg 1391w\" sizes=\"(max-width: 1739px) 100vw, 1739px\" \/><\/a><\/p>\n<p>The data will be received from the queue from an Ubuntu server &#8212; the specs are not important.<\/p>\n<p>Next, let&#8217;s take a look at the software.<\/p>\n<h3>The Software<\/h3>\n<p>Of course, we are going to be utilizing RabbitMQ for the pushing and receiving of messages of relevant data. RabbitMQ provides various connectors for many programming languages, so we essentially are free to mix and match as we see fit (which is another stealth benefit of utilizing RabbitMQ for your stack). In our case, because we were already provided with a python library to which we can easily read and translate the raw sensor data, we want to push the payload data with Python. We <em>could<\/em> receive our payload data on the server with another python script with the aid of the <a href=\"https:\/\/docs.griddb.net\/gettingstarted\/python\/\">GridDB Python Connector<\/a>, but we will instead opt to receive with Java as it is GridDB&#8217;s native interface and doesn&#8217;t require any additional downloads.<\/p>\n<h3>The Plan<\/h3>\n<p>Overall, our plan is as follows:<\/p>\n<ol>\n<li>Install RabbitMQ onto Ubuntu server<\/li>\n<li>Read sensor readings and translate into readable data payloads (python)<\/li>\n<li>push data onto an Exchange\/Queue of our creation<\/li>\n<li>Receive queue with Java (and RabbitMQ)<\/li>\n<li>Save received payloads directly GridDB<\/li>\n<\/ol>\n<h3>How to Run<\/h3>\n<p>The python script can be easily run: install the required libraries and then simply run the script: <code>python3 app.py<\/code>.<\/p>\n<p>For Java, because we have some dependencies to outside libraries, we need to reference them (they&#8217;re in the <code>lib<\/code> directory) and then run that way. For example:<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-sh\">$ cd lib\/\n$ export CP=.:amqp-client-5.16.0.jar:slf4j-api-1.7.36.jar:slf4j-simple-1.7.36.jar:gridstore-5.6.0.jar:jackson-databind-2.17.2.jar:jackson-core-2.17.2.jar:jackson-annotations-2.17.2.jar\n$ java -cp $CP ..\/Recv.java<\/code><\/pre>\n<\/div>\n<p>The order of running these two files is not important; the receive will stay on even if the queue is empty.<\/p>\n<h2>Prereqs &amp; Getting Started<\/h2>\n<p>Here are list of needs if you would like to follow this project 1:1<\/p>\n<ol>\n<li>Raspberry Pi<\/li>\n<li>STEMMA Hat &amp; Wire (or other means of connecting to board)<\/li>\n<li>Python, RabbitMQ, GridDB, Java, &amp; various other libraries<\/li>\n<\/ol>\n<p>You can install RabbitMQ from their <a href=\"https:\/\/www.rabbitmq.com\/docs\/download\">download<\/a> page; instructions are straightforward. The only caveats are you will need to create yourself a new user and set the permissions properly:<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-sh\">$ sudo rabbitmqctl add_user username password\n$ sudo rabbitmqctl set_permissions -p \/ username \".*\" \".*\" \".*\"<\/code><\/pre>\n<\/div>\n<p>The credentials here will be the same ones used when forging the connection between the data sender and the data receiver.<\/p>\n<p>One note: I was unsuccessful in trying to use &#8220;special characters&#8221; in my password when making my connection, so I&#8217;d advise to keep the password simple for now (ie. just A-z and integers).<\/p>\n<h2>Implementation: The Producer<\/h2>\n<p>Finally, let&#8217;s get into specifics. We will first focus on our producer (the raspberry pi) and then move on to the consumer (our server). We will also be setting some configs to ensure our messages are delivered and saved into the database.<\/p>\n<h3>Python Script for Reading Data<\/h3>\n<p>We are using a modified version of the python script provided by adafruit to read the sensor data. Essentially, our task is very simple: we read the data, convert to JSON, and push to the Exchange\/Queue. First, let&#8217;s look at the hardware part of the code; after that we will get into the code for creating and pushing onto a queue to the correct machine.<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-python\">import board\nimport busio\nfrom adafruit_pm25.i2c import PM25_I2C\n\nreset_pin = None\ni2c = busio.I2C(board.SCL, board.SDA, frequency=100000)\n# Connect to a PM2.5 sensor over I2C\npm25 = PM25_I2C(i2c, reset_pin)\naqdata = pm25.read()<\/code><\/pre>\n<\/div>\n<p>This snippet of code is all you need to read\/translate the sensor readings. With this, assuming everything is connected properly, we will save the current values into the variable we called <code>aqdata<\/code>.<\/p>\n<h3>Python Code to Create and Push Data to RabbitMQ Queue<\/h3>\n<p>Next, let&#8217;s look at the RabbitMQ code. First, we want to establish our connection to our ubuntu server. We will point the address to the IP of the machine and set the port to the default. We will also use the credentials we made earlier on our Ubuntu server<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-python\">import pika\n\ncredentials = pika.PlainCredentials('israel', 'israel')\nparameters = pika.ConnectionParameters('192.168.50.206',\n                                   5672,\n                                   '\/',\n                                   credentials)\n\nconnection = pika.BlockingConnection(parameters)\nchannel = connection.channel()<\/code><\/pre>\n<\/div>\n<p>Next we want to create and set some parameters for our queue, including how we handle pushing data messages to it.<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-python\">channel.confirm_delivery()\nchannel.queue_declare(queue='airQuality', durable=True)<\/code><\/pre>\n<\/div>\n<p>By default, RabbitMQ prioritizes throughput above all else, meaning we need to change some default configuration options to ensure our data is being sent &#8212; even in the case of a weak connection &#8212; to our server (also known as the broker).<\/p>\n<p>First, we want to enable <code>confirm delivery<\/code>. This will produce an exception\/error if the producer receives a negative acknowledgement (also referred to as a nack) from our broker. This means if our data is falling off, we will at least have a log of it. Unfortunately for us, there isn&#8217;t a very robust handling of failed messages on the Python side; if this were for a production project, we would need to migrate from Python to some other language where you can deal with messages in a variety of ways. Namely, I think, we&#8217;d like to add batch processing of messages so that there&#8217;s less of a chance of dropped data readings, and an easier time of re-sending dropped efforts.<\/p>\n<p>Anyway, working with what we have, the next thing we do is turn on <code>durable<\/code> which will save the queue in the event of a broker crash\/reboot. This means the <code>aqdata<\/code> won&#8217;t need to be re-created but the messages inside of the queue won&#8217;t necessarily be saved.<\/p>\n<p>After that, we read and send data simultaneously:<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-python\">while True:\n    time.sleep(1)\n\n    try:\n        aqdata = pm25.read()\n        current_time = datetime.datetime.utcnow().replace(microsecond=0)\n        now = current_time.strftime('%Y-%m-%dT%H:%M:%S.%fZ')\n        aqdata['ts'] = now\n        aqdata['pm1'] = aqdata.pop('pm10 standard')\n        aqdata['pm25'] = aqdata.pop('pm25 standard')\n        aqdata['pm10'] = aqdata.pop('pm100 standard')\n        aqdata['pm1e'] = aqdata.pop('pm10 env')\n        aqdata['pm25e'] = aqdata.pop('pm25 env')\n        aqdata['pm10e'] = aqdata.pop('pm100 env')\n        aqdata['particles03'] = aqdata.pop('particles 03um')\n        aqdata['particles05'] = aqdata.pop('particles 05um')\n        aqdata['particles10'] = aqdata.pop('particles 10um')\n        aqdata['particles25'] = aqdata.pop('particles 25um')\n        aqdata['particles50'] = aqdata.pop('particles 50um')\n        aqdata['particles100'] = aqdata.pop('particles 100um')\n        #print(aqdata)\n    except RuntimeError:\n        print(\"Unable to read from sensor, retrying...\")\n        continue\n    \n    payload = json.dumps(aqdata)\n    try: \n        channel.basic_publish(exchange='',\n                        routing_key='airQuality',\n                        body=payload,\n                        properties=pika.BasicProperties(delivery_mode=pika.DeliveryMode.Persistent),\n                        mandatory=True)\n        print(\" [x] Sent payload: \" + payload)\n    except pika.exceptions.UnroutableError:\n        # If the message is not confirmed, it means something went wrong\n        print(\"Message could not be confirmed\")<\/code><\/pre>\n<\/div>\n<p>For this snippet of code, we are reading the sensor data, changing the column names into the ones we want to use on the consumer side, and then pushing the payload into the channel to our queue we made earlier. Some things to note here: we set the mandatory flag to true and set the delivery mode to persistent. These two settings will try to save our messages into disk if they don&#8217;t receive positive acknowledgement from our broker that the messages were safely delivered.<\/p>\n<p>The exception occurs if the broker ends back to our producer a <code>nack<\/code> (negative acknowledgement).<\/p>\n<p>And so now every 1 second, our script will read sensor values and push it into the queue. Once the data is confirmed by the broker, the producer no longer cares about that data message.<\/p>\n<h2>Implementation: The Consumer<\/h2>\n<p>Our consumer will be written in Java and its job is to read from the Queue in our broker (in our case, the same host machine as our consumer), unmarshal the data into a Java Object, and then save the results into GridDB.<\/p>\n<h3>Consuming the Queue in Java<\/h3>\n<p>The consumer portion of the code is rather simple: forge the connection and read from the queue.<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-python\">private final static String QUEUE_NAME = \"airQuality\";\nprivate final static boolean AUTO_ACK = false;\nConnectionFactory factory = new ConnectionFactory();\nfactory.setHost(\"localhost\");\nConnection connection = factory.newConnection();\nChannel channel = connection.createChannel();\nchannel.queueDeclare(QUEUE_NAME, true, false, false, null);\nSystem.out.println(\" [*] Waiting for messages. To exit press CTRL+C\");<\/code><\/pre>\n<\/div>\n<p>Here we are making our connection to our broker (hosted on the same machine as the consumer, hence <code>localhost<\/code>). We declare the queue we want to read from and set some options; we are using the default values for everything except for the first <code>true<\/code> which corresponds to <code>durable mode<\/code>, which we are setting to true, as explained above in the python section, it means that our queue will persist even if the broker goes down.<\/p>\n<p>Next, let&#8217;s run the actual consume:<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-python\">channel.basicConsume(QUEUE_NAME, AUTO_ACK, deliverCallback, consumerTag -> { });<\/code><\/pre>\n<\/div>\n<p>The only thing I&#8217;d like to point out here is that we&#8217;ve turned off the <code>AUTO_ACK<\/code> option (it&#8217;s set to FALSE). This means we will need to manually acknowledge either if the message being read from the queue was successful or not.<\/p>\n<p>Next, here&#8217;s the callback function that is run every-time it reads a new message off of the queue:<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-python\">        DeliverCallback deliverCallback = (consumerTag, delivery) -> {\n            byte[] data = delivery.getBody();\n   \n            try {\n                AirData ad = mapper.readValue(data, AirData.class);\n                String jsonString = mapper.writerWithDefaultPrettyPrinter().writeValueAsString(ad);\n                System.out.println(jsonString);\n                container.put(ad);\n                channel.basicAck(delivery.getEnvelope().getDeliveryTag(), false);\n            } catch (Exception e) {\n                channel.basicNack(delivery.getEnvelope().getDeliveryTag(), false, true);\n                System.out.println(\"Setting nack\");\n            }\n        };<\/code><\/pre>\n<\/div>\n<p>Here is what&#8217;s going on: we read the message (type of array of bytes), we use the jackson json library to unmarshal the value from raw bytes into a class we declare called <code>AirData<\/code>:<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-python\">    static public class AirData {\n        @JsonProperty(\"ts\")\n        @RowKey Date ts;\n        @JsonProperty(\"pm1\")\n        double pm1;\n        @JsonProperty(\"pm25\")\n        double pm25;\n        @JsonProperty(\"pm10\")\n        double pm10;\n        @JsonProperty(\"pm1e\")\n        double pm1e;\n        @JsonProperty(\"pm25e\")\n        double pm25e;\n        @JsonProperty(\"pm10e\")\n        double pm10e;\n        @JsonProperty(\"particles03\")\n        double particles03;\n        @JsonProperty(\"particles05\")\n        double particles05;\n        @JsonProperty(\"particles10\")\n        double particles10;\n        @JsonProperty(\"particles25\")\n        double particles25;\n        @JsonProperty(\"particles50\")\n        double particles50;\n        @JsonProperty(\"particles100\")\n        double particles100;\n    }<\/code><\/pre>\n<\/div>\n<p>Next we save that newly made Java object into GridDB and then finally acknowledge to our broker that we received the message. If something goes wrong, we will send a <code>nack<\/code> and the message will remain in the queue until it gets an <code>ack<\/code>.<\/p>\n<h3>GridDB<\/h3>\n<p>Lastly, let&#8217;s go over how GridDB fits into this. We will do our standard connecting to GridDB and then get our timeseries container. In this case, I created the table\/container in the shell as it&#8217;s easier than writing a one-time use java code.<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-sh\">$ sudo su gsadm\n$ gs_sh\ngs> createtimeseries aqdata NO ts timestamp pm1 double pm25 double pm10 double pm1e double pm25e double pm10e double particles03 double particles05 double particles10 double particles25 double particles50 double particles100 double<\/code><\/pre>\n<\/div>\n<p>And now we make our connection in our Java code:<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-python\">    public static GridStore GridDBNoSQL() throws GSException {\n\n        GridStore store = null;\n\n        try {\n            Properties props = new Properties();\n            props.setProperty(\"notificationMember\", \"127.0.0.1:10001\");\n            props.setProperty(\"clusterName\", \"myCluster\");\n            props.setProperty(\"user\", \"admin\");\n            props.setProperty(\"password\", \"admin\");\n            store = GridStoreFactory.getInstance().getGridStore(props);\n        } catch (Exception e) {\n            e.printStackTrace();\n        }\n\n        return store;\n    }<\/code><\/pre>\n<\/div>\n<p>Using our <code>AirData<\/code> class from earlier we grab our newly made container:<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-python\">TimeSeries&lt;airdata> container = store.getTimeSeries(\"aqdata\", AirData.class);\nSystem.out.println(\"Connected to GridDB!\");&lt;\/airdata><\/code><\/pre>\n<\/div>\n<p>And then we&#8217;ve already seen this above, but as we receive new payloads, we immediately save to GridDB and then send the positive acknowledgement:<\/p>\n<div class=\"clipboard\">\n<pre><code class=\"language-python\">container.put(ad);\nchannel.basicAck(delivery.getEnvelope().getDeliveryTag(), false);<\/code><\/pre>\n<\/div>\n<h2>Conclusion<\/h2>\n<p>In this article, we set up a robust system in which our IoT data will safely transferred from our python producer to an exchange with no name (&#8221;), transferred to our broker which houses our queue called <code>airQuality<\/code>, and then finally will be read by our java consumer.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>RabbitMQ is a popular message-queuing system, used in a variety of systems where message delivery are of utmost importance. For our case, we would like to use RabbitMQ to ensure delivery of an on-the-field sensor data to be delivered to GridDB for later processing. Of course, we could always send data from the field to [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":30767,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[121],"tags":[],"class_list":["post-46816","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-blog"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.1.1 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Saving your IoT Data into GridDB with RabbitMQ | GridDB: Open Source Time Series Database for IoT<\/title>\n<meta name=\"description\" content=\"RabbitMQ is a popular message-queuing system, used in a variety of systems where message delivery are of utmost importance. For our case, we would like to\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Saving your IoT Data into GridDB with RabbitMQ | GridDB: Open Source Time Series Database for IoT\" \/>\n<meta property=\"og:description\" content=\"RabbitMQ is a popular message-queuing system, used in a variety of systems where message delivery are of utmost importance. For our case, we would like to\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/\" \/>\n<meta property=\"og:site_name\" content=\"GridDB: Open Source Time Series Database for IoT\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/griddbcommunity\/\" \/>\n<meta property=\"article:published_time\" content=\"2024-11-06T08:00:00+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-11-13T20:57:08+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/wp-content\/uploads\/2024\/11\/hardware-scaled.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1739\" \/>\n\t<meta property=\"og:image:height\" content=\"2560\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Israel\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@GridDBCommunity\" \/>\n<meta name=\"twitter:site\" content=\"@GridDBCommunity\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Israel\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"11 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/\"},\"author\":{\"name\":\"Israel\",\"@id\":\"https:\/\/griddb.net\/en\/#\/schema\/person\/c8a430e7156a9e10af73b1fbb46c2740\"},\"headline\":\"Saving your IoT Data into GridDB with RabbitMQ\",\"datePublished\":\"2024-11-06T08:00:00+00:00\",\"dateModified\":\"2025-11-13T20:57:08+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/\"},\"wordCount\":1733,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/griddb.net\/en\/#organization\"},\"image\":{\"@id\":\"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/#primaryimage\"},\"thumbnailUrl\":\"\/wp-content\/uploads\/2024\/11\/hardware-scaled.jpg\",\"articleSection\":[\"Blog\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/\",\"url\":\"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/\",\"name\":\"Saving your IoT Data into GridDB with RabbitMQ | GridDB: Open Source Time Series Database for IoT\",\"isPartOf\":{\"@id\":\"https:\/\/griddb.net\/en\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/#primaryimage\"},\"thumbnailUrl\":\"\/wp-content\/uploads\/2024\/11\/hardware-scaled.jpg\",\"datePublished\":\"2024-11-06T08:00:00+00:00\",\"dateModified\":\"2025-11-13T20:57:08+00:00\",\"description\":\"RabbitMQ is a popular message-queuing system, used in a variety of systems where message delivery are of utmost importance. For our case, we would like to\",\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/#primaryimage\",\"url\":\"\/wp-content\/uploads\/2024\/11\/hardware-scaled.jpg\",\"contentUrl\":\"\/wp-content\/uploads\/2024\/11\/hardware-scaled.jpg\",\"width\":1739,\"height\":2560},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/griddb.net\/en\/#website\",\"url\":\"https:\/\/griddb.net\/en\/\",\"name\":\"GridDB: Open Source Time Series Database for IoT\",\"description\":\"GridDB is an open source time-series database with the performance of NoSQL and convenience of SQL\",\"publisher\":{\"@id\":\"https:\/\/griddb.net\/en\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/griddb.net\/en\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/griddb.net\/en\/#organization\",\"name\":\"Fixstars\",\"url\":\"https:\/\/griddb.net\/en\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/griddb.net\/en\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/griddb.net\/wp-content\/uploads\/2019\/04\/fixstars_logo_web_tagline.png\",\"contentUrl\":\"https:\/\/griddb.net\/wp-content\/uploads\/2019\/04\/fixstars_logo_web_tagline.png\",\"width\":200,\"height\":83,\"caption\":\"Fixstars\"},\"image\":{\"@id\":\"https:\/\/griddb.net\/en\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.facebook.com\/griddbcommunity\/\",\"https:\/\/x.com\/GridDBCommunity\",\"https:\/\/www.linkedin.com\/company\/griddb-by-toshiba\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/griddb.net\/en\/#\/schema\/person\/c8a430e7156a9e10af73b1fbb46c2740\",\"name\":\"Israel\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/griddb.net\/en\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/4df8cfc155402a2928d11f80b0220037b8bd26c4f1b19c4598d826e0306e6307?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/4df8cfc155402a2928d11f80b0220037b8bd26c4f1b19c4598d826e0306e6307?s=96&d=mm&r=g\",\"caption\":\"Israel\"},\"url\":\"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/author\/israel\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Saving your IoT Data into GridDB with RabbitMQ | GridDB: Open Source Time Series Database for IoT","description":"RabbitMQ is a popular message-queuing system, used in a variety of systems where message delivery are of utmost importance. For our case, we would like to","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/","og_locale":"en_US","og_type":"article","og_title":"Saving your IoT Data into GridDB with RabbitMQ | GridDB: Open Source Time Series Database for IoT","og_description":"RabbitMQ is a popular message-queuing system, used in a variety of systems where message delivery are of utmost importance. For our case, we would like to","og_url":"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/","og_site_name":"GridDB: Open Source Time Series Database for IoT","article_publisher":"https:\/\/www.facebook.com\/griddbcommunity\/","article_published_time":"2024-11-06T08:00:00+00:00","article_modified_time":"2025-11-13T20:57:08+00:00","og_image":[{"width":1739,"height":2560,"url":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/wp-content\/uploads\/2024\/11\/hardware-scaled.jpg","type":"image\/jpeg"}],"author":"Israel","twitter_card":"summary_large_image","twitter_creator":"@GridDBCommunity","twitter_site":"@GridDBCommunity","twitter_misc":{"Written by":"Israel","Est. reading time":"11 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/#article","isPartOf":{"@id":"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/"},"author":{"name":"Israel","@id":"https:\/\/griddb.net\/en\/#\/schema\/person\/c8a430e7156a9e10af73b1fbb46c2740"},"headline":"Saving your IoT Data into GridDB with RabbitMQ","datePublished":"2024-11-06T08:00:00+00:00","dateModified":"2025-11-13T20:57:08+00:00","mainEntityOfPage":{"@id":"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/"},"wordCount":1733,"commentCount":0,"publisher":{"@id":"https:\/\/griddb.net\/en\/#organization"},"image":{"@id":"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/#primaryimage"},"thumbnailUrl":"\/wp-content\/uploads\/2024\/11\/hardware-scaled.jpg","articleSection":["Blog"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/","url":"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/","name":"Saving your IoT Data into GridDB with RabbitMQ | GridDB: Open Source Time Series Database for IoT","isPartOf":{"@id":"https:\/\/griddb.net\/en\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/#primaryimage"},"image":{"@id":"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/#primaryimage"},"thumbnailUrl":"\/wp-content\/uploads\/2024\/11\/hardware-scaled.jpg","datePublished":"2024-11-06T08:00:00+00:00","dateModified":"2025-11-13T20:57:08+00:00","description":"RabbitMQ is a popular message-queuing system, used in a variety of systems where message delivery are of utmost importance. For our case, we would like to","inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.griddb.net\/en\/blog\/saving-your-iot-data-into-griddb-with-rabbitmq\/#primaryimage","url":"\/wp-content\/uploads\/2024\/11\/hardware-scaled.jpg","contentUrl":"\/wp-content\/uploads\/2024\/11\/hardware-scaled.jpg","width":1739,"height":2560},{"@type":"WebSite","@id":"https:\/\/griddb.net\/en\/#website","url":"https:\/\/griddb.net\/en\/","name":"GridDB: Open Source Time Series Database for IoT","description":"GridDB is an open source time-series database with the performance of NoSQL and convenience of SQL","publisher":{"@id":"https:\/\/griddb.net\/en\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/griddb.net\/en\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/griddb.net\/en\/#organization","name":"Fixstars","url":"https:\/\/griddb.net\/en\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/griddb.net\/en\/#\/schema\/logo\/image\/","url":"https:\/\/griddb.net\/wp-content\/uploads\/2019\/04\/fixstars_logo_web_tagline.png","contentUrl":"https:\/\/griddb.net\/wp-content\/uploads\/2019\/04\/fixstars_logo_web_tagline.png","width":200,"height":83,"caption":"Fixstars"},"image":{"@id":"https:\/\/griddb.net\/en\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/griddbcommunity\/","https:\/\/x.com\/GridDBCommunity","https:\/\/www.linkedin.com\/company\/griddb-by-toshiba"]},{"@type":"Person","@id":"https:\/\/griddb.net\/en\/#\/schema\/person\/c8a430e7156a9e10af73b1fbb46c2740","name":"Israel","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/griddb.net\/en\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/4df8cfc155402a2928d11f80b0220037b8bd26c4f1b19c4598d826e0306e6307?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/4df8cfc155402a2928d11f80b0220037b8bd26c4f1b19c4598d826e0306e6307?s=96&d=mm&r=g","caption":"Israel"},"url":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/author\/israel\/"}]}},"_links":{"self":[{"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/posts\/46816","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/comments?post=46816"}],"version-history":[{"count":1,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/posts\/46816\/revisions"}],"predecessor-version":[{"id":51476,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/posts\/46816\/revisions\/51476"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/media\/30767"}],"wp:attachment":[{"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/media?parent=46816"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/categories?post=46816"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/tags?post=46816"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}