{"id":46637,"date":"2021-03-18T00:00:00","date_gmt":"2021-03-18T07:00:00","guid":{"rendered":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/"},"modified":"2025-11-13T12:55:14","modified_gmt":"2025-11-13T20:55:14","slug":"use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector","status":"publish","type":"post","link":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/","title":{"rendered":"Use Machine Learning and GridDB to build a Production-Ready Stock Market Anomaly Detector"},"content":{"rendered":"<p>In this project, we use GridDB to create a Machine Learning platform where Kafka is used to import stock market data from <a href=\"https:\/\/www.alphavantage.co\/\">Alphavantage<\/a>, a market data provider. <a href=\"https:\/\/tensorflow.org\">Tensorflow<\/a> and <a href=\"https:\/\/keras.io\">Keras<\/a> train a model that is then stored in GridDB, and then finally uses LSTM prediction to find anomalies in daily intraday trading history. The last piece is that the data is visualized in Grafana and then we configure GridDB to send notifications via its REST Trigger function to Twilio&#8217;s Sendgrid.<\/p>\n<p>The actual machine learning portion of this project was inspired by posts on <a href=\"https:\/\/towardsdatascience.com\/time-series-of-price-anomaly-detection-with-lstm-11a12ba4f6d9\">Towards Data Science<\/a> and <a href=\"https:\/\/curiousily.com\/posts\/anomaly-detection-in-time-series-with-lstms-using-keras-in-python\/\">Curiously<\/a>. This model and the data flow is also applicable to many other datasets such as predictive maintenance or machine failure prediction or wherever you want to find anomalies in time series data. An anomaly as it relates to machine learning is when the predicted value is significantly different than the actual value, in this case stock price anomalies may mean there are good trading opportunities but anomalies in sensor data from an engine may mean failure is imminent.<\/p>\n<p>The following GridDB features are showcased: &#8211; Key-Container data model &#8211; kafka-connect-jdbc &amp; JDBC &#8211; Pandas Data Frames &#8211; Grafana Connector &#8211; Triggers<\/p>\n<p>GridDB&#8217;s remarkable write performance is necessary as hundreds of thousands of rows of data for just a few stocks are loaded daily after the market close. It&#8217;s Key-Container architecture means queries for individual stocks are efficient, it is not necessary to scan GOOG&#8217;s intraday data to find the price anomalies in AAPL&#8217;s stocks. If you do not already have GridDB set up, follow the <a href=\"https:\/\/docs.griddb.net\/gettingstarted\/using-rpmyum\/#install-with-rpm\">Getting Started<\/a> manual to first install GridDB.<\/p>\n<h1>Kafka<\/h1>\n<p><a href=\"https:\/\/kafka.apache.org\/\">Kafka<\/a> is a data streaming platform with many different possible inputs and outputs that are easy to create. Kafka can be downloaded from their <a href=\"https:\/\/kafka.apache.org\/downloads\">downloads page<\/a>; we&#8217;re using version 2.12-2.5.0. You will also need to have a Java 1.8 development environment installed on your system. After downloading, we simply untar and start the Zookeeper and Kafka Servers.<\/p>\n<pre><code>$ tar xzvf kafka_2.12-2.5.0.tgz\n$ cd kafka_2.12-2.5.0\n$ export PATH=$PATH:\/path\/to\/kafka_2.12-2.5.0\/bin\n$ zookeeper-server-start.sh  --daemon config\/zookeeper.properties\n$ kafka-server-start.sh --daemon config\/server.properties\n<\/code><\/pre>\n<p>We will use Kafka Connect JDBC, a connector that allows Kafka to write to many different JDBC databases. We&#8217;ve already added GridDB support to the connector, but it is only available from our own Github repository. Download the <a href=\"https:\/\/github.com\/griddbnet\/kafka-connect-jdbc-griddb\/releases\/download\/e3cc5d9\/kafka-connect-jdbc-6.1.0-SNAPSHOT.jar\">JAR<\/a><\/p>\n<p>We will also need to use the <a href=\"https:\/\/github.com\/griddbnet\/jdbc\/releases\/download\/v4.5.0.1-setTimestamp\/gridstore-jdbc.jar\">gridstore-jdbc.jar<\/a> from GridDB.net&#8217;s Github repository. It has implemented additional JDBC functions requiured by the Kafka Connect JDBC connector.<\/p>\n<p>Once you&#8217;ve downloaded both JARs, place them in \/path\/to\/kafka_2.12-2.5.0\/libs\/.<\/p>\n<p>Now we&#8217;ll create a config file for kafka-connect-jdbc, configs\/jdbc-connect.json. The config files tells the connector what JDBC URL to connect to, the topics to listen to, and how to transform the string time field to an actual timestamp used by the database.<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/1dacffa99adf98a5d025ee17e986f9ff.js\"><\/script> Before starting Kafka Connect, we&#8217;ll create all the topics for the stocks we intend to use.<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/84319a18ed5e714f7e433c2f4cb0cb95.js\"><\/script> Now start Kafka Connect:<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/3e70aba7d2004a3b64e67f606bb5ef6f.js\"><\/script><\/p>\n<h1>Load Data<\/h1>\n<p>Data is first fetched in a CSV format from AlphaAdvantage using different endpoints to get the past 4 months of data for training and then with the daily API to find anomalies. The REST endpoints are: &#8211; Batch mode (Monthly) https:\/\/www.alphavantage.co\/query?function=TIME_SERIES_INTRADAY_EXTENDED&amp;symbol=${SYMBOL}&amp;interval=1min&amp;slice=year1month${MONTH}&amp;apikey=${AV_API_KEY} &#8211; Daily https:\/\/www.alphavantage.co\/query?function=TIME_SERIES_INTRADAY&amp;datatype=csv&amp;symbol=${SYMBOL}&amp;interval=1min&amp;apikey=${AV_API_KEY}<\/p>\n<p>The returned CSV is put into a Python dictionary:<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/ee0f69d0926ef45158be61fd0664cea3.js\"><\/script> The produce() function uses the row data dictionary to generate the JSON record required by Kafka:<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/4e08f7b6ac3c140ab40517418c4eb14c.js\"><\/script> Finally the data is sent to Kafka on the &#8216;INTRADAY_${SYMBOL}&#8217; topic where it will be written to GridDB by the Kafka Connect JDBC connector.<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/2cf37a09058b1d885ba698722303330d.js\"><\/script> Before running, we install the Python dependencies with Pip:<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/27de4356b4a50ade514cad36246e7ff2.js\"><\/script> Then manually run loadbatch.py<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/95f622dc61453017682b43c42d43e4f8.js\"><\/script> The daily import can be added to crontab so it runs every night (this assumes the system clock is set to UTC):<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/a72ebf5faea1ed08543e470f8ad53768.js\"><\/script><\/p>\n<h1>Train<\/h1>\n<p>First install the Python module dependencies with Pip:<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/ab3c13d611e667b32b8517e3f0b360d2.js\"><\/script> Once the historic batch data is loaded in GridDB we can train a model. The last three months of data are queried and fetched as a Pandas data frame.<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/7c43f67f311fdc93560404646178a82a.js\"><\/script> From here we modifiy the data to fit inputs expected the Keras LSTM model. 95% of the data will be used for training, the last 5% used for verification<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/51a2d390694a29b58f2a76979ca5514f.js\"><\/script><\/p>\n<h1>Model Storage<\/h1>\n<p>To store the model in GridDB requires a bit of a work around as it&#8217;s only possible to export a complete model to the filesystem through Keras. First, we create a temporary file which we save to the model to. Then open the temporary file and write it as a bytearray in GridDB. Finally, remove the temporary file.<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/1698dbfd77744ab041c66b8bfa9ff29f.js\"><\/script> To load a model, the reverse is done. Read the latest from GridDB, write it to temporary, load it with Keras and then remove the temporary file.<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/a3ad06af5aa868b58e7be293ecc77597.js\"><\/script><\/p>\n<h1>Anomaly Detection<\/h1>\n<p>Like with training, data is queried from GridDB and read into a Pandas Data Frame before being transformed into the format expected by the model.<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/2800a5261966e63a534d15e3290f10a8.js\"><\/script> Then the model is loaded and prediction is run. To find anomalies, the difference of the predicted value versus the actual value is calculated (loss) and then all values where the loss is greater than 0.65 are then written back to GridDB.<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/cafc7299e62eefae5e01ddb6bfa05be9.js\"><\/script> The anomaly detection script is also put in crontab to run 1 hour after the daily data load is started.<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/45d22cd4d551de7ce435501ea17d8025.js\"><\/script><\/p>\n<h1>Notifications<\/h1>\n<p>One of GridDB&#8217;s unique features is its Trigger functionality: whenever a row is written to a specified container, it can be configured to send a JMS message or make a REST API call.<\/p>\n<p>For more architectural and implementation details on Triggers, check out the <a href=\"http:\/\/www.toshiba-sol.co.jp\/en\/pro\/griddb\/docs-en\/v4_1\/GridDB_TechnicalReference.html#sec-4.4.8\">Technical<\/a> and <a href=\"https:\/\/docs.griddb.net\/GridDB_Java_API_Reference.html#TriggerInfo.html__\">API<\/a> Reference documents.<\/p>\n<p>The GridDB Python client does not support adding triggers so Java must be used:<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/28346092e9ce257ec3378f44c2b63545.js\"><\/script> Since the REST API calls are fairly basic and do not support adding Authorization headers, we create a Flask Application that will then Twilo&#8217;s SendGrid to send an email.<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/73e536e69f21cfceae20b66e38c17326.js\"><\/script> PIP can be used to install all of the dependencies before running the Trigger Handler application.<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/94472b716d1a672ec398e03d5c31663b.js\"><\/script> Grafana can be installed using their <a href=\"https:\/\/grafana.com\/docs\/grafana\/latest\/installation\/rpm\/\">instructions<\/a>. Once installed and running, the first step is to create the default GridDB data source and configure it based on our system settings.<\/p>\n<p>To install the GridDB data source, first download and extract the <a href=\"https:\/\/github.com\/griddb\/griddb-datasource\/archive\/1.1.0.tar.gz\">tar<\/a> and copy the dist directory to the Grafana system directory.<\/p>\n<p><script src=\"https:\/\/gist.github.com\/Imisrael\/71b8f1077daacc37c0c695af2b51006b.js\"><\/script> More detailed instructions for the Grafana GridDB data source are <a href=\"https:\/\/github.com\/griddb\/griddb-datasource\/blob\/master\/README.md\">here<\/a>.<\/p>\n<p><a href=\"https:\/\/griddb.net\/wp-content\/uploads\/2021\/03\/1_z5c8VsGTTotQ5M2xfO1CrQ.png\"><img fetchpriority=\"high\" decoding=\"async\" src=\"https:\/\/griddb.net\/wp-content\/uploads\/2021\/03\/1_z5c8VsGTTotQ5M2xfO1CrQ.png\" alt=\"\" width=\"1916\" height=\"1484\" class=\"aligncenter size-full wp-image-27355\" srcset=\"\/wp-content\/uploads\/2021\/03\/1_z5c8VsGTTotQ5M2xfO1CrQ.png 1916w, \/wp-content\/uploads\/2021\/03\/1_z5c8VsGTTotQ5M2xfO1CrQ-300x232.png 300w, \/wp-content\/uploads\/2021\/03\/1_z5c8VsGTTotQ5M2xfO1CrQ-1024x793.png 1024w, \/wp-content\/uploads\/2021\/03\/1_z5c8VsGTTotQ5M2xfO1CrQ-768x595.png 768w, \/wp-content\/uploads\/2021\/03\/1_z5c8VsGTTotQ5M2xfO1CrQ-1536x1190.png 1536w, \/wp-content\/uploads\/2021\/03\/1_z5c8VsGTTotQ5M2xfO1CrQ-600x465.png 600w\" sizes=\"(max-width: 1916px) 100vw, 1916px\" \/><\/a><\/p>\n<p>Now, create create one dashboard for every symbol we&#8217;re monitoring. Each dashboard will have one annotation that will mark each time an anomaly occured in the given stock. In this case, we&#8217;re looking at Facebook&#8217;s (FB) anomalies.<\/p>\n<p><a href=\"https:\/\/griddb.net\/wp-content\/uploads\/2021\/03\/2-1.png\"><img decoding=\"async\" src=\"https:\/\/griddb.net\/wp-content\/uploads\/2021\/03\/2-1.png\" alt=\"\" width=\"1910\" height=\"1474\" class=\"aligncenter size-full wp-image-27356\" srcset=\"\/wp-content\/uploads\/2021\/03\/2-1.png 1910w, \/wp-content\/uploads\/2021\/03\/2-1-300x232.png 300w, \/wp-content\/uploads\/2021\/03\/2-1-1024x790.png 1024w, \/wp-content\/uploads\/2021\/03\/2-1-768x593.png 768w, \/wp-content\/uploads\/2021\/03\/2-1-1536x1185.png 1536w, \/wp-content\/uploads\/2021\/03\/2-1-600x463.png 600w\" sizes=\"(max-width: 1910px) 100vw, 1910px\" \/><\/a><\/p>\n<p>We will create a new panel and create the query, setting the container name to &#8220;INTRADAY_FB&#8221; and removing any limits on returned records. The default limit of 10000 will only give a few weeks of data.<\/p>\n<p><a href=\"https:\/\/griddb.net\/wp-content\/uploads\/2021\/03\/3-1.png\"><img decoding=\"async\" src=\"https:\/\/griddb.net\/wp-content\/uploads\/2021\/03\/3-1.png\" alt=\"\" width=\"1910\" height=\"799\" class=\"aligncenter size-full wp-image-27358\" srcset=\"\/wp-content\/uploads\/2021\/03\/3-1.png 1910w, \/wp-content\/uploads\/2021\/03\/3-1-300x125.png 300w, \/wp-content\/uploads\/2021\/03\/3-1-1024x428.png 1024w, \/wp-content\/uploads\/2021\/03\/3-1-768x321.png 768w, \/wp-content\/uploads\/2021\/03\/3-1-1536x643.png 1536w, \/wp-content\/uploads\/2021\/03\/3-1-600x251.png 600w\" sizes=\"(max-width: 1910px) 100vw, 1910px\" \/><\/a><\/p>\n<p>Finally, we visualize the stock price and anomalies:<\/p>\n<p><a href=\"https:\/\/griddb.net\/wp-content\/uploads\/2021\/03\/1_RJEMXfs-0DHCrb4UHQ9Kyw.png\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/griddb.net\/wp-content\/uploads\/2021\/03\/1_RJEMXfs-0DHCrb4UHQ9Kyw.png\" alt=\"\" width=\"1902\" height=\"1472\" class=\"aligncenter size-full wp-image-27359\" srcset=\"\/wp-content\/uploads\/2021\/03\/1_RJEMXfs-0DHCrb4UHQ9Kyw.png 1902w, \/wp-content\/uploads\/2021\/03\/1_RJEMXfs-0DHCrb4UHQ9Kyw-300x232.png 300w, \/wp-content\/uploads\/2021\/03\/1_RJEMXfs-0DHCrb4UHQ9Kyw-1024x792.png 1024w, \/wp-content\/uploads\/2021\/03\/1_RJEMXfs-0DHCrb4UHQ9Kyw-768x594.png 768w, \/wp-content\/uploads\/2021\/03\/1_RJEMXfs-0DHCrb4UHQ9Kyw-1536x1189.png 1536w, \/wp-content\/uploads\/2021\/03\/1_RJEMXfs-0DHCrb4UHQ9Kyw-600x464.png 600w\" sizes=\"(max-width: 1902px) 100vw, 1902px\" \/><\/a><\/p>\n<p>In the above graph, stock price is the primary time series shown and the anomalies (or where the predicted price significantly differs from the actual price) are denoted by the dashed red lines.<\/p>\n<h1>Conclusion<\/h1>\n<p>This project has demonstrated how effective GridDB can be as the data store in the deployment of a real world, production ready machine learning project, mainly by showcasing its ability to store both the models and the input\/output data. Open source tools such as Kafka and Grafana were also successfully integrated into the project to help implement effective data streaming and visualization. And lastly, GridDB&#8217;s Trigger functionality was used in conjunction with SendGrid to ensure the anomalies were not missed.<\/p>\n<p>The full source code for this project can be found on GridDB.net&#8217;s GitHub page (here)[https:\/\/github.com\/griddbnet\/stock-anomaly-ml-project].<\/p>\n","protected":false},"excerpt":{"rendered":"<p>In this project, we use GridDB to create a Machine Learning platform where Kafka is used to import stock market data from Alphavantage, a market data provider. Tensorflow and Keras train a model that is then stored in GridDB, and then finally uses LSTM prediction to find anomalies in daily intraday trading history. The last [&hellip;]<\/p>\n","protected":false},"author":71,"featured_media":27349,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[121],"tags":[],"class_list":["post-46637","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-blog"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.1.1 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Use Machine Learning and GridDB to build a Production-Ready Stock Market Anomaly Detector | GridDB: Open Source Time Series Database for IoT<\/title>\n<meta name=\"description\" content=\"In this project, we use GridDB to create a Machine Learning platform where Kafka is used to import stock market data from Alphavantage, a market data\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Use Machine Learning and GridDB to build a Production-Ready Stock Market Anomaly Detector | GridDB: Open Source Time Series Database for IoT\" \/>\n<meta property=\"og:description\" content=\"In this project, we use GridDB to create a Machine Learning platform where Kafka is used to import stock market data from Alphavantage, a market data\" \/>\n<meta property=\"og:url\" content=\"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/\" \/>\n<meta property=\"og:site_name\" content=\"GridDB: Open Source Time Series Database for IoT\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/griddbcommunity\/\" \/>\n<meta property=\"article:published_time\" content=\"2021-03-18T07:00:00+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-11-13T20:55:14+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/wp-content\/uploads\/2021\/03\/1_Ol10dytvo9NAOwkifhzlxQ.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1672\" \/>\n\t<meta property=\"og:image:height\" content=\"946\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Owen\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@GridDBCommunity\" \/>\n<meta name=\"twitter:site\" content=\"@GridDBCommunity\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Owen\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"7 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/\"},\"author\":{\"name\":\"Owen\",\"@id\":\"https:\/\/griddb.net\/en\/#\/schema\/person\/0f2f6d4b593adde8c43cf3ea5c794c66\"},\"headline\":\"Use Machine Learning and GridDB to build a Production-Ready Stock Market Anomaly Detector\",\"datePublished\":\"2021-03-18T07:00:00+00:00\",\"dateModified\":\"2025-11-13T20:55:14+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/\"},\"wordCount\":1308,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/griddb.net\/en\/#organization\"},\"image\":{\"@id\":\"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/#primaryimage\"},\"thumbnailUrl\":\"\/wp-content\/uploads\/2021\/03\/1_Ol10dytvo9NAOwkifhzlxQ.png\",\"articleSection\":[\"Blog\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/\",\"url\":\"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/\",\"name\":\"Use Machine Learning and GridDB to build a Production-Ready Stock Market Anomaly Detector | GridDB: Open Source Time Series Database for IoT\",\"isPartOf\":{\"@id\":\"https:\/\/griddb.net\/en\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/#primaryimage\"},\"thumbnailUrl\":\"\/wp-content\/uploads\/2021\/03\/1_Ol10dytvo9NAOwkifhzlxQ.png\",\"datePublished\":\"2021-03-18T07:00:00+00:00\",\"dateModified\":\"2025-11-13T20:55:14+00:00\",\"description\":\"In this project, we use GridDB to create a Machine Learning platform where Kafka is used to import stock market data from Alphavantage, a market data\",\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/#primaryimage\",\"url\":\"\/wp-content\/uploads\/2021\/03\/1_Ol10dytvo9NAOwkifhzlxQ.png\",\"contentUrl\":\"\/wp-content\/uploads\/2021\/03\/1_Ol10dytvo9NAOwkifhzlxQ.png\",\"width\":1672,\"height\":946},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/griddb.net\/en\/#website\",\"url\":\"https:\/\/griddb.net\/en\/\",\"name\":\"GridDB: Open Source Time Series Database for IoT\",\"description\":\"GridDB is an open source time-series database with the performance of NoSQL and convenience of SQL\",\"publisher\":{\"@id\":\"https:\/\/griddb.net\/en\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/griddb.net\/en\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/griddb.net\/en\/#organization\",\"name\":\"Fixstars\",\"url\":\"https:\/\/griddb.net\/en\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/griddb.net\/en\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/griddb.net\/wp-content\/uploads\/2019\/04\/fixstars_logo_web_tagline.png\",\"contentUrl\":\"https:\/\/griddb.net\/wp-content\/uploads\/2019\/04\/fixstars_logo_web_tagline.png\",\"width\":200,\"height\":83,\"caption\":\"Fixstars\"},\"image\":{\"@id\":\"https:\/\/griddb.net\/en\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.facebook.com\/griddbcommunity\/\",\"https:\/\/x.com\/GridDBCommunity\",\"https:\/\/www.linkedin.com\/company\/griddb-by-toshiba\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/griddb.net\/en\/#\/schema\/person\/0f2f6d4b593adde8c43cf3ea5c794c66\",\"name\":\"Owen\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/griddb.net\/en\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/47438a5c81215c7a9043be1b427e0bbd8dc0f77bd536f147f8495575149e4325?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/47438a5c81215c7a9043be1b427e0bbd8dc0f77bd536f147f8495575149e4325?s=96&d=mm&r=g\",\"caption\":\"Owen\"},\"url\":\"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/author\/owen\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Use Machine Learning and GridDB to build a Production-Ready Stock Market Anomaly Detector | GridDB: Open Source Time Series Database for IoT","description":"In this project, we use GridDB to create a Machine Learning platform where Kafka is used to import stock market data from Alphavantage, a market data","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/","og_locale":"en_US","og_type":"article","og_title":"Use Machine Learning and GridDB to build a Production-Ready Stock Market Anomaly Detector | GridDB: Open Source Time Series Database for IoT","og_description":"In this project, we use GridDB to create a Machine Learning platform where Kafka is used to import stock market data from Alphavantage, a market data","og_url":"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/","og_site_name":"GridDB: Open Source Time Series Database for IoT","article_publisher":"https:\/\/www.facebook.com\/griddbcommunity\/","article_published_time":"2021-03-18T07:00:00+00:00","article_modified_time":"2025-11-13T20:55:14+00:00","og_image":[{"width":1672,"height":946,"url":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/wp-content\/uploads\/2021\/03\/1_Ol10dytvo9NAOwkifhzlxQ.png","type":"image\/png"}],"author":"Owen","twitter_card":"summary_large_image","twitter_creator":"@GridDBCommunity","twitter_site":"@GridDBCommunity","twitter_misc":{"Written by":"Owen","Est. reading time":"7 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/#article","isPartOf":{"@id":"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/"},"author":{"name":"Owen","@id":"https:\/\/griddb.net\/en\/#\/schema\/person\/0f2f6d4b593adde8c43cf3ea5c794c66"},"headline":"Use Machine Learning and GridDB to build a Production-Ready Stock Market Anomaly Detector","datePublished":"2021-03-18T07:00:00+00:00","dateModified":"2025-11-13T20:55:14+00:00","mainEntityOfPage":{"@id":"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/"},"wordCount":1308,"commentCount":0,"publisher":{"@id":"https:\/\/griddb.net\/en\/#organization"},"image":{"@id":"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/#primaryimage"},"thumbnailUrl":"\/wp-content\/uploads\/2021\/03\/1_Ol10dytvo9NAOwkifhzlxQ.png","articleSection":["Blog"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/","url":"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/","name":"Use Machine Learning and GridDB to build a Production-Ready Stock Market Anomaly Detector | GridDB: Open Source Time Series Database for IoT","isPartOf":{"@id":"https:\/\/griddb.net\/en\/#website"},"primaryImageOfPage":{"@id":"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/#primaryimage"},"image":{"@id":"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/#primaryimage"},"thumbnailUrl":"\/wp-content\/uploads\/2021\/03\/1_Ol10dytvo9NAOwkifhzlxQ.png","datePublished":"2021-03-18T07:00:00+00:00","dateModified":"2025-11-13T20:55:14+00:00","description":"In this project, we use GridDB to create a Machine Learning platform where Kafka is used to import stock market data from Alphavantage, a market data","inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/griddb.net\/en\/blog\/use-machine-learning-and-griddb-to-build-a-production-ready-stock-market-anomaly-detector\/#primaryimage","url":"\/wp-content\/uploads\/2021\/03\/1_Ol10dytvo9NAOwkifhzlxQ.png","contentUrl":"\/wp-content\/uploads\/2021\/03\/1_Ol10dytvo9NAOwkifhzlxQ.png","width":1672,"height":946},{"@type":"WebSite","@id":"https:\/\/griddb.net\/en\/#website","url":"https:\/\/griddb.net\/en\/","name":"GridDB: Open Source Time Series Database for IoT","description":"GridDB is an open source time-series database with the performance of NoSQL and convenience of SQL","publisher":{"@id":"https:\/\/griddb.net\/en\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/griddb.net\/en\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/griddb.net\/en\/#organization","name":"Fixstars","url":"https:\/\/griddb.net\/en\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/griddb.net\/en\/#\/schema\/logo\/image\/","url":"https:\/\/griddb.net\/wp-content\/uploads\/2019\/04\/fixstars_logo_web_tagline.png","contentUrl":"https:\/\/griddb.net\/wp-content\/uploads\/2019\/04\/fixstars_logo_web_tagline.png","width":200,"height":83,"caption":"Fixstars"},"image":{"@id":"https:\/\/griddb.net\/en\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/griddbcommunity\/","https:\/\/x.com\/GridDBCommunity","https:\/\/www.linkedin.com\/company\/griddb-by-toshiba"]},{"@type":"Person","@id":"https:\/\/griddb.net\/en\/#\/schema\/person\/0f2f6d4b593adde8c43cf3ea5c794c66","name":"Owen","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/griddb.net\/en\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/47438a5c81215c7a9043be1b427e0bbd8dc0f77bd536f147f8495575149e4325?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/47438a5c81215c7a9043be1b427e0bbd8dc0f77bd536f147f8495575149e4325?s=96&d=mm&r=g","caption":"Owen"},"url":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/author\/owen\/"}]}},"_links":{"self":[{"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/posts\/46637","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/users\/71"}],"replies":[{"embeddable":true,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/comments?post=46637"}],"version-history":[{"count":1,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/posts\/46637\/revisions"}],"predecessor-version":[{"id":51313,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/posts\/46637\/revisions\/51313"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/media\/27349"}],"wp:attachment":[{"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/media?parent=46637"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/categories?post=46637"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/griddb-linux-hte8hndjf8cka8ht.westus-01.azurewebsites.net\/en\/wp-json\/wp\/v2\/tags?post=46637"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}