Raspberry Pi and Sensor Tag - Part IV

This post is part of what is becoming a very well rounded project about the leverage of Raspberry Pi and Sensor Tag by TI []

7 years ago

Latest Post The Great Escape Tunnel by Mario Esposito public

This post is part of what is becoming a very well rounded project about the leverage of Raspberry Pi and Sensor Tag by TI to collect sensor data. In part III I mentioned that a much better approach to store data besides redirecting the console output to file was to leverage a database. Storing time-series data can be pretty tricky and mostly unreliable if not well architected.

You might want to take a look at carbon and whisper, part of the graphite project. Carbon can handle vast amounts of time series data. To give you an idea of how well it scales, when graphite was first put into production at Orbitz, it was handling 160,000 metrics per minute.In this case, I picked Redis.

It’s an in-memory data structure store, used as database, cache and message broker. It supports data structures such as strings, hashes, lists, sets, sorted sets with range queries, bitmaps, hyperloglogs and geospatial indexes with radius queries. Regis has built-in replication, Lua scripting, LRU eviction, transactions and different levels of on-disk persistence, and provides high availability via Redis Sentinel and automatic partitioning with Redis Cluster.

The beauty of all, it’s free and open source!

At the simplest level, a message queue is a way for applications and discrete components to send messages between one another to reliably communicate.

A job queue is similar to a message queue and contains an ordered list of jobs to be performed by a separate subsystem. Thanks to this excellent write up I learnt how to use that in Node. Thanks, Prateek!Here is what we’re going to do:

  1. Backup what we have done so far
  2. Install Redis server
  3. Whip up some code to queue and store our sensor data

Backup SD Card

We have made so much progress in the last three publications that it would be unfair and inhuman to lose all this work or to compromise our working configuration. To avoid that, stick that SD card in your Mac and open Disk Utility then follow those steps.

If you feel that the UI slows you down, you can do that task through the command line, and here you find a well-written step by step process for that. Thanks, Johnny!

Install Redis Server

In my workflow, I did this first on the Mac and then I moved to the RPi to do the same. On the Mac, I used home-brew to install it. Using

brew install redis

Off the bat, in my case, after the installation the config file that comes with Redis was broken, and the server wasn’t starting. However launching the server just with


That was enough to run the server with no issue whatsoever. Open another terminal window and run your Node code from there.

A note for whom of you on a Mac has Anaconda installed for some reason my installation came with an older version of Redis (v2.6) that I was unable to update and therefore, I couldn’t use by default the instance of Redis that I wanted (>v3.x) I simply removed it so I have only one instance.

Once you have the server running, you might want to take a look at the database and data that you will be storing. There are plenty of options I used RDM (Redis Desktop Manager) it’s free and does the job well. It is pretty straightforward so I am not going to document how to use it. In case that you need a quick pick on how to do it take a look at this video.

Storing Data

I placed the code on Github, I will highlight the key parts to point out where the magic is happening. The infrastructure to work with Redis and expose the necessary modules for the job queue system is provided by Kue.

Take a look at the file simplejob.js, that has the core of how you can create your own job queue without the fuss of events and sensors that we used in the previous posts. The function newJob is what really generates and creates the record that will be stored in the database. Remember that Redis is a noSQL database and therefore, you have enough freedom to build your own dataset without running too much into schema definitions complexity that you normally phase with schema driven databases.

Once you have that understanding, move to the file qlogger.js and you will find roughly the same code with the exception that we build a data structure that contains the three data points from the accelerometer.

If you have not done that yet, run

sudo npm install

from the directory where you cloned the code so that all the nodes module get installed. And then

  1. Turn on your Sensor Tag
  2. Run qlogger.js

and collect some data. After a few seconds/minutes of data collection. Press at the same time the two buttons on the Sensor Tag so it will naturally disconnect.

To quickly get an idea of that data was stored, do the following

Open RDM

  1. Register your server if you haven’t done that beforehand
  2. Right click on the first database
  3. Select “reload”
  4. Navigate under the jobs node your data

At this point, the whole picture is completed. In the course of the latest posts, we have learned how to configure the RPi, connect to the Sensor Tag and now how to collect all this data into various forms of storage.


Once the data is stored the next and natural step is to process it (clean up, linearize) and then plot it to get out of that some useful insights. I am thinking of evaluating two options one involves doing all that work directly into Node.js or going bolder and leverage RServer to actually kill two birds with one stone.

I will experiment and post once I settle on a valid option for this exercise. In the meanwhile share away from your experience, make some noise and enjoy your data collection. I definitely did thus far :-)


Basic of NodeJS and RedisInstall Redis Server

Mario Esposito

Published 7 years ago