113% ROI on Large Enterprise Crowdsourcing Programs - Forrester TEIā„¢ Get the Study ×


By jessie-ford In Uncategorized

Posted July 29th, 2015


I briefly talked about this solution in my introduction to this series but wanted to dive into more details here. Let’s start with my current favorite Microcontroller, the Arduino Yun.

You may recall this MCU has both a Arduino and a Linux processor, and we will be using both. On the Linux side we will write a simple agent in Node. This agent will get sensor values and POST them to our mongoDB hosted on Mongolab. I am a big fan of Mongolab.com, they have a more than generous free tier and their support is super fast. Next we will write some html that will utilize D3.js to create some pretty graphs. The D3s.js html/javascript will live inside a MEAN.io package that is hosted on Heroku. All the source code will be provided and both the Heroku and Mongolab environments are well within the free tier so you won’t have to spend a penny.


  1. MCU: Arduino Yun
  2. Arduino Sketch: mybridge.ino
  3. Agent: Agent1Mongo.js
  4. Data Storage: mongolab.com (free hosted mongodb)
  5. Server: Node on Heroku (Mean.io)
  6. Data service: Node route via express.
  7. Client: Mean.io on the same Heroku instance.

MCU: Arduino Yun

Arduino Sketch

The sketch running on the Yun is my modified bridge called mybridge.ino. It simply adds the 1-wire temperature calls to the already existing analog/digital calls. You can see the original Arduino bridge library here. Once this sketch is loaded you can test the value of digital pin 13 (the onboard led) with the following call from your browser: Note my YUN hostname is kyleyun

This returns:

Pin D13 set to 0

I can confirm that the red led on the Yun is off. I can set this pin to high with the following call from my browser:

And watch the led turn on. Next I want to see the value of my light sensor on analog pin 0 http://kyleyun.local/arduino/analog/0 and get:

Pin A0 reads analog 694

The analog pins have a built-in 10bit ADC (0-1023) so the value of 694 means that of the 5 volts I am supplying to the pin, I am reading about 3.5 Volts or 67% of my supply voltage. If I turn a bright light on my ADC the reading goes to 953 which corresponds to 4.65V. Putting a volt meter across these pins I actually get a little less (about 4.1) due the fact that my supply voltage (USB) is only 4.4V instead of 5. If you are still following along the math makes sense. Here is a link that describes the Photo Resistor circuit on Bildr. Everything thus far uses the default bridge sketch. Now let’s talk about getting the temperature readings.

You can use an analog temperature sensor to measure the temperature the same way as we did with the light with the TMP36 and just convert it to C or F in your sketch or agent code, but I like the digital sensors like the DS18B20 since it give me back a discrete value and I can add as many as I want to a single digital pin via the 1-wire bus. However, you need to modify the sketch to add this REST call and include the temperature library (included with the Arduino IDE). In the ingredients section you can find this source code for mybridge.ino on Github.

Just like we measured the digital and analog pins, I added a route in this sketch for the 1-wire temperature reading by passing in the sensor index into the REST call. The call looks something (actually, exactly) like this to measure the 1st temp sensor http://kyleyun.local/arduino/temp/1. Since I was planning on putting this directly into my data store I decided to make the response formatted in JSON and also include the 64bit device address. The response looks like this:

Since I return the response in JSON it is simple to parse and include both properties in my database bound payload. Okay, enough about the sketch. Let’s move to the agent that pulls data from the sketch and pushes it to Mongo.

Arduino Agent:

This simple node script lives on the Linux side of the YUN. The full code can be found here in github. I wrote it in Node because I like Node. I did have to expand the files system to install node, so I will create an addendum to this post and write the agent in BASH. I fire this script via cron running on the Linux processor. Here is what the cron statement looks like:

This script runs every 5 minutes (*/5 is the first parameter in the crontab).

First, define the variables. As you can see, I read the MONGOLAB_KEY from an environmental variable. So you will need to execute it like this: MONGOLAB_KEY=1234567 node agent1mongo.js.

Simple function to upload to Mongo using request.

This function is to read analog or digital values (the default bridge library supports this) Note I have to parse the response to pull out just the pin and the value. Remember, the original response looks like this:

Pin A1 reads analog 427

I also use moment.js to get two formats of the current time.

Function to read the 1-wire temperature reading. *Note I return the payload in JSON format and include the device address:

Finally I make the calls to get the four-sensor reading. I pass parameters that will be included in the data.

Data Storage: mongolab.com

I chose to store the data in mongodb for three reasons:

  1. With Mongo I do not need to define my schema. This is important because I may want to add a new type of sensor that may have some new metadata. For example I may decide to store the actual voltage on my pins. In the agent I can just add a new property to the payload which calculates the actual pin values. I dont have to touch anything in Mongo.
  2. Mongolab has a built in REST api so I can do a simple curl command to insert or delete or query data. This is ok for the agent because no one will see this call which will include my api key. However, I would never use this on a web client because the user could inspect the source and see my api key. To get around this I will use an express route so the api key is confined to my server.
  3. Mongo stores data in collections (think tables) natively in a json format. This means that I can use javascript libraries to render the graphs or tables of my data without having to parse the data. If you have ever tried to use a Javascript library on data from a relational database you will appreciate this paradigm shift.

NoSQL stores JSON — Javascript Consumes JSON

There is actually not too much to say about the Mongo piece. It is by far the easiest. All I need is to set up a db and collection and get the api key. When we jump to the server next we will need to build some routes that will aggregate the hourly average and that is where we will need to understand more about Mongo.

Server: MEAN.io hosted on Heroku.

I am using MEAN io for my client/server since it is easy to use and I am comfortable with it, but you could replace it with the web server of your choice.

MEAN io is serving two purposes: (1) It’s a container for my html that includes d3js javascript; and (2) it provides custom routes to get data from Mongo. If you wanted to use the Mongolab REST api instead of this custom route you could, despite exposing your api key, but you would not be able to get any aggregate functions like Daily Averages. For this we need to take advantage of Mongoose which is included in the MEAN stack. It should be noted for this example that I am not using any Angular. Besides the Express routes for Mongo this is just a dumb webserver.

Now that we have our data pumping into Mongo let’s look at an express route that will fetch that data for us. We will look at a simple raw moisture value route and another one that takes daily averages which is a bit more complex. All this code can be found in the mean branch on github kbowerma/kubli.

First we set up our route for the moisture sensor which is on Analog pin 1.

Next we need to look to the controller to see what the sensor logic is to get this data. I won’t go into too much detail on using Express — all you need to know is that I have declared Sensor to my model var Sensor = mongoose.model(‘Sensor’) and have included Mongoose. If you use Mean.io you will have this server controller stub already created.

You can see that I am using Mongoose to find all the the records with version=2 and pin = ‘A1’ and sorting by the latest timestamps. My agent statically sets version: 2. My idea on this was that if I drastically change my schema I could update the version in my data. Next let’s look at a more complicated route to get the daily averages of the moisture values. Below is the controller method for my moisture daily averages. It would have been better to abstract this route/controller to take the device (moisture) as a parameter but I wanted to keep it easy to read.

Just like above we match on the version and pin but now we add a group to feed into our mongodb aggregate function. We group by $substr: [‘$time’, 0, 11] which is the data part of the GMT timestamp and we add 00:00:000z to the end of it to represent the first timestamp in the hour. We use this for the label display on our graph. We also have the minimum epoch time (similar to above but in epoch). Since we are returning an aggregate function we need to represent the values for the pin and the device. We are filtering out these values so they will all be the same so we need to ask for the the first value of each. This was my first time using aggregation in Mongo so I am sure there are more efficient ways use the aggregation pipeline, but it did the trick. This will return a payload that has the average value for the moisture sensor for each hour and you can see the live results of this call here: https://kubli.herokuapp.com/mdaily. To see the raw results of all the moisture data, hit this endpoint in your browser: https://kubli.herokuapp.com/a1

Pro Tip If you want to see json data in your browser which is well-formatted (pretty print) install the Chrome extension called JSONView

I am not going to go into much detail on using MEAN.io or D3js since they are both well documented. Here is my live Demo of my D3js graphs. You can hit the graphs and see that they are updating every five minutes. In lieu of a code discussion, the code can be found on Github.


This article represents a very simple clean Microservices approach. Each of the components can be swapped out and replaced without disrupting the application. This solution offers the flexibility of both mongodb and d3js which are a great combination. If you have not played with Heroku it is definitely worth investing your time in. Once you get past the learning curve, the ease of deployment and environment cloning is unsurpassed in the industry. For recipe 2 we will use the same agent but post the data to Treasure Data — a hosted Hadoop solution — and aggregate it up to Salesforce.com.