Skip to content

Latest commit

 

History

History
63 lines (40 loc) · 4.93 KB

DeviceUploadData.md

File metadata and controls

63 lines (40 loc) · 4.93 KB

Get sensor readings in CSV format and upload to Azure

Change C code that does the sensor readings to write CSV files

  1. The original code from https://github.com/alexh-name/bsec_bme680_linux prints out sensor the readings to stdout (screen). Replace the file from that repo with my version here. The tweaks I made were:
  • modified the output_ready() function to generate files named YYYYMMDD-HHMMSS.csv in a data folder with an header row and a row with the sensor readings;
  • at the top a temp_offset constant is defined with value 5.0f. This value is being added to the temperature measurements (which are in Celcius), to compensate for the heat generated by the Pi/power supply/etc. I changed this value to 4.0f as I believe my sensor is better isolated -- do edit to your own situation;
  • also at the head of the file there's an int variable named i2c_address, with default value BME680_I2C_ADDR_PRIMARY (corresponding to I2C address 76). In my case I have address 77, so needed to change this to BME680_I2C_ADDR_SECONDARY;
  • changed call bsec_iot_loop() near the end to checkpoint the internal state of the sensor every 4 hours (i.e., changed the last parameter from 10000 to 4800).
  1. After this, compile the program again by calling ./make.sh as before.

  2. Create a data folder as a subfolder from where you'll be running the compiled bsec_bme680 (or you'll get a friendly Segmentation fault when you run it).

  3. Now type ./bsec_bme680 & to run the application in the background. If you list the contents of the data folder, you'll start seeing new csv files being generated, one every 3 seconds. You'll have to run this manually every time you reboot/restart your Raspberry, haven't configured auto-start yet (TBD). Here's an example content of one of these files:

Example reading

Upload the CSV files to an Azure IoT Hub

I decided to separate the capture of the sensor readings from the uploading to Azure for a couple of reasons:

  • If there's any networking issue, you don't run the risk of losing readings. Additionally, while the time to push a reading to the cloud is non-deterministic, writing to a file is much more so, so you can keep a predictable reading every 3 seconds;
  • It's much simpler to install/use the IoT Hub Client SDK for Python than it is to figure out how the CMake-based compilation process of the IoT Hub Client SDK for C works. Hence, I'll be using Python for the upload stage.

The steps to follow are:

  1. Create a file on the Pi called scoop_up_data.py with the content you find here. This code uses the version 1.4.4 of the Azure IotHub Device Client SDK (which you installed previously).
  2. Edit the file to change the value of the iothub_connstring variable. This is a string looking like "HostName=NAME_OF_YOUR_IOTHUB.azure-devices.net;DeviceId=NAME_OF_YOUR_DEVICE_IN_THE_IOTHUB;SharedAccessKey=LOTS_OF_ALFANUM_CHARACTERS" which you can obtained from the Azure portal.
  3. To do a test run, call python3 scoop_up_data.py ./data/. This will upload all your already captured CSV files to the Azure IoTHub, in cronological order, and print out something like as it uploads them:
pi@rpi0:~/bsec_bme680_linux $ python3 scoop_up_data.py ./data/
Starting iothub client
Reading files from /home/pi/bsec_bme680_linux/data:
1 - /home/pi/bsec_bme680_linux/data/20200131-233324.csv
2 - /home/pi/bsec_bme680_linux/data/20200131-233327.csv
...
378 - /home/pi/bsec_bme680_linux/data/20200131-235215.csv
Files uploaded: 378

  1. The files in the data folder are renamed one by one immediatelly after being sent up, with an uploaded_ prefix added. E.g., 20200131-235215.csv becomes uploaded_20200131-235215.csv. You'll need to clear up these files later.
  2. Now that this process has been tested, you need to run the Python script on a schedule with a cron job. To do this, run crontab -e on your Linux shell, pick a text editor if Linux asks you to (nano may be the simplest one), and add the following at the end of the file:

* * * * * /usr/bin/python3 /home/pi/bsec_bme680_linux/scoop_up_data.py /home/pi/bsec_bme680_linux/data/

When you save and exit, the command above will be executed every minute, and upload the readings (typically 20 files at a time, considering they are recorded every 3 seconds).

  1. Clean up the uploaded files - as above, do crontab -e and add this to the end:

*/2 * * * * rm /home/pi/bsec_bme680_linux/data/uploaded*.csv

This will remove the uploaded files every two minutes. See here for more detail on the crontab string: https://crontab.guru/every-2-minutes .

Your crontab will thus contain the following:

To be done

How to avoid race conditions in cron jobs as per here: https://www.cyberciti.biz/faq/how-to-run-cron-job-every-minute-on-linuxunix/ . This is a detail.