I’ve lived in 100-plus-year-old brick homes for many of my life. They look good, they’re comfy, and often, they aren’t too costly. However, humidity is excessive within the winter in my local weather, and mildew is a recurring downside. A desktop thermometer that shows relative humidity is helpful for measuring it, but it surely doesn’t present steady monitoring.
In comes the Raspberry Pi: It is small, cheap, and has many sensor choices, together with temperature and relative humidity. It can gather knowledge across the clock, do some alerting, and ahead knowledge for evaluation.
Recently, I participated in an experiment by miniNodes to gather and course of environmental knowledge on an all-Arm community of computer systems. One of my community’s nodes was a Raspberry Pi that collected environmental knowledge above my desk. Once the undertaking was over, I used to be allowed to maintain the and play with it. This turned my winter vacation undertaking. Learning Python or Elasticsearch simply to know extra about them is boring. Having a sensible undertaking that makes use of these applied sciences is not only helpful but additionally makes studying enjoyable.
Originally, I deliberate to make the most of solely these two applied sciences. Unfortunately, my good outdated Arm “server,” an OverDrive 1000 machine for builders, and my Xeon server are too loud for steady use above my desk. I flip them on solely once I want them, which suggests some type of buffering is critical when the servers are offline. Implementing buffering for Elasticsearch as a newbie Python coder seemed a bit troublesome. Luckily, I do know a instrument that may buffer knowledge and ship it to Elasticsearch: syslog-ng.
A notice about licensing
Elastic, the maintainer of Elasticsearch, has not too long ago modified the undertaking’s license from the Apache License, a particularly permissive license accepted by the Open Source Initiative, to a extra restrictive license “to protect our products and brand from abuse.” The time period “abuse” on this context refers back to the tendency of corporations utilizing Elasticsearch and Kibana and offering them to clients instantly as a service with out collaborating with Elastic or the Elastic group (a typical critique of permissive licenses). It’s nonetheless unclear how this impacts customers, but it surely’s an essential dialogue for the open supply group to have, particularly as cloud providers develop into an increasing number of frequent.
To preserve your undertaking open supply, use Elasticsearch model 7.10 below the Apache License.
Configure knowledge assortment
For knowledge assortment, I’ve a Raspberry Pi Model 3B+ with the most recent Raspberry Pi OS model and a set of sensors from SparkFun linked to a Qwiic pHat add-on board (this board has been discontinued, however there are newer boards that present the identical performance). Since monitoring GPS doesn’t make a lot sense with a set location and there’s no lightning to detect throughout the winter, I linked solely the environmental sensor. You can gather knowledge from the sensor utilizing Python scripts available on GitHub.
Install the Python modules domestically as a person:
pip3 set up sparkfun-qwiic-bme280
There are three instance scripts you need to use to examine knowledge assortment. You can obtain them utilizing your browser or Git:
git clone https://github.com/sparkfun/Qwiic_BME280_Py/
When you begin the script, it is going to print knowledge in a pleasant, human-readable format:
pi@raspberrypi:~/Documents/Qwiic_BME280_Py/examples $ python3 qwiic_bme280_ex1.py
SparkFun BME280 Sensor Example 1
Ending Example 1
I’m from Europe, so the default temperature knowledge didn’t make a lot sense to me. Luckily, you possibly can simply rewrite the code to make use of the metric system: simply substitute
temperature_celsius. Pressure and altitude confirmed some loopy values, even once I modified to the metric system, however I didn’t debug them. The humidity and temperature values have been fairly near what I anticipated (based mostly on my desktop thermometer).
Once I verified that the related sensors work as anticipated, I began to develop my very own code. It is fairly easy. First, I made certain that it printed values each second to the terminal, then I added syslog help:
# initialize sensor
sensor = qwiic_bme280.QwiicBme280()
if sensor.linked == False:
print("Sensor not connected. Exiting")
# gather and log time, humidity and temperature
t = time.localtime()
current_time = time.strftime("%H:%M:%S", t)
current_humidity = sensor.humidity
current_temperature = sensor.temperature_celsius
print("time= humidity= temperature=".format(current_time,current_humidity,current_temperature))
message = "humidity=" + str(current_humidity) + " temperature=" + str(current_temperature)
As I begin the Python script utilizing the screen utility, I additionally print knowledge to the terminal. Check if the collected knowledge arrives into syslog-ng utilizing the
pi@raspberrypi:~ $ tail -Three /var/log/messages
Jan 5 12:11:24 raspberrypi sensor2syslog_v2.py: humidity=58.294921875 temperature=21.four
Jan 5 12:11:25 raspberrypi sensor2syslog_v2.py: humidity=58.294921875 temperature=21.four
Jan 5 12:11:26 raspberrypi sensor2syslog_v2.py: humidity=58.294921875 temperature=21.39
The 1GB RAM in my Pi 3B+ is approach too low to run Elasticsearch and Kibana, so I host them on a second machine. Installing Elasticsearch and Kibana is completely different on each platform, so I can’t cowl that. What I’ll cowl is mapping. By default, syslog-ng sends all knowledge as textual content. If you wish to put together good graphs in Kibana, you want temperature and humidity values as floating-point numbers.
You have to arrange mapping earlier than sending knowledge from syslog-ng. The syslog-ng configuration expects that the Sensors index makes use of this mapping:
Elasticsearch is now prepared to gather knowledge from syslog-ng.
Install and configure syslog-ng
Version Three.19 of syslog-ng is included in Raspberry Pi OS, but it surely doesn’t but have Elasticsearch help. Therefore, I put in the most recent model of syslog-ng from an unofficial repository. First, I added the repository key:
wget -qO - https://obtain.opensuse.org/repositories/house:/laszlo_budai:/syslog-ng/Raspbian_10/Release.key | sudo apt-key add -
Then I added the next line to
/and many others/apt/sources.record.d/sng.record:
deb https://obtain.opensuse.org/repositories/house:/laszlo_budai:/syslog-ng/Raspbian_10/ ./
Finally, I up to date the repositories and put in the required syslog-ng packages (which additionally eliminated rsyslog from the system):
apt-get set up syslog-ng-mod-json syslog-ng-mod-http
There are many different syslog-ng subpackages, however solely these two are wanted to ahead sensor logs to Elasticsearch.
Syslog-ng’s essential configuration file is
/and many others/syslog-ng/syslog-ng.conf, and you do not want to change it. You can lengthen the configuration by creating new textual content information with a
.conf extension below the
/and many others/syslog-ng/conf.d listing.
I created a file referred to as
sens2elastic.conf with the next content material:
filter f_sensors ;
parser p_kv kv-parser(prefix("sensors."));;
vacation spot d_sensors ;
If you might be new to syslog-ng, learn my article about syslog-ng’s building blocks to study syslog-ng’s configuration. The configuration snippet above reveals among the attainable constructing blocks, aside from the supply, as you have to use the native log supply outlined in
The first line is a filter: it matches this system title. Mine is
sensor2syslog_v2.py. Make certain this worth is similar because the title of your Python script.
The second line is a key-value parser. By default, syslog-ng treats the message a part of incoming log messages as plain textual content. Using this parser, you possibly can create name-value pairs inside syslog-ng from knowledge within the log messages that you need to use later when sending logs to Elasticsearch.
The subsequent block is a bit bigger. It is a vacation spot containing two completely different vacation spot drivers. The first driver saves logs to a neighborhood file in JSON format. I take advantage of this for debugging. The second driver is the Elasticsearch vacation spot. Make certain that the index title and the URL match your surroundings. Using this massive disk buffer, you possibly can guarantee you do not lose any knowledge even when your Elasticsearch server is offline for days.
The final block is a bit completely different. It is the log assertion, the a part of the configuration that connects the above constructing blocks. The title of the supply comes from the primary configuration.
Save the configuration and create the
/tmp/disk-buffer/ listing. Reload syslog-ng to make the configuration stay:
systemctl restart syslog-ng
Test the system
The subsequent step is to check the system. Elasticsearch is already working and ready to obtain knowledge. Syslog-ng is configured to ahead knowledge to Elasticsearch. So, begin the script to ensure knowledge is definitely collected.
For a fast check, you can begin it in a terminal window. For steady knowledge assortment, I like to recommend beginning it from the display screen utility in order that it retains working even after you disconnect from the machine. Of course, this isn’t fail-safe, because it is not going to begin “automagically” on a reboot. If you wish to gather knowledge 24/7, create an init script or a systemd service file for it.
Check that logs arrive within the
/var/log/sensors file. If it’s not empty, then the filter is working as anticipated. Next, open Kibana. I can’t give actual directions right here, because the menu construction appears to alter with every launch. Create an index sample for Kibana from the Sensors index, then change to Kibana’s Discover mode, and choose the freshly outlined index. You ought to already see incoming temperature and humidity knowledge on the display screen.
You at the moment are prepared to visualise knowledge. I used Kibana’s new Lens mode to visualise temperature and humidity values. While it’s not very versatile, it’s undoubtedly simpler to deal with than the opposite visualization instruments in Kibana. This diagram reveals the information I collected, together with how values change once I ventilate my room with recent, chilly air by opening my home windows.
What have I discovered?
My unique purpose was to observe my house’s relative humidity whereas brushing up on my Python and Elasticsearch expertise. Even staying at primary ranges, I now really feel extra comfy working with Python and Elasticsearch.
Best of all: Not solely did I follow these instruments, however I additionally discovered about relative humidity from the graphs. Previously, I usually ventilated my house by opening the home windows for only one or two minutes. The Kibana graphs confirmed that humidity went again to the unique ranges fairly rapidly after I shut the home windows. When I opened the home windows for 5 to 10 minutes as an alternative, humidity stayed low for a lot of hours.
The extra adventurous can use a Raspberry Pi and sensors not simply to observe but additionally to regulate their properties. I configured the whole lot from the bottom up, however there are ready-to-use instruments obtainable resembling Home Assistant. You may configure alerting in syslog-ng to do issues like sending an alert to your Slack channel if the temperature drops under a set degree. There are many sensors obtainable for the Raspberry Pi, so there are numerous potentialities on each the software program and aspect.