PowerTXT alerts via Twilio and Node Red into Zabbix (Part 2)
The project (see here) has moved on from a proof of concept to live now. The original Node.Red flow has been updated to delete the original Twilio message after having passed it on to Zabbix. A new flow has been added to accept requests from Zabbix to send PowerTXT commands via Twilio and a small flow has been created to allow Zabbix to check that Node.Red is up and running ok.
The original inbound SMS Node.Red flow now has the extra steps (at the bottom) which check if the message was sent to Zabbix ok and if so builds the command to send via the Twilio API to delete the original message from Twilio.
A new Node.Red flow has been added to accept requests from Zabbix and then send PowerTXT commands. In Zabbix, the requests are created by a Zabbix script that uses wget to send the request, which includes the Zabbix hostname (in the for sms_44nnnnnnnnn where nnnnnnnnn is the mobile number without the leading zero) and a command to perform (register, on, off or query). The flow verifies the request has originated from localhost and checks that the hostname and command parameters have been supplied. The symbolic command names are then translated into PowerTXT commands (e.g. #07# for query) and passed into a step that builds the request for Twilio. The request is sent and the reply from Twilio examined to see if the message was sent or not, the result of this test is used to pass a status back to the Zabbix script that originally requested the message be sent.
Finally a short Node.Red flow was added to that Zabbix can send a web request that is replied to with "OK" so Zabbix can check that Node.Red is running ok. This flow also checked that request originated from localhost.
PowerTXT alerts via Twilio and Node Red into Zabbix
A proof of concept for a client operating 56-60 Networks around the UK. Each Network has its own ADSL/FTTC/Cable/Fibre Backhaul Internet connection and Network Manager Server. The power to the Backhaul equipment is Monitored and Controlled via a TekView PowerTXT device which sends SMS messages for Power/Temperature events and accepts command via SMS to control the power and configure the device. Their overall Network is Monitored via Zabbix which needed to be aware of Power Outage events.
The approach used was to set the PowerTXT device to send it's status Text messages via the Mobile Network to a Mobile Number create on Twilio. The Twilio Mobile Number being configured to send the SMS wrapped in an http request to Node Red. A Node Red Flow takes the http request and extracts the "From" Mobile number and the Text data from the message, constructs a "zabbix_sender" command to use to pass the message to Zabbix which is stored aa "Zabbix trapper" data. The Node Red Flow also creates the response to the original https request from Twilio so it knows the request has been processed and there is no reply SMS to send. These also a bit of Debug logging to catch whats going on.
The next stage will be to add to the Node Red Flow so that it will delete the original SMS from Twilio after its been processed and for it to parse the body of the SMS from the PowerTXT to detect Power OFF and ON event to pass to Zabbix.
(Part 2 is here)
Parsing TFL Tube Line Status with Pentaho
|In a previous article I wrong about using TRL Tube Line Status via the old API (rather than the new Unified API), formatting and storing the data in Zabbix to be displayed on a Grafana Dashboard. In this article, I'm using the new TFL Unified API to grab TFL Tube Line Status in a JSON structure and parsing that data using Pentaho to produce a simple CSV file with a Date/Time, a Tube Line Name and a Status of the Tube Line.|
Having signed up for access to the new TFL Unified API and having had a look at the example RESTful web calls for data, I fired up an example query to get the TFL Tube Line Statuses.
The first element of the JSON Array returned looks like this;
"$type": "Tfl.Api.Presentation.Entities.Line, Tfl.Api.Presentation.Entities",
"$type": "Tfl.Api.Presentation.Entities.LineStatus, Tfl.Api.Presentation.Entities",
"statusSeverityDescription": "Good Service",
"$type": "Tfl.Api.Presentation.Entities.LineServiceTypeInfo, Tfl.Api.Presentation.Entities",
"$type": "Tfl.Api.Presentation.Entities.Crowding, Tfl.Api.Presentation.Entities"
However, all I wanted was the TFL Tube Line Name "name": "Bakerloo" and the Status "statusSeverityDescription": "Good Service" (marked in yellow above). The following Pentaho Transformation does the work required
The step "Generate Rows" is just used to set the API URL to be used by the "HTTP Client" step which send the RESTful API request to the TRL Servers. The returned JSON data is passed to the "JSON Input" step which parses out the Line Name and Line Status data. The "Get System Info" step just added the current Date/Time that is sent with the parsed data to the "Text file output" step that write the CSV file.
The only "tricky" bit is parsing the returned JSON structure. This uses JSONPath definitions to tell the JSON parser step where to find the data in the JSON structure.
For the Line Name the JSONPath was "$.[*].name" which instructs the JSON parser to get the "name" from each top level array element. For the Line Status the JSONPath was "$.[*].lineStatuses.statusSeverityDescription" which instructs the JSON parser to get the "statusSeverityDescription" from the first element of the "lineStatuses" array from each top level array element.
The results being...
2018/06/22 15:12:41,"Bakerloo","Good Service"
2018/06/22 15:12:41,"Central","Good Service"
2018/06/22 15:12:41,"Circle","Good Service"
2018/06/22 15:12:41,"District","Good Service"
2018/06/22 15:12:41,"Hammersmith & City","Good Service"
2018/06/22 15:12:41,"Jubilee","Good Service"
2018/06/22 15:12:41,"Metropolitan","Good Service"
2018/06/22 15:12:41,"Northern","Good Service"
2018/06/22 15:12:41,"Piccadilly","Good Service"
2018/06/22 15:12:41,"Victoria","Good Service"
2018/06/22 15:12:41,"Waterloo & City","Good Service"