sigfox-gcloud adapter for writing Sigfox messages into databases
npm install sigfox-gcloud-datasigfox-gcloud-data is a sigfox-gcloud
adapter for writing Sigfox messages into SQL databases like MySQL and Postgres.
You may read and update Sigfox messages with other modules (such as sigfox-gcloud-ubidots)
before passing to sigfox-gcloud-data for writing to the database.sigfox-gcloud-data works with most SQL databases supported by
Knex.js
like MySQL, Postgres, MSSQL, MariaDB and Oracle.
sigfox-gcloud-data was built with sigfox-gcloud, an open-source software framework for building a
Sigfox server with Google Cloud Functions and Google Cloud PubSub
message queues. Check out sigfox-gcloud
_sigfox-gcloud-data with MySQL:_
_sigfox-gcloud-data with Postgres:_
- Version 1.0.1 (14 Oct 2017): Supports multiple instances
For development we support Linux, MacOS and Ubuntu on Windows 10.
Open a command prompt and enter these commands to download the sigfox-cloud-data source folder to your computer.
``bash`
git clone https://github.com/UnaBiz/sigfox-gcloud-data.git
cd sigfox-gcloud-data
If you're using Ubuntu on Windows 10, we recommend that you launch "Bash on Ubuntu on Windows" and enter the following
commands to download the source files into the folder /mnt/c/sigfox-gcloud-data:
`bash`
cd /mnt/c
git clone https://github.com/UnaBiz/sigfox-gcloud-data.git
cd sigfox-gcloud-data
That's because /mnt/c/sigfox-gcloud-data under bash is a shortcut to c:\sigfox-gcloud-data under Windows. \n
So you could use Windows Explorer and other Windows tools to browse and edit files in the folder.
Remember to use a text editor like Visual Studio Code that can save files using
the Linux line-ending convention (linefeed only: ), \r \n
instead of the Windows convention (carriage return + linefeed: ).
1. Install sigfox-gcloud with the base modules (exclude optional modules):
https://github.com/UnaBiz/sigfox-gcloud/blob/master/README.md
1. Open a bash command prompt. For Windows, open "Bash on Ubuntu on Windows."
Create a file named .env in the sigfox-gcloud-data folder GCLOUD_PROJECT
and populate the variable with your project ID.myproject
To do that, you may use this command (change to your project ID):
`bash`
cd sigfox-gcloud-data
echo GCLOUD_PROJECT=myproject >.env
1. Add the following sigfox-route setting to the Google Cloud Project Metadata store.decodeStructuredMessage
This route says that all received Sigfox messages will be processed by the
two steps and sendToDatabase.
`bash`
gcloud compute project-info add-metadata --metadata=^:^sigfox-route=decodeStructuredMessage,sendToDatabase
sigfox-gcloud-ubidots
If you're using , the sendToDatabase step should appearsendToUbidots
last so that the updates from will be recorded in the database.
`bash`
gcloud compute project-info add-metadata --metadata=^:^sigfox-route=decodeStructuredMessage,sendToUbidots,sendToDatabase
1. Create the Google PubSub message queue that we will use to route the
Sigfox messages between the Cloud Functions:
`bash`
gcloud beta pubsub topics create sigfox.types.sendToDatabase
sigfox.devices.sendToDatabase
is the queue that will receive decoded Sigfox messages
to be sent to data via the data API
1. Go to the Google Cloud Metadata screen to define your database settings:
https://console.cloud.google.com/compute/metadata

- sigfox-dbclient: Database client library to be used e.g mysql, pg. sigfox-dbhost
Check this page for the library: http://knexjs.org/#Installation-node
- : Address of database server e.g. 127.127.127.127sigfox-dbuser
- : User ID for accessing the database e.g. usersigfox-dbpassword
- : Password for accessing the database.sigfox-dbname
- : Name of the database that will store the sensor data. Defaults to sigfoxsigfox-dbtable
- : Name of the table to store sensor data. Defaults to sensordatasigfox-dbversion
- : Version number of database, used only by Postgres, e.g. 7.2
If the sigfox-dbtable table above does not exist, it will be created automatically.
1. Install the database library if you are NOT using MySQL or Postgres.
Check this page for the library to be used:
http://knexjs.org/#Installation-node
Then run the command npm install LIBRARYNAME --save. For example if you're using MSSQL, you would`
run this command:
bash`
npm install mssql --save
sendToDatabase
1. Deploy the Cloud Function with the deployall.sh script:
`bash`
chmod +x /.sh
scripts/deployall.sh
1. Sigfox messages are pushed by the Sigfox Cloud to the Google Cloud Function
sigfoxCallback
1. Cloud Function sigfoxCallback delivers the message to PubSub message queuesigfox.devices.all
, as well as to the device ID and device type queuesrouteMessage
1. Cloud Function
sigfox.devices.all
listens to PubSub message queue
and picks up the new messagerouteMessage
1. Cloud Function assigns a route to the sigfox-route
Sigfox message by reading the from the Google Compute Metadata Store.
The route looks like this:
``
decodeStructuredMessage, sendToDatabase
1. This route first sends the message to function decodeStructuredMessage sigfox.types.decodeStructuredMessage
via the queue decodeStructuredMessage
1. contains the logic to decode a compressed message format that we call b0513801a421f0019405a500
Structured Message Format. Within a 12-byte Sigfox message, the Structured Message Format
can encode efficiently 3 sensor field values and their sensor field names.
For example, the encoded 12-byte message
tmp = 31.2, hmd = 49.6, alt = 16.5
contains 3 sensor values (temperature, humidity, altitude) and their field names:
sigfox-route
1. According to above, the resulting decoded message is sent next to function sendToDatabase
via the queue sigfox.types.sendToDatabase
1. sendToDatabase appends the received Sigfox message to the sensordata tablesendToDatabase
that you have defined in the Google Cloud Metadata settings. It calls the
Knex.js library to update the database.
1. automatically matches the received Sigfox message fields with the sensordata fields.sensordata
So if your Sigfox message includes a new field (perhaps by decoding a Structured Message)
and the table also contains a field by that name, sendToDatabasesensordata
will write the new field into the table.
1. See this doc for the definition of Structured Messages:
https://unabiz.github.io/unashield/
To understand how Structured Messages may be used with the Ubidots IoT platform, check the UnaShield Tutorial for Ubidots:
https://unabiz.github.io/unashield/ubidots
You may view the logs through the
Google Cloud Logging Console
Select "Cloud Function" as the "Resource"

From the screen above you can see the logs generated as each Sigfox message is processed in stages by sigfox-gcloud:
- Sigfox Device IDs are shown in square brackets e.g. [ 2C30EB ]
- Completed Steps are denoted by _<<_
- sigfoxCallback is the Google Cloud Function that listens for incoming HTTPS messages delivered by Sigfox
- routeMessage passes the Sigfox message to various Google Cloud Functions to decode and process the message
- decodeStructuredMessage decodes a compressed Sigfox message that contains multiple field names and field values
- sendToDatabase would appear after decodeStructuredMessage.sendToDatabase
writes the decoded sensor data to the database via the
Knex.js library.
The
Google Cloud Trace Console
shows you the time taken by each step of the Sigfox message processing pipeline, tracing the message through every Google Cloud Function.
Each message delivered by Sigfox appears as a separate trace timeline. Messages are shown like 2C30EB seq:19132C30EB
where is the Sigfox Device ID and 1913 is the Sigfox Message Sequence Number (seqNumber)
The Google Stackdriver Trace API needs to be enabled manually.
Custom reports may be created in Google Cloud Trace Control to benchmark the performance of each processing step over time.
To understand each processing step in the sigfox-gcloud-data server, you may use the
Google Cloud Debug Console
to set breakpoints and capture in-memory variable values for each Google Cloud Function, without stopping or reconfiguring the server.
In the example below, we have set a breakpoint in the sigfoxCallback Google Cloud Function. The captured in-memory
values are displayed at right - you can see the Sigfox message that was received by the callback.
The Callback Stack appears at the lower right.
Google Cloud Debug is also useful for troubleshooting your custom message processing code without having to insert the
debugging code yourself.
1. Send some Sigfox messages from the Sigfox devices. Monitor the progress
of the processing through the
Google Cloud Logging Console.
Select "Cloud Function" as the "Resource"

1. Processing errors will be reported to the
Google Cloud Error Reporting Console.

1. We may configure
Google Cloud Stackdriver Monitoring
to create incident
reports upon detecting any errors. Stackdriver may also be used to
generate dashboards for monitoring the PubSub message processing queues.

1. To send messages from a Sigfox device into your database, you may use this Arduino sketch:
https://github.com/UnaBiz/unabiz-arduino/blob/master/examples/send-light-level/send-light-level.ino
The sketch sends 3 field names and field values, packed into a Structured Message:
``
ctr - message counter
lig - light level, based on the Grove analog light sensor
tmp - module temperature, based on the Sigfox module's embedded temperature sensor
1. Alternatively, you may test by sending a Sigfox message
from your Sigfox device with the data field set to:
``
920e82002731b01db0512201
sigfoxCallback
We may also use a URL testing tool like Postman to send a POST request to the URL e.g.myproject
(change to your Google Cloud Project ID)https://us-central1-myproject.cloudfunctions.net/sigfoxCallback
Set the Content-Type header to application/json. Body
If you're using Postman, click -> Raw -> JSON (application/json)`
Set the body to:
json`
{
"device":"1A2345",
"data":"920e82002731b01db0512201",
"time":"1476980426",
"duplicate":"false",
"snr":"18.86",
"station":"0000",
"avgSnr":"15.54",
"lat":"1",
"lng":"104",
"rssi":"-123.00",
"seqNumber":"1492",
"ack":"false",
"longPolling":"false"
}
device
where is your Sigfox device ID.curl
Here's the request in Postman:

We may use the command as well. Remember to change myproject and 1A2345
to your project ID and device ID.
`bash`
curl --request POST \
--url https://us-central1-myproject.cloudfunctions.net/sigfoxCallback \
--header 'cache-control: no-cache' \
--header 'content-type: application/json' \
--data '{"device":"1A2345", "data":"920e82002731b01db0512201", "time":"1476980426", "duplicate":"false", "snr":"18.86", "station":"0000", "avgSnr":"15.54", "lat":"1", "lng":"104", "rssi":"-123.00", "seqNumber":"1492", "ack":"false", "longPolling":"false"}'
`
1. The response from the callback function should look like this:
json`
{
"1A2345": {
"noData": true
}
}
sensordata
1. The test message sent above will be decoded and written to your
table as
``
ctr (counter): 13
lig (light level): 760
tmp (temperature): 29
The other fields of the Sigfox message will be written as well.
It's possible to run 2 or more Cloud Functions that will update different databases.
The Cloud Functions should be named:
``
sendToDatabase, sendToDatabase2, sendToDatabase3, ...
and the configuration for each function shall be set in the Google Cloud Metadata
screen as
``
sigfox-dbclient, sigfox-dbclient2, sigfox-dbclient3, ...
For example, this metadata screen defines 2 databases settings for MySQL and Postgres:
To deploy the second instance of sendToDatabase, edit the script scripts/deploy.sh functiondeploy
and uncomment the second so it looks like:
`bash`
./scripts/functiondeploy.sh ${name}2 ${localpath} ${trigger} ${topic}
Run scripts/deploy.sh. This will deploy a new function sendToDatabase2 that uses the second database setting
in the Google Cloud Metadata screen.
To deploy sendToDatabase3, sendToDatabase4, ... you may edit scripts/deploy.sh accordingly:
`bash`
./scripts/functiondeploy.sh ${name}3 ${localpath} ${trigger} ${topic}
./scripts/functiondeploy.sh ${name}4 ${localpath} ${trigger} ${topic}
Note that all instances of sendToDatabase will read Sigfox messages from the sigfox.types.sendToDatabase` queue simultaneously.
The database updates will run in parallel.