Moving existing mqtt-iot devices to work with Google cloud

159 views Asked by At

I'm trying to fit two different previously designed iot devices to work with Google cloud, currently my devices are publishing it's telemetry using custom mqtt topics like:

cloud to device:

/devices/device1/battery/cmnd/stat (gets the status)
/devices/device1/battery/cmnd/interval (value on payload)
/devices/device1/fan/cmnd/turn (value on payload)

...

device to cloud:

/devices/device1/heartbeat/tele
/devices/device1/wifi/ip/tele
/devices/device1/wifi/rssi/tele

...

I'm new to the google iot cloud and trying to understand how it works and how should I design the backend to connect my devices to it.

AFAIK using the GCP I lose the flexibility of using custom mqtt topics and I'm limited to use only four predefined ones:

  • /devices/my-device/events/
  • /devices/my-device/commands/#
  • /devices/my-device/config/
  • /devices/my-device/state/

I'm trying to map my 'tele' topics to 'events' and use a cloud function to parse the json and insert the data to a cloudsql database.

I created the topics on the Iot Core like this:

Iot topics

and created a cloud function:

import base64
import sqlalchemy
from sqlalchemy import update


def device_telemetry_wifi_pubsub(event, context):
    pubsub_message = base64.b64decode(event['data']).decode('utf-8')

db_user = os.environ.get("DB_USER")
db_pass = os.environ.get("DB_PASS")
db_name = os.environ.get("DB_NAME")
cloud_sql_connection_name = os.environ.get("CLOUD_SQL_CONNECTION_NAME")

db = sqlalchemy.create_engine(
    sqlalchemy.engine.url.URL(
        drivername='mysql+pymysql',
        username=db_user,
        password=db_pass,
        database=db_name,
        query={
            'unix_socket': '/cloudsql/{}'.format(cloud_sql_connection_name)
        },
    ),
)

stmt = sqlalchemy.text('INSERT INTO wifistr(wifistr) VALUES (:data)')
try:
    with db.connect() as conn:
        conn.execute(stmt, data=pubsub_message)
    message.ack()
except Exception as e:
    print(e)

I'm using the 'projects/{my-project}/topics/wifi' topic trigger to insert the received json encoded wifi telemetry to the database, my plan is to repeat this process with the rest of data send by my devices...it works but not sure if it's the best way to do what I'm trying to do in GCP.

I have a few questions/validations/assumptions:

  1. I have seen there is a default telemetry topic when creating a iot core registry that publish all events received if there isn't a matching topic subfolder, but I can't find any way to know which mqtt topic generated the message when I'm processing it from the cloud function, is that right?
  2. The only way to know which topic generated the message is by using different pub/sub topics and subfolders and using different matching cloud functions triggering when a message is received in that pub/sub message queue.
  3. Most of the functions will be very similar with probably common code, is it possible to create some sort of library containing all my functions in only one file/codebase and have different cloud functions calling different methods on that library?
  4. If that is not possible is there any method for deploying multiple functions from the gcloud cli on a single step?

Excuse me if some of my assumptions are wrong or I missed something as I'm very new to the cloud thing

0

There are 0 answers