The ZeroSensor – a sentient space point of presence

One application for rt-ai Edge is ubiquitous sensing leading to sentient spaces – spaces that can interact with people moving through and provide useful functionality, whether learned or programmed. A step on the road to that is the ZeroSensor, four prototypes of which are shown in the photo. Each ZeroSensor consists of a Raspberry Pi Zero W, a Pi camera module v2, an Adafruit BME 680 breakout and an Adafruit TSL2561 breakout. The combination gives a video stream and a sensor stream with light, temperature, pressure, humidity and air quality values. The video stream can be used to derive motion sensing and identification while the other sensors provide a general idea of conditions in the space. Notably missing is audio. Microphone support would be useful for general sensing and I might add that in real devices. A 3D printable case design is underway in order to allow wide-scale deployment.

Voice-based interaction is a powerful way for users to interact with sentient spaces. However, it is assumed that people who want to interact are using an AR headset of some sort which itself provides the audio I/O capabilities. Gesture input would be possible via the ZeroSensor’s camera. For privacy reasons video would not be viewed directly or stored but just used as a source of activity data and interaction.

This is the simple rt-ai design used to test the ZeroSensors. The ZeroSynth modules are rt-ai Edge synth modules that contain SPEs that interface with the ZeroSensor’s hardware and generate a video stream and a sensor data stream. An instance of a video viewer and sensor viewer are connected to each ZeroSynth module.

This is the result of running the ZeroSensor test design, showing a video and sensor window for each ZeroSensor. The cameras are staring at the ceiling because the four sensors were on a table. When the correct case is available, they will be deployed in the corners of rooms in the space.

Setting up Apache Kafka for use with an Apache ZooKeeper quorum on Ubuntu

There are lots of guides out there describing how to set up simple Apache Kafka configurations but they generally stop short of describing how to use this with a three Apache ZooKeeper quorum so that ZooKeeper isn’t a single point of failure. The configuration of machines that I am working with are running these components:

  • Server1 (static 192.168.10.11) – ZooKeeper
  • Server2 (static 192.168.10.12) – ZooKeeper
  • Server3 (static 192.168.10.13) – ZooKeeper, Kafka broker
  • Desktop (static 192.168.10.14) – Kafka producer and Kafka consumer

This setup doesn’t use multiple Kafka brokers but that’s a relatively simple extension.

Continue reading “Setting up Apache Kafka for use with an Apache ZooKeeper quorum on Ubuntu”

Using Apache NiFi with Apache Kafka

NiFiKafkaApache NiFi is a great way of capturing and processing streams while Apache Kafka is a great way of storing stream data. There’s an excellent description here of how to configure NiFi to pass data to Kafka using MovieLens data as its source. Since I am not running HDFS I modified the example to just put the movies and tags data into Kafka and save the ratings data to a local file. Trying to stash the ratings data into Kafka doesn’t work – there is just too much of it too fast and buffers overflow. It’s pretty easy to use the Kafka console consumer to check that the data is being stored for the movies and tags topics and the local ratings.dat file will be generated also.