Skills:
Physical Computing
Python (NLP)
Mechanical Design (CAD)
Spark AR
Collaborators:
Lior Shulak-Hai
Mariona Ruiz Peris
Xianzhi Zhang
BrightSide; Communication Tool for Fibromyalgia
BrightSide is the first communication tool for parents with fibromyalgia that focuses on the social aspect of pain management, and introduces a novel approach to how families talk about chronic illness.
Fibromyalgia is a chronic condition that causes widespread pain and severe fatigue, that has no cure and no explainable onset. Being a parent with fibromyalgia is extremely difficult especially when it comes to communicating their state to their children. This is why we created BrightSide, the first communication device for parents with fibromyalgia and their kids.
By using BrightSide a parent can establish a communication ritual that will help them overcome the barrier of speaking about their condition with their children. The interaction will establish a family conversation routine about parents’ symptoms and efforts of getting better. This will reduce stress levels associated with fibromyalgia along with uncertainties for both parent and child, helping normalise the condition by shining a light on their daily life.
Development Journey
Playtime Exploration:
We began the project focusing on how we might be able to assist the parent who has fibromyalgia with playtime. We saw a key opportunity in bridging high and low energy activities so that a mother can play a game with a high-energy child without one of the parties having to compromise.
By talking with users we learned how important it is for the mother to be able to interact physically, but be able to control how much energy she expends. Some days she may feel strong and sharp and other days she may be experiencing brain fog and feel weighed down.
We collected information on playtime routine through a website and Instagram filter that we built for fibro-parents. It has helped us realize that our direction should not impose an activity but rather fit into a wider context of their daily lives, and allow them to find moments to connect.
Physical Embodiment of Concept One: Faro
Render:
Its function is two-fold: to mediate and facilitate communication with children, and to encourage the parent to write, which is proven to reduce stress levels in chronic pain patients.
FARO has two pieces: the bulb and the body. Together, they help measure the parent's fatigue level and state, and communicate it to their child.
Our goal was to detect the content of the writing, as well as the writing speed and heart rate, which will allow us to obtain a fatigue level.
Credit: Xianzhi Zhang
Technical Specs: Faro
To be able to analyse the fatigue level, the bulb has a built- in pulse sensor that captures the user’s heart rate variability while writing. It also has a camera and an optical sensor to obtain the speed, hesitation and position information as well as the sentiment analysis of the user's writing.
Once put on the base, the integrated LEDs in the bulb will start to shine softly. When the kid pats on the light bulb, a capacitive sensor will detect the interaction and turn on a more colorful light show to convey the parent's state.
FARO’s bulb has an ergonomic grip, to allow easy writing while sliding on a surface. A flexible silicone insert is in place to fit different pens.
The Fisherman and the Foggy Sea:
The Fisherman and the Foggy Sea is a narrative designed to help children understand complex symptoms like fatigue and brain fog , which are often associated with fibromyalgia.
The book would help introduce FARO into their lives.
Credit: Lior Shulak-Hai
Technical Prototypes with Computer Vision & Natural Language Processing
Using preexisting computer vision and natural language processing libraries I was able to quickly stitch together a couple programs that scanned hand written documents and then reacted to the processed image.
Prototype 1: Handwritten Digit Recognition
The first computer vision embodiment works by detecting handwritten characters (in this case numbers) through a live webcam and uses a model trained on the MNIST dataset to determine what number is being read.
Here the visualization is a direct correlation between number of LED’s that turn on and the number on the post-it note.
Prototype 2:
Note scanner's view from webcam
The second embodiment scanned a hand written note, transcribed the text, and then ran a sentiment analyzer on the notes
Scanning was done with help from Murtaza's Workshop: Link to github
Prototype 2:
Note scanner's view from terminal
The scanned image was then read using Google's cloud vision API and a sentiment analyzer was run on it to determine whether the message was positive or negative.
Final Concept: BrightSide
Exploded View:
Each device consists of two components: the bulb which contains the speaker, microphone, LEDs, microprocessor and Wi-Fi module, and the base, which is the charger.
Interaction / System Map:
The system is designed to facilitate an asynchronous communication routine where the parents can record a message for their children whenever they feel physically able.
Parents record messages that are analysed through voice-based analytics and natural language processing to determine fatigue and emotional state. Once analysed, their state is conveyed through a simple colour system for the children to understand.
Machine Learning Voice Analysis:
In the speech recognition system we parametrise the speech waveform, that enables us to classify the speech rate, tone, and content of the message.
Technical Prototypes with Speech Analysis & Natural Language Processing
To validate the interaction I built out the basic functionality using asynchronous HTTP servers, but ultimately this interaction is best suited for a MQTT protocol.
Functional Prototype for Demo:
1. First an ESP32 representing the kid bulb starts a HTTP server.
2. Then a laptop with python takes an audio recording and runs a comprehensive analysis of the voice message, determines 1 of 5 different states, and HTTP requests the state to the ESP32.
3. The ESP receives the request and stores it as a variable to later turn on a neoPixel array.
4. The computer starts its own server through where it posts the recording.
5. The NFC reader triggers the ESP32 to close its server and stream the mp3 file.