OSC - Blink signal

Zague
Posts: 8
Joined: Mon Apr 10, 2017 10:25 pm

OSC - Blink signal

Post by Zague »

I'm doing various tests with Quartz Composer (QC) on macOS Sierra.

QC is no longer officially supported by Apple but it is still packaged with XCode and it allows me to test a few things.

In QC, I use qcOSC -- a third party patch -- to access the data stream.

The /muse/elements/blink and /muse/elements/jaw clench/ seem to be stuck outputting a numerical 1 no matter what is happening, blinking or clenching or not.

In MuseLab, the blinking signal is consistently displayed when selected in a visualizer. (flat lines at a value of 1 for a fixed duration)

Also, I can't find a jaw clench signal in neither the incoming signal or the outgoing signal (which is forwarded to QC). So, there's a "jaw clench" signal that is received by QC although none is apparently sent according to MuseLab.

Am I missing something?

Z.
User avatar
James
Site Admin
Posts: 1103
Joined: Wed Jan 02, 2013 9:06 pm

Re: OSC - Blink signal

Post by James »

It's not made clear in the Interaxon documentation, but Blink and Jaw_Clench are data markers and are only transmitted when the event occurs, so you will never see a zero.

In MuseLab, you turn on data markers in the "Markers" drop down, then enter the paths you want to enable. These are then displayed as a vertical bar scrolling across the data at the point the event occurred.

Muse Monitor also has built in numbered meta data markers for manually tagging data when streaming (or recording). Marker Buttons are hidden by default and can be turned on in Advanced Settings.
Zague
Posts: 8
Joined: Mon Apr 10, 2017 10:25 pm

Re: OSC - Blink signal

Post by Zague »

Thanks, James.

I do see what you mean in MuseLab, both on the graph and in the msg/sec column in the incoming messages list.

This tells me that MuseLab sends a signal through OSC when blinking or clenching occurs and none otherwise. There are jaw clench items in both incoming and outgoing OSC lists contrary to what I wrote yesterday.

In QC, trying to assess blinking or clenching is pointless, I guess. If I monitor the signal while blinking or clenching nothing happens. The patch probably can't assess the interruption of the signals, It just keeps the last value received and that never changes numerically. BTW, in the Muse docs, blinking is defined as boolean and jaw clench as integer, but they both behave as booleans in MuseLab.

I'll move on for now. That QC patch can't deal with that and it's not of the utmost importance at the moment.

I'm testing compositions that could act as a BCI to trigger visual events and I wondered if I could filter out blinking and clenching since these signals were apparently usable.

While I can manage to visualize the eeg and waves signals, I'm not yet at the point where I could create voluntary triggers out of the eeg or waves. The eeg is hard to use without elaborate processing from what I see and read. I do see that waves seem to offer more control although I should be cautious while saying that. A training stage is probably unavoidable.

Z.
User avatar
James
Site Admin
Posts: 1103
Joined: Wed Jan 02, 2013 9:06 pm

Re: OSC - Blink signal

Post by James »

I know that wave values produced by the Interaxon SDK are already processed to filter power line noise, but I'm not sure about blink and jaw clench; you'd have to confirm that with Interaxon themselves. I would recommend posing the question on their dev forum.
Zague
Posts: 8
Joined: Mon Apr 10, 2017 10:25 pm

Re: OSC - Blink signal

Post by Zague »

From what I understand now, the problem lies with this OSC patch in QC and Interaxon probably can't do much about it. Also, QC is only there to maintain a certain continuity from what was to what is (Metal, Swift, etc.).

I'm using QC myself because it works well as a prototyping tool for the little things that I'm trying. I thought of filtering out blinks from the waves signals because as you know they artificially drive values up for a second or so.

For instance, let's say I use the delta wave stream to see if I can voluntarily move an object horizontally on screen from point A to point B. Now, I do see that I have some voluntary control over the movement of the object. It is far from perfect -- it's easier to bring values down than pushing them up -- but it does sem to correlate enough. Blinking pushes values up as I said. So, technically, I can drive the object to its destination by blinking, not exactly useful in the circumstance. This said, I can refrain from blinking too much and apply some smoothing so a inevitable one will not affect too much the process.

Ultimately, I'd like to have an interactive display in an immersive environment that would respond to the brainwaves of a visitor. It could also help vary the parallax of stereoscopic objects and have them move to and from the observer.

As soon as I see some future in my prototype, I'll move on to Swift/Metal, the main challenge remaining signal processing in the context of relatively precise interactions. An interactive but passive flow between the brainwaves and the display is much simpler.

Hope I'm not using too much of your time. I appreciate the help.

Z.
User avatar
James
Site Admin
Posts: 1103
Joined: Wed Jan 02, 2013 9:06 pm

Re: OSC - Blink signal

Post by James »

I would recommend that you just work with the raw data, calculating your own FFT and subsequent waves values, then you can code your own blink removal filter.
Zague
Posts: 8
Joined: Mon Apr 10, 2017 10:25 pm

Re: OSC - Blink signal

Post by Zague »

I see. The hard part is that QC has no generic patch to calculate FFTs. There's one for audio input that calculates 16 frequency bins. I found some javascript code online that can be used inside a QC patch, but that would require some additional programming in order to work as intended. It's a tad heavy for this stage of my exploration.

I see in Muse docs that they they have four OSC paths of raw fft. Is there a reason why they're not accessible through Muse Monitor?

Z.
User avatar
James
Site Admin
Posts: 1103
Joined: Wed Jan 02, 2013 9:06 pm

Re: OSC - Blink signal

Post by James »

The fft paths mentioned in the Muse docs have not been implemented by Interaxon in their public mobile SDKs. However, they can be calculated very easily from the raw data, so they are not necessary.

Additionally if you calculate your own FFTs you can use whatever window and filters you like, rather than being restricted to the Interaxon defaults. Just Google "FFT Code" for examples, or check out https://rosettacode.org/wiki/Fast_Fourier_transform
Zague
Posts: 8
Joined: Mon Apr 10, 2017 10:25 pm

Re: OSC - Blink signal

Post by Zague »

Thanks, James. This is exactly what is needed, but I'm struggling to implement it in QC because of inconsistencies in the current state of QC within the macos developement environement, the minimal capabilities of debugging javascript within QC and my own lack of competence. That problem would be better handled by other tools in the mac development environment, I guess. I have to see. Interaxon has not released the macos SDK, for one thing, and I can't tell if anything is feasible on the mac and at my level without it.

I learned a few concrete things about BCIs while trying this. What I'm trying to do calls for specific tools but I remain more in the position of a user than that of a developer.
User avatar
James
Site Admin
Posts: 1103
Joined: Wed Jan 02, 2013 9:06 pm

Re: OSC - Blink signal

Post by James »

Perhaps you would have more luck with https://processing.org?
Post Reply