[EEG] Notes from EEG meeting

Jonathan Foote jtfoote at ieee.org
Mon Apr 6 22:11:28 UTC 2009


Thanks Kelly for excellent notes. I added some details about the
boards and enclosures.

In a development that could be exciting but probably isn't, I ran into
Kal Spelletich this morning. If I understand him right (not always a
given) he has both an Emotiv headset and a spare set of OpenEEG
boards. He said he's not using them and would contribute them to the
cause. I think the Emotiv headset may be fairly useless without the
api/driver/software but we could at least investigate their electrode
setup. I'm still on the record as being Very Skeptical about them.
Anyway I'll be by his studio later this week and see what I can come
away with.

Kelly suggests another ping to the main NB list which I think is a
splendid idea.

Rachel, may we PayPal you some contributions for the electrode cap?

On Sun, Apr 5, 2009 at 6:00 PM, Kelly <hurtstotouchfire at gmail.com> wrote:
> There are notes from our second meeting on the wiki:
> https://noisebridge.net/wiki/OpenEEG#Meetup_the_Second
>
> And I also added some brief notes at the beginning of the page with some
> links.  There's a link to a video of a pretty classic ERP method of text
> selection.
>
> I think the big highlight is that Rachel said she's going to buy the cap.
> Jake sent us the link to that (it's on the wiki page).  I seem to recall an
> additional package of peripherals that we were considering buying...?  Can't
> recall.  If anyone remembers, tell Rachel.
>
> We mostly talked about further hardware troubleshooting, but also talked a
> little more about the various methods of EEG analysis.  We also talked about
> Emotiv's headset some more, and I linked the video of their demo in the
> notes.  One thing that came up a few times was using EEG to control precise
> movements, like in the Pong video (linked on wiki) or in Emotiv's EPOC
> demo.  I'm assuming that this is using motor cortex data, which would be
> awesome.  That would mean we could use gestures (or possibly just imagined
> gestures) as a recognizable cognitive event.
>
> Which brings me to one thing that we didn't talk about much, the passthought
> project.  Jake couldn't make it, and we didn't have too many of our
> programmers, so that whole genre of material didn't get discussed much, but
> I'd really like to brainstorm next time about some of the cognitive events
> we could use as the actual passthought, and what other security features
> would be useful. I'm sure we can count on Jake to be vocal about this in the
> future, but I seem to recall some other folks were interested in that
> problem as well.  The idea of a passthought that's gesture based is really
> exciting to me.  There's a lot of interesting neuroscience around mirror
> neurons [http://en.wikipedia.org/wiki/Mirror_neuron] that would be cool to
> leverage as well (possibly for the security angle?).  And I happen to know
> that there's been research in detecting mirror neuron activity with EEG.
>
> Anyway, I'm excited about this whole motion detection thing, and that'll be
> my research item this week.  There's actually kind of a lot that we need to
> research so if anyone's interested in spending some quality time on the
> googles, ping me and maybe we can find something you're interested in
> amongst the vast catalog of Shit We Need To Research.
>
> And I think that the Sunday time was nice.  It's fairly mellow at 83c.  The
> turnout was definitely less, so if any of you that couldn't make it were
> actually unable to make that time in particular, let us know.  Otherwise,
> I'm assuming we'll meet again next Sunday at teatime.
>
> -K
>
> _______________________________________________
> EEG mailing list
> EEG at lists.noisebridge.net
> https://www.noisebridge.net/mailman/listinfo/eeg
>
>



More information about the Neuro mailing list