- This topic has 1 reply, 2 voices, and was last updated 1 month, 1 week ago by .
Viewing 2 posts - 1 through 2 (of 2 total)
Viewing 2 posts - 1 through 2 (of 2 total)
- You must be logged in to reply to this topic.
No products in the cart.
Forum › Developers › Projects and Resources › Different Emotiv softwares for developing different use cases
Tagged: cortexAPI, Emotiv BCI, Emotiv Softwares
Hi Team,
Hope all is well. I am writing to understand the software use cases for developing uses cases with emotiv headsets. I have trained different mental commands using the Insights 2 headset using Emotiv launcher and Emotiv BCI. There are only a set number of commands that one can train on. I am wondering how we can train on our own commands.
To give you a simple example. If I want to play mario using my brain signals, how can i train different mental commands(jump, shoot, move ahead, move back) using the emotiv and then get the live feed of these commands using the cortex API. Can anyone walk me through the high level process of achieving this.
Thank you community. Appreciate your help.
Hi Gautam,
The commands in the Cortex API are more like a slot to store the desired action. If you want to play Mario and map the game actions to the BCI, you will need to develop some kind of interface that will map the trained commands in emotiv to the actions in Mario.
What I mean by that is that you need to code some piece of software that when it recognises the emotiv command “PUSH”, for example, it will call the function in the game to move forward or back, or whatever.
In short, the commands in Emotiv are simply a slot to store the actions you need, but they are not related to the word used for naming it.
To give you a better idea, I have developed some kind of BCI authorisation system that basically is mapping the command PUSH to call a function to “unlock” a vault.