I can confirm the machine detects and transmits data about your emotional states to a server. It also can detect if you play your own music to the light sequence, presumably to find ways to augment their own programs, if we're in denial that brain data miners aren't benign. Thing is, there are four other products that are lesser competitors with inferior machines that all have that exact same capacity, so it's quite possible this machine is sending more info on your brain than that.
Anyone thinking they need your brain data to improve needs to understand that their business model is designed for them to never achieve a perfect program. If they now declare that these even 10 programs in sequence will manage ADHD or insomnia, for instance, now there is no incentive to pay a monthly fee for new programs, you have ones that work. More financially interesting than that is it guarantees that customers will connect their device to the internet, and data from their devices can be revived from the programs. Even if you know how to set up an emulator, they can access this data even when you have lynux, without some pretty programs, in the data transfer of the new programs.
It's incredibly frustrating because the technology has real potential, and setting up the business in a way as to maximize profit and punish success, while making it a data mining project for everyone and forcing you to have a China and everyone hacked cellphone plugged into a device on your brain is a black hole of a security risk before we even talk about what the company may try to profit from.