GasLight : Neural Networked Presets
Been working on GasLight again over the past few days. I’ve spent quite a bit of time on the autopilot mode, which randomly alters the visual settings while you watch. There’s a whole bunch of factors which all combine together in strange and unusual ways to produce GasLight’s effects, and trying to predict them and avoid the ugly effects on a randomly generated sequence is actually pretty tricky. It’s more or less there now, however. Much improved on the previous version, anyway.
Meanwhile, it’s got me thinking. It sounds a very cool idea to hook the autopilot up to a neural network. The user could hit a key whenever he sees a combination of settings that he likes. The neural network slowly learns to produce new settings that it thinks might appeal to the user.
Hmm….it just might work. But it would take a while – I’d need to dust off my old neural network code (and probably rewrite the entire damn thing while I’m at it). It certainly won’t be in the next release of GasLight.
So, if there’s enough interest in self-generating intelligent presets, I might add it. I’m looking for feedback on this one – do you use the autopilot mode, or do you rely on your own settings? Any interest in a self-learning system?
Commenting is closed for this article.
Try it. — Russell G Nov 17, 11:42 PM #
By the way, how would you be able to capture the screen output for use in I-Movie? — Traum Nov 19, 08:56 PM #