Metafetish project dead? xbox controller realtime haptic scripting, when?


#1

it’s been quiet here lately is the metafetish project dead?

still no possible way to script from xbox controller realtime and create haptic scripts?
Blender it’s cool but takes to much time to create a single (mostly inacurate) scripts.


#2

Hardly! I just made a new blog post yesterday, and our new website launched last week: https://buttplug.io

I’m now back to working on webapps, but before I can start adding new features to syncydink, I’ve got to bring it up to date with the latest dependencies, which might take a week or two. But I’m definitely very active/busy at the moment. :slight_smile:


#3

Any first projects available for realtime gamepad recorded motion controls to vids?
great to seen lot of toys supported in butt project, but without a good tool to create new and fast content for the toys the fun parts flew away quickly…


#4

Wouldn’t scripting be better with a VR controller? The problem with a regular controller is it would be a bit hard to translate speed and position just from button presses. Not to mention, if you are scripting VR, you can’t see exactly what you are doing in another program. If you used a VR controller you could simply just simulate the action on screen by moving the controller back and forth. You only need Z axis information (position/depth) and speed you are moving the controller. X and Y access information of the controller can be ignored by the script generation program as they cannot be used.

Not only is this method incredibly natural but it’s also quicker. Let me know what you think of this method and if it could be improved.

Right now the Oculus SDK and OpenVR SDK provides access to this controller information so it’s definitely feasible.


#5

an xbox controller would just fine, you only need X/Y actually only Y for up/down and may be X for speed but you can speed also by press a few times a button. it 's the Y position up/down that need to be created when play vid in realtime. But it’s very quiet here, so guess we never see this appear, not this year.


#6

With that attitude, you may never see it unless you implement it your own damn self.


#7

Unfortunately this still has the fatal flaw of: how do you input position by pressing just the Y button? Assigning up/down to the Y button but what exactly is up/down referring to? You have to know the current position to go up/down from anywhere. You’d first have to assign couple other buttons to set position and then have the Y button assigned to up and X to down. Having a single button assigned to both up and down would be extremely confusing and I don’t know how that would work. Even with those issues out of the way, that’s not a very natural way of doing it.

Maybe a throttle (like for flight simulators) would be the best option. It has a 0 and 100 set position and it only moves on one axis. All you would have to do is move the throttle to emulate the video all the while recording the USB data being sent to the computer, sanitize, and then convert that into a .meta file the launch can read.


#8

My thought on joystick encoding has always been that we’d just read raw analog axes and look for min/max then smooth in between ('cause there’s not much else that can be done in funscript anyways, since the launch sucks for inner-stroke movement changes :frowning: ). I still think there’s massive ergonomics issues there 'cause moving the stick back and forth for however long the movie is could be a lot of muscle tensions that’s not quite the same as game usage.

Really, all of these manual methods would be fine for training systems that could automatically encode videos, which is the direction I’d rather be spending my time on (if I had time to spend outside of the Buttplug core :frowning: ). I’ve been playing with Magic Leap’s SuperPoint pretrained network (https://github.com/MagicLeapResearch/SuperPointPretrainedNetwork) and have been getting really good results out of it as long as it’s filmed porn. It’s a little iffy on cartoons due to the difference in color gradients and what not, but still basically workable.

In the end, I expect encoding interfaces will be mostly automated, with most of the human time spend in encoding cleanup afterward.


#9

Thank you for the link, I’m going to have to check that out. If anyone can tune that deep learning network so that it automates 90% of the work that would be a huge boon the community. Thank you for your hard work!


#10

wow wow this was not an attack, you are the programmer here, 'm just mister dumb. i don’t know any pyton and can’t waste my time on it. music is my life and not 0 and 1 or and, loop while do this or that.
it’s just frustrate to see all this stuff and no one created a solution to code fast new vids.
Onetouchscripter was a nice try, but not a quick solution, to much time process.
for me the launch or any other launched toy all big failure if you see this against a realtouch unit.

I still prefer the joystick motion no mather the ergonomics, it just sound a fast way for creation motion in the vids.


#11

dancer, I don’t know if you hang out on realtouchscripts.com, but there’s someone working on a tool that does what you want there. http://www.realtouchscripts.com/viewtopic.php?f=59&t=6565

Personally, I’m not convinced that many people will be able to get good synch with that kind of setup, and making it work well for multivector devices seems unlikely at best, but it’s always nice to have more tools.