04-03-2016, 11:20 AM | #1 |
Human being with feelings
Join Date: Mar 2016
Posts: 9
|
360º*video in Reaper?
Hi,
To make this amazing thing that is Reaper even more amazing, would be the addition of 360º video playback. This is a new and upcoming format that is expected to get a lot of traction over the coming years... Could that be done? It would be similar playback functionality like this app, but now right in the Reaper timeline: http://www.kolor.com/kolor-eyes/download/ Maybe Cockos and Kolor can partner on this... |
04-04-2016, 07:46 AM | #2 |
Administrator
Join Date: Jan 2005
Location: NYC
Posts: 15,716
|
Could probably write a video processor to deform the video as desired....
|
04-05-2016, 11:51 PM | #3 | |
Human being with feelings
Join Date: Mar 2016
Posts: 9
|
Quote:
Not sure if this helps, but here is some Ffmpeg code, for the cubemap projection that Facebook uses, a very efficient way of projecting: https://github.com/facebook/transform Albert |
|
04-06-2016, 01:55 PM | #4 |
Human being with feelings
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
|
This would be a great addition. Reaper is so ideally suited to Ambisonic and spatial mixing techniques already.
__________________
mymusic http://music.darylpierce.com mywork http://production.darylpierce.com mypodcast https://youtube.com/@ultimatesoundtest |
06-15-2016, 10:00 PM | #5 |
Human being with feelings
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
|
So I think I've found a script to help with this at the Google mathmap project. Now, among other things I need to know what sort of transform I need to use to create the desired effect from this math.
I've tried with gfx_blit and I can get panning to work but that's all, no bending or windowing of the image. I know I'm way out of my depth with all this but I'm wanting to see it done. Code:
//@param1:FoV 'view' 150 15 320 150 1 //@param2:eye 'eye' 1 0 1.5 0.5 0.01 //@param3:pan 'pan' 0 -180 180 0.5 1 //@param4:vsh 'shape' 0 -1 1 0 0.01 img1 = 0; img2 = input_ismaster(); input_info(src,W,H); pi = 3.14159265; //angular scale factors Sppr = W / (2*pi); //source pixels/radian d = eye + 1; wfov = pi * min( FoV, 160 * d ) / 180; //radians Drpp = 2*d*tan(wfov/(2*d)) / W; W > 0 ? ( gfx_a = W; //destination coordinates in radians xr = x * Drpp; yr = (y - Y * vsh) * Drpp; //project from dest to source azi = d * atan2( xr, d); alt = atan2( yr * (eye + cos(azi)), d ); //source coordinates in pixels sx = Sppr*azi; sy = Sppr*alt; //pan & interpolate sx = sx + W*pan/360; gfx_blit(img1, paspect, sx|0, sy|0, W, H); ); //if sx > X then sx = sx - W end; //if sx < -X then sx = sx + W end; //in(xy:[sx, sy])
__________________
mymusic http://music.darylpierce.com mywork http://production.darylpierce.com mypodcast https://youtube.com/@ultimatesoundtest Last edited by plush2; 06-16-2016 at 01:40 PM. |
06-16-2016, 11:40 AM | #6 |
Human being with feelings
Join Date: May 2013
Location: New York
Posts: 780
|
I'd love to see this happen. We've been working on a lot of VR/360 video content lately at my studio. Right now we're using SpookSync3D to sync playback between Reaper and Kolor Eyes, but to have 360 videos working natively in Reaper would be really awesome.
|
06-16-2016, 01:39 PM | #7 |
Human being with feelings
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
|
It's good to hear that you are getting in on the content creation Robin. What tools are you using? Are you working in FOA or TOA or something completely different?
__________________
mymusic http://music.darylpierce.com mywork http://production.darylpierce.com mypodcast https://youtube.com/@ultimatesoundtest |
06-16-2016, 07:26 PM | #8 |
Human being with feelings
Join Date: May 2013
Location: New York
Posts: 780
|
We're working in FOA. Mostly using Matthias Kronlachner's ambix plugins as well his multi-channel convolver for ambisonic reverb. For monitoring/metering we're using the Harpex-B plugin. Always looking for new tools so I'd love to know what other folks are using for this type of stuff.
|
06-18-2016, 03:35 PM | #9 |
Human being with feelings
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
|
Sadly, I've reached the edge of my current capabilities and have made no progress. Here's hoping that post #2 comes to fruition.
In the meantime, I'm using a variety of tools, Ambisonic Toolkit, the AmbiX stuff, Wigware, and BlueRipple for TOA mixing. I also have the BlueRipple Harpex upsampler which is nice for my tetramic recordings.
__________________
mymusic http://music.darylpierce.com mywork http://production.darylpierce.com mypodcast https://youtube.com/@ultimatesoundtest |
06-18-2016, 04:03 PM | #10 | |
Human being with feelings
Join Date: Mar 2010
Location: Adelaide, South Australia (originally from Geelong)
Posts: 5,598
|
This topic has grabbed my interest. Posting to subscribe to updates.
I contributed to the Ossic Kickstarter campaign and am eagerly awaiting a pair of Ossic 3D headphones which they've just started creating the Ambisonic tools for. I have high expectations for these and am hoping they will be my primary monitoring tool for Ambisonic mixing. Quote:
Regarding Harpex, I've been demoing the their plugin for monitoring and while it is good for some things, I find it kills the sense of ambience and depth in some of the more complex soundfield recordings, especially those created with the various Ambisonic mics (I'm keenly eyeing off a Core Sound Tetramic for my next mic purchase). Location tracking seems to be a bit random with Harpex on some recordings too.
__________________
|
|
06-18-2016, 07:33 PM | #11 | |
Human being with feelings
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
|
Quote:
I agree that it's possible to overdo the envelopment on Harpex but I love the resolution it adds. Do let us know how the whole Ossic thing turns out. If it's good I will likely follow you in that purchase.
__________________
mymusic http://music.darylpierce.com mywork http://production.darylpierce.com mypodcast https://youtube.com/@ultimatesoundtest |
|
06-19-2016, 06:33 PM | #12 |
Human being with feelings
Join Date: Mar 2016
Posts: 9
|
I agree that Blue Ripple is very expensive, especially compared to all the other (mostly free) plugins, but they do get updated constantly and new plugins are added complimentary once you buy the package. He just released a 3rd order brick wall limiter which is very useful to get your levels under control.
Apart from that it is the only VST solution afaik that can render 3rd order to binaural directly using the decoders VST package. To me, once I heard that, it's like night and day compared to 1st order. The soundfield really opens up and you have a much better sense of behind-the-head localization. I'd be curious about your thoughts re. the upsampler plugin, which is next on my shopping list. Does it make a difference to your Tetramic recordings when you upsample them to 3rd order and then render them to FOA (I'm assuming..) as opposed to just mixing in the the 1st order Tetramic B-format into the TOA stream? Hope we get this 360° video playback feature soonish..:-) |
06-19-2016, 07:35 PM | #13 | |
Human being with feelings
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
|
Quote:
__________________
mymusic http://music.darylpierce.com mywork http://production.darylpierce.com mypodcast https://youtube.com/@ultimatesoundtest |
|
06-19-2016, 11:14 PM | #14 |
Human being with feelings
Join Date: May 2013
Location: New York
Posts: 780
|
Sorry to derail this even more, but what platforms are you guys mixing for that can decode a TOA mix? So far most of the VR content I've worked on has been bound for youtube, which only supports FOA. I'm sure TOA would be a nice step up, but I can't see a good reason to start working in it if the final delivery mechanism can't actually use it.
|
06-20-2016, 06:56 AM | #15 | |
Administrator
Join Date: Jan 2005
Location: NYC
Posts: 15,716
|
Quote:
|
|
06-20-2016, 09:15 AM | #16 | |
Human being with feelings
Join Date: Mar 2016
Posts: 9
|
Quote:
|
|
06-20-2016, 09:30 AM | #17 | |
Human being with feelings
Join Date: Mar 2016
Posts: 9
|
Quote:
https://www.dropbox.com/s/05zud8c759...x2048.mp4?dl=0 |
|
06-20-2016, 10:19 AM | #18 | |
Human being with feelings
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
|
Quote:
The video from TumbleAndYaw should serve well as a test video.
__________________
mymusic http://music.darylpierce.com mywork http://production.darylpierce.com mypodcast https://youtube.com/@ultimatesoundtest |
|
06-20-2016, 11:49 AM | #19 |
Human being with feelings
Join Date: Jun 2011
Location: Belgium
Posts: 5,246
|
As there are several of you who also run a website, this might be interesting:
https://github.com/polarch/JSAmbisonics "A JS library for first-order ambisonic (FOA) and higher-order ambisonic (HOA) processing for browsers, using Web Audio."
__________________
In a time of deceit telling the truth is a revolutionary act. George Orwell |
06-21-2016, 09:15 AM | #20 |
Administrator
Join Date: Jan 2005
Location: NYC
Posts: 15,716
|
Here's something I made using gfx_xformblit(), this needs the map() function to be implemented with the correct math, but it should allow you to deform it back to correct:
Code:
removed because it is obsolete Last edited by Justin; 06-27-2016 at 04:06 PM. |
06-21-2016, 12:55 PM | #21 | ||
Human being with feelings
Join Date: Mar 2010
Location: Adelaide, South Australia (originally from Geelong)
Posts: 5,598
|
Absolutely! Couldn't agree more!
Quote:
I'm strongly leaning towards a 5.1 surround setup with the extra two Auratones as left front up and right front up and use the four others as left rear up, right rear up, left lower side and right lower side. I also need to set up more D/A on my Fireface UFX for the additional speakers too. I have 17 channels of amplification (13 channels of Yamaha amps and another smaller four channel amp) so there's plenty of room for future expansion there. Quote:
I have to say, the Harpex B demo has left me less than convinced at this point. It is good but the imaging issues bug me. I will most certainly be posting about them when they arrive. They're not due here until January 2017 though and that seems like AGES when something so potentially exciting is the subject!!
__________________
|
||
06-27-2016, 12:56 PM | #22 |
Administrator
Join Date: Jan 2005
Location: NYC
Posts: 15,716
|
OK here's a 360 viewer, it ended up quite a bit longer than I anticipated, but maybe someone better at math can simplify it/improve:
Code:
//equirectangular 360 panner //@param1:fov_ang 'fov' 90 20 170 45 1 //@param2:x_ang 'x' 0 -180 180 0.5 1 //@param3:y_ang 'y' 0 -90 90 0.5 1 //@param4:div 'div' 100 20 150 40 1 //@param5:filter 'filter' 0 0 1 0 1 project_w<1 ? project_w=1920; project_h<1 ? project_h=1080; gfx_img_resize(-1,project_w,project_h); input_info(0,srcw,srch); function matrix_make_rotate(matrix, m, d) global() local(m2) ( memset(matrix,0,16); matrix[m*5-5] = matrix[15] = 1.0; m2 = m==2 ? 0 : (m+1); matrix[m2*5]=matrix[m*5]=cos(d); matrix[m2*4+m]=-(matrix[m*4+m2]=sin(d)); ); function matrix_multiply(dest,src) global() local(s0,s1,s2,s3) ( loop(4, s0=dest[0]; s1=dest[1]; s2=dest[2]; s3=dest[3]; dest[0] = s0*src[(0<<2)+0]+s1*src[(1<<2)+0]+s2*src[(2<<2)+0]+s3*src[(3<<2)+0]; dest[1] = s0*src[(0<<2)+1]+s1*src[(1<<2)+1]+s2*src[(2<<2)+1]+s3*src[(3<<2)+1]; dest[2] = s0*src[(0<<2)+2]+s1*src[(1<<2)+2]+s2*src[(2<<2)+2]+s3*src[(3<<2)+2]; dest[3] = s0*src[(0<<2)+3]+s1*src[(1<<2)+3]+s2*src[(2<<2)+3]+s3*src[(3<<2)+3]; dest+=4; ); ); function matrix_apply(x,y,z, m, vec*) global() ( vec.x = x*m[0] + y*m[1] + z*m[2] + m[3]; vec.y = x*m[4] + y*m[5] + z*m[6] + m[7]; vec.z = x*m[8] + y*m[9] + z*m[10] + m[11]; ); matrix1 = 0; matrix2 = matrix1 + 16; tab=matrix2 + 16; xdiv=ydiv=div|0; screen_z = 1/tan(fov_ang * 0.5 * $pi / 180); y = -0.5 * (project_h/project_w); dx = 1.0 / (xdiv-1); dy = (project_h/project_w) / (ydiv-1); matrix_make_rotate(matrix1,2,-x_ang * $pi / 180); matrix_make_rotate(matrix2,1,y_ang * $pi / 180); matrix_multiply(matrix1,matrix2); ptr = tab; loop(ydiv, x=-0.5; loop(xdiv, matrix_apply(x,y,screen_z,matrix1,vv); sy = 0.5 + asin(vv.y / sqrt(vv.x*vv.x+vv.y*vv.y+vv.z*vv.z)) / $pi; sx = 0.5 + (atan2(vv.x,vv.z)) / (2*$pi); sy < 0 ? ( sy = -sy; sx += 0.5; ) : sy>= 1 ? ( sy=2-sy; sx+=0.5; ); ptr[0]=sx*srcw; ptr[1]=sy*srch; x+=dx; ptr+=2; ); y+=dy; ); gfx_mode=filter > 0 ? 0x100 : 0; gfx_xformblit(0, 0,0, project_w,project_h,xdiv,ydiv, tab,0); Preview: Last edited by Justin; 06-27-2016 at 04:43 PM. Reason: updated processor some |
06-27-2016, 02:01 PM | #23 | |
Human being with feelings
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
|
Quote:
Might I suggest that the 'y' rotation param be Code:
//@param3:y_ang 'y' 0 -90 90 0.5 1
__________________
mymusic http://music.darylpierce.com mywork http://production.darylpierce.com mypodcast https://youtube.com/@ultimatesoundtest |
|
06-27-2016, 03:49 PM | #24 |
Administrator
Join Date: Jan 2005
Location: NYC
Posts: 15,716
|
5.22pre6 is up which should support this processor in the monitoring chain nicely...
|
06-27-2016, 04:03 PM | #25 |
Administrator
Join Date: Jan 2005
Location: NYC
Posts: 15,716
|
Done, also did (again) some other small fixes/simplifications, and added a filtering parameter. (edit: edited again)
Last edited by Justin; 06-27-2016 at 04:45 PM. |
06-27-2016, 06:15 PM | #26 |
Administrator
Join Date: Jan 2005
Location: NYC
Posts: 15,716
|
Another test version, removes some stitching artifacts but makes some others (ugh):
Code:
//Equirectangular 360 panner //@param1:fov_ang 'fov' 90 20 170 45 1 //@param2:x_ang 'x' 0 -180 180 0.5 1 //@param3:y_ang 'y' 0 -90 90 0.5 1 //@param4:div 'div' 100 20 150 40 1 //@param5:filter 'filter' 0 0 1 0 1 project_w<1 ? project_w=1920; project_h<1 ? project_h=1080; gfx_img_resize(-1,project_w,project_h); input_info(0,srcw,srch); function matrix_make_rotate(matrix, m, d) global() local(m2) ( memset(matrix,0,16); matrix[m*5-5] = matrix[15] = 1.0; m2 = m==2 ? 0 : (m+1); matrix[m2*5]=matrix[m*5]=cos(d); matrix[m2*4+m]=-(matrix[m*4+m2]=sin(d)); ); function matrix_multiply(dest,src) global() local(s0,s1,s2,s3) ( loop(4, s0=dest[0]; s1=dest[1]; s2=dest[2]; s3=dest[3]; dest[0] = s0*src[(0<<2)+0]+s1*src[(1<<2)+0]+s2*src[(2<<2)+0]+s3*src[(3<<2)+0]; dest[1] = s0*src[(0<<2)+1]+s1*src[(1<<2)+1]+s2*src[(2<<2)+1]+s3*src[(3<<2)+1]; dest[2] = s0*src[(0<<2)+2]+s1*src[(1<<2)+2]+s2*src[(2<<2)+2]+s3*src[(3<<2)+2]; dest[3] = s0*src[(0<<2)+3]+s1*src[(1<<2)+3]+s2*src[(2<<2)+3]+s3*src[(3<<2)+3]; dest+=4; ); ); function matrix_apply(x,y,z, m, vec*) global() ( vec.x = x*m[0] + y*m[1] + z*m[2] + m[3]; vec.y = x*m[4] + y*m[5] + z*m[6] + m[7]; vec.z = x*m[8] + y*m[9] + z*m[10] + m[11]; ); matrix1 = 0; matrix2 = matrix1 + 16; tab=matrix2 + 16; xdiv=ydiv=div|0; screen_z = 1/tan(fov_ang * 0.5 * $pi / 180); y = -0.5 * (project_h/project_w); dx = 1.0 / (xdiv-1); dy = (project_h/project_w) / (ydiv-1); matrix_make_rotate(matrix1,2,-x_ang * $pi / 180); matrix_make_rotate(matrix2,1,y_ang * $pi / 180); matrix_multiply(matrix1,matrix2); xf=0; ptr = tab; loop(ydiv, x=-0.5; ya_offs=xa_offs=0; loop(xdiv, matrix_apply(x,y,screen_z,matrix1,vv); ya = asin(vv.y / sqrt(vv.x*vv.x+vv.y*vv.y+vv.z*vv.z)) / $pi + ya_offs + 0.5; xa = atan2(vv.x,vv.z)/(2*$pi) + xa_offs + 0.5; x!=-0.5 ? ( xa>lxa+0.5 ? ( xa-=1; xa_offs-=1; ) : xa < lxa-0.5 ? ( xa+=1; xa_offs+=1; ); ya>lya+ 0.5 ? ( ya-=1; ya_offs-=1; ) : ya < lya-0.5 ? ( ya+=1; ya_offs+=1; ); ); lxa=xa; lya=ya; xa>1?xf|=1; xa<0?xf|=2; ptr[0]=xa*srcw; ptr[1]=ya*srch; x+=dx; ptr+=2; ); y+=dy; ); //gfx_set(1,0,1); //gfx_fillrect(0,0,project_w,project_h); gfx_mode=filter > 0 ? 0x100 : 0; gfx_xformblit(0, 0,0, project_w,project_h,xdiv,ydiv, tab,0); ptr=tab; loop(ydiv*xdiv, ptr[0] -= srcw-0.5/*fudge*/; ptr+=2; ); (xf&1) ? gfx_xformblit(0, 0,0, project_w,project_h,xdiv,ydiv, tab,0); ptr=tab; loop(ydiv*xdiv, ptr[0] += srcw*2+1.0; ptr+=2; ); (xf&2) ? gfx_xformblit(0, 0,0, project_w,project_h,xdiv,ydiv, tab,0); |
06-27-2016, 08:24 PM | #27 |
Human being with feelings
Join Date: Mar 2016
Posts: 9
|
Wonderful ! Thanks so much for having a crack at this. Will start playing with it asap.
Can the yaw and pitch (horizontal and vertical rotation ie. x and y controls in this case) be linked to parameters in a VST/JS audio plugin? Ideally you'd want to rotate the 360º video and the soundfield will counter-rotate, which creates the impression that sound sources stay in place when you visually rotate. Another very helpful addition would be some sort of video overlay / crosshair pointer, which would show the mouse/pointer position in horizontal and vertical degrees, so when you have let's say a car engine in the 360 video that's located at 90º left and 20º down, you can move the pointer over the car position in the video and read those values and input them into the audio positioner plugin for that sound source. Hope this is not pushing it too much...:-) Already very grateful for what we have here so far! |
06-28-2016, 07:22 AM | #28 | |
Human being with feelings
Join Date: May 2013
Location: New York
Posts: 780
|
Quote:
|
|
06-28-2016, 07:27 AM | #29 | |
Human being with feelings
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
|
Quote:
This is working and I hope to post a little gif tomorrow to demonstrate it. *edit- I see that RobinGShore has a gif demonstration using midi CC to do basically the same thing.
__________________
mymusic http://music.darylpierce.com mywork http://production.darylpierce.com mypodcast https://youtube.com/@ultimatesoundtest |
|
06-29-2016, 03:50 PM | #30 |
Administrator
Join Date: Jan 2005
Location: NYC
Posts: 15,716
|
Here's a video processor that is cleaned up and has fewer stitching artifacts:
Code:
//equirectangular 360 panner //@param1:fov_ang 'fov' 90 20 170 90 1 //@param2:x_ang 'yaw' 0 -180 180 0 1 //@param3:y_ang 'pitch' 0 -90 90 0 1 //@param4:z_ang 'roll' 0 -180 180 0 1 project_w=1920; project_h=1080; xdiv=ydiv=100; // subdivision (quality) 10-200 is usable filter=1; // bilinear filtering? xscale=1; // -1 to flip yaw yscale=1; // -1 to flip pitch zscale=1; // -1 to flip roll function matrix_make_rotate(matrix, m, d) global() local(m2) ( memset(matrix,0,16); matrix[m*5-5] = matrix[15] = 1.0; m2 = ((m%=3)+1)%3; matrix[m2*5]=matrix[m*5]=cos(d); matrix[m2*4+m]=-(matrix[m*4+m2]=sin(d)); ); function matrix_make_xlate(matrix, x, y, z) global() ( memset(matrix,0,16); matrix[0]=matrix[5]=matrix[10]=matrix[15]=1; matrix[3]=x; matrix[7]=y; matrix[11]=z; ); function matrix_multiply(dest,src) global() local(s0,s1,s2,s3) ( loop(4, s0=dest[0]; s1=dest[1]; s2=dest[2]; s3=dest[3]; loop(4, dest[0] = s0*src[0]+s1*src[4]+s2*src[8]+s3*src[12]; dest+=1; src+=1; ); src -= 4; ); ); function matrix_apply(x,y,z, m, vec*) global() ( vec.x = x*m[0] + y*m[1] + z*m[2] + m[3]; vec.y = x*m[4] + y*m[5] + z*m[6] + m[7]; vec.z = x*m[8] + y*m[9] + z*m[10] + m[11]; ); function vector_project(vv*, s*) global(srcw,srch) ( s.y = srch * (0.5 + asin(vv.y / sqrt(vv.x*vv.x+vv.y*vv.y+vv.z*vv.z)) * (1/$pi)); s.x = srcw * (0.5 + atan2(vv.x,vv.z) * (1 / (2*$pi))); ); function centerx(sm*, v*) global(srcw) ( v.x < sm.x-srcw*.5 ? ( v.x += srcw; 2; ) : v.x > sm.x+srcw*.5 ? ( v.x -= srcw; 1; ); ); gfx_img_resize(-1,project_w,project_h); input_info(0,srcw,srch); gfx_mode=filter ? 0x100 : 0; screen_z = project_w/tan(fov_ang * 0.5 * $pi / 180); dxpos=project_w/(xdiv-1); dypos=project_h/(ydiv-1); ypos = 0; matrix1 = 0; matrix2 = matrix1 + 16; matrix_make_rotate(matrix1,2,xscale * x_ang * -$pi / 180); matrix_make_rotate(matrix2,1,yscale * y_ang * $pi / 180); matrix_multiply(matrix1,matrix2); matrix_make_rotate(matrix2,3,z_ang * -$pi / 180); matrix_multiply(matrix1,matrix2); matrix_make_xlate(matrix2,project_w*-.5,project_h*-.5,0); matrix_multiply(matrix1,matrix2); loop(ydiv, y1=(ypos)&0xffffe; y2=(ypos+=dypos)&0xffffe; idy = 1/(y2-y1); xpos = 0; loop(xdiv, x1=(xpos)&0xffffe; x2=(xpos+=dxpos)&0xffffe; idx = 1/(x2-x1); matrix_apply(x1,y1,screen_z,matrix1,vv); vector_project(vv, s1); matrix_apply(x2,y1,screen_z,matrix1,vv); vector_project(vv, s2); matrix_apply(x1,y2,screen_z,matrix1,vv); vector_project(vv, s3); matrix_apply(x2,y2,screen_z,matrix1,vv); vector_project(vv, s4); a=centerx(s1,s2)|centerx(s1,s3)|centerx(s1,s4); dsdx = (s2.x-s1.x) * idx; dtdx = (s2.y-s1.y) * idx; dsdy = (s3.x-s1.x) * idy; dtdy = (s3.y-s1.y) * idy; dsdx2 = (s4.x-s3.x) * idx; dtdx2 = (s4.y-s3.y) * idx; dsdxdy = (dsdx2-dsdx) * idy; dtdxdy = (dtdx2-dtdx) * idy; gfx_deltablit(0, x1,y1, x2-x1,y2-y1, s1.x,s1.y, dsdx, dtdx, dsdy, dtdy, dsdxdy, dtdxdy); (a&1) ? gfx_deltablit(0, x1,y1, x2-x1,y2-y1, s1.x+srcw,s1.y, dsdx, dtdx, dsdy, dtdy, dsdxdy, dtdxdy); (a&2) ? gfx_deltablit(0, x1,y1, x2-x1,y2-y1, s1.x-srcw,s1.y, dsdx, dtdx, dsdy, dtdy, dsdxdy, dtdxdy); ); ); (edited: added roll support) Last edited by Justin; 06-29-2016 at 09:10 PM. |
08-19-2016, 12:57 AM | #31 |
Human being with feelings
Join Date: Aug 2016
Posts: 2
|
Been following this thread since I started playing with ambisonics for 360 videos. I think what would be useful for sound designers would be a view that presents the 360 video as is(flat equirectangular) where some markers can be added on the actual picture as it plays back. You can then animate the position of these markers(using automation curves) to track say a moving object within the video. You would then link the values coming from these marker positions curves(converted to degrees) to an ambisonic panner. This in my opinion is a much more efficient workflow than having to pan around the 360 video to see/hear where the sound is supposed to be coming from and track it around.
Is it possible,using the video processing plugin, to draw UI elements(a crosshair, a square, a circle...whatever) that can then be controlled with the mouse and linked to a parameter in the code? I'm having trouble finding any documentation on the JSFX code specific to the video processor as it seems to be different from the published language reference. I would be happy to write this plugin if I can get some guidance from the experts on this forum as I just started with reaper a couple of weeks ago. |
08-19-2016, 12:21 PM | #32 | |
Human being with feelings
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
|
Quote:
What about a cubic projection as well where the mixer could choose which panel they wanted to be viewing. It is theoretically possible to do mouse move capture with the gfx code as can be seen in several jsfx. I have not yet seen it done with the videofx though. That gfx section in the jsfx code guide is probably your best starting point for features. I'm not sure if it's one to one with the videofx but there are many similarities.
__________________
mymusic http://music.darylpierce.com mywork http://production.darylpierce.com mypodcast https://youtube.com/@ultimatesoundtest |
|
08-19-2016, 12:27 PM | #33 | |
Human being with feelings
Join Date: Aug 2016
Posts: 2
|
Quote:
Do you know where I can find more documentation about the jsfx standard used by the video process? I tried a few of the gfx functions from the website and they don't work and I don't want to reverse engineer the existing presets to find what I'm looking for |
|
09-15-2016, 02:06 PM | #34 | |
Human being with feelings
Join Date: Jul 2014
Posts: 6
|
FYI, Noise Makers updated its 3D panner Ambi Pan with a transparent window doing basically what you described below.
It was designed to pan multiple sources on top of an equirectangular 360 video. www.noisemakers.fr/ambi-pan Quote:
|
|
09-15-2016, 02:22 PM | #35 |
Human being with feelings
Join Date: Jul 2014
Posts: 6
|
It looks like this
Last edited by charlifiiiii; 09-15-2016 at 02:27 PM. Reason: image not displayed |
11-19-2016, 08:58 AM | #36 |
Human being with feelings
Join Date: Jul 2016
Posts: 45
|
This is awesome! Is there any chance you can rotate the video by dragging mouse click?
|
09-23-2018, 09:06 AM | #37 |
Human being with feelings
Join Date: Nov 2008
Location: Stockholm Sweden
Posts: 23
|
In this context, I have a small project on Github named OHTI
Open HeadTracking Initiative Hardware and firmware for a Bluetooth Headtracker that uses omnitone to convert ambisonics to headtracked binaural. The headtracker currently sends quaternion data for rotation control, but can easily be converted to use OSC (with a OSC npm module). The goal is to create the the impression that you are really listening to a stationary sound image to improve the externalization when listening with headphones. commercial products that does similar things are Smyth Realizer for example. All of the code for the player and decoder are written in JavaScript. Do any one have the knowledge on how to integrate JavaScript written code in to Reaper? |
09-23-2018, 02:37 PM | #38 |
Human being with feelings
Join Date: Jul 2017
Posts: 24
|
I guess it might be more efficient writing it in proper C++ in form of a VST plug-in.
The normal patching would be employing a rotator plugin (e.g. mcfx_rotator), control the angles via MIDI / OSC, and use a binaural decoder plugin afterwards (e.g. IEM's BinauralDecoder ;-) ) There are also binaural decoding plug-ins out there which already have a rotation feature implemented. For your project, an interface for your bluetooth quaternion data in C++ would be great, maybe in form of a library / class under a proper license, so third party plug-in manufacturers could use it? PS: I think this discussion should be continued in another thread, as it's off-topic. |
Thread Tools | |
Display Modes | |
|
|