Old 04-03-2016, 11:20 AM   #1
TumbleAndYaw
Human being with feelings
 
Join Date: Mar 2016
Posts: 9
Default 360º*video in Reaper?

Hi,

To make this amazing thing that is Reaper even more amazing, would be the addition of 360º video playback. This is a new and upcoming format that is expected to get a lot of traction over the coming years...

Could that be done? It would be similar playback functionality like this app, but now right in the Reaper timeline:

http://www.kolor.com/kolor-eyes/download/

Maybe Cockos and Kolor can partner on this...
TumbleAndYaw is offline   Reply With Quote
Old 04-04-2016, 07:46 AM   #2
Justin
Administrator
 
Justin's Avatar
 
Join Date: Jan 2005
Location: NYC
Posts: 15,716
Default

Could probably write a video processor to deform the video as desired....
Justin is offline   Reply With Quote
Old 04-05-2016, 11:51 PM   #3
TumbleAndYaw
Human being with feelings
 
Join Date: Mar 2016
Posts: 9
Default

Quote:
Originally Posted by Justin View Post
Could probably write a video processor to deform the video as desired....
Thanks Justin. That would be such a help for many of us spatial audio engineers, that currently have no reliable way of locking audio and picture.

Not sure if this helps, but here is some Ffmpeg code, for the cubemap projection that Facebook uses, a very efficient way of projecting:

https://github.com/facebook/transform

Albert
TumbleAndYaw is offline   Reply With Quote
Old 04-06-2016, 01:55 PM   #4
plush2
Human being with feelings
 
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
Default

This would be a great addition. Reaper is so ideally suited to Ambisonic and spatial mixing techniques already.
plush2 is offline   Reply With Quote
Old 06-15-2016, 10:00 PM   #5
plush2
Human being with feelings
 
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
Default

So I think I've found a script to help with this at the Google mathmap project. Now, among other things I need to know what sort of transform I need to use to create the desired effect from this math.

I've tried with gfx_blit and I can get panning to work but that's all, no bending or windowing of the image. I know I'm way out of my depth with all this but I'm wanting to see it done.

Code:
//@param1:FoV 'view' 150 15 320 150 1
//@param2:eye 'eye' 1 0 1.5 0.5 0.01
//@param3:pan 'pan' 0 -180 180 0.5 1
//@param4:vsh 'shape' 0 -1 1 0 0.01
img1 = 0;
img2 = input_ismaster();
input_info(src,W,H);
pi = 3.14159265;
//angular scale factors 
Sppr = W / (2*pi); //source pixels/radian 
d = eye + 1; 
wfov = pi * min( FoV, 160 * d ) / 180; //radians 
Drpp = 2*d*tan(wfov/(2*d)) / W; 
W > 0 ? (
  gfx_a = W;
  //destination coordinates in radians 
  xr = x * Drpp; yr = (y - Y * vsh) * Drpp; 
  //project from dest to source 
  azi = d * atan2( xr, d); 
  alt = atan2( yr * (eye + cos(azi)), d ); 
  //source coordinates in pixels 
  sx = Sppr*azi; sy = Sppr*alt; 
  //pan & interpolate 
  sx = sx + W*pan/360;
  gfx_blit(img1, paspect, sx|0, sy|0, W, H);
  );
//if sx > X then sx = sx - W end; 
//if sx < -X then sx = sx + W end; 
//in(xy:[sx, sy])
The basic desire is to have Panini type viewer for equirectangular panoramic video.

Last edited by plush2; 06-16-2016 at 01:40 PM.
plush2 is offline   Reply With Quote
Old 06-16-2016, 11:40 AM   #6
RobinGShore
Human being with feelings
 
Join Date: May 2013
Location: New York
Posts: 780
Default

I'd love to see this happen. We've been working on a lot of VR/360 video content lately at my studio. Right now we're using SpookSync3D to sync playback between Reaper and Kolor Eyes, but to have 360 videos working natively in Reaper would be really awesome.
RobinGShore is offline   Reply With Quote
Old 06-16-2016, 01:39 PM   #7
plush2
Human being with feelings
 
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
Default

It's good to hear that you are getting in on the content creation Robin. What tools are you using? Are you working in FOA or TOA or something completely different?
plush2 is offline   Reply With Quote
Old 06-16-2016, 07:26 PM   #8
RobinGShore
Human being with feelings
 
Join Date: May 2013
Location: New York
Posts: 780
Default

We're working in FOA. Mostly using Matthias Kronlachner's ambix plugins as well his multi-channel convolver for ambisonic reverb. For monitoring/metering we're using the Harpex-B plugin. Always looking for new tools so I'd love to know what other folks are using for this type of stuff.
RobinGShore is offline   Reply With Quote
Old 06-18-2016, 03:35 PM   #9
plush2
Human being with feelings
 
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
Default

Sadly, I've reached the edge of my current capabilities and have made no progress. Here's hoping that post #2 comes to fruition.

In the meantime, I'm using a variety of tools, Ambisonic Toolkit, the AmbiX stuff, Wigware, and BlueRipple for TOA mixing. I also have the BlueRipple Harpex upsampler which is nice for my tetramic recordings.
plush2 is offline   Reply With Quote
Old 06-18-2016, 04:03 PM   #10
Dannii
Human being with feelings
 
Dannii's Avatar
 
Join Date: Mar 2010
Location: Adelaide, South Australia (originally from Geelong)
Posts: 5,598
Default

This topic has grabbed my interest. Posting to subscribe to updates.

I contributed to the Ossic Kickstarter campaign and am eagerly awaiting a pair of Ossic 3D headphones which they've just started creating the Ambisonic tools for. I have high expectations for these and am hoping they will be my primary monitoring tool for Ambisonic mixing.

Quote:
Originally Posted by plush2 View Post
In the meantime, I'm using a variety of tools, Ambisonic Toolkit, the AmbiX stuff, Wigware, and BlueRipple for TOA mixing. I also have the BlueRipple Harpex upsampler which is nice for my tetramic recordings.
Which Blue Ripple packages do you own? I'm looking at a few options but they are VERY expensive here in Australia with the current exchange rates. How do you find the necessity for an internet connection at certain intervals to keep them running?

Regarding Harpex, I've been demoing the their plugin for monitoring and while it is good for some things, I find it kills the sense of ambience and depth in some of the more complex soundfield recordings, especially those created with the various Ambisonic mics (I'm keenly eyeing off a Core Sound Tetramic for my next mic purchase).
Location tracking seems to be a bit random with Harpex on some recordings too.
__________________
Dannii is offline   Reply With Quote
Old 06-18-2016, 07:33 PM   #11
plush2
Human being with feelings
 
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
Default

Quote:
Originally Posted by ReaDave View Post
Which Blue Ripple packages do you own? I'm looking at a few options but they are VERY expensive here in Australia with the current exchange rates. How do you find the necessity for an internet connection at certain intervals to keep them running?

Regarding Harpex, I've been demoing the their plugin for monitoring and while it is good for some things, I find it kills the sense of ambience and depth in some of the more complex soundfield recordings, especially those created with the various Ambisonic mics (I'm keenly eyeing off a Core Sound Tetramic for my next mic purchase).
Location tracking seems to be a bit random with Harpex on some recordings too.
It's an exciting time to be involved in spatial audio. I only own the harpex upsampler (which is a little different than harpex b although based on the exact same technology) but I use the free package as well which comes with some very useful tools. My experience with the licensing has been, license it once and don't think about it ever again. I have had absolutely no issues with it. If I were to buy another of his toolsets it would be the Rupture3d advanced. As you said, they are expensive but if you need what they do (decode to auro 3d and such for example) then they are absolutely worth it.

I agree that it's possible to overdo the envelopment on Harpex but I love the resolution it adds.

Do let us know how the whole Ossic thing turns out. If it's good I will likely follow you in that purchase.
plush2 is offline   Reply With Quote
Old 06-19-2016, 06:33 PM   #12
TumbleAndYaw
Human being with feelings
 
Join Date: Mar 2016
Posts: 9
Default

I agree that Blue Ripple is very expensive, especially compared to all the other (mostly free) plugins, but they do get updated constantly and new plugins are added complimentary once you buy the package. He just released a 3rd order brick wall limiter which is very useful to get your levels under control.
Apart from that it is the only VST solution afaik that can render 3rd order to binaural directly using the decoders VST package.

To me, once I heard that, it's like night and day compared to 1st order. The soundfield really opens up and you have a much better sense of behind-the-head localization.

I'd be curious about your thoughts re. the upsampler plugin, which is next on my shopping list. Does it make a difference to your Tetramic recordings when you upsample them to 3rd order and then render them to FOA (I'm assuming..) as opposed to just mixing in the the 1st order Tetramic B-format into the TOA stream?

Hope we get this 360° video playback feature soonish..:-)
TumbleAndYaw is offline   Reply With Quote
Old 06-19-2016, 07:35 PM   #13
plush2
Human being with feelings
 
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
Default

Quote:
Originally Posted by TumbleAndYaw View Post
I'd be curious about your thoughts re. the upsampler plugin, which is next on my shopping list. Does it make a difference to your Tetramic recordings when you upsample them to 3rd order and then render them to FOA (I'm assuming..) as opposed to just mixing in the the 1st order Tetramic B-format into the TOA stream?

Hope we get this 360° video playback feature soonish..:-)
The Harpex upsampler mainly makes a huge difference when upsampling and monitoring in TOA. The difference it makes to a tetramic recording in that case is quite noticeable. Going back to FOA afterwards it's less noticeable although if you turn envelopment to it's lower settings you can get some interesting advantages in the way of cleaning up the immediate signals (closest to the mic).
plush2 is offline   Reply With Quote
Old 06-19-2016, 11:14 PM   #14
RobinGShore
Human being with feelings
 
Join Date: May 2013
Location: New York
Posts: 780
Default

Sorry to derail this even more, but what platforms are you guys mixing for that can decode a TOA mix? So far most of the VR content I've worked on has been bound for youtube, which only supports FOA. I'm sure TOA would be a nice step up, but I can't see a good reason to start working in it if the final delivery mechanism can't actually use it.
RobinGShore is offline   Reply With Quote
Old 06-20-2016, 06:56 AM   #15
Justin
Administrator
 
Justin's Avatar
 
Join Date: Jan 2005
Location: NYC
Posts: 15,716
Default

Quote:
Originally Posted by plush2 View Post
So I think I've found a script to help with this at the Google mathmap project. Now, among other things I need to know what sort of transform I need to use to create the desired effect from this math.

I've tried with gfx_blit and I can get panning to work but that's all, no bending or windowing of the image. I know I'm way out of my depth with all this but I'm wanting to see it done.

Code:
//@param1:FoV 'view' 150 15 320 150 1
//@param2:eye 'eye' 1 0 1.5 0.5 0.01
//@param3:pan 'pan' 0 -180 180 0.5 1
//@param4:vsh 'shape' 0 -1 1 0 0.01
img1 = 0;
img2 = input_ismaster();
input_info(src,W,H);
pi = 3.14159265;
//angular scale factors 
Sppr = W / (2*pi); //source pixels/radian 
d = eye + 1; 
wfov = pi * min( FoV, 160 * d ) / 180; //radians 
Drpp = 2*d*tan(wfov/(2*d)) / W; 
W > 0 ? (
  gfx_a = W;
  //destination coordinates in radians 
  xr = x * Drpp; yr = (y - Y * vsh) * Drpp; 
  //project from dest to source 
  azi = d * atan2( xr, d); 
  alt = atan2( yr * (eye + cos(azi)), d ); 
  //source coordinates in pixels 
  sx = Sppr*azi; sy = Sppr*alt; 
  //pan & interpolate 
  sx = sx + W*pan/360;
  gfx_blit(img1, paspect, sx|0, sy|0, W, H);
  );
//if sx > X then sx = sx - W end; 
//if sx < -X then sx = sx + W end; 
//in(xy:[sx, sy])
The basic desire is to have Panini type viewer for equirectangular panoramic video.
Have a video to test with? I think you could use gfx_xformblit() or whatever it is called to undeform it...
Justin is offline   Reply With Quote
Old 06-20-2016, 09:15 AM   #16
TumbleAndYaw
Human being with feelings
 
Join Date: Mar 2016
Posts: 9
Default

Quote:
Originally Posted by RobinGShore View Post
Sorry to derail this even more, but what platforms are you guys mixing for that can decode a TOA mix? So far most of the VR content I've worked on has been bound for youtube, which only supports FOA. I'm sure TOA would be a nice step up, but I can't see a good reason to start working in it if the final delivery mechanism can't actually use it.
You can work in TOA with all its' advantages and once you're done, just use the 1st 4 channels (which will be your 1st order B-format) for your YouTube render. Once the tech is more readily available (which is currently only mostly through custom playback apps) you can replace your foa mix with your toa mix. Currently the only readily available solution is the free Jump Inspector app for Android, which will play back 3rd order ACN/SN3D.
TumbleAndYaw is offline   Reply With Quote
Old 06-20-2016, 09:30 AM   #17
TumbleAndYaw
Human being with feelings
 
Join Date: Mar 2016
Posts: 9
Default

Quote:
Originally Posted by Justin View Post
Have a video to test with? I think you could use gfx_xformblit() or whatever it is called to undeform it...
This is a good video to test with (visual only, no sound):

https://www.dropbox.com/s/05zud8c759...x2048.mp4?dl=0
TumbleAndYaw is offline   Reply With Quote
Old 06-20-2016, 10:19 AM   #18
plush2
Human being with feelings
 
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
Default

Quote:
Originally Posted by RobinGShore View Post
Sorry to derail this even more, but what platforms are you guys mixing for that can decode a TOA mix? So far most of the VR content I've worked on has been bound for youtube, which only supports FOA. I'm sure TOA would be a nice step up, but I can't see a good reason to start working in it if the final delivery mechanism can't actually use it.
A good point certainly. YouTube will support TOA eventually as it is in their published spec. I am attempting to code some of my own web based and app based delivery methods so TOA is a very likely future for me. For web based, a frequent contributor on the surrsound mailing list just posted this web audio TOA library.

The video from TumbleAndYaw should serve well as a test video.
plush2 is offline   Reply With Quote
Old 06-20-2016, 11:49 AM   #19
cyrano
Human being with feelings
 
cyrano's Avatar
 
Join Date: Jun 2011
Location: Belgium
Posts: 5,246
Default

As there are several of you who also run a website, this might be interesting:

https://github.com/polarch/JSAmbisonics

"A JS library for first-order ambisonic (FOA) and higher-order ambisonic (HOA) processing for browsers, using Web Audio."
__________________
In a time of deceit telling the truth is a revolutionary act.
George Orwell
cyrano is offline   Reply With Quote
Old 06-21-2016, 09:15 AM   #20
Justin
Administrator
 
Justin's Avatar
 
Join Date: Jan 2005
Location: NYC
Posts: 15,716
Default

Here's something I made using gfx_xformblit(), this needs the map() function to be implemented with the correct math, but it should allow you to deform it back to correct:

Code:
removed because it is obsolete

Last edited by Justin; 06-27-2016 at 04:06 PM.
Justin is offline   Reply With Quote
Old 06-21-2016, 12:55 PM   #21
Dannii
Human being with feelings
 
Dannii's Avatar
 
Join Date: Mar 2010
Location: Adelaide, South Australia (originally from Geelong)
Posts: 5,598
Default

Quote:
Originally Posted by plush2 View Post
It's an exciting time to be involved in spatial audio.
Absolutely! Couldn't agree more!
Quote:
Originally Posted by plush2 View Post
I only own the harpex upsampler (which is a little different than harpex b although based on the exact same technology) but I use the free package as well which comes with some very useful tools. My experience with the licensing has been, license it once and don't think about it ever again. I have had absolutely no issues with it. If I were to buy another of his toolsets it would be the Rupture3d advanced. As you said, they are expensive but if you need what they do (decode to auro 3d and such for example) then they are absolutely worth it.
Yeah. Rapture 3D has my attention particularly because it includes a plugin for decoding to custom speaker setups. I have seven Auratones and four more smaller speakers that I'm planning to incorporate in a setup compatible with regular 7.1 monitoring and also have Ambisonics with height decoding using the extra four speakers. I'm not totally sure how I'm going to do the Ambisonic part just yet though. I'm debating whether to maintain a 5.1 setup for regular surround and have six speakers dedicated to height information (two Auratones and the four others) or to go with a 7.1 regular surround setup. That would limit the height decoding for Ambisonics though.
I'm strongly leaning towards a 5.1 surround setup with the extra two Auratones as left front up and right front up and use the four others as left rear up, right rear up, left lower side and right lower side.

I also need to set up more D/A on my Fireface UFX for the additional speakers too. I have 17 channels of amplification (13 channels of Yamaha amps and another smaller four channel amp) so there's plenty of room for future expansion there.
Quote:
Originally Posted by plush2 View Post
I agree that it's possible to overdo the envelopment on Harpex but I love the resolution it adds.
I'm with you on the extra resolution but only in some situations. I've found the image tends to drift sometimes even when playing back the same audio multiple times. Things get positioned differently. Perhaps I need to experiment more but my demo of the Harpex B plugin has expired. I'd need to experiment with the free player but it is somewhat more limited and fiddly.
I have to say, the Harpex B demo has left me less than convinced at this point. It is good but the imaging issues bug me.
Quote:
Originally Posted by plush2 View Post
Do let us know how the whole Ossic thing turns out. If it's good I will likely follow you in that purchase.
I will most certainly be posting about them when they arrive. They're not due here until January 2017 though and that seems like AGES when something so potentially exciting is the subject!!
__________________
Dannii is offline   Reply With Quote
Old 06-27-2016, 12:56 PM   #22
Justin
Administrator
 
Justin's Avatar
 
Join Date: Jan 2005
Location: NYC
Posts: 15,716
Default

OK here's a 360 viewer, it ended up quite a bit longer than I anticipated, but maybe someone better at math can simplify it/improve:
Code:
//equirectangular 360 panner
//@param1:fov_ang 'fov' 90 20 170 45 1
//@param2:x_ang 'x' 0 -180 180 0.5 1
//@param3:y_ang 'y' 0 -90 90 0.5 1
//@param4:div 'div' 100 20 150 40 1
//@param5:filter 'filter' 0 0 1 0 1
project_w<1 ? project_w=1920;
project_h<1 ? project_h=1080;
gfx_img_resize(-1,project_w,project_h);
input_info(0,srcw,srch);

function matrix_make_rotate(matrix, m, d) global() local(m2)
(
  memset(matrix,0,16);
  matrix[m*5-5] = matrix[15] = 1.0;
  m2 = m==2 ? 0 : (m+1);
  matrix[m2*5]=matrix[m*5]=cos(d);
  matrix[m2*4+m]=-(matrix[m*4+m2]=sin(d));
);

function matrix_multiply(dest,src) global() local(s0,s1,s2,s3)
(
  loop(4,
    s0=dest[0]; s1=dest[1]; s2=dest[2]; s3=dest[3];
    dest[0] = s0*src[(0<<2)+0]+s1*src[(1<<2)+0]+s2*src[(2<<2)+0]+s3*src[(3<<2)+0];
    dest[1] = s0*src[(0<<2)+1]+s1*src[(1<<2)+1]+s2*src[(2<<2)+1]+s3*src[(3<<2)+1];
    dest[2] = s0*src[(0<<2)+2]+s1*src[(1<<2)+2]+s2*src[(2<<2)+2]+s3*src[(3<<2)+2];
    dest[3] = s0*src[(0<<2)+3]+s1*src[(1<<2)+3]+s2*src[(2<<2)+3]+s3*src[(3<<2)+3];
    dest+=4;
  );
);

function matrix_apply(x,y,z, m, vec*) global()
(
  vec.x = x*m[0] + y*m[1] + z*m[2] + m[3];
  vec.y = x*m[4] + y*m[5] + z*m[6] + m[7];
  vec.z = x*m[8] + y*m[9] + z*m[10] + m[11];
);

matrix1 = 0;
matrix2 = matrix1 + 16;
tab=matrix2 + 16;

xdiv=ydiv=div|0;

screen_z = 1/tan(fov_ang * 0.5 * $pi / 180);

y = -0.5 * (project_h/project_w);
dx = 1.0 / (xdiv-1);
dy = (project_h/project_w) / (ydiv-1);

matrix_make_rotate(matrix1,2,-x_ang * $pi / 180);
matrix_make_rotate(matrix2,1,y_ang * $pi / 180);
matrix_multiply(matrix1,matrix2);

ptr = tab;
loop(ydiv,
  x=-0.5;
  loop(xdiv,
    matrix_apply(x,y,screen_z,matrix1,vv);
    sy = 0.5 + asin(vv.y / sqrt(vv.x*vv.x+vv.y*vv.y+vv.z*vv.z)) / $pi;
    sx = 0.5 + (atan2(vv.x,vv.z)) / (2*$pi);

    sy < 0 ? (
      sy = -sy;
      sx += 0.5;
    ) : sy>= 1 ? (
      sy=2-sy;
      sx+=0.5;
    );
    ptr[0]=sx*srcw;
    ptr[1]=sy*srch;
    x+=dx;
    ptr+=2;
  );
  y+=dy;
);
gfx_mode=filter > 0 ? 0x100 : 0;
gfx_xformblit(0, 0,0, project_w,project_h,xdiv,ydiv, tab,0);
In the next 5.22 pre, I'm going to improve video processors in the monitoring FX chain -- particularly, it will apply them after the main (cached) video rendering pipeline, so you can look around and it won't need to reprocess any project video. Right now (in 5.21) this viewer pretty useless unless you're automating it (e.g. if you have some 360deg 4k video, and you tweak the x/y parameters, it takes forever to update). So try in the next 5.22pre, I suppose...

Preview:


Last edited by Justin; 06-27-2016 at 04:43 PM. Reason: updated processor some
Justin is offline   Reply With Quote
Old 06-27-2016, 02:01 PM   #23
plush2
Human being with feelings
 
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
Default

Quote:
Originally Posted by Justin View Post
OK here's a 360 viewer,


In the next 5.22 pre, I'm going to improve video processors in the monitoring FX chain -- particularly, it will apply them after the main (cached) video rendering pipeline, so you can look around and it won't need to reprocess any project video. Right now (in 5.21) this viewer pretty useless unless you're automating it (e.g. if you have some 360deg 4k video, and you tweak the x/y parameters, it takes forever to update). So try in the next 5.22pre, I suppose...
Just awesome! It's a rather complicated little world doing these spherical mapping calculations but that looks good an everything I've thrown at it so far. The speed issue being fixed so quickly is a big bonus as well.

Might I suggest that the 'y' rotation param be

Code:
//@param3:y_ang 'y' 0 -90 90 0.5 1
as there is rarely a need to be upside down and looking backwards when working on an edited video.
plush2 is offline   Reply With Quote
Old 06-27-2016, 03:49 PM   #24
Justin
Administrator
 
Justin's Avatar
 
Join Date: Jan 2005
Location: NYC
Posts: 15,716
Default

5.22pre6 is up which should support this processor in the monitoring chain nicely...
Justin is offline   Reply With Quote
Old 06-27-2016, 04:03 PM   #25
Justin
Administrator
 
Justin's Avatar
 
Join Date: Jan 2005
Location: NYC
Posts: 15,716
Default

Quote:
Originally Posted by plush2 View Post
Might I suggest that the 'y' rotation param be

Code:
//@param3:y_ang 'y' 0 -90 90 0.5 1
Done, also did (again) some other small fixes/simplifications, and added a filtering parameter. (edit: edited again)

Last edited by Justin; 06-27-2016 at 04:45 PM.
Justin is offline   Reply With Quote
Old 06-27-2016, 06:15 PM   #26
Justin
Administrator
 
Justin's Avatar
 
Join Date: Jan 2005
Location: NYC
Posts: 15,716
Default

Another test version, removes some stitching artifacts but makes some others (ugh):
Code:
//Equirectangular 360 panner
//@param1:fov_ang 'fov' 90 20 170 45 1
//@param2:x_ang 'x' 0 -180 180 0.5 1
//@param3:y_ang 'y' 0 -90 90 0.5 1
//@param4:div 'div' 100 20 150 40 1
//@param5:filter 'filter' 0 0 1 0 1
project_w<1 ? project_w=1920;
project_h<1 ? project_h=1080;
gfx_img_resize(-1,project_w,project_h);
input_info(0,srcw,srch);

function matrix_make_rotate(matrix, m, d) global() local(m2)
(
  memset(matrix,0,16);
  matrix[m*5-5] = matrix[15] = 1.0;
  m2 = m==2 ? 0 : (m+1);
  matrix[m2*5]=matrix[m*5]=cos(d);
  matrix[m2*4+m]=-(matrix[m*4+m2]=sin(d));
);

function matrix_multiply(dest,src) global() local(s0,s1,s2,s3)
(
  loop(4,
    s0=dest[0]; s1=dest[1]; s2=dest[2]; s3=dest[3];
    dest[0] = s0*src[(0<<2)+0]+s1*src[(1<<2)+0]+s2*src[(2<<2)+0]+s3*src[(3<<2)+0];
    dest[1] = s0*src[(0<<2)+1]+s1*src[(1<<2)+1]+s2*src[(2<<2)+1]+s3*src[(3<<2)+1];
    dest[2] = s0*src[(0<<2)+2]+s1*src[(1<<2)+2]+s2*src[(2<<2)+2]+s3*src[(3<<2)+2];
    dest[3] = s0*src[(0<<2)+3]+s1*src[(1<<2)+3]+s2*src[(2<<2)+3]+s3*src[(3<<2)+3];
    dest+=4;
  );
);

function matrix_apply(x,y,z, m, vec*) global()
(
  vec.x = x*m[0] + y*m[1] + z*m[2] + m[3];
  vec.y = x*m[4] + y*m[5] + z*m[6] + m[7];
  vec.z = x*m[8] + y*m[9] + z*m[10] + m[11];
);

matrix1 = 0;
matrix2 = matrix1 + 16;
tab=matrix2 + 16;

xdiv=ydiv=div|0;

screen_z = 1/tan(fov_ang * 0.5 * $pi / 180);

y = -0.5 * (project_h/project_w);
dx = 1.0 / (xdiv-1);
dy = (project_h/project_w) / (ydiv-1);

matrix_make_rotate(matrix1,2,-x_ang * $pi / 180);
matrix_make_rotate(matrix2,1,y_ang * $pi / 180);
matrix_multiply(matrix1,matrix2);

xf=0;
ptr = tab;
loop(ydiv,
  x=-0.5;
  ya_offs=xa_offs=0;
  loop(xdiv,
    matrix_apply(x,y,screen_z,matrix1,vv);
    ya = asin(vv.y / sqrt(vv.x*vv.x+vv.y*vv.y+vv.z*vv.z)) / $pi + ya_offs + 0.5;
    
    xa = atan2(vv.x,vv.z)/(2*$pi) + xa_offs + 0.5;
    
    x!=-0.5 ? (
      xa>lxa+0.5 ? (
        xa-=1;
        xa_offs-=1;
      ) : xa < lxa-0.5 ? (
        xa+=1;
        xa_offs+=1;
      );
      ya>lya+ 0.5 ? (
        ya-=1;
        ya_offs-=1;
      ) : ya < lya-0.5 ? (
        ya+=1;
        ya_offs+=1;
      );        
    );
    lxa=xa;
    lya=ya;
    
    xa>1?xf|=1;
    xa<0?xf|=2;
    
    ptr[0]=xa*srcw;
    ptr[1]=ya*srch;
    x+=dx;
    ptr+=2;
  );
  y+=dy;
);
//gfx_set(1,0,1);
//gfx_fillrect(0,0,project_w,project_h);
gfx_mode=filter > 0 ? 0x100 : 0;
gfx_xformblit(0, 0,0, project_w,project_h,xdiv,ydiv, tab,0);


ptr=tab;
loop(ydiv*xdiv,
  ptr[0] -= srcw-0.5/*fudge*/;
  ptr+=2;
  );
(xf&1) ? gfx_xformblit(0, 0,0, project_w,project_h,xdiv,ydiv, tab,0);
ptr=tab;
loop(ydiv*xdiv,
  ptr[0] += srcw*2+1.0;
  ptr+=2;
  );
(xf&2) ? gfx_xformblit(0, 0,0, project_w,project_h,xdiv,ydiv, tab,0);
Justin is offline   Reply With Quote
Old 06-27-2016, 08:24 PM   #27
TumbleAndYaw
Human being with feelings
 
Join Date: Mar 2016
Posts: 9
Default

Wonderful ! Thanks so much for having a crack at this. Will start playing with it asap.

Can the yaw and pitch (horizontal and vertical rotation ie. x and y controls in this case) be linked to parameters in a VST/JS audio plugin?

Ideally you'd want to rotate the 360º video and the soundfield will counter-rotate, which creates the impression that sound sources stay in place when you visually rotate.

Another very helpful addition would be some sort of video overlay / crosshair pointer, which would show the mouse/pointer position in horizontal and vertical degrees, so when you have let's say a car engine in the 360 video that's located at 90º left and 20º down, you can move the pointer over the car position in the video and read those values and input them into the audio positioner plugin for that sound source.

Hope this is not pushing it too much...:-) Already very grateful for what we have here so far!
TumbleAndYaw is offline   Reply With Quote
Old 06-28-2016, 07:22 AM   #28
RobinGShore
Human being with feelings
 
Join Date: May 2013
Location: New York
Posts: 780
Default

Quote:
Originally Posted by TumbleAndYaw View Post
Wonderful ! Thanks so much for having a crack at this. Will start playing with it asap.

Can the yaw and pitch (horizontal and vertical rotation ie. x and y controls in this case) be linked to parameters in a VST/JS audio plugin?

Ideally you'd want to rotate the 360º video and the soundfield will counter-rotate, which creates the impression that sound sources stay in place when you visually rotate.
I don't think parameters in the monitoring fx chain can be linked so you can't do it exactly the way you're saying, but what you can do is have an OSC or MIDI controller setup to to control both plugins at the same time. I've been playing around with it and it works really well. Here's me using Lemur on my phone to rotate the both the video and the sound field with OSC (Unfortunately Licecap seems to have trouble showing images from Reaper's video window, so you'll have to just watch the gui controllers moving in the video processor window):

RobinGShore is offline   Reply With Quote
Old 06-28-2016, 07:27 AM   #29
plush2
Human being with feelings
 
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
Default

Quote:
Originally Posted by TumbleAndYaw View Post
Wonderful ! Thanks so much for having a crack at this. Will start playing with it asap.

Can the yaw and pitch (horizontal and vertical rotation ie. x and y controls in this case) be linked to parameters in a VST/JS audio plugin?

Ideally you'd want to rotate the 360º video and the soundfield will counter-rotate, which creates the impression that sound sources stay in place when you visually rotate.
It used to be that this was not possible but it seems that now parameters of the video processor can be modulated and linked to other effects in the fx chain of the same track. One little catch is that both of the panners I linked to (ATK and blueripple Look) output their link in radians instead so it takes some fiddling to get things working right (scaling by 36000 and offsetting the parameter link).

This is working and I hope to post a little gif tomorrow to demonstrate it.

*edit- I see that RobinGShore has a gif demonstration using midi CC to do basically the same thing.
plush2 is offline   Reply With Quote
Old 06-29-2016, 03:50 PM   #30
Justin
Administrator
 
Justin's Avatar
 
Join Date: Jan 2005
Location: NYC
Posts: 15,716
Default

Here's a video processor that is cleaned up and has fewer stitching artifacts:
Code:
//equirectangular 360 panner
//@param1:fov_ang 'fov' 90 20 170 90 1
//@param2:x_ang 'yaw' 0 -180 180 0 1
//@param3:y_ang 'pitch' 0 -90 90 0 1
//@param4:z_ang 'roll' 0 -180 180 0 1

project_w=1920;
project_h=1080;
xdiv=ydiv=100; // subdivision (quality) 10-200 is usable
filter=1; // bilinear filtering?
xscale=1; // -1 to flip yaw
yscale=1; // -1 to flip pitch
zscale=1; // -1 to flip roll

function matrix_make_rotate(matrix, m, d) global() local(m2) (
  memset(matrix,0,16);
  matrix[m*5-5] = matrix[15] = 1.0;
  m2 = ((m%=3)+1)%3;
  matrix[m2*5]=matrix[m*5]=cos(d);
  matrix[m2*4+m]=-(matrix[m*4+m2]=sin(d));
);

function matrix_make_xlate(matrix, x, y, z) global() (
  memset(matrix,0,16);
  matrix[0]=matrix[5]=matrix[10]=matrix[15]=1;
  matrix[3]=x; matrix[7]=y; matrix[11]=z;
);

function matrix_multiply(dest,src) global() local(s0,s1,s2,s3) (
  loop(4,
    s0=dest[0]; s1=dest[1]; s2=dest[2]; s3=dest[3];
    loop(4, dest[0] = s0*src[0]+s1*src[4]+s2*src[8]+s3*src[12]; dest+=1; src+=1; );
    src -= 4;
  );
);

function matrix_apply(x,y,z, m, vec*) global() (
  vec.x = x*m[0] + y*m[1] + z*m[2] + m[3];
  vec.y = x*m[4] + y*m[5] + z*m[6] + m[7];
  vec.z = x*m[8] + y*m[9] + z*m[10] + m[11];
);

function vector_project(vv*, s*) global(srcw,srch) (
  s.y = srch * (0.5 + asin(vv.y / sqrt(vv.x*vv.x+vv.y*vv.y+vv.z*vv.z)) * (1/$pi));
  s.x = srcw * (0.5 + atan2(vv.x,vv.z) * (1 / (2*$pi)));
);

function centerx(sm*, v*) global(srcw) (
  v.x < sm.x-srcw*.5 ? ( v.x += srcw; 2; ) : v.x > sm.x+srcw*.5 ? ( v.x -= srcw; 1; );
);

gfx_img_resize(-1,project_w,project_h);
input_info(0,srcw,srch);

gfx_mode=filter ? 0x100 : 0;

screen_z = project_w/tan(fov_ang * 0.5 * $pi / 180);
dxpos=project_w/(xdiv-1);
dypos=project_h/(ydiv-1);
ypos = 0;

matrix1 = 0; matrix2 = matrix1 + 16;
matrix_make_rotate(matrix1,2,xscale * x_ang * -$pi / 180);
matrix_make_rotate(matrix2,1,yscale * y_ang * $pi / 180); matrix_multiply(matrix1,matrix2); 
matrix_make_rotate(matrix2,3,z_ang * -$pi / 180);         matrix_multiply(matrix1,matrix2);
matrix_make_xlate(matrix2,project_w*-.5,project_h*-.5,0); matrix_multiply(matrix1,matrix2);

loop(ydiv,
  y1=(ypos)&0xffffe;
  y2=(ypos+=dypos)&0xffffe;
  idy = 1/(y2-y1);
  xpos = 0;

  loop(xdiv,
    x1=(xpos)&0xffffe;
    x2=(xpos+=dxpos)&0xffffe;
    idx = 1/(x2-x1);

    matrix_apply(x1,y1,screen_z,matrix1,vv); vector_project(vv, s1);
    matrix_apply(x2,y1,screen_z,matrix1,vv); vector_project(vv, s2);
    matrix_apply(x1,y2,screen_z,matrix1,vv); vector_project(vv, s3);
    matrix_apply(x2,y2,screen_z,matrix1,vv); vector_project(vv, s4);

    a=centerx(s1,s2)|centerx(s1,s3)|centerx(s1,s4);

    dsdx = (s2.x-s1.x) * idx;
    dtdx = (s2.y-s1.y) * idx;
    dsdy = (s3.x-s1.x) * idy;
    dtdy = (s3.y-s1.y) * idy;
    dsdx2 = (s4.x-s3.x) * idx;
    dtdx2 = (s4.y-s3.y) * idx;
    dsdxdy = (dsdx2-dsdx) * idy;
    dtdxdy =  (dtdx2-dtdx) * idy;

    gfx_deltablit(0, x1,y1, x2-x1,y2-y1, s1.x,s1.y, dsdx, dtdx, dsdy, dtdy, dsdxdy, dtdxdy);
    (a&1) ? gfx_deltablit(0, x1,y1, x2-x1,y2-y1, s1.x+srcw,s1.y, dsdx, dtdx, dsdy, dtdy, dsdxdy, dtdxdy);
    (a&2) ? gfx_deltablit(0, x1,y1, x2-x1,y2-y1, s1.x-srcw,s1.y, dsdx, dtdx, dsdy, dtdy, dsdxdy, dtdxdy);
  );
);
(edited with simplifications -- you can now configure the output dimensions in the code easily, and flip the x/y axes, etc)
(edited: added roll support)

Last edited by Justin; 06-29-2016 at 09:10 PM.
Justin is offline   Reply With Quote
Old 08-19-2016, 12:57 AM   #31
ihabali
Human being with feelings
 
Join Date: Aug 2016
Posts: 2
Default

Been following this thread since I started playing with ambisonics for 360 videos. I think what would be useful for sound designers would be a view that presents the 360 video as is(flat equirectangular) where some markers can be added on the actual picture as it plays back. You can then animate the position of these markers(using automation curves) to track say a moving object within the video. You would then link the values coming from these marker positions curves(converted to degrees) to an ambisonic panner. This in my opinion is a much more efficient workflow than having to pan around the 360 video to see/hear where the sound is supposed to be coming from and track it around.

Is it possible,using the video processing plugin, to draw UI elements(a crosshair, a square, a circle...whatever) that can then be controlled with the mouse and linked to a parameter in the code? I'm having trouble finding any documentation on the JSFX code specific to the video processor as it seems to be different from the published language reference. I would be happy to write this plugin if I can get some guidance from the experts on this forum as I just started with reaper a couple of weeks ago.
ihabali is offline   Reply With Quote
Old 08-19-2016, 12:21 PM   #32
plush2
Human being with feelings
 
Join Date: May 2006
Location: Saskatoon, Canada
Posts: 2,110
Default

Quote:
Originally Posted by ihabali View Post
Been following this thread since I started playing with ambisonics for 360 videos. I think what would be useful for sound designers would be a view that presents the 360 video as is(flat equirectangular) where some markers can be added on the actual picture as it plays back. You can then animate the position of these markers(using automation curves) to track say a moving object within the video. You would then link the values coming from these marker positions curves(converted to degrees) to an ambisonic panner. This in my opinion is a much more efficient workflow than having to pan around the 360 video to see/hear where the sound is supposed to be coming from and track it around.
As an option this would be great. As long as the action in the video stays along the equator this would work pretty well. Anything at the top or the bottom of the viewing field would be pretty wildly distorted and possibly difficult to track or pinpoint.

What about a cubic projection as well where the mixer could choose which panel they wanted to be viewing.

It is theoretically possible to do mouse move capture with the gfx code as can be seen in several jsfx. I have not yet seen it done with the videofx though.

That gfx section in the jsfx code guide is probably your best starting point for features. I'm not sure if it's one to one with the videofx but there are many similarities.
plush2 is offline   Reply With Quote
Old 08-19-2016, 12:27 PM   #33
ihabali
Human being with feelings
 
Join Date: Aug 2016
Posts: 2
Default

Quote:
Originally Posted by plush2 View Post
As an option this would be great. As long as the action in the video stays along the equator this would work pretty well. Anything at the top or the bottom of the viewing field would be pretty wildly distorted and possibly difficult to track or pinpoint.

What about a cubic projection as well where the mixer could choose which panel they wanted to be viewing.

It is theoretically possible to do mouse move capture with the gfx code as can be seen in several jsfx. I have not yet seen it done with the videofx though.
Cubic would be even better and you can just do the map so you can track the motion across the different faces without having to flip between views or provide an option to switch as you've described.

Do you know where I can find more documentation about the jsfx standard used by the video process? I tried a few of the gfx functions from the website and they don't work and I don't want to reverse engineer the existing presets to find what I'm looking for
ihabali is offline   Reply With Quote
Old 09-15-2016, 02:06 PM   #34
charlifiiiii
Human being with feelings
 
Join Date: Jul 2014
Posts: 6
Default

FYI, Noise Makers updated its 3D panner Ambi Pan with a transparent window doing basically what you described below.
It was designed to pan multiple sources on top of an equirectangular 360 video.

www.noisemakers.fr/ambi-pan

Quote:
Originally Posted by ihabali View Post
Been following this thread since I started playing with ambisonics for 360 videos. I think what would be useful for sound designers would be a view that presents the 360 video as is(flat equirectangular) where some markers can be added on the actual picture as it plays back. You can then animate the position of these markers(using automation curves) to track say a moving object within the video. You would then link the values coming from these marker positions curves(converted to degrees) to an ambisonic panner. This in my opinion is a much more efficient workflow than having to pan around the 360 video to see/hear where the sound is supposed to be coming from and track it around.

Is it possible,using the video processing plugin, to draw UI elements(a crosshair, a square, a circle...whatever) that can then be controlled with the mouse and linked to a parameter in the code? I'm having trouble finding any documentation on the JSFX code specific to the video processor as it seems to be different from the published language reference. I would be happy to write this plugin if I can get some guidance from the experts on this forum as I just started with reaper a couple of weeks ago.
charlifiiiii is offline   Reply With Quote
Old 09-15-2016, 02:22 PM   #35
charlifiiiii
Human being with feelings
 
Join Date: Jul 2014
Posts: 6
Default

It looks like this


Last edited by charlifiiiii; 09-15-2016 at 02:27 PM. Reason: image not displayed
charlifiiiii is offline   Reply With Quote
Old 11-19-2016, 08:58 AM   #36
fsz
Human being with feelings
 
Join Date: Jul 2016
Posts: 45
Default

This is awesome! Is there any chance you can rotate the video by dragging mouse click?
fsz is offline   Reply With Quote
Old 09-23-2018, 09:06 AM   #37
bommaren
Human being with feelings
 
Join Date: Nov 2008
Location: Stockholm Sweden
Posts: 23
Default

In this context, I have a small project on Github named OHTI
Open HeadTracking Initiative

Hardware and firmware for a Bluetooth Headtracker that uses omnitone to convert ambisonics to headtracked binaural.

The headtracker currently sends quaternion data for rotation control, but can easily be converted to use OSC (with a OSC npm module).

The goal is to create the the impression that you are really listening to a stationary sound image to improve the externalization when listening with headphones.

commercial products that does similar things are Smyth Realizer for example.

All of the code for the player and decoder are written in JavaScript.

Do any one have the knowledge on how to integrate JavaScript written code in to Reaper?
bommaren is offline   Reply With Quote
Old 09-23-2018, 02:37 PM   #38
RDan
Human being with feelings
 
Join Date: Jul 2017
Posts: 24
Default

I guess it might be more efficient writing it in proper C++ in form of a VST plug-in.

The normal patching would be employing a rotator plugin (e.g. mcfx_rotator), control the angles via MIDI / OSC, and use a binaural decoder plugin afterwards (e.g. IEM's BinauralDecoder ;-) )

There are also binaural decoding plug-ins out there which already have a rotation feature implemented.

For your project, an interface for your bluetooth quaternion data in C++ would be great, maybe in form of a library / class under a proper license, so third party plug-in manufacturers could use it?


PS: I think this discussion should be continued in another thread, as it's off-topic.
RDan is offline   Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 01:58 AM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.