frank26080115 / alpha-fairy

Wi-Fi Remote for Sony Cameras
https://eleccelerator.com/alpha-fairy-wireless-camera-remote/
MIT License
119 stars 10 forks source link

[Feature Request] Setting Time Code Preset #33

Open somethingp opened 1 year ago

somethingp commented 1 year ago

Is your feature request related to a problem? Please describe. I'm trying to create a cost effective timecode syncing solution for my A7IV and my gopro.

Describe the solution you'd like Most of the sony alpha cameras offer adding timecode to video metadata. You can set the "Time Code Preset" setting and have the time code run on "Free Run" which lets you use the built in timecode functionality. GoPro offers timecode encoding as well using their qr code functionality in the gopro labs firmware. The new sony camera remote SDK does offer a way to set the timecode preset using the sdk, and they offer the sdk for ARM. Would that ARM SDK work on ESP32? Once the timecode functionality works, it's relatively trivial to port the GoPro QR generation code which is available on their GitHub. Essentially we'd generate a time code on the alpha-fairy remote, which sets the "Time Code Preset" on the sony camera, and show a QR code for the GoPro at the same time. The QR code will keep updating as the timecode runs on the alpha-fairy, and thus both cameras (or any number of sony and gopro cameras) can be synced together.

Describe alternatives you've considered The only real alternatives are the paid $500 timecode solutions, and the limitation with them is that they don't use the sony camera's metadata functionality. Instead they store the timecode data in one of the audio tracks. I think this solution may not be perfect (might be off by a couple of frames due to latency, but it would be a really cool solution if it's possible to implement.

I started digging into the SDK compiled files, and found this function which might be relevant:

/* PTPProxy3_0::SetTimeCodePreset(sony_lajiao::LjDeviceProperty*) */

undefined4 __thiscall PTPProxy3_0::SetTimeCodePreset(PTPProxy3_0 *this,LjDeviceProperty *param_1)

{
  char cVar1;
  undefined4 uVar2;
  uint uVar3;
  long in_FS_OFFSET;
  uint local_2c;
  uint local_28;
  uint local_24;
  long local_20;

  uVar2 = 0x8401;
  local_20 = *(long *)(in_FS_OFFSET + 0x28);
  if (param_1 != (LjDeviceProperty *)0x0) {
    cVar1 = PTPProxy::GetPropertiesSetValues
                      ((PTPProxy *)this,*(LjDevicePropertyCode *)param_1,&local_2c,0xc);
    if (cVar1 == '\0') {
      uVar2 = 0x8402;
    }
    else {
      uVar3 = (int)*(ulong *)(param_1 + 0x10) -
              (int)((*(ulong *)(param_1 + 0x10) & 0xffffffff) % (ulong)local_24);
      if ((local_2c <= uVar3) && (uVar3 <= local_28)) {
        uVar2 = (**(code **)(*(long *)this + 0x58))(this,0xd0d3,3);
      }
    }
  }
  if (local_20 == *(long *)(in_FS_OFFSET + 0x28)) {
    return uVar2;
  }
                    /* WARNING: Subroutine does not return */
  __stack_chk_fail();
}

And this is a screen grab from their SDK documentation: image

frank26080115 commented 1 year ago

Would that ARM SDK work on ESP32?

No but if it exists in the SDK, and it can work over Wi-Fi, I can run the SDK demo and use Wireshark to figure out what packet it is sending. That's how I developed all of the code, by examining Wireshark sniffing logs.

The QR code will keep updating as the timecode runs on the alpha-fairy, and thus both cameras (or any number of sony and gopro cameras) can be synced together.

I am not understanding the big picture because of this. First, it takes time to generate the QR code and the screen refresh rate on the M5StickC is not super fast. Second, I expect any camera parsing a QR code to have some latency. So how exactly does synchronization work with so much latency in the system.

I am not that familiar with videography workflows, so I need some help here. Otherwise, sending a command and generating QR code is "easy", making it actually do something useful is hard for me.

somethingp commented 1 year ago

This video shows how tentacle sync does QR based timecode sync for GoPro around 6:00 mark: https://www.youtube.com/watch?v=c1oa3ZBJYsU

It's possible GoPro does the sync and has some sort of assumption for how long processing generally takes, and thus compensates the timecode for it. A similar thing might be possible to do with the alpha fairy, where maybe it does a couple of cycles of setting and getting the timecodepreset from the camera, and correcting for latency each time until it's reasonably in sync.

Edit - I added some reverse engineered SDK code, and I think it's still over PTP so I feel like it should work over wireless even though the SDK only officially supports USB/LAN connection.

frank26080115 commented 1 year ago

how does the ESP32 know that it's actually in sync with the camera, once the time code is sent? With modern oscillators it should be pretty close, but I am not sure if it's close enough.

also, how did you reverse engineer the SDK code? that looks useful, you've shown me the numbers 0x8401 and 0xd0d3 and I think those are the time code mode (or status) and the property code. I can sort of understand the code but I can't place videography terms to the variable names.

if I can use this information to find the current time code inside the device properties, then I can actually display the time code from the camera instead of the assumed timecode tracked by ESP32

As for user interface, should it just always start from zero? or should the time be configurable? which one of those 4 needs to be configurable?

I scanned the one in the Youtube video and it says oMTCAL=1100oTxxxxxxxxxxxxxxxx where the x is a date and time. The M5StickC-Plus does have a RTC but I am not using it right now. So I need a way to set the RTC. Do I need that to be in perfect sync with the camera? I don't think so... it'd be nice if it can read the date from the camera though.

somethingp commented 1 year ago

So the ESP32 doesn't actually know it's in sync, but there's likely a way to get the current timecode from the camera because it displays it on the screen (instead of the record time). I'm guessing using the GetLiveViewProperties function from the SDK would do it. But the documentation doesn't actually list what properties are returned. You can't just poll the preset again because the preset is always going to be the starting point you defined.

For reversing the SDK code, I used Ghidra, and basically followed these steps on libCr_Core.so (might be a different file ending if you're on windows or mac but it's inside the external/crsdk folder in the sony sdk download). Since this is reverse engineering from the binaries, the variable names are lost, and will require some guessing to understand what's happening.

The ESP32 will have to have some way to reliably keep track of "master" timecode that everything can can be set to. Generally the devices on the market just use current time for that. So it doesn't have to start with 0, it can start with anything, and then the video editing software just puts it in order of lowest to highest. I think using the RTC is perfect for this. That is essentially what the cameras do too. Once you sync the cameras they maintain the timecode using their internal clocks even if the device is turned off. That way they stay in sync throughout your shoot even if you're turning cameras on/off or changing batteries, etc.

The only thing that has to be configurable is the number of frames you're recording in so that the frame number value can be a division of the number of frames. For example, if you're shooting in 30fps, then the frame number would progress every 1/30 of a second. If you're in 24, it progresses 1/24th of a second and only goes up to 24. For higher frame rates (60, 120) you still use 1-30 as the range and progress 1/30 of a second. And for the broadcast frame rates (29.97 and 23.976) you divide based on those numbers. Although I'd say the broadcast thing might be a limited use case nowadays, and unnecessarily complicated for a first run.

P.S. - Thank you for even taking a look into this. I know you recently added the IR timecode reset functionality, which is awesome, but the A7IV doesn't have a stupid IR sensor :( So this timecode functionality might help other users as well even if they're not using sony and gopro cameras.

somethingp commented 1 year ago

Just wanted to add I got the frame numbers wrong in my examples. They're 0-indexed, ie 0-29 for 30 fps. Also the docs seem to indicate for 24 fps, you report in multiples of 4 between 0-23, so I guess that means 0, 4, 8, 12, etc, while for 30, 50, 60 you report every number, i.e. 0,1,2,3, etc.

Edit - Did some more reading on the RTC clock, and it's to use it to pick a starting point on the ESP32, but I think may need to use the CPU clock to progress the clock as it is much more accurate than RTC. Also since the alpha-fairy is just the generator, it is reasonable that you can't power off the device while you're syncing your devices. So from usability, I don't think it would make that much of a difference.

frank26080115 commented 1 year ago

let me get this clarified, the list of things to work on would be:

I have a robotics competition I am preparing for though so I'm not going to accomplish any of this really fast

somethingp commented 1 year ago

I think this is more of less correct. The clock doesn't necessarily have to stay running once alpha-fairy is turned off. It just needs to keep track of the timecode until all the cameras you're using have been synchronized.

Honestly take your time. The fact that you're even willing to work on something amazing like alpha-fairy is incredibly generous!

PS - good luck in the competition!