bcrowe306 / MPC-Studio-Mk2-Midi-Sysex-Charts

The repository is a breakdown of the Midi note, CC, and sysex data for the MPC Studio Mk2
30 stars 7 forks source link

LED Display #1

Open gstepniewski opened 1 year ago

gstepniewski commented 1 year ago

I was able to figure out parts of how the LCD display is used.

Long story short, it's just as dumb as the rest of the device - the MPC software actually renders the entire content of the display and then sends PNG files encoded as SysEx messages, every image seems to be cut up into 6 parts (2 rows, 3 columns).

Here's what the SysEx message looks like:

F0 (1 byte) SysEx Message Start
47 7F 4A (3 bytes) Manufacturer / Device / Model ID
04 (1 byte) Message ID
?? ?? (2 bytes different each time) No idea - assuming payload size because it seems roughly corellated to it, but can't find the exact formula
00 20 / 20 20 (2 bytes) No idea, see below.
XX 00 YY 00 (4 bytes) X / Y coordinates of the chunk, see below
?? ?? (2 bytes different each time) No idea, possible checksum? See below.
PAYLOAD Encoded PNG file, see below
F7 (1 byte) SysEx Message End

The SysEx header is quite obvious and works the same as everywhere else. Then after the message type we have some fields that are a little more complicated:

1. Payload size?

The first two bytes following the message type usually mean message size. There's definitely a correlation between the value of this and the lenght of the message, but I haven't been able to find a formula that would give me the number of bytes in the entire message after this, or the number of bytes in the encoded png, or the number of bytes in the decoded png. Might need to capture a few more messages so I can have a bigger data set to look at.

2. 00 20 / 20 20

Of these two bytes, the first one is always 00 or 20 and the second is always 20 (both in hex). Not really sure what the first one is doing, could find a corellation with any other things about the image. The second one is possibly a separator?

3. Image coordinates

These 4 bytes encode the image coordinates. As mentioned before, frame on the display is made from 6 separate images. They have different sizes and are laid out like this:

60x60 60x60 40x60
60x20 60x20 40x20

This works out to a total display size of 160 x 60, which looks accurate. The coordinate encoding is stored in byte 1 and 3, with both 2 and 4 always being 00. The coordinates for each image in the frame are:

00 00 (0,0) 3C 00 (60, 0) 78 00 (120, 0)
00 3C (0,60) 3C 3C (60,60) 78 3C (120,60)

4. Last 2 bytes of header

Not really sure. No corellation with any other message parameters. Possibly a checksum of the headers, that's often where they're placed but no idea how to calculate it.

5. Actual image

The rest of the message, until the terminating F7 character, is a PNG image. However, there is a twist to it. SysEx messages are not really supposed to have values in them that are larger than 128 - this is because it's a format designed for sending MIDI message which actually have 128 different values. In order to send a file (which can have all 256 byte values) through SysEx, we need to bring all the exceeding values down and also have a way to reconstruct them back. The way they do it here (possibly a standard in this scenario, I couldn't tell) is to have groups of 7 bytes and prepend an extra byte which marks the places where truncation happened and needs to be reversed. As an example, this is the first group of each message (because it's a PNG file header):

01 09 50 4E 47 0D 0A 1A

If you compare this to a correct PNG file header, which begins with 89 50 4E 47 0D ..., you'll notice that the first value is off. In fact, it's off by exactly 128 (0x89 - 0x09 = 137 - 9 = 128). If you look at the first byte now in it's binary form, i.e.

00000001

This is telling us that the first byte in the group (because we're counting from the least significant bit i.e. backwards) need to have 128 added to it. After that, the control byte is dropped, the remaining and fixed 7 are stored and we move on to the next group. Using this logic I was able to reconstruct 12 different images for a total of 2 frames (all linked below).

Test Data

Message ID SysEx Message Header Message Metadata (until the payload)
1 F0 47 7F 4A 04 063D 20 20 00000000 4E02
2 F0 47 7F 4A 04 0337 00 20 00003C00 7901
3 F0 47 7F 4A 04 1267 00 20 3C000000 3308
4 F0 47 7F 4A 04 0515 00 20 3C003C00 3B02
5 F0 47 7F 4A 04 0278 00 20 78000000 4201
6 F0 47 7F 4A 04 0173 20 20 78003C00 4D00
7 F0 47 7F 4A 04 1225 20 20 00000000 7907
8 F0 47 7F 4A 04 025B 00 20 00003C00 2801
9 F0 47 7F 4A 04 1A22 00 20 3C000000 760B
10 F0 47 7F 4A 04 0624 20 20 3C003C00 3802
11 F0 47 7F 4A 04 0865 20 20 78000000 5103
12 F0 47 7F 4A 04 0128 20 20 78003C00 0C00

Before going any further with this I will probably check if we can send images to the MPC Studio using the public port as well - otherwise this entire exercise is mostly pointless. It will take a while since I first need to learn how to send SysEx message, but I will post back when done. In the meantime, any insight into this would be very appreciated, I'm clearly out of my depth here a little.

gstepniewski commented 1 year ago

Turns out you can't embed images inside tables... Here they are again as text:

Image 1: 1 3 5 2 4 6

Image 2: 7 9 11 8 10 12

And to match the images with the table above, here's the ordering

Image 1
1 3 5
2 4 6
Image 1
7 9 11
8 10 12
gstepniewski commented 1 year ago

I've also added the decoded PNGs and the full, untouched SysEx messages in my fork of this repo: https://github.com/gstepniewski/MPC-Studio-Mk2-Midi-Sysex-Charts

BlackBinary commented 1 year ago

Hey! Great stuff going on here. Another frustrated user trying to gain control over my device. I don't know if you have figured out how to send the sysex messages, but I got it working in Node.js, as a test to see if I can port that functionality to Python for FL Studio later.

You can check it out here: https://github.com/BlackBinary/MPC-Studio-MKII

Not sure how you are with Node.js, but "npm install" and "node index.js" should do the trick. You might need to change the port number in index.js. Mine is 0 for public port.

Greetings!

BlackBinary commented 1 year ago

I'm able to send other PNG parts to all coordinates using the metadata and header from original image 1. The only thing I have to update are the coordinates, and no other changes.

The code I use to generate the sysex header is quite easy:

`function generateMessageMetadata(x, y) { const imageCoordinates = [x, 0x00, y, 0x00];

return [ ...[0x06, 0x3d], ...[0x20, 0x20], ...imageCoordinates, ...[0x4e, 0x02], ]; }`

Other findings I have are that the images seem to be limited to PNG8 with a color depth between 8 and 10. I'm no PNG expert in any way, but that's what sharp.js is able to generate as a recognised image.

BlackBinary commented 1 year ago

POC:

ezgif-2-580a5f213d

gstepniewski commented 1 year ago

This is going to be long...

So first of all - amazing work getting that POC done! The JS code for encoding the images and sending sysex was extremely useful in pushing my research further. I actually spend the weekend tinkering around, trying to do the same thing in python (and within the context of an Ableton Live Remote Script) and found a number of interesting things.

I'm going to skip the full story of what I did since a lot of mistakes, misassumptions and backtracking happened along the way and would be very confusing to read. Instead, here's a summary of what I discovered - some of it may be less relevant to you but I'm including it in case someone else joins the effort.

  1. Libraries in Live Remote Scripts. It is possible to use external libraries in Live Remote Scripts. You cannot use pip to install them since Live has its own virtual environment and won't see them, but you can either drop the library into your project or the /Applications/Ableton Live 11.app/Contents/App-Resources/Python/lib folder. However, it seems like libraries that make use of native c bindings do not actually work (tried with Pillow and a few others that would let me generate a png in memory), and they all fail at importing a compiled file. Which means that unless we have a way of generating png files in python without calling c, we can't do dynamic display - which would have been super useful for parameter navigation.

  2. ControlElement _send_midi function I tried using this function to send the images over to the device (by generating a full payload externally, encoding it as Base64 and storing as a variable in code) and it sort of works. It seems to be extremely slow (it takes about a second to fully render one single frame and you can see the chunks appear one by one) and sometimes a message will become corrupt in transit (usually the first one in a group, no idea why). If we want to keep trying to use the LCD screen in Ableton, this is another hurdle we'll need to get over.

  3. The two uknown chunks of metadata These definitely are important and are almost definitely related to image size. When I tried to send larger images using your code (i.e. images with more information on them that cannot be compressed to less than 1KB) and chunks that were too big would not display. Increasing the value of the first metadata chunk would sometimes send some of the images over, but you can't just blindly put high values in there because apparently the second metadata chunk has to somehow match the first one. I went back to analyzing both of them and found one more small detail - when you write the second metadata chunk backwards (e.g. 4E02 -> 20E4), the values also seem to correlate with the image size. But just like with the first chunk, not in a way that would suggest how to calculate them for new images.

  4. The device breaks and needs to be restarted when bad messages are sent When testing I would try to send different images and different metadata. I noticed that if have a message A that shows correctly, then send a message B that doesn't, A would not work anymore until I unplugged the device and restarted it. Be aware of that when testing, as you may be sending a proper message to a device in a broken state and completely mess up your tests.

  5. MPC software tries to hide its communication with the device, but there is a hack I decided to try and capture some more messages from the software to the device to have more data to analyze, but when I plugged it in and started Midi Monitor, nothing would show up. I eventually figured out that the MPC driver you install to make the device work is somehow able to hide the Private port from the system (and also mute all communication on other ports while MPC software is open). To workaround this, you need to

    1. Uninstall the driver (by removing files from /Library/Extensions/AkaiProfessionalMPCStudioMk2.kext and /Library/Audio/MIDI Drivers/Akai Professional MPC Studio mk2.plugin)
    2. Reboot your computer
    3. Install the driver again. DO NOT REBOOT! The dialog at the end will try to force you to reboot, just keep it open and do not reboot.
    4. Start MPC software and MIDI Monitor, you can now see traffic on the private port.
    5. If you reboot at any point after this, you need to run the procedure again if you want to spy on the sysex messages.

I have captures some more sysex messages with different screens shown on the device, but have run out of time before I could look into them. I will post back here when I've had a chance to analyze them a little more.

gstepniewski commented 1 year ago

Added these new sysex messages I captured and decoded pngs to my repo: https://github.com/gstepniewski/MPC-Studio-Mk2-Midi-Sysex-Charts

BlackBinary commented 1 year ago

Very cool! Will read later thoroughly.

What I forgot to mention is that I opened the device up and dumped the UART data I got from the two onboard modes. The chip numbers are exposed at least, so maybe a custom firmware could even be a thing if I'm able to dump an actual rom.

There's lot's of possibilities that route it seems, even the touchstrip is connected to the main on board processor by UART.

gstepniewski commented 1 year ago

Here's a spreadsheet with some of the Sysex data analyzed: https://docs.google.com/spreadsheets/d/1iZmnS6n7Qm-5FdF6Mg7Plnk7Mgt3fkqM76K-ckFGPhs/

There is definite correlation, but I can't figured out the exact formulas. I've also noticed that for images that have the same size, the exact same metadata is used - which means each one is encoded independently from others and only the coordinate bytes determine its position (I've also verified that it doesn't matter what order you send the images and you can even do partial updates).

A second, fresh pair of eyes on this would be very helpful.

BlackBinary commented 1 year ago

I can't open the sheet, is it public?

gstepniewski commented 1 year ago

It is now, sorry about that!

Unkleskunky commented 8 months ago

Y'all are wizards!

I stumbled onto this looking for an controller editor for the studio mk2, imagine the world that has opened up to me.

Is it possible an editor can be created from the work youre doing here?

A question or two.. Are you familiar with BlueCat Audio's "plugnscript"? Also of possible interest is a program called "driven by moss" a controller editor for bitwig, reaper and perhaps others. Might be that your data would assist their efforts as well.

Would it be possible to create an editor/ plugin that would allow peeps to midi learn a controller as a vst?

Wouldnt that be something? To have plug in specific controller function? I am imagining being able to switch from vst level to global or daw/ host level control? Perhaps through scenes?

If I haven't fully exposed my epic level noobness, please understand that I am less than novice at all these disciplines but really dig whats happening here.

I would love to contribute in some way to this project and perhaps a more general project that would allow peeps to edit and learn all aspects of a software on this controller and perhaps others.

I'm not financially well off but I would certainly donate from time to time.

Keep up the epic work here please!

BlackBinary commented 6 months ago

@Unkleskunky Thanks! Haven't seen that plugin before. I was actually looking into the midi maps to map them to VST's, is that what you're proposing here? That should be doable.

Unkleskunky commented 6 months ago

@Unkleskunky Thanks! Haven't seen that plugin before. I was actually looking into the midi maps to map them to VST's, is that what you're proposing here? That should be doable.

I was hoping for a controller editor like for other Akai products but could perhaps exist as a vst. I then started thinking , Beats and MPC can both be installed as a vst which allows for MIDI mapping, so perhaps all the coding is in those programs. MPC as vst doesn't work well with the bluecat software(yet) if I am recalling correctly it is mostly an issue with clock control, but as I said I am pre noob status.

But generally yes a controller editor that could exist as a vst and even perhaps allow us to affect what is displayed on the tiny screen and of course map those many buttons to whatever DAW controls we might desire. In the "driven by moss" app, which works with reaper and bitwig the "studio mk2" isnt listed as compatible and I havent ever tried the generic "flexi" which is for controllers other than those listed. I assume this wouldn't work for the same reasons we cant use it elsewhere, its too "dumb" but if you could code for that handshake or whatever and then build the editor we would be in good shape. How cool if you could just install a VST. or AUV3 and have functionality with all OS?

Looking forward to hearing whats possible.