Closed Joshuaek closed 5 years ago
It's easy to miss, but the documentation says:
"... it has methods for sending and receiving MIDI events (to be used only from within the process callback ..."
You'll need to create a callback function and register it as process callback with client.set_process_callback()
.
Within the process callback, the time
parameter makes more sense, because it is the number of samples after the beginning of the current block your MIDI event should be created.
If you want it to be created at the beginning of the current block, just use 0
.
Ah - thanks, I did miss that.
Do you have any examples for generating a midi sequence? The examples all seem to be reacting to incoming midi events - it's still not clear how I would use the callback to sequence a series of notes - e.g. if I wanted to play a C major scale.
Sorry, I don't have an example (yet). Do you have an idea how a very simplified example could look like?
Sorry for coming back after so long, but I just recently re-started my project and have run into the same issue. The example that I'd love to see is something as simple as playing a C major scale. The part I can't fathom is the timing for the notes. All the examples are reacting to income events, rather than simply generating new events.
@Joshuaek I've finally created a new MIDI example: #61
I don't know if that helps, though ...
I'd love to hear your comments on that!
I ran into the same problem of sending real time MIDI while working on a custom MIDI controller project. A solution that worked for me was to use an atomic queue to pass the messages from the main thread into the Jack client processing thread:
import jack
import queue
# Open the connection to jack-midi
client = jack.Client('MIDI-Controller')
outport = client.midi_outports.register('output')
midi_msg_q = queue.Queue()
@client.set_process_callback
def process(frames):
global midi_msg_q
outport.clear_buffer()
try:
while True:
midi_msg = midi_msg_q.get(block=False)
outport.write_midi_event(0, midi_msg)
except queue.Empty:
pass
client.activate()
client.connect(outport, 'system:playback_1')
def midi_note_on(channel, midi_note):
'''Transmit a MIDI "note on" message'''
midi_msg_q.put((0x90 | channel, midi_note, 127))
def midi_note_off(channel, midi_note):
'''Transmit a MIDI "note off" message'''
midi_msg_q.put((0x80 | channel, midi_note, 0))
def midi_control_change(channel, control, value):
'''Transmit a MIDI control change message'''
midi_msg_q_.put((0xb0 | channel, control, value))
@pbshilli Thanks for the suggestion! Using queue.Queue
is often a good solution for this kind of situations.
Please note, however, that writing each MIDI event with offset 0
might introduce audible jitter, especially with very long JACK block sizes.
In such a situation, I would probably try to obtain client.frame_time
in each of the functions, adding a fixed delay (to stay causal) and transmitting this time to the process callback together with the MIDI data. The process callback can then use client.last_frame_time
to calculate an appropriate offset
value for write_midi_event()
.
I think this issue is resolved by the suggestions above and by the new example examples/midi_file_player.py.
I'm trying to write a program to send midi events to a soft synth. I can't tell from the docs what the 'time' parameter should represent.
A real simple example which I thought would work:
Can you let me know if I'm missing something obvious!