-
Notifications
You must be signed in to change notification settings - Fork 260
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MIDI 2.0 Support #625
Comments
It looks like the more important new things are:
|
Thanks for this summary. We will have an SOVERSION bump next release. In a first step I think it's worth going through the API to verify that our data types are wide enough. I'll do so. |
I was observing this issue for a while until I could finally start working on MIDI 2.0 support in my Android FluidsynthMidiDeviceService project for fun: atsushieno/fluidsynth-midi-service-j@74459a4 . Now I'm at the stage where I can invoke So what is doable now is -- I can invoke the function, but it crashes because What would be a good solution from here now? I can think of two (there would be probably more solutions):
|
Good news, thanks for the update.
My suggestion: When creating a Internally, we can do whatever it needs to support MIDI 1 and MIDI 2 via the same API that we already have. Example: We could switch to 16 bit velocities in fluidsynth internally. If the synth has been set up with legacy MIDI 1.0, the velocities would be converted appropriately when passed via functions like P.S.: I'm still not really familiar with MIDI 2 as I'm lacking a use-case myself. So I welcome anybody to contribute to get this implemented. |
I thought more about the actual implementation (for example, In MIDI 2.0 UMP specification, there are various "message types", and there are MIDI1 message type (0x2g) and MIDI2 message type (0x4g) ( That is, when we are dealing with MIDI 2.0 UMP streams, mixing MIDI1 and MIDI2 necessarily happens anyways, regardless of either application or fluidsynth is responsible to convert MIDI1 messages. Since it will happen when we implement MIDI2 based I'd try to hack some proof-of-concept implementation while I'm interested in it. |
Sure, go ahead. But please keep in mind that those "MIDI messages" you're referring to are based on the MIDI protocol, i.e. the lowest-level representation of MIDI to be sent around between hardware devices. As such it probably contains many quirks and workarounds to remain backward compatibility with MIDI1. The synth on the other hand is a high-level API. And I really appreciate to keep this API as simple as possible (esp. without having to duplicate every API-function for MIDI2).
What has the sequencer to do with MIDI2? It's just a class that receives and sends events around. It doesn't care about their contents. If there is a need to introduce new event types, we can talk about that. But again: Just duplicating the existing event types for MIDI2 does not seem like a viable way to go. |
Considering that a sequencer like I would discuss with the actual code to understand how a MIDI 2 sequencer that calls synth functions would look like (excuse my Kotlin code here!) :
(This implementation is clearly wrong at the moment, as IF, Another point I noticed afterwards is, since MIDI-CI "Set New Protocol" messages are sent as Universal SysEx messages, the specification assumes that it can be dynamically changed at run time while the device is connected. We do not have to become compatible with MIDI-CI, but to become compatible with OS-provided MIDI 2.0 APIs (such as CoreMIDI) it will be either required or at least become more straightforward to follow their presumed design. Fixing the internal MIDI protocol at instantiation time however makes it impossible. Even if we don't have distinct
This only partially applies; |
The sequencer processes
fluidsynth could expose helper functions to mitigate this particular conversion trouble. E.g.
Fluidsynth doesn't know about per-note controllers. It's one of the open points in the first comment. So ofc, we need a dedicated way to support this new functionality. I was talking about existing functionality. It could very well be that we'll have to add a behaviour switch enum or something. But the conversion argument alone is not enough to justify this, IMO. Perhaps give me some time to read about MIDI-CI and Universal SysEx that you mentioned and we'll see... |
Alright. Things should become clearer with some working implementation, and then we'd benefit from these discussions. So far, every new bits could be hidden in the nonpublic API, so we could discuss how useful or useless the API signatures would be when some implementation gets ready. |
I am going to make what are likely to be naïve comments as someone who does not fully understand MIDI 2.0 and probably not even MIDI 1.0. It is not my intent to diminish anything being discussed by those who have taken the time to develop a more thorough understanding of MIDI 2.0. I am focusing on what is in: MIDI 2.0 Specification Overview with Minimum Requirements, MIDI Association Document: M2-100-U, Document Version 1.1, Draft Date May 11, 2023, Published June 15, 2023 MIDI Capability Inquiry (MIDI-CI) Bidirectional Negotiations for MIDI Devices, MIDI Association Document: M2-101-UM, Document Version 1.2, Draft Date May 11, 2023, Published June 15, 2023 Universal MIDI Packet (UMP) Format and MIDI 2.0 Protocol With MIDI 1.0 Protocol in UMP Format, MIDI Association Document: M2-104-UM, Document Version 1.1.1, Draft Date 2023-07-19, Published 2023-07-19 To be totally transparent, I am ambivalent about MIDI 2.0. I have a nagging feeling that it a solution in search of a problem that was concocted by a committee of corporate software hotshots and that it is too convoluted to ever capture the interest of the vast majority of working musicians who use MIDI 1.0. I sense that there are those within MMA who have the same feeling. The Executive Summary in MIDI Capability Inquiry begins:
Recognizing that MIDI 2.0 specifications are still drafts I would encourage the FluidSynth developers to move slowly with regard to adding MIDI 2.0 capabilities. It might be helpful to state the goals for FluidSynth with reference to the MIDI 2.0 Specification Overview, 5 Minimum Compatibility Requirements of MIDI 2.0. At least one of the following is needed to claim MIDI 2.0 capability: A. MIDI-CI to at least its minimum requirements Depending on whether A or B is supported there are additional things required. Bidirectional communication is an important element of MIDI 2.0. I don't think this will pose any particular difficulty for FluidSynth but it is worth keeping in mind. What might be more important is the classification as a Receiver and/or Sender. I believe FluidSynth is primarily a Receiver. AFAIK only the MIDI File Player when used to send MIDI Messages causes FluidSynth to act as a Sender. I think it might be worthwhile to consider at least conceptually separating the MIDI File Player as a separate MIDI Device so there are no complications from having to treat FluidSynth as a MIDI Device that is capable of being both a Receiver and a Sender. Perhaps Function Blocks as described in Universal MIDI Packet Section 6, p 28, address this issue? I think Universal MIDI Packet Section 6.2 MIDI 1.0 Byte Stream Ports, p 30, deserves careful study. MIDI 1.0 compatibility is something the MMA has given a lot of thought. Any work done adding MIDI 2.0 capabilities to FluidSynth should follow the guidance given in the MIDI 2.0 specs for maintaining MIDI 1.0 compatibility. Again, I apologize for rehashing things that most people participating in this thread have probably thought about and moved beyond long ago. But hopefully having to explain these things to someone slow will provide additional clarity and focus to the work you are doing. |
The MIDI 2.0 specification has been released:
https://www.midi.org/articles-old/details-about-midi-2-0-midi-ci-profiles-and-property-exchange
This issue serves as a placeholder to figure out what's new and whether / how it can be adopted for fluidsynth.
The text was updated successfully, but these errors were encountered: