26 thoughts on “SampleWiz Sneak Preview

  1. It's time for you to RETIRE the thought of MIDI functionality for the iPad.
    Give all the developers another year or two to get through all the trials and errors.

  2. I think it's fair to hold their feet to the fire on this – but we shouldn't let inconsistent MIDI support keep us from appreciating and using what's possible now.

    Two of my favorite synths – an MOTM modular and a Pro-One – have no native MIDI support, and they are no less useful.

  3. can i control this ipod app with another ipod app? please update front page with more ipod apps that i can look at with my app. also, don't send anyone to musical equipment trade shows, just pull the worst shit off an aggregator.

  4. lol. Pfft.

    There are no trials and errors. They should just fucking follow what Finger did with MoDrum and BassLine.

    1. Perfect MIDI sync. – across no less than 3 different connection methods.
    2. Apps that can communicate via sync on the SAME iDevice.
    3. Apps that remain open and functioning while you switch to other apps or shut off the screen.
    4. Apps that can communicate across multiple iOS devices.

    That was all accomplished by ONE guy – not even 6 months after CoreMIDI was implemented.

    AND he's generously offered his knowledge to help any other developer do the same thing.

    Therefore: It's fucking LUDICROUS that any other developers couldn't implement the same features immediately.

    ffs.

  5. I agree. But consider this: It would be UNTHINKABLE to release a hardware synth without fully functioning MIDI. Un..fucking..thinkable.

    I utterly don't understand – when it's patently obvious that it can be done – why that wouldn't also apply to any iPad instrument app. ?

    Rhodan is gonna do what Rhodan is going to do. But it's particularly shameful that he, of all people, wouldn't insist that his programmers include MIDI in apps that feature is name.

    lol.

    Jebus. It's shameful that I have to actually keep preaching this crap. lol.

  6. I actually got the chance to talk to JR himself, and he said that originally he explictly didn’t want MIDI in MorphWiz because he thought it detracted from a complete “in the box” experience that didn’t rely on extra gear to make music. He also said that after talking to a number of (sane) people, he realized that was a stupid way of thinking, and that MIDI actually made it much more useful.

  7. Much more useful, but still no MIDI update in Morphwiz. lol.

    meh. I'm going to make this easy on devs: I'm not buying any more apps that don't give me full MIDI functionality.

    There. Simple. lol.

  8. “It would be UNTHINKABLE to release a hardware synth without fully functioning MIDI. Un..fucking..thinkable.”

    @loopstationzebra: Not if you’re Korg, apparently. However, I kind of like the Monotron, and I’m tempted by the Monotribe, neither of which seems to have much in terms of external control interfaces. And maybe synthtopia has ruined my brain, but I am beginning to think that MIDI may be overrated and that CV/Gate/Sync has some advantages.

    1. guest – agreed!

      I'm a big fan of both iApps and MIDI – but nothing is tighter than sticking with CV's and Gates.

      Also – old computer & hardware MIDI sequencers are much tighter than MIDI sequenced from modern computer DAWs, because the old sequencers weren't juggling 50 background tasks.

      loopstationzebra – it's as simple as don't buy apps you don't like. There are more music apps popping up on iOS than on any other platform, so just vote with your money. I wish there was a "MIDI Seal Of Approval" on apps, though!

  9. MorphWiz with MIDI does appear to be under development, although it isn’t out yet – he demonstrated it at NAMM and Macworld.

  10. Well….check out my various diatribes about Korg in any number of forums, blogs, lol.

    Korg has truly forgotten what the word MIDI means…

    Shameful.

  11. There are multiple options for integrating CV's into DAW workflows.

    Most people using CVs, though, synth their gear and then record multiple tracks of tightly synced audio.

    That's a different discussion, though!

  12. iOS musicians: you're in luck! Flush from his success with MorphWiz, legendary keyboardist and app developer JC Rudess brings us SampleWiz, a new iOS app with clear, liquid tones that bring instant relief for those who have trouble getting their creative juices flowing freely.

  13. In general with iPad: MIDI is a support headache on iPad. Everybody want's a $5 app, but you spend all day teaching everybody how to hook up MIDI to your app, which means that in the end, the app should have been double the price everybody thinks they should pay. Self-containment is a beautiful thing; no stuck notes, no WIFI latency, no combinations involving hardware that you don't own to test against.

    And it cannot be said enough that the iPad is NOT a box of buttons and a pitch wheel, which is the only context in which MIDI excels. Unfortunately, the glass-surface/fretless nature of the interface means that the limitations of MIDI will simply cripple some instruments. (Ie: 10 fingers of pitches with 6 octave bend range, microtonality. Fretless microtonality is the last straw, because there are no tuning tables in this context and the design of MIDI bends is inappropriate for the task.).

    MIDI is being ignored on this platform because it isn't really relevant. You need an internal sound engine no matter what else you do for starters. What should probably happen *instead* of MIDI is a standard similar to Pd or MaxMSP to transport patches and fx *between* instruments, and disconnect them after the transfer is done. There is this assumption that you must connect gesture hardware to an external brain.

  14. (and of course I know and work with these guys. i am curious to ask the midi question to them myself. i'm just giving you my viewpoint. i don't think the dearth of MIDI support in iPad apps is mysterious at all. it is probably not as wrong or as lazy as it sounds either. as an iPad developer, the combination of a raw AudioUnit samplebuffer, raw touches continuous across the screen, and OpenGL is quite awesome. This combination puts you in a completely different mindset from any MIDI instrument.)

  15. lol. now that's the spirit. and that's how it should be. 😉 it mostly depends upon the type of music app you are writing.

    i didn't put MIDI into Mugician. it would have been a distraction at the time, and would have instantly killed off all of the microtonality experimentation that made it different. the my new app might have it to satisfy the small number of (very loud! but sometimes high profile) people who will even try it once. i hear stuff that will not be true "it would be a $100 app" with MIDI. but it definitely will attract some very loud and high maintenence users. 😉

    writing your own sound engine is 100x harder than just sticking in MIDI support. ipad apps aren't all forgoing it out of laziness. i think in time, you will appreciate what this platform is doing: slowly killing off a standard that is killing actual creativity. when you finally twisted your original idea to fit into MIDI's very limited view of gestures, you end up with an ordinary instrument that might be a novel shape. ….

    my expectation is for Pd, or maybe MaxMSP, ChucK, or SuperCollider…. SOMETHING that's actually frequency oriented from the ground up to get created for the iPad that causes you to wake up one day and realize that the new 'protocol' is portable patches moving between instruments that only connect to beat-sync if at all. the fact that you are starting with a smooth continuous surface kind of dictates that this is what happens in this platform in the long term.

    think of a violin-like interface for new instruments. i know that some people see the world in terms of step sequencers and rather normal gestural interfaces… some of that stuff is cool, but i don't think that's where any more big changes will come from.

  16. "i think in time, you will appreciate what this platform is doing: slowly killing off a standard that is killing actual creativity. when you finally twisted your original idea to fit into MIDI's very limited view of gestures, you end up with an ordinary instrument that might be a novel shape."

    As much as I think MIDI is a requirement for standard music apps – I'd have to agree that imposes limitations on what you can do.

    It's great to see that we can have both worlds on a new platform like the iPad.

  17. The main use cases for MIDI:

    1) MIDI sync: some way to beat sync. this matters greatly if you have a sequencer app.

    2) MIDI in: somebody made yet another iOS app with an interface that looks like a keyboard, so you know… you must have midi INPUT to drive the brain, because a keyboard on iPad is just unplayable. the actual mistake is the keyboard interface on the iPad. It's just ergonomically wrong in every way IMHO. Check out what I am doing with Mugician and Pythagoras, and what Roger Linn is doing with Linnstrument. There are other layout alternatives that work. My Mugician and Pythagoras playing is far better than my guitar playing, in spite of the fact that I am a lefty guitarist playing what amounts to a right handed layout. The best keyboardists are a shadow of their keyboardist selves on iPad keyboard layouts. It's useful to plugin your favorite keyboard for MIDI IN, but the problem is more to do with people consistently choosing to put a keyboard interface on a piece of hardware that just doesn't support those dimensions in any ergonomically reasonable way.

    3) MIDI Out: I am getting to the point where it will be easier to just implement MIDI out rather than continue to tweak the sound engine. For the simple case in mono mode, I am sure it will be pretty fantastic. But my instrument explicitly supports 12 note per octave, 24 note per octave, 53 notes per octave scales, and 665 note per octave scales. (These are meaningful numbers that have special properties with respect to the harmonic series… I digress). MIDI protocol is just BOBO when polyphony becomes involved. The whole point of microtonality is to get the wavelengths of notes in a chord to lock so that they land exactly on zero like a couple drummers playing quarter and triplet notes landing on 1 together. The sound difference is astonishing, and it is something that you can't fix with good patches in the engine; because it's an issue with the pitches going in right. So, MIDI bends assume box of buttons and a wheel. So the whole channel goes up and down with the bend message, so this is useless for microtonality. So you put every note on a different channel, and a lot of hardware doesn't like your instrument to be spread out across multiple channels like this. You have 10 fingers, and a lot of stuff thinks that CH10 should be drums. Then you get to the issue of bends in general… 14bits of resolution, by default it's only across a whole tone up or down. If you increase the range (this is not portable), then you smear this resolution across wider ranges.

    FFS!!! It's so dumb! If you start a tone at 1Hz and slowly double the period every few seconds until the highest pitch, then that's a perfectly reasonable thing to do with frequencies. If you do that in different directions with multiple strings, this is perfectly valid. MIDI is just too high level. The sound rendering brain needs to be FREQUENCY oriented, while all the NOTE ON/ NOTEOFF junk is a filter in between the keyboard and the engine.

    Fretless instruments cannot use tuning tables. If there are 1024 pixels wide for an octave range, then you have a 1024 frequency per octave instrument. And these probably aren't even tempered because you will do something to make sure that simultaneous pitches either snap to a standard set of pitches or chord in harmonic ratios close to the current pixels. This is what Pythagoras' freless mode is btw… 665 Equal temperment, which is the very deep Just Circle of Fifths approximation beyond the next closest 53 Equal temperment approx, where 12 ET is somewhat weak but simplifies the music theory to wield it.

    If this all sounds wierd or esoteric, it's because from a MIDI point of view this kind of thing just never happens. MIDI itself is strangling experimentation and progress. The Fourier Transform is microtonal, the natural world's waves are microtonal. Electronic music should have been… It's MIDI's fault. I'm glad that lone developers are playing around with raw sound buffers again.

    That being said… I AM looking at MIDI OUT for Pythagoras. I can probably make it kick-ass in 12ET mode, and be OK in 24ET mode. My experience writing a MIDI keyboard instrument Xstrument (a Samchillian variant) left me irritated about MIDI though… My main problem will probably be latency. I can play Pythagoras so fast that WIFI may make it unuseable. There's an enormous difference between getting beat sync within low tolerance, and getting finger-to-ear latency right. More than about 50ms of latency in an instrument will dash the hope of anybody using it on stage with any real skill, because you can't keep a straight rhythm while you chase the jittered and delayed version of what you feel in your fingers.

Leave a Reply to loopstationzebra Cancel reply

Your email address will not be published. Required fields are marked *