PPG WaveGenerator Updated With AudioBus, Improved MIDI Support

ppg-wavegeneratorWolfgang Palm has released a free update to PPG WaveGenerator, his ‘next generation synthesizer, building on the heritage of the PPG Wave keyboards.’

Here’s what new in PPG Wavegenerator 1.3:

  • Audiobus support
  • MIDI Voice Per Channel mode
  • MIDI mono mode
  • MIDI Sustain pedal
  • Improved Keyboard Octave Shifting
  • Optimization of the audio engine – more voices

PPG WaveGenerator is $19.99 in the App Store.

If you’ve used PPG WaveGenerator, let us know what you think about it!

21 thoughts on “PPG WaveGenerator Updated With AudioBus, Improved MIDI Support

  1. The ui looks like a prototype gone into production….i really wish they make use of sliders and knobs in the right places and make the entire screen as a keyboard…they keys are really awfull right now.
    Definately a good ux guy can do it a nice makeover….
    But the sounds are just Killer…great sound and awesome controls on every parameter…and the rate at which the devs are addin the latest features is great…they are quick on addressing issues and adding features..kudos !!!

    1. I agree the ui could look newer, but I like that it doesn’t have knobs and faders. Why do we need faders in a soft synth. PPG provides fader-like visual feedback while you are adjusting it and then collapses when you aren’t. I think that’s awesome! I wish more soft synths worked like that. Don’t get me wrong I love hardware faders but do we really need them taking up real estate on our screens? I say no. I do think PPG would benefit from a full screen keyboard though.

      Tip: while adjusting something in PPG drag to the right for fine adjustment and to the left for course. It can get REALLY precise because you can use the whole screen to adjust one setting.

      1. I personally think it needs a preformance screen, with a keyboard, 2 xy pads, and a few other little things, like something to change presets, so on. That would make it more of an instrument.

        1. well, i think of it, and use it, as an instrument, but i wouldn’t mind seeing some of those changes you mention.

          and i’d like to see a hold *button* pn the keyboard as i want to be able to do it from the front of the ipad, not a midi send.

    1. it means that you can properly setup a fretless synth (you need hop across channels. it behaves like multi-timbral but there is only one voice… Normal Omni behavior is wrong because you can’t apply pitch wheel to channels independently. The MIDI spec doesn’t say to do it the way that most people do..just rewrite all channels to zero, but that’s what most people do, so Omni usually only works correctly on piano controllers as a result, because they don’t have a pitch wheel per finger.)

      1. but… It looks like you need to set the bend width to 4 to get it to work right. midi spec is ambiguous here too, and I had it wrong for a few weeks in this exact same way. but I set my synth to pitch bend size of 4 and it works correctly.

        1. Rob – based on what you know, is his what’s needed for iOS synths to work correctly with things like the QuNexus, which allows per-note expression?

      2. Thanks!

        So, it’s a way to get poly aftertouch and beyond (if you’re willing to take the time to map it to your controller) But it seems suited for monophonic use? Until iPads are powerful enough to use 128 voices. 🙂 Or there is a device that dynamicly assigns a MIDI channel at every note on.

        I hope this is paving the way for a percussion mode where we can load multiple monophonic voices.

        Now I have to figure out the MIDI ccs for x and y. I’ll start sending it random ccs in sunvox.

    1. Many of the patches provide similar expression control to Animoog (effectivey polyphonic pressure control) via sliding up & down the keys. The simplest way to capture that kind of performance is from the keyboard itself (even if it is nowhere near as elegant as Megellan’s or Animoogs ribbon style keyboard interfaces) so midi out is pretty essential.

      1. It is more difficult to implement midi out because WG uses per note midi. Every note you hit is put on a new midi channel. It does this so it can capture XY movement on the keyboard as midi cc values. For instance, if you hold down 3 notes and move vertically on the keyboard, WG would have to send 3 separate channels of midi cc #1.

        Has anyone tried to capture this type of midi signal? Do you just record 16 channels of midi at the same time?

    2. MIDI in and out should be part of any serious software synth – just like they’ve been on hardware synths for 30 years.

      In defense of PPG & Animoog, though, they both offer types of polyphonic control that have never been very practical with hardware synths and MIDI. Based on what several developers has said here on previous posts, there really is no set way to do some of these things with MIDI.

      This stuff will come – but we need to let Moog and PPG and other developers that we want polyphonic control over sounds and they will need to work together to implement it.

  2. For a synth, MIDI IN is an absolute must (and now, really, so is Audiobus). MIDI out is a lot less interesting IMO, there’s already a ton of good controller apps out (TouchOSC, etc.), I’m not sure what making this send note on/off messages via MIDI OUT would add to the party.

    1. Read the post above:
      on December 18, 2012 at 11:14 am said:…
      It’s not just about “send note on/off messages”. Have you actually used Wavegenerator and its keyboard?

Leave a Reply

Your email address will not be published. Required fields are marked *