Byuu and co.’s “Ruby” abstraction paradigm

Ruby

Ruby is an abstraction of a feature over several different drivers, allowing the automatic specification / compilation.

Shouts to the original contributors of bsnes/ bsnes-classic. I have a hope that by explaining Ruby, I can understand how to apply this in my own project, where I aim to produce a self-doing API abstraction for Audio over QT/SDL.

I will only explain the Audio section. I sometimes combine .hpp and .cpp files into one code listing, which I comment the beginning of the CPP with //IMPL

Disclaimer

I created this document as I was studying the Ruby hardware abstraction layer created by various authors – BearOso, byuu, Nach, RedDwarf, wertigon, _willow_, OV2 in no particular order. This library is not to be confused with the Ruby language – it is in fact CPP. You should be able to find it soon at http://byuu.org, but since he’s updating his website it is not currently up. You will have to use this page as a reference. This document explains Ruby as it is in BSNES-Classic. Yes I linked my fork of bsnes-classic, the original can be found at BSNES-Classic.

Be forewarned that I did not care much to perfect this document as my understanding grew. I used it as a basis. I too often grow wings from my basis and forget / am too lazy to come back to the ground and document all that I’ve learned. It’s arduous. So here’s to another imperfected document 🙂

PreReq

It uses nall, which is another invention in bsnes, a rather unknown one. Understanding it will help a lot.

ruby.hpp

Here is where the high-level __Interface classes are defined.. __ is my substitute for [Audio|Video|Input]But this file includes, before these class definitions, __.cpp .. So let’s look at those first and come back to this.

audio.cpp

    class Audio {
    public:
      static const char *Volume;
      static const char *Resample;
      static const char *ResampleRatio;

      static const char *Handle;
      static const char *Synchronize;
      static const char *Frequency;
      static const char *Latency;

      virtual bool cap(const nall::string& name) { return false; }
      virtual nall::any get(const nall::string& name) { return false; }
      virtual bool set(const nall::string& name, const nall::any& value) { return false; }

      virtual void sample(uint16_t left, uint16_t right) {}
      virtual void clear() {}
      virtual bool init() { return true; }
      virtual void term() {}

      Audio() {}
      virtual ~Audio() {}
    };

I guess cap() means is_capable.. But we see a bunch of static variables and virtual functions with default implementations.. What’s that about a virtual destructor?? What’s that for?

Notice some small use of nall::any, I understand its use but not much about how it works..

Now, regarding the static variables.. These are later defined as what I consider to be “string labels” for kinds of variables to interact with.

    const char *Audio::Volume = "Volume";
    const char *Audio::Resample = "Resample";
    const char *Audio::ResampleRatio = "ResampleRatio";

    const char *Audio::Handle = "Handle";
    const char *Audio::Synchronize = "Synchronize";
    const char *Audio::Frequency = "Frequency";
    const char *Audio::Latency = "Latency";

The AudioInterface is actually the one to interact with these static Audio variable string labels.

ruby.hpp

    class AudioInterface {
    public:
      void driver(const char *driver = "");
      const char* default_driver();
      const char* driver_list();
      bool init();
      void term();

      bool cap(const nall::string& name);
      nall::any get(const nall::string& name);
      bool set(const nall::string& name, const nall::any& value);

      void sample(uint16_t left, uint16_t right);
      void clear();
      AudioInterface();
      ~AudioInterface();

    private:
      Audio *p;

      unsigned volume;

      //resample unit
      double hermite(double mu, double a, double b, double c, double d);
      bool   resample_enabled;
      double r_step, r_frac;
      int    r_left[4], r_right[4];
    };
    
    extern AudioInterface audio;

Notice the Audio *p;

ruby.cpp

The __Interface classes are defined here, except there is a ruby_audio.cpp which the AudioInterface implementation has been segmented into. I imagine it was planned to do the same with the other Interface class implementations. So for that reason, I’ll include ruby_audio.cpp

Notice below that we do not yet see how p is given an address. I ignore most implementation details for now. The general idea is to do certain things when p is defined, otherwise just do default behaviors. There are some audio routines that do not require p near the end.

    const char *Audio::Volume = "Volume";
    const char *Audio::Resample = "Resample";
    const char *Audio::ResampleRatio = "ResampleRatio";

    const char *Audio::Handle = "Handle";
    const char *Audio::Synchronize = "Synchronize";
    const char *Audio::Frequency = "Frequency";
    const char *Audio::Latency = "Latency";

    bool AudioInterface::init() {
      if(!p) driver();
      return p->init();
    }

    void AudioInterface::term() {
      if(p) {
        delete p;
        p = 0;
      }
    }

    bool AudioInterface::cap(const string& name) {
      if(name == Audio::Volume) return true;
      if(name == Audio::Resample) return true;
      if(name == Audio::ResampleRatio) return true;

      return p ? p->cap(name) : false;
    }

    any AudioInterface::get(const string& name) {
      if(name == Audio::Volume) return volume;
      if(name == Audio::Resample) return resample_enabled;
      if(name == Audio::ResampleRatio) return r_step;

      return p ? p->get(name) : false;
    }

    bool AudioInterface::set(const string& name, const any& value) {
      if(name == Audio::Volume) {
        volume = any_cast<unsigned>(value);
        return true;
      }

      if(name == Audio::Resample) {
        resample_enabled = any_cast<bool>(value);
        return true;
      }

      if(name == Audio::ResampleRatio) {
        r_step = any_cast<double>(value);
        r_frac = 0;
        return true;
      }

      return p ? p->set(name, value) : false;
    }

    //4-tap hermite interpolation
    double AudioInterface::hermite(double mu1, double a, double b, double c, double d) {
      const double tension = 0.0; //-1 = low, 0 = normal, 1 = high
      const double bias    = 0.0; //-1 = left, 0 = even, 1 = right

      double mu2, mu3, m0, m1, a0, a1, a2, a3;

      mu2 = mu1 * mu1;
      mu3 = mu2 * mu1;

      m0  = (b - a) * (1 + bias) * (1 - tension) / 2;
      m0 += (c - b) * (1 - bias) * (1 - tension) / 2;
      m1  = (c - b) * (1 + bias) * (1 - tension) / 2;
      m1 += (d - c) * (1 - bias) * (1 - tension) / 2;

      a0 = +2 * mu3 - 3 * mu2 + 1;
      a1 =      mu3 - 2 * mu2 + mu1;
      a2 =      mu3 -     mu2;
      a3 = -2 * mu3 + 3 * mu2;

      return (a0 * b) + (a1 * m0) + (a2 * m1) + (a3 * c);
    }

    void AudioInterface::sample(uint16_t left, uint16_t right) {
      int s_left  = (int16_t)left;
      int s_right = (int16_t)right;

      if(volume != 100) {
        s_left  = sclamp<16>((double)s_left  * (double)volume / 100.0);
        s_right = sclamp<16>((double)s_right * (double)volume / 100.0);
      }

      r_left [0] = r_left [1];
      r_left [1] = r_left [2];
      r_left [2] = r_left [3];
      r_left [3] = s_left;

      r_right[0] = r_right[1];
      r_right[1] = r_right[2];
      r_right[2] = r_right[3];
      r_right[3] = s_right;

      if(resample_enabled == false) {
        if(p) p->sample(left, right);
        return;
      }

      while(r_frac <= 1.0) {
        int output_left  = sclamp<16>(hermite(r_frac, r_left [0], r_left [1], r_left [2], r_left [3]));
        int output_right = sclamp<16>(hermite(r_frac, r_right[0], r_right[1], r_right[2], r_right[3]));
        r_frac += r_step;
        if(p) p->sample(output_left, output_right);
      }

      r_frac -= 1.0;
    }

    void AudioInterface::clear() {
      r_frac = 0;
      r_left [0] = r_left [1] = r_left [2] = r_left [3] = 0;
      r_right[0] = r_right[1] = r_right[2] = r_right[3] = 0;
      if(p) p->clear();
    }

    AudioInterface::AudioInterface() {
      p = 0;
      volume = 100;
      resample_enabled = false;
      r_step = r_frac = 0;
      r_left [0] = r_left [1] = r_left [2] = r_left [3] = 0;
      r_right[0] = r_right[1] = r_right[2] = r_right[3] = 0;
    }

    AudioInterface::~AudioInterface() {
      term();
    }

ruby_impl.cpp

Here is the audio section

  /* Audio */

    #define DeclareAudio(Name) \
      class Audio##Name : public Audio { \
      public: \
        bool cap(const string& name) { return p.cap(name); } \
        any get(const string& name) { return p.get(name); } \
        bool set(const string& name, const any& value) { return p.set(name, value); } \
        \
        void sample(uint16_t left, uint16_t right) { p.sample(left, right); } \
        void clear() { p.clear(); } \
        bool init() { return p.init(); } \
        void term() { p.term(); } \
        \
        Audio##Name() : p(*new pAudio##Name) {} \
        ~Audio##Name() { delete &p; } \
      \
      private: \
        pAudio##Name &p; \
      };

    #ifdef AUDIO_ALSA
      #include <ruby/audio/alsa.cpp>
    #endif

    #ifdef AUDIO_AO
      #include <ruby/audio/ao.cpp>
    #endif

    #ifdef AUDIO_DIRECTSOUND
      #include <ruby/audio/directsound.cpp>
    #endif

    #ifdef AUDIO_OPENAL
      #include <ruby/audio/openal.cpp>
    #endif

    #ifdef AUDIO_OSS
      #include <ruby/audio/oss.cpp>
    #endif

    #ifdef AUDIO_PULSEAUDIO
      #include <ruby/audio/pulseaudio.cpp>
    #endif

    #ifdef AUDIO_PULSEAUDIOSIMPLE
      #include <ruby/audio/pulseaudiosimple.cpp>
    #endif

    #ifdef AUDIO_XAUDIO2
      #include <ruby/audio/xaudio2.cpp>
    #endif

This section declares a macro to automatically generate [what I consider to be] a wrapper class, whose point I misunderstand [or it may in fact be pointless]. The preprocessor directive is used inside of the dependent includes.

Let’s take a look at

audio/openal.cpp

    /*
      audio.openal (2007-12-26)
      author: Nach
      contributors: byuu, wertigon, _willow_
    */

    #if defined(PLATFORM_OSX)
      #include <OpenAL/al.h>
      #include <OpenAL/alc.h>
    #else
      #include <AL/al.h>
      #include <AL/alc.h>
    #endif

There is more to this file but I want to first explain the reason for the OSX difference is because OSX has a native OpenAL framework in /System/Library/Frameworks/OpenAL.framework/ and the framework header is referenced first by the framework’s name [unless the framework Header directory is directly referenced as an include dir].

    namespace ruby {

    class pAudioOpenAL {
    public:
      struct {
        ALCdevice *handle;
        ALCcontext *context;
        ALuint source;
        ALenum format;
        unsigned latency;
        unsigned queue_length;
      } device;

      struct {
        uint32_t *data;
        unsigned length;
        unsigned size;
      } buffer;

      struct {
        bool synchronize;
        unsigned frequency;
        unsigned latency;
      } settings;

      bool cap(const string& name) {
        if(name == Audio::Synchronize) return true;
        if(name == Audio::Frequency) return true;
        if(name == Audio::Latency) return true;
        return false;
      }

      any get(const string& name) {
        if(name == Audio::Synchronize) return settings.synchronize;
        if(name == Audio::Frequency) return settings.frequency;
        if(name == Audio::Latency) return settings.latency;
        return false;
      }

      bool set(const string& name, const any& value) {
        if(name == Audio::Synchronize) {
          settings.synchronize = any_cast<bool>(value);
          return true;
        }

        if(name == Audio::Frequency) {
          settings.frequency = any_cast<unsigned>(value);
          return true;
        }

        if(name == Audio::Latency) {
          if(settings.latency != any_cast<unsigned>(value)) {
            settings.latency = any_cast<unsigned>(value);
            update_latency();
          }
          return true;
        }

        return false;
      }

      void sample(uint16_t sl, uint16_t sr) {
        buffer.data[buffer.length++] = sl + (sr << 16);
        if(buffer.length < buffer.size) return;

        ALuint albuffer = 0;
        int processed = 0;
        while(true) {
          alGetSourcei(device.source, AL_BUFFERS_PROCESSED, &processed);
          while(processed--) {
            alSourceUnqueueBuffers(device.source, 1, &albuffer);
            alDeleteBuffers(1, &albuffer);
            device.queue_length--;
          }
          //wait for buffer playback to catch up to sample generation if not synchronizing
          if(settings.synchronize == false || device.queue_length < 3) break;
        }

        if(device.queue_length < 3) {
          alGenBuffers(1, &albuffer);
          alBufferData(albuffer, device.format, buffer.data, buffer.size * 4, settings.frequency);
          alSourceQueueBuffers(device.source, 1, &albuffer);
          device.queue_length++;
        }

        ALint playing;
        alGetSourcei(device.source, AL_SOURCE_STATE, &playing);
        if(playing != AL_PLAYING) alSourcePlay(device.source);
        buffer.length = 0;
      }

      void clear() {
      }

      void update_latency() {
        if(buffer.data) delete[] buffer.data;
        buffer.size = settings.frequency * settings.latency / 1000.0 + 0.5;
        buffer.data = new uint32_t[buffer.size];
      }

      bool init() {
        update_latency();
        device.queue_length = 0;

        bool success = false;
        if(device.handle = alcOpenDevice(NULL)) {
          if(device.context = alcCreateContext(device.handle, NULL)) {
            alcMakeContextCurrent(device.context);
            alGenSources(1, &device.source);

            //alSourcef (device.source, AL_PITCH, 1.0);
            //alSourcef (device.source, AL_GAIN, 1.0);
            //alSource3f(device.source, AL_POSITION, 0.0, 0.0, 0.0);
            //alSource3f(device.source, AL_VELOCITY, 0.0, 0.0, 0.0);
            //alSource3f(device.source, AL_DIRECTION, 0.0, 0.0, 0.0);
            //alSourcef (device.source, AL_ROLLOFF_FACTOR, 0.0);
            //alSourcei (device.source, AL_SOURCE_RELATIVE, AL_TRUE);

            alListener3f(AL_POSITION, 0.0, 0.0, 0.0);
            alListener3f(AL_VELOCITY, 0.0, 0.0, 0.0);
            ALfloat listener_orientation[] = { 0.0, 0.0, 0.0, 0.0, 0.0, 0.0 };
            alListenerfv(AL_ORIENTATION, listener_orientation);

            success = true;
          }
        }

        if(success == false) {
          term();
          return false;
        }

        return true;
      }

      void term() {
        if(alIsSource(device.source) == AL_TRUE) {
          int playing = 0;
          alGetSourcei(device.source, AL_SOURCE_STATE, &playing);
          if(playing == AL_PLAYING) {
            alSourceStop(device.source);
            int queued = 0;
            alGetSourcei(device.source, AL_BUFFERS_QUEUED, &queued);
            while(queued--) {
              ALuint albuffer = 0;
              alSourceUnqueueBuffers(device.source, 1, &albuffer);
              alDeleteBuffers(1, &albuffer);
              device.queue_length--;
            }
          }

          alDeleteSources(1, &device.source);
          device.source = 0;
        }

        if(device.context) {
          alcMakeContextCurrent(NULL);
          alcDestroyContext(device.context);
          device.context = 0;
        }

        if(device.handle) {
          alcCloseDevice(device.handle);
          device.handle = 0;
        }

        if(buffer.data) {
          delete[] buffer.data;
          buffer.data = 0;
        }
      }

      pAudioOpenAL() {
        device.source = 0;
        device.handle = 0;
        device.context = 0;
        device.format = AL_FORMAT_STEREO16;
        device.queue_length = 0;

        buffer.data = 0;
        buffer.length = 0;
        buffer.size = 0;

        settings.synchronize = true;
        settings.frequency = 22050;
        settings.latency = 40;
      }

      ~pAudioOpenAL() {
        term();
      }
    };

    DeclareAudio(OpenAL)

    };

To bring this all full-circle with SNES .. and QT ..

the SNES’ DSP (snes/dsp/echo.cpp:109) calls a SNES::Audio class’s sample() function. SNES::Audio :

  class Audio {
  public:
    void coprocessor_enable(bool state);
    void coprocessor_frequency(double frequency);
    void sample(int16 left, int16 right);
    void coprocessor_sample(int16 left, int16 right);
    void init();

  private:
    bool coprocessor;
    uint32 dsp_buffer[32768], cop_buffer[32768];
    unsigned dsp_rdoffset, cop_rdoffset;
    unsigned dsp_wroffset, cop_wroffset;
    unsigned dsp_length, cop_length;

    double r_step, r_frac;
    int r_sum_l, r_sum_r;

    void flush();
  };

  extern Audio audio;
  //IMPL
  #ifdef SYSTEM_CPP

  Audio audio;

  // …

  void Audio::sample(int16 left, int16 right) {
    if(coprocessor == false) {
      system.interface->audio_sample(left, right);
    } else {
      dsp_buffer[dsp_wroffset] = ((uint16)left << 0) + ((uint16)right << 16);
      dsp_wroffset = (dsp_wroffset + 1) & 32767;
      dsp_length = (dsp_length + 1) & 32767;
      flush();
    }
  }

  // …

  void Audio::flush() {
    while(dsp_length > 0 && cop_length > 0) {
      uint32 dsp_sample = dsp_buffer[dsp_rdoffset];
      uint32 cop_sample = cop_buffer[cop_rdoffset];

      dsp_rdoffset = (dsp_rdoffset + 1) & 32767;
      cop_rdoffset = (cop_rdoffset + 1) & 32767;

      dsp_length--;
      cop_length--;

      int dsp_left  = (int16)(dsp_sample >>  0);
      int dsp_right = (int16)(dsp_sample >> 16);

      int cop_left  = (int16)(cop_sample >>  0);
      int cop_right = (int16)(cop_sample >> 16);

      system.interface->audio_sample(
        sclamp<16>((dsp_left  + cop_left ) / 2),
        sclamp<16>((dsp_right + cop_right) / 2)
      );
    }
  }

  #endif

At some point or another (thru flush()), system.interface->audio_sample(l, r) is called. I did my homework, and this actually points to ui-qt’s Interface class.

There is an SNES::Interface class that ui-qt’s Interface subclasses. It contains the audio routine called from SNES::Audio for processing samples (push method), which calls the AudioInterface’s sample() method (confusingly the AudioInterface object is also called audio).

First let’s look at SNES::Interface abstract class:

    class Interface {
  public:
    virtual void video_refresh(const uint16_t *data, unsigned width, unsigned height) {}
    virtual void audio_sample(uint16_t l_sample, uint16_t r_sample) {}
    virtual void input_poll() {}
    virtual int16_t input_poll(bool port, Input::Device device, unsigned index, unsigned id) { return 0; }

    virtual void message(const string &text) { print(text, "\n"); }
  };

and now ui-qt’s Interface module:

  class Interface : public SNES::Interface {
  public:
    void video_refresh(const uint16_t *data, unsigned width, unsigned height);
    void audio_sample(uint16_t left, uint16_t right);
    void input_poll();
    int16_t input_poll(bool port, SNES::Input::Device device, unsigned index, unsigned id);
    void message(const string &text);

    Interface();
    void captureScreenshot(uint32_t*, unsigned, unsigned, unsigned);
    bool saveScreenshot;
    bool framesUpdated;
    unsigned framesExecuted;
  };

  extern Interface interface;
  // IMPL
  Interface interface;
  
  // … 
  
  void Interface::audio_sample(uint16_t left, uint16_t right) {
    if(config().audio.mute) left = right = 0;
    audio.sample(left, right);
  }

  // …
  Interface::Interface() {
    saveScreenshot = false;
  }

SNES::System is provided an interface by way of SNES::System::init()

  void System::init(Interface *interface_) {
    interface = interface_;
    assert(interface != 0);

    supergameboy.init();
    superfx.init();
    sa1.init();
    necdsp.init();
    bsxbase.init();
    bsxcart.init();
    bsxflash.init();
    srtc.init();
    sdd1.init();
    spc7110.init();
    cx4.init();
    obc1.init();
    st0018.init();
    msu1.init();
    serial.init();

    video.init();
    audio.init();
    input.init();

    input.port_set_device(0, config.controller_port1);
    input.port_set_device(1, config.controller_port2);
  }

This interface is assigned from ui-qt/application/application.cpp Application::main()

  int Application::main(int &argc, char **argv) {
    app = new App(argc, argv);
    #if !defined(PLATFORM_WIN)
    app->setWindowIcon(QIcon(":/bsnes.png"));
    #else
    //Windows port uses 256x256 icon from resource file
    CoInitialize(0);
    utf8_args(argc, argv);
    #endif

    initPaths(argv[0]);
    locateFile(configFilename = "bsnes-qt.cfg", true);
    locateFile(styleSheetFilename = "style.qss", false);

    string customStylesheet;
    if(customStylesheet.readfile(styleSheetFilename) == true) {
      app->setStyleSheet((const char*)customStylesheet);
    } else {
      app->setStyleSheet(defaultStylesheet);
    }

    config().load(configFilename);
    mapper().bind();
    init();
    SNES::system.init(&interface);
    mainWindow->system_loadSpecial_superGameBoy->setVisible(SNES::supergameboy.opened());

    if(argc == 2) {
      //if valid file was specified on the command-line, attempt to load it now
      cartridge.loadNormal(argv[1]);
    }

    timer = new QTimer(this);
    connect(timer, SIGNAL(timeout()), this, SLOT(run()));
    timer->start(0);
    app->exec();

    //QbWindow::close() saves window geometry for next run
    for(unsigned i = 0; i < windowList.size(); i++) {
      windowList[i]->close();
    }

    cartridge.unload();
    config().save(configFilename);
    return 0;
  }

The interfaces are assigned by config file or defaults in ui-qt/application/init.cpp Application::init()

  void Application::init() {
    if(config().system.crashedOnLastRun == true) {
      //emulator crashed on last run, disable all drivers
      QMessageBox::warning(0, "bsnes Crash Notification", string() <<
      "<p><b>Warning:</b><br>bsnes crashed while attempting to initialize device "
      "drivers the last time it was run.</p>"
      "<p>To prevent this from occurring again, all drivers have been disabled. Please "
      "go to Settings->Configuration->Advanced and choose new driver settings, and then "
      "restart the emulator for the changes to take effect. <i>Video, audio and input "
      "will not work until you do this!</i></p>"
      "<p><b>Settings that caused failure on last run:</b><br>"
      << "Video driver: " << config().system.video << "<br>"
      << "Audio driver: " << config().system.audio << "<br>"
      << "Input driver: " << config().system.input << "<br></p>"
      );

      config().system.video = "None";
      config().system.audio = "None";
      config().system.input = "None";
    }

    if(config().system.video == "") config().system.video = video.default_driver();
    if(config().system.audio == "") config().system.audio = audio.default_driver();
    if(config().system.input == "") config().system.input = input.default_driver();

    mainWindow = new MainWindow;
    loaderWindow = new LoaderWindow;
    htmlViewerWindow = new HtmlViewerWindow;
    aboutWindow = new AboutWindow;
    fileBrowser = new FileBrowser;
    stateSelectWindow = new StateSelectWindow;

    //window must be onscreen and visible before initializing video interface
    utility.updateSystemState();
    utility.resizeMainWindow();
    utility.updateFullscreenState();
    QApplication::processEvents();

    #if defined(DEBUGGER)
    debugger = new Debugger;
    #endif
    settingsWindow = new SettingsWindow;
    toolsWindow = new ToolsWindow;

    //if emulator crashes while initializing drivers, next run will disable them all.
    //this will allow user to choose different driver settings.
    config().system.crashedOnLastRun = true;
    config().save(configFilename);

    video.driver(config().system.video);
    video.set(Video::Handle, (uintptr_t)mainWindow->canvas->winId());
    video.set("QWidget", (QWidget*)mainWindow->canvas);
    if(video.init() == false) {
      QMessageBox::warning(0, "bsnes", string() <<
        "<p><b>Warning:</b> " << config().system.video << " video driver failed to initialize. "
        "Video driver has been disabled.</p>"
        "<p>Please go to Settings->Configuration->Advanced and choose a different driver, and "
        "then restart the emulator for the changes to take effect.</p>"
      );
      video.driver("None");
      video.init();
    }

    audio.driver(config().system.audio);
    audio.set(Audio::Handle, (uintptr_t)mainWindow->canvas->winId());
    audio.set(Audio::Frequency, config().audio.outputFrequency);
    audio.set(Audio::Latency, config().audio.latency);
    audio.set(Audio::Volume, config().audio.volume);
    if(audio.init() == false) {
      QMessageBox::warning(0, "bsnes", string() <<
        "<p><b>Warning:</b> " << config().system.audio << " audio driver failed to initialize. "
        "Audio driver has been disabled.</p>"
        "<p>Please go to Settings->Configuration->Advanced and choose a different driver, and "
        "then restart the emulator for the changes to take effect.</p>"
      );
      audio.driver("None");
      audio.init();
    }

    input.driver(config().system.input);
    input.set("Handle", (uintptr_t)mainWindow->canvas->winId());
    if(input.init() == false) {
      QMessageBox::warning(0, "bsnes", string() <<
        "<p><b>Warning:</b> " << config().system.input << " input driver failed to initialize. "
        "Input driver has been disabled.</p>"
        "<p>Please go to Settings->Configuration->Advanced and choose a different driver, and "
        "then restart the emulator for the changes to take effect.</p>"
      );
      input.driver("None");
      input.init();
    }

    //didn't crash, note this in the config file now in case a different kind of crash occurs later
    config().system.crashedOnLastRun = false;
    config().save(configFilename);

    //hide pixel shader settings if driver does not support them
    videoSettingsWindow->synchronizePixelShaderSettings();

    utility.resizeMainWindow();
    utility.updateAvSync();
    utility.updateColorFilter();
    utility.updatePixelShader();
    utility.updateHardwareFilter();
    utility.updateSoftwareFilter();
    utility.updateEmulationSpeed();
    utility.updateControllers();
  }
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s