I can also see a MIDI response in MainStage as well when I trigger from the M-Audio.Īny help and/or suggestions are very much welcome and thank you for viewing this thread. It should be noted that when I'm in the Mac's Audio/MIDI preferences, I can see the MIDI signal and I can see it within Reaper as well but no sound whatsoever. It's hard to believe that I would need an external soundcard to achieve this in theory, the ability to route this should work with my current setup. I've also seen people building aggregate devices within OS X to compile all of the useful elements together but mainly with a soundcard of some sort. A concert can store all the sounds you’ll use in an entire performance or a series of performances. Where there has been people having success with Live, has anybody had any success with what I'm proposing? I'm especially looking to record the actual MIDI data into Reaper from MainStage. In MainStage, you organize and access your sounds in concerts. Yes.I understand I'm not using Live here but the main purpose of watching these videos is to get a general understanding of the routing scheme. The videos I've seen are people doing this with Ableton Live, Soundflower, and MainStage. I've seen a couple of videos on YouTube touching on this but not anything specific to Reaper. I have a grasp on how the routing is supposed to go but I'm obviously missing something. All 3 pieces of software are up to date and so is my OS. What I'm trying to achieve is being able to record audio and MIDI output from MainStage into Reaper via Soundflower. I do see that some of you have had success but I'm wondering if its the same success I'm looking for. I've been searching the net for a solution to get Reaper, Soundflower, and MainStage all working together. but "it's a thing" we need to look at.I'm new to the group. I'm checking what the further consequences are, and what the workaround would need to be. Oh, another thing that's incompatible between versions: exporting stems to load into MS' playback - if these were imported into Logic and Logic interpreted the tempo as it does in 10.4, MS pops up an incompatibility warning for those files. they need a couple of hour design session to figure out how to implement the articulation IDs for the new strings and horn instruments. Open MainStage 3 and check your MIDI controller connection in Mainstage > Preferences > MIDI Start a new project and choose a Keyboards project (e.g. I doubt (as a guy who has lead software dev projects for > 30 years) that it's particularly hard to add the new reverb to MS, and. What is MainStage 3 MainStage is a software program developed by Apple, designed for use in live keys performances. I'm trying not to be harsh here, but real. When Apple doesn't keep them in sync, I can't imagine it's anything less than lack of resources or lack of priority to keep the two products synced up. This template is designed to give you sounds that are industry standard. The issue is that many of us use MainStage for live based on channelstrip compatibility and ease of going from Logic to MainStage for live use. The ultimate MainStage 3 template for worship. It has been the norm that MainStage releases came out a couple of weeks after Logic updates, but we've had a couple of times that it took months and I believe once that it was almost 3/4 of a year before Apple synced these up again. All creative solutions/ideas, but the user-friendly point of view would be that Apple keep Logic and Mainstage reasonably in sync in terms of sound-making capabilities.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |