Dr. M

Dr. M's Developmental Blog of Theories

Dr. M's Developmental Blog of Theories



Dr. M's Developmental Blog of Theories, Structure Tests, & Rants about - Optimization through Hybridization of Setup & Mobility, Routing & Control.



2.08.2014

Use iPhone to Control Motu Hardware/CueMix w/ Touch OSC Template from Dr. M

After meticulous reconfiguring & editing, I have finally completed the Touch OSC Cuemix Template for the iPhone and I pod Touch.
The template contains all controls for inputs, outputs, mixes, channel control, the compressors & Limiters, eq, & reverb. The only buttons I had to remove (because of space) are the talk back and listen buttons. The. Monitor select buttons are. Still intact.
I took on this project basically for myself, I don't understand why OSC templates for the iPhone doe not exist. I could not find any complete mappings from anyone online so I underwent the process. I. Have used, tested, altered, edited, & then reconfigured these, so hopefully they are of some use for anyone else trying to use these. Feedback, ideas, etc. Are totally welcome. Comment below if you have anyprob.'s w/ the link.

https://drive.google.com/file/d/0B54dMJ4hAIF3eHRaSXVUdlQ4bTg/edit?usp=sharing

12.08.2013

[HOW TO] Build Ultimate Snap Drum Rack.

[HOW TO] Ultimate Snap Drum Rack. Button select drum cell, turn knob change sample, button swap 128's Live!
{BUTTONS TO SELECT INDIVIDUAL DRUM CHAINS FOR CHANGING THE SAMPLE, SOUND, ETC. IN NESTED 128's}


(this is how I blue-hand select nested devices in a drum rack for manipulation, acheived thru ClyphX & a dedicated user script's blue hand function.
1. Press a button to bring up view of drum sample and macros on rack for control of sample sound manipulation.
2. Press the other button to change to a different group of samples
3. Turn a knob to change sample.

[HOW TO]
1. First Assign these user controls listed below to buttons on your midi controller. 15 buttons total, 14 for specific drum cell selecting, one for selecting top rack.
--A. Suggested controller: Push, Nocturn, etc. A controller with both knobs and buttons is best.
2. Create a User Remote Script(URS) here->C:\Users\owner\AppData\Roaming\Ableton\Live 9.0.4\Preferences
3. Assign the knobs from your midi controller to the Device Controller Encoders listed in the URS.
--A. Select this midi controller as the input & output for this URS and Clyphx in Ableton's midi preferences.
-----1. NOCTURN : Create a new page via automap for the channel you specify in your XControls. Replicate the control cc's of the encoders on each page. 
--------Ex: 'KICK_A BUTTON' 'KICK_A CHAIN STEP BUTTON(info @3A.)' 'SNAREs BUTTON' 'SNARE CHAIN STEP BUTTON' etc.

RACK MAP = (MAIN DRUM RACK->INSTRUMENT RACK->DRUM RACK->SAMPLER)

Now, in Ableton -
4. Create a Track, name it 'DR.M8TRX TRAK'
5. Create a Drum Rack on this Track. Name it 'DR.M8TRX RAK' (SAY - DRUMATRIX RACK)
--A. Create the 14 chains listed below, in this Drum Rack.
--B. Place Instrument Racks in each chain.
------(NOTE: This Instrument Rack's purpose in the chain is to allow multiple 128's to be selected w/o hot swap. If you only need space for 128 samples, skip this STEP B only, and leave out the Instrument Rack. Start again @ C1. Your not racking multiple instruments, so you dont need it.)
-----1. Except for Snare, Clap, & Hit chains. On these chains place Drum Rack 1st, THEN the Instrument Rack.
-------A. This is to gain the ability to route the first Drum Rack in these chains(snares, claps, hits, etc.) to sends, within the individual rack per chain, and without.
---------1. Create Returns in this Drum Rack and place reverb and/or delay, or change the routing output to verb or delay on Return Tracks.
--C. NOW, inside this Instrument Rack, create a Chain. 
-----1. Place a Drum Rack on this Chain, repeat on each of the 13 Chains.
-----2. On the TOMS Chain create 3 Chains in this Rack. -> 1.Hi TOMS, 2. MED TOMS, 3. LOW TOMS
-------A. Place a Sampler in each of these Drum Racks.
---------1. Place 128 samples in this Sampler.
-----------A. Open Zone Tab on Sampler and click 'SEL' chicklet.
-----------B. Select all 128 samples and right click->distribute ranges equally
---------2. Assign Sampler Selector to Macro, as well as parameters such as filter, ADSR, etc.
---------3. Repeat STEP C if you have more than 128 samples, and need another '128'.
-----------A. If you do place multiple 128's in this Instrument Rack, create 14 more buttons (on your controller), 1 per XControl, to step thru these different 128's via the Chain Selector on this Instrument Rack.
--------------(Nocturn has step mode for buttons. set number of steps the button has to = match the number of chains in the rack)
NOTE!!! DO NOT PLACE EFFECT AFTER THESE SAMPLERS IF YOU WANT TO CLYPHX 'SNAP' THIS RACK. YOU CAN, HOWEVER, PLACE MIDI EFFECTS BEFORE THE SAMPLER. ex: arpegiator
Now pressing the 'KICK_A' XControl that you've assigned to your controller, will select the appropriate KICK_A chain, placing the Blue-Hand from the URS on the Drum Rack holding # the Sampler, for instant control. ex: Press a button & turn a knob to change a drum sample. press a neighboring button to change groups of samples, per drum type.
Simple Ultimate Drum Rack Control.
This allows you to utilize the SAME 8 KNOBS TO CHANGE THOUSANDS OF SOUNDS, ACCESSED BY PRESSING 14 DIFFERENT DEDICATED BUTTONS. (this is only 4 pages on a Nocturn,
leaving 2 buttons free for Hot Swap Buttons, delay or reverb rack select buttons, etc. )
This is the most elegant method Ive acheived that accesses this large a matrix of sound selection and control, with 8 knobs & 13 or 32 buttons.)

# DR.M8TRX Cell Selector XControls

MAIN.DRUM_RACK = cc, 15, 17, “DR.M8TRX TRAK”/DEV1 SEL

KICK_A = cc, 15, 0, “DR.M8TRX TRAK”/DEV1.1.2 SEL ; DEVRIGHT

SNARES = cc, 15, 1, “DR.M8TRX TRAK”/DEV1.2 SEL ; DEVRIGHT ; DEVRIGHT

KICK_B = cc, 15, 2, “DR.M8TRX TRAK”/DEV1.3.2 SEL ; DEVRIGHT

CLAPS = cc, 15, 3, “DR.M8TRX TRAK”/DEV1.4.2 SEL ; DEVRIGHT ; DEVRIGHT

CLOSED_HATS = cc, 15, 4, “DR.M8TRX TRAK”/DEV1.5.2 SEL ; DEVRIGHT

OPEN_HATS = cc, 15, 5, “DR.M8TRX TRAK”/DEV1.6 SEL ; DEVRIGHT

CRASH = cc, 15, 6, “DR.M8TRX TRAK”/DEV1.7 SEL ; DEVRIGHT

RIDE = cc, 15, 7, “DR.M8TRX TRAK”/DEV1.8 SEL ; DEVRIGHT

PERC = cc, 15, 8, “DR.M8TRX TRAK”/DEV1.9.2 SEL ; DEVRIGHT ; DEVRIGHT

RIM_CLAVE = cc, 15, 9, “DR.M8TRX TRAK”/DEV1.10 SEL ; DEVRIGHT

SHAKERS = cc, 15, 10, “DR.M8TRX TRAK”/DEV1.11 SEL ; DEVRIGHT

COWBELL = cc, 15, 11, “DR.M8TRX TRAK”/DEV1.12 SEL ; DEVRIGHT

TOMS = cc, 15, 12, “DR.M8TRX TRAK”/DEV1.13 SEL

HITS = cc, 15, 16, “DR.M8TRX TRAK”/DEV1.14 SEL ; DEVRIGHT

_________________
Music https://soundcloud.com/dr-mysterium
Ableton Nerdery http://drmysterium.blogspot.com
Shwag http://drmysterium.bandcamp.com

11.24.2013

DR. M'S MASTER CONTROL CLIP TECHNIQUE


CREATE CLIPS TO SEQUENCE REMOTE CONTROL OF ABLETON, SCRIPTS, AND MASTER TRACK FX(!) 

HOW I RECORD CLYPHX XTRIGGERS LIVE INTO CLIPS IN ARRANGEMENT VIEW


THIS TECHNIQUE WILL ALLOW YOU TO
A. USE MIDI CLIPS IN ARRANGEMENT VIEW THAT WILL TRIGGER GLOBAL CONTROLS, SELECTED TRACK CONTROLS, & XCONTROLS.
B. RECORD/STEP SEQUENCE MIDI CLIPS, LIVE! IN SESSION VIEW (OR ARRANGEMENT!) OF YOUR GLOBAL CONTROLS, SELECTED TRACK CONTROLS,, CLYPHX GETURES AND XCONTROLS. (MIDI DATA). THESE CLIPS CAN BE THEN BE RECORDED INTO THE ARRANGEMENT VIEW, DRAG N DROPPED, OR COPY AND PASTED OVER.
C. CREATE CLIPS LIVE WITH MIDI TO SEQUENCE REMOTE CONTROL OF ABLETON, SCRIPTS, AND MASTER TRACK FX.
BETTER THAN DRAWING DUMMY CLIPS, WERE AUTOMATING MASTER CONTROLS, LIVE!

1. CREATE A ClyphX XCONTROL TO OPEN & CLOSE THE BROWSER W/ MIDI. EXAMPLE: BROWSER_BUTTON = note, 9, 74, TGLBRWSR. 
A. I SET STC SCRIPT ON MIDI CH. 9, AS WELL AS MY ClyphX XCONTROLS (AS SHOW ABOVE)
NOW LETS OPEN AND CLOSE THE BROWSER BY RECORDING THIS MIDI NOTE INTO A CLIP. 
2. DUPLICATE YOUR CONTROLLERS MIDI OUT SIGNAL INTO TWO STREAMS BEFORE IT ENTERS ABLETON. 
A. I OPEN COPPERLAN & SELECT MY CONTROLLER[NOCTURN CH9](IT IS LABELED AUTOMAP IN COPPERLAN) AS THE INPUT FOR TWO VIRTUAL PORTS. vMIDI_1 & vMIDI_2
3. IN ABLETONS MIDI PREF.'S MAKE THESE CHANGES
A. SELECT vMIDI_1 AS ClyphX INPUT.( or whatever control script your using.)
B. ENABLE vMIDI_1 AS  A REMOTE INPUT.
C. ENABLE vMIDI_2 AS A TRACK INPUT.
D. DESELECT YOUR CONTROLLER(S) AS AN INPUT BECAUSE YOU ARE NOW ROUTING YOUR CONTROLLER INTO ABLETON THRU COPPERLAN (vMIDI_1 AND vMID_2).
(E. I PERSONALLY ALSO SELECT vMIDI_1 AS INPUT FOR THE SELECTED TRACK CONTROL SCRIPT. 
1. I CONNECT MULTIPLE OTHER CONTROLLERS TO THESE SCRIPTS BY ROUTING THEM ALL INTO vMIDI_1 & vMIDI_2. 
A. MORE INFO ON SECTION E. & E.1 IN MY HYBRID SCRIPTING TUTORIAL.)
4. CREATE A MIDI TRACK TO RECORD MIDI CLIPS INTO (SESSION VIEW) 
5. SELECT vMIDI_2, CH9 (or whatever midi channel your control messages are using) AS THE INPUT FOR THIS TRACK. SET MONITOR TO OFF TO PREVENT MIDI FEEDBACK LOOP.
6. WE WILL NOW ROUTE THE MIDI OUTPUT OF THIS TRACK, BACK TO THE vMIDI_1 REMOTE INPUT. HERE'S HOW.
A. OPEN COPPERLAN, CREATE A MIDI LOOP BACK TO vMIDI_1 
1. SELECT vMIDI_3@CH9, CONNECT IT TO vMIDI_1@CH9. 
B.BACK IN ABLETON, IN MIDI PREF's, ENABLE 'Track Out' SWITCH FOR vMIDI_3.
C. NOW CHANGE THE OUTPUT ROUTING OF THIS MIDI TRACK WE CREATED, TO vMIDI_3. 
7. CREATE A SNAPSHOT OF THE ROUTE CONFIGURATION IN COPPERLAN, AND SET IT TO RUN AT BOOT.

BAM. NOW YOU'VE GOT A MIDI CONTROL LOOP WHEREIN YOUR MIDI MESSAGES REACH ABLETON FOR SCRIPT AND REMOTE CONTROL, BUT ALSO ARRIVE AT TRACK INPUTS FOR RECORDING THESE MESSAGES INTO CLIPS FOR SEQUENCING(WITHOUT BEING EATEN BY ABLETON!), WHICH IN TURN IS ROUTED BACK INTO THE SAME REMOTE MIDI INPUT! THIS BASICALLY TRICKS ABLETON INTO PROCESSING THE MIDI DATA FROM THE CLIPS AS IF IT WERE YOUR LIVE MOVEMENTS. NOW ONE CAN SEQUENCE GLOBAL CONTROLS, MASTER TRACK FX, & SCRIPT CONTROLS, ALL INTO CLIPS IN SESSION VIEW! AND THEN, LAUNCH THESE CLIPS WITH A LAUNCHPAD! &/OR USE THEM IN YOUR ARRANGEMENT...

NOTES:
YOU WILL ONLY NEED TO CREATE ONE TRACK FOR EVERY MIDI CHANNEL USED TO CONTROL LIVE. RECORD THE MIDI DATA FROM YOUR GESTURES INTO CLIPS ON THE TRACK. IF YOU USE MULTIPLE MIDI CHANNELS, GROUP THE TRACKS AND JUST LAUNCH THE WHOLE GROUP WHEN RECORDING. I USE CHANNEL 16 FOR REMOTE CONTROL OF MASTER FX, GLOBAL CONTROLS, SELECTED TRACK SCRIPT,  MANIPULATE AUDIO ROUTING TO TRICK BUSSES, & MOST OF MY CLYPHX XCONTROLS. THIS IS HOW I LOOP SEQUENCE MULTIPLE CLYPHX & STC CONTROLS ON THE FLY, LIVE. I SET CLYPHX AND STC TO SHARE THE SAME MIDI CHANNEL. I CREATE ONE TRACK TO RECORD CONTROL CLIPS FOR BOTH, OR JUST ONE OR THE OTHER. I  USE THIS TRACK TO RECORD MY GESTURES INTO CLIPS AND TO OVERDUB MULTIPLE PASSES FOR DIFFERENT EFFECTS. I THEN LAUNCH THESE CLIPS WITH A LAUNCHPAD TO HAVE TEMPO SYNCED COMPLEX AUTOMATIONS THAT I RECORDED LIVE WITH MIDI GESTURES.
ONE CAN NOW AUTOMATE VOLUME, PAN, ETC OF A TRACK WITHOUT CREATING AUTOMATIONS IN THE CLIPS ON THAT TRACK. 
AUTOMATE UNUSUAL TEMPO CHANGES, OR OTHER WONKY USES OF GLOBAL CONTROLS W/ CLIPS - CHECK
USE ClyphX'S NEW ENVELOPE DRAW CONTROLS TO CREATE AUTOMATIONS TO CONTROL SCRIPTS SUCH AS STC, OR USER REMOTE SCRIPTS. 
THIS TECHNIQUE ALSO SPLITS THE BURDEN OF PROCESSING THIS MIDI ROUTING BETWEEN THE TWO PROGRAMS, LOWERING THE OVERALL PROCESSING PERCENTAGE OF ABLETON.
COPPERLAN CAN CREATE PRESET SNAPSHOTS, AND YOU CAN SET THESE TO RUN @ BOOT. NO RECONFIGURING, SO JUST SET IT AND FORGET IT. 
MERGING AND DUPLICATING MIDI STREAMS SEEMS TO BE ANOTHER WAY TO GAIN ANOTHER LEVEL OF CONTROL OF ABLETON LIVE.

!WARNING!
BE SURE AND SET MONITOR TO OFF FOR ALL MASTER CONTROL MIDI TRACKS THAT YOU CREATE. THIS PREVENTS THE DANGER OF A MIDI FEEDBACK LOOP.

10.29.2013

Manipulating Audio Routing in Ableton w/ MIDI or Clyphx [How To]

The intention of this technique is to instantly route audio tracks through an effect on a different track momentarily when the effect is needed, without placing the effect on the track or the master track which can cause unwanted processing of your audio and un-needed strain on your processor(s). 

There are many Ways to use your Korg Kaoss Pad, Pioneer RMX, or other external fx units. The Kaoss and RMX series differ from alot of other effect units because their looping, sampling, and glitching abilities literally demand that they be placed directly in the audio signal path. 
Example:
Ableton->Audio Interface output->Ext. fx->speakers

This technique can be used to  record your Kaoss Pad effects (audio &/or midi),  while others simply use it as a live effect.

I would like to take a second to explain to you how I use an external effect with Ableton. Specifically the  Korg Kaoss Pad,  the KP3. 
  This technique can also be used to route audio tracks through an effect bus with vsts, without placing the effect on the track, or on the master track, causing unwanted processing of your audio and un-needed strain on your processor.

Digital to analog conversion of audio happens at the audio outputs of a computer, when either the headphone out or an audio interface is used. The Kaoss Pad also converts incoming audio into digital data at the inputs and from digital back into analog audio at the outputs. This in a way degrades the audio signal and it almost goes without saying, as this is a necessary evil. This basically also happens when audio is routed thru a vst. 
But let me ask you this, -For your audio from your computer to pass thru the Kaoss Pad, does it ALWAYS have to pass thru the Kaoss Pad?(ie when your not touching it?)

Hard wiring, or hooking up an extenal audio effect unit to mixer outputs, audio interface outputs, or send/returns is close to permanent, especially on stage. Re-routing your fx audio cables on stage, per song, is a nightmare. I'll leave it at that. Try it sometime. 
It's quite convenient to simply place the Kaoss Pad after your audio interface/mixer, in your signal path. You can then add effects to everything coming out of your computer/mixer, as you Dj, perform live P.A., or simply watch adult swim adding glitches and delay for fun.
What about when you want to record the Kaoss Pad effects your adding to a Dj mix? Do you just route the outs back in? How do you monitor the effected signal/recorded signal? What if you want to record the output on stage for looping?
If you add up how many analog to digital and digital to analog conversions are happening in most situations with a Kaoss Pad you might pull your hair out. 
So then, this is where we sit with this problem. Degradation of the recorded signal. If your reading this from a search you probably know this situation.
Here's the key- the kaoss pad sends a configurable midi message out its midi out port (and/or its USB port) at the moment you touch it, and the moment you let go. This, my friend, is all you need for triumph.

I have hobbled together a method, that when utilized, causes all selected audio to be routed thru the Kaoss Pad -ONLY WHEN TOUCHED! and the routing returns to normal when you remove/ lift your finger!

In order to reach a complete understanding, lets build an example. 
1. Lets create two tracks, one midi and one audio. In session view, Place a drum loop or synth line in scene 1 on the audio track, and vice versa on the midi track (drum or synth vst playing a pattern), so that you have a drum pattern and a synth melody playing audibly. Name them apropriately, drum and melody. 
2. Now create a third track named KP3 and place an effect rack there and create two chains inside this rack. Name them the same, Drum and Melody. Turn the volume on this third track all the way down.
3. Here's the next trick - place a gate plug in on each chain, open the SIDECHAIN Tab on the gate plugin, select (drums/pre mixer) as the input on first chain. Click the monitor button and Shazam! You've routed the audio from the drum track into this third track using the gate plugin! Repeat this step for the gate plugin on the second chain to route the audio from the melody track here as well, and bam! your mixing signals from multiple tracks into a single track! 
    A.Your selecting the premixer option for the drum rack routing because you do not want the audio to cut out on the third track when the fader turns down on the first track. 
4. So let's place the external effect plugin on this KP3 track to route this mixed audio to the kp3 and back!

    Hopefully at this point I should not have to explain how to use the kp3 with the ableton external effect plugin, it's pretty self explanatory. Just don't use your main outs on your interface to route the audio thru the Kp3, use your auxiliary outs. Leave your main outs free for the master track out. Also, leave the mix knob at 100% so the signal is actually always being fed to the kp3, when your touching the Kp3 your simply 'opening the audio valve via midi' to monitor the return mix from the fx loop.
Here a mapping example to see where we stand. 
1. Open your midi preferences tab in ableton and activate the remote input box for the kp3 midi input. 
2. Now enter Ableton's midi mapping mode. Cllick the volume fader for track one and touch your kp3 pad to map the volume fader of Drum track one to this kp3 touch mid message. Set max to -inf and min to 0db. 
    A. Repeat this step for the melody channel as well. After exiting mapping mode, both these tracks volume faders should zip to minimum (-inf) and back (0.0db) when the kp3 screen is touched and released. 
3. Now re-enter midi mapping mode and map this third volume fader for track 3 as well, except while you are in mapping mode, right click this mapping in the left browser listing and click the 'Invert range' option. Exit
Now the volume for first 2 tracks should turn down and this third track should turn up to full 0db when kp3 is pressed allowing the effected signal from the kp3 to be heard ( 3rd track routed to the master track), and when the kp3 pad is released the volume on the third track should return to minimum and the first two tracks volume should return to full volume routing the clean, un-effected signal to the master track. 
Voila! Albeit crudely ( this mapping only works in this project), your audio is now being routed directly to your outs, and when you touch your Kaoss Pad, your audio is instantly routed thru it!
This midi mapped audio routing trick is also excellent for dblu glitch, just map a midi button for the volume swap trigger and to the mix knob on dblu glitch, and assign a midi fader or knob to the preset changer on dblu glitch, and your good to go! Press down the button and the audio is routed to the glitch track, turn the knob to change glitch presets, release the button in time for the drop! Quneo faders are great for this, since the faders are also buttons.

Eventually I'd like to explain how to clean this situation up with Clyphx, allowing one to preserve the delay effect trails inherent in kp3 usage, allow these midi commands to be synced to master clock (on touch and release!), allowing mutiple buttons for multiple audio configurations to be changed, and finally Clyphx would allow this mapping to be permanent across all projects. 
So, Check back soon, &
Thanks for reading!

10.25.2013

Hybrid Scripting of Clyphx and STC ; an example

STC has a control for 'detail view toggle'.Note77
Clyphx has commands for 
FOCDETAIL -Moves focus to detail
SHOWDETAIL- detail view toggle (on and off)
TGLDETAIL - Changes detail view from clip view to device view.
A clever combination of all of these will triumph over any single one.

I set STC Script as ch.9
I comment out the line in STC script for Detail view Toggle - Note 77 ,by adding # at begining of the line.

I create a Clyphx x-control to replace this Note77 on ch.9 (like this example button below)
SHOWDETAIL_TOGGLE = note, 9, 77, FOCDETAIL ; TGLDETAIL , SHOWDETAIL 

This is done using the user control section of the user settings text inside the clyphx script folder.

This button will shift focus to detail and bring CLIP VIEW into to view, when pressed again will close detail view. 
When pressed again it will alternately show DEVICE VIEW and another press close device view. 
This is far superior to simply toggling detail view with one button and switching device to clip view with another. As a plus when the focus is shifted to Detail view you can use the arrow keys to navigate there.

I then create a page for midi channel 9 on the Nocturn. Open AUTOMAP server under mixer tab is an advanced menu. Check ch.9. I name it STC/CLYPHX under the crossfader.
I create a button for Note77 named 'Detail' on this ch.9 page on the Nocturn. Set it to toggle.

Set AUTOMAP as the input for both clyphx and STC. STC will respond when appropriate, 
Clyphx will respond to these new buttons you create to replace more simple functions in STC.
This way you get the best of both accessed from one channel from one controller.
You have now connected midi channel 9 from your Nocturn controller to both scripts (which are set to the same midi channel) accessing a particular function you created within the clyphx script, to replace a particular function in the STC Script with a more configurable, intuitive control, while still maintaining access to all the best of both scripts, with a single page of your Nocturn or any midi controller.

Side note - using  Clyphx, one can create controls that operate on several different midi channels at once, it's only the STC Script that only operates on one midi channel, as it is a script for Selected_Track_Control in Ableton Live.

10.08.2013

Rave Reminiscent

It was loosely based on PLUR ethics, then the law came round and disasembled the whole operation, charging venue owners with murder if there was a death by extacy. We had whole wall colages of awesome flyers that showed 'we were there'. 
Now, it's been ten years, or more, and the new youth culture attempts to rebuild the 'dance scene' under the presumtion they started it.
'Do you see what god did for us, man?'
'Na bro, check what we did for ourselves.'
We RAVED about it. They RAGE at it.
I remember people hangin from the rafters, as far as the eye can see, in the biggest warehouse in the dark, dancing till they drug the weak out to the ambulances. After the squash down the 'parties' moved into homes, into dorm rooms, into suburbia to hide. we downloaded free music from napster to dance to, got even freakier in close quarters, Vicks, wippets, LEDs, however you could get back on the rollercoaster, and forget you were at home.
How do you afford your rock n roll lifestyle? How bout affording the festival circuit? Let's go VIP so we can laugh and point...
So Whats happened to the professionalism? To the press packet?
Rock exploded in the 90's and every youngster got a 100$ guitar. We all sucked. We begged for gigs. All of a sudden there were a million bands and we all loved Metalica riffs. We accepted low to no paying gigs to gain the experience. Venue owners were down to let us play for free. We were crappy but we brought in the youth. We had moms money. The parties had begun to invade real venues, years ago, but by now it was almost forgotten.
"what am I gonna pay u kids for? There's a millions guys in a billion bands chompin at the bit to get up here on this stage...."
Let's not even talk about the crappy job the sound guy gives everybody that's not the main act. 
"wait, this main act sounds clean and loud, but I handed him my cables?"
So unprofessionalism entered rock. Before, we would jam closed up behind doors, gettin it right, workin on that sound, developing the complete package, gettin the press kit just right, everything in the right font. 
Now you can set up the myspace and let those counters roll. It doesn't matter that mom clicks listen all day everyday.
"I'm so close y'all, let's work them counters! I'm One click away..." 
The counter is acurate and it matters.
Let's not play around here. Pretty soon everything in wal-mart will say dubstep. After all its just a word. Oh, and thats after it's on everything at the mall, btw. Then we'll probably call it something else, it already seems to be something else. 
Rock used to be the umbrella everyone hid under to get paying gigs. 
'What's rock? I like electronica. You hear that badass guitar riff?"
"Electronica/trance/house is old and wore out, I like anything downtempo..."
Dubstep umbrellas for sale.
Seems like it used to be called the Funk.
"2 dollar drink specials, dubstep all nite! Come right in! Pay over here..." 

Understanding Balanced Audio Cables & Connections

Balanced audio cables/connections use a number of techniques to reduce noise.
A typical balanced cable contains two identical wires, which are twisted together and then wrapped with a third conductor (foil or braid) that acts as a shield.The term "balanced" comes from the method of connecting each wire to identical impedances at source and load. This means that much of the electromagnetic interference will induce an equal noise voltage in each wire. Since the amplifier at the far end measures the difference in voltage between the two signal lines, noise that is identical on both wires is rejected. The noise received in the second, inverted line is applied against the first, upright signal, and cancels it out when the two signals are subtracted. 
The separate shield of a balanced audio connection also yields a noise rejection advantage over an unbalanced two-conductor arrangement (such as used in typical home stereos) where the shield must also act as the signal return wire. Any noise currents induced into a balanced audio shield will not therefore be directly modulated onto the signal, whereas in a two-conductor system they will be. This also prevents ground loop problems, by separating the shield/chassis from signal ground.
Signals are often transmitted over balanced connections using the differential mode, meaning the wires carry signals of opposite polarity to each other (for instance, in an XLR connector, pin 2 carries the signal with normal polarity, and pin 3 carries an inverted version of the same signal).
Despite popular belief, this is not necessary for noise rejection. As long as the impedances are balanced, noise will couple equally into the two wires (and be rejected by a differential amplifier), regardless of the signal that is present on them.
[1] A simple method of driving a balanced line is to inject the signal into the "hot" wire through a known source impedance, and connect the "cold" wire to ground through an identical impedance. Due to common misconceptions about differential signalling, this is often referred to as a quasi-balanced or impedance-balanced output, though it is, in fact, fully balanced and will reject common-mode interference.
However, there are some benefits to driving the line with a fully differential output:
The electromagnetic field around a differential line is ideally zero, which reduces crosstalk into adjacent cables.
Though the signal level would not be changed due to nominal level standardization, the maximum output from the differential drivers is twice as much, giving 6 dB extra headroom.
[2] (if the amplifiers are identical, though, their output noise sums to 3 dB more than a single amplifier, decreasing dynamic range).
Noise that is correlated between the two amps (from imperfect power supply rejection, for instance), would be cancelled out.
At higher frequencies, the output impedance of the output amplifier can change, resulting in a small imbalance. When driven in differential mode by two identical amplifiers, this impedance change will be the same for both lines, and thus cancelled out.
[3] Differential drivers are also more forgiving of incorrectly wired adapters or equipment that unbalances the signal by shorting pin 2.
[4]Professional audio products (recording, public address, etc.) provide differential balanced inputs and outputs, typically via XLR or TRS connectors. However, in most cases, a differential balanced input signal is internally converted to a single-ended signal via transformer or electronic amplifier. After internal processing, the single-ended signal is converted back to a differential balanced signal and fed to an output. A small number of professional audio products have been designed as an entirely differential balanced signal path from input to output; the audio signal never unbalances. This design is achieved by providing identical (mirrored) internal signal paths for both pin 2 and pin 3 signals (AKA "hot" and "cold" audio signals). In critical applications, a 100% differential balanced circuit design can offer better signal integrity by avoiding the extra amplifier stages or transformers required for front-end unbalancing and back-end rebalancing. Fully balanced internal circuitry has been promoted as yielding 3dB better dynamic range. 
[5] On TRS plugs, the tip is "hot" (positive), the ring is "cold" (negative), and the sleeve is ground (earthed or chassis). If a stereophonic or other binaural signal is plugged into such a jack, one channel (usually the right) will be subtracted from the other (usually the left), leaving an unlistenable L − R (left minus right) signal instead of normal monophonic L + R. Reversing the polarity at any other point in a balanced audio system will also result in this effect at some point when it is later mixed-down with its other channel.
Unbalanced signals can be converted to balanced signals by the use of a balun, often through a DI unit.
If balanced audio must be fed into an unbalanced connection the electronic design used for the balanced output stage must be known. In most cases the negative output can be tied to ground but in certain cases the negative output should be left disconnected.
In telecommunications and professional audio, a balanced line or balanced signal pair is a transmission line consisting of two conductors of the same type, each of which have equal impedances along their lengths and equal impedances to ground and to other circuits. 
[6] The chief advantage of the balanced line format is good rejection of external noise. Common forms of balanced line are twin-lead, used for radio frequency signals and twisted pair, used for lower frequencies. They are to be contrasted to unbalanced lines, such as coaxial cable, which is designed to have its return conductor connected to ground, or circuits whose return conductor actually is ground. Balanced and unbalanced circuits can be interconnected using a transformer called a balun.
Circuits driving balanced lines must themselves be balanced to maintain the benefits of balance. This may be achieved by differential signaling, transformer coupling or by merely balancing the impedance in each conductor.
An example of balanced lines is the connection of microphones to a mixer in professional systems. Classically, both dynamic and condenser microphones used transformers to provide a differential-mode signal. While transformers are still used in the large majority of modern dynamic microphones, more recent condenser microphones are more likely to use electronic drive circuitry. Each leg, irrespective of any signal, should have an identical impedance to ground. Pair cable (or a pair-derivative such as star quad) is used to maintain the balanced impedances and close twisting of the cores ensures that any interference is common to both conductors. Providing that the receiving end (usually a mixing console) does not disturb the line balance, and is able to ignore common-mode (noise) signals, and can extract differential ones, then the system will have excellent immunity to induced interference.
Typical professional audio sources, such as microphones, have three-pin XLR connectors. One is the shield or chassis ground, while the other two are signal connections. These signal wires carry two copies of the same signal, but with opposite polarity. (They are often termed "hot" and "cold," and the AES14-1992(r2004) Standard [and EIA Standard RS-297-A] suggest that the pin that carries the positive signal that results from a positive air pressure on a transducer will be deemed 'hot'. Pin 2 has been designated as the 'hot' pin, and that designation serves useful for keeping a consistent polarity in the rest of the system.) Since these conductors travel the same path from source to destination, the assumption is that any interference is induced upon both conductors equally. The appliance receiving the signals compares the difference between the two signals (often with disregard to electrical ground) allowing the appliance to ignore any induced electrical noise. Any induced noise would be present in equal amounts and in identical polarity on each of the balanced signal conductors, so the two signals’ difference from each other would be unchanged. The successful rejection of induced noise from the desired signal depends in part on the balanced signal conductors receiving the same amount and type of interference. This typically leads to twisted, braided, or co-jacketed cables for use in balanced signal transmission.
To convert a signal from balanced to unbalanced requires a balun. For example, baluns can be used to send line level audio or E-carrier level 1 signals over coaxial cable (which is unbalanced) through 300 feet (91 m) of Category 5 cable by using a pair of baluns at each end of the CAT5 run. The balun takes the unbalanced signal, and creates an inverted copy of that signal. It then sends these 2 signals across the CAT5 cable as a balanced signal. Upon reception at the other end, the balun takes the difference of the two signals, thus removing any noise picked up along the way and recreating the unbalanced signal.

[Dr. M's Hybrid Script Technique] or {How I Learned to Love and CombineClyphx and the Selected_Track_Control Script}

(Tools needed = More than one midi controller, CopperLan Midi Routing Software, Selected Track Control Script, and Clyphx Control Script.)

1. Select the same MIDI channel for Selected Track Control Script AND your Clyphx X-Controls. (Example = Ch. 9)
   A. Avoid conflicting cc's & notes between both scripts.
        (Clue) - 'Comment out' any unneeded STC functions to free up Control CC's and MIDI Notes (as needed)
2. CopperLan Midi routing Software (similar to Midi Yoke)
    A. Download and install CopperLan.
    B. Merge the selected MIDI Channel from EACH MIDI   CONTROLLER (MIDI Ch. 9) into ONE virtual midi cable. (Example = VMIDI 1)
3. Open preferences menu in Ableton, select MIDI tab.
     A. Click to enable REMOTE switch for VMIDI 1 INPUT.
4. -The Magic Moment-
     A. Select VMIDI 1 as the INPUT for Selected Track Control Script.
     B. Select VMIDI 1 as the INPUT for Clyphx Control Script. DONE.

Now one can use all their controllers (AS ONE CONTROLLER!) for STC Functions, STC Blue Hand Control, Global Controls, Clyphx X-Controls and X-Clips, Etc....

This routing scheme allows all the MIDI data output from all your MIDI controllers to be merged into one MIDI Data Stream or 'Virtual Midi Cable' and then directed to your control scripts. Thus allowing for ease of use by allowing the user to think of all their controllers as 'One Controller' and program them as such, and think of all their control scripts as 'One Script' and also program them as such. It's a stream of consciousness issue that allows for quicker more intuitive programming workflow.

Continuation of theory (homework)
1. Combine Clyphx controls and STC controls on one pad/slider on Quneo.
2. Use same midi channel page in AUTOMAP to control Clyphx and STC (Novation Controllers)
3. Add more controllers.
4. Invite your favorite User Remote Script to the party.

Drastically Cut DAW CPU Meter Usage by 50% - or Ableton mixbus who?

What if I told you I could drastically reduce your CPU usage, realistically.

1. It takes processing power to sum digitally. Zero, if summed externally.
2. Some audio interfaces have mix/summing bus's and/or dsp/CPU's. (focusrite)
  A. Some of these particular interfaces have a near or zero latency digital summing/return bus. (Motu)
3. Ableton has native external fx plugins that can be placed on the master track to route audio from an external MIXER directly into the master track. (For Processing, Automation, & Recording).
4. Every track in Ableton can be routed externally vs. routed to Abletons 64bit Summing bus, which does zero processing when not in use.
5. Some people will not need this explained in any further detail.

By default Ableton Live sets the output from every track to send to the master track.  Here is a simple diagram...

TRACK 1.  ->\
                              ->  Abletons 64 bit Summing Bus -> Master 
TRACK 2 ->/

1.Set the output of each track to send to the mixbus in your interface. 
2.Drop The Ableton External Effect Plug in on Master Track.
3. For the input (on this ext. fx plugin) Select the output from your interface's mixbus. Viola, external summing via audio interface. 
    Generally your CPU usage meter should drop by half after this re-routing, because you have successfully disable Abletons CPU hungry 64 bit summing bus.

Track 1->\
                ->audio interface summing bus->ext.fx plug@master
Track 2->/

9.12.2012

Hanz's Modified Ableton Remote Script for Arturia Spark Midi Controller

I will get a link posted ASAP, comment and let me know if you need it. I've basically modded the script for all my midi controllers.
As the name 'Mobile Hybrid Rig' implies, everything involved connects two systems or has two uses. My Korg KP3 FX becomes a midi controller when you press shift+8. When you shift it into this mode, it will interact with an Ableton Midi Remote Script. In this case, a modifies version of Hanz's script, which is a hacked/modified version of the APC40 Script.
The Arturia Spark Drum Machine Controller does the same thing when you press all 3 fx buttons at the same time. It becomes a midi controller and I use it to control one Track in live, 16 scenes long (down). This way I can use the Spark to create patterns in production mode, that I record or drag into clip slots in Ableton, then I switch to midi mode and launch these clips (launchpad style) live while I play my synth and glitch everything with KP3, launching trippy samples to thicken the mix.

My process was this:
I loaded Hanz's script into the Ableton MIDI Remote Scripts folder. Started Ableton and picked the new script from the Midi Control surface list. I believe you can load five different scripts. Ableton translated the script and created a new python file. After installing python 3.something, I loaded the script python file into a text editor and I then modified the script to Specify what size red box, and then modified the python file called 'Midi' to specify which midi control and note values controlled what parameters. Its Much like creating a midi Control chart for Ableton (similar in concept to the midi chart that comes with every synth or drum machine). Then I edited the control cc#'s and notes on the Spark Midi Controller to match the the values I entered into the python script and BAM, you got instant 'Red Box' control of Ableton with any controller. Find Hanz's tutorial on blogspot.com for more. This is how I gain extra control of Ableton without Ableton controllers. I use the new Novation Nocturn as an automap Mixer/vsti/Ableton controller, and my other controllers for launching clips and creating clips on the fly. The KP3 screen also becomes 8 faders in midi mode, I use these to control hidden parameters on my synth. So it's hooked to my computer via usb at all times (thru an installed hub), and midi out into the interface which is hooked to the synth and everything else( at all times). The script I use for my Nocturn is a modified one as well, you can find it on their forum, and on the Ableton forum, I can't seem to remember who created it, but the Nocturn is useless without it, and it takes some time to master, but it is a Swiss Army Knife for Ableton. It is an excellently made script. Especially with some other controllers for clip launching and additional control.

[10.08.13 edit] I no longer have the Spark, but if I did I would now program it's MIDI output to utilize the Selected Track Control Script.

9.08.2012

Top Deck - HUD & Stanton Scs.3d Dj Controller

Laptop and Dj Controller

The top shelf on this rack case slides back to expose the work surface underneath, and was designed as a laptop shelf. It's carpeted for Velcro use.
I have chosen a 15" HP Quad-Core AMD Vision 6 Processor, with 8gb hi-speed ram, Windows 7, 500 gb Hd and LED screen. I actually chose 15" to maximize space for something else to ride on the top shelf. This is where the Tweakalizer used to reside, to the right of the HP, where the Scs.3d now sits.
I foresee (not that it takes alotta foresight for this one) iPads taking this top position, usurping the laptop as DAW/ sequencer, in the near future (when they and their apps can actually replace multi-core processing at higher than 16bit, integrate more ports, and gain a more stable clock).
As of now I use an iPod touch in the rig, but in a 'instrument/ sampler' approach by way of the iConnect Midi Interface. I trigger the ipod with midi or touch and route the audio into my audio interface. The iConnect Midi integrates my laptop, iPhone/iPod, synth, KP3, midi and USB controllers(!) all into one midi stream. Every device Accessible to all other devices by midi of any kind. This midi interface is integral to this setup and I'll go in-depth more on it in a further section.