Automation refers to the ability to dynamically control mixing parameters such as volume, panning, etc., throughout the course of a song or performance. The word ‘automation’ harks back to the days of analogue mixing desks. Mixing engineers, in a bid to add interest to a track, would manually alter the fader levels of individual tracks. Since this was limited by the fact that a human being has only two hands, mixing desk manufacturers came up with a solution: they incorporated motorized faders into their desks, which allowed the positions of the faders to change ‘automatically’; hence, automation.
Automation in the era of the DAW has a taken on a broader meaning: essentially, it involves altering any sound-influencing parameter (including VST/AU effect plugin parameters). The parameter can be altered in two ways. It can be either programmed beforehand using so-called automation curves, or it can be performed using e.g. a MIDI controller (such as a fader, rotary controller, or pedal). zenAud.io ALK has direct support for both types of automation, in the form of Scripted Automation and Captured Automation.
In ALK – unlike other DAWs – all automation, whether scripted or captured, lives in Automation Tracks. Automation tracks look like normal tracks, and just like with normal tracks, you can draw “loops” inside them, but the correct terminology in ALK for this kind of “loop” is an Automation Region, because it doesn’t loop. We’ll discuss automation regions in greater detail below; for now, let’s discuss the “why” of zenAud.io’s automation, rather than the “how”.
The Philosophy Of Automation Tracks¶
The idea that automation should live in its own track type is somewhat different from other DAWs. Typically, captured automation is treated differently from scripted automation: usually, only the latter is visible in the arrangement view, and even then it usually appears as a sub-lane of the track that it affects. Captured automation, on the other hand, is usually accessible separately from some global view.
zenAud.io’s philosophy is that all relevant information about what you intend to do when you perform should be available directly in the arrangement. With the increasing inportance of pattern based music – particularly in the domain of electronic dance music –, nowadays, in many types of performances, DJ-style automation of parameters such as filter cutoff and similar lie at the very heart of the performance, so it makes sense to treat automation as a first-class citizen.
As such, the right way to think of an automation track is simply as another instrument in the performance. You can imagine each fader, or each rotary, as a separate instrument in your rig, in much the same way that you may allocate a track to your guitar, vocals, synths, and drum machines. From that point, it’s up to you whether you want to perform the automation live – i.e. to use captured automation – or to have it play back automatically using scripted automation.
The Fader Sweep Is The Performance¶
As alluded to above, once you take the view that auotomation is a first-class citizen, and that the fader itself is the instrument, it makes sense that a single fader can control different things at different times. This is accomplished with automation regions, which allow the destination plugin and parameter to be set for that specific region, independently of all others.
This allows a single fader to control different parameters, and even different plugins on different tracks, at different times during the performance. At that point it, a fader acquires a higher-level meaning than in other DAWs. For example, rather than buying an expensive controller with a large number of faders or rotaries, you can instead use a controller with, say, four faders. The first fader can be used for overall, say, glitch-style effects – and in each song (or part of some song) in your set, you can draw a region assigning that fader to specific parameter on a specific plugin. The second fader could be used as a “breakdown fader”. In the first song it may control a low-pass filter cutoff parameter to give you the classic filter-sweep effect. In the second song, it could control the decay of a reverb, to give you an effect with a similar purpose, but not exactly the same.
Automation tracks come in two flavors:
- Command Track: this represents an on/off parameter or a trigger. For example, you would use a command track to automate the power (or mute) track parameter. Command tracks have track type icons that resemble a foot pedal.
- Control Track: this represents a parameter which takes on continuous values. If you wanted to automate, say, the volume of a track, you would use a control track. Control tracks have track type icons that represent a rotary control.
Both automation tracks can be duplicated and deleted via the track menu.
Automation Track Panel¶
Just like other track types, automation tracks have track panels that give you access to functionality that impacts the track as a whole. Automation track panels look a little different than normal track panels, and this reflects the fact that, by itself, an automation track does not generate its own audio output; instead, it alters another track.
Following are the user interface elements available in an automation track.
- Learn Button: clicking this button initiates the Learn dialog. Learning associates a given input – say, the top-right fader – with the current automation track. Once a fader or rotary has been learned, ALK will forward that type of messages to any destinations for the duration of any captured automation regions.
- Control/Command Sources: this is a list containing any input sources (i.e. faders, rotaries, or keys) that the current automation track listens to. For command tracks, this list is labeled Command Sources; for for control tracks, it is labeled Control Sources. Sources can be removed by clicking the delete button on the right-hand side of the source, which means that the input in question no longer triggers the destination during a captured automation region. Note that multiple sources can trigger the same automation regions (and thus destinations). This allows, for example, several band members in a band to control the same automation destination. A typical use-case for this is to allow several band members to control the global panic mode.
In order to use our MIDI controller’s physical controls to manipulate functions in ALK, an input source must be determined. Within the Automation Tracks’ Track Panels. Once a MIDI controller is connected, MIDI Control Change messages (CC’s) can be assigned to the current track by clicking the Learn button. Once the Learn dialog opens, the user manipulates a physical MIDI control (fader, rotary, pedal etc.) and the control’s CC number is learned by the automation track, allowing it to be routed to Captured Automation Regions.
Similarly to automation track types, automation regions also have two variants.
- Captured Automation Region: this is a region in which you intend to perform some automation. For example, you may use the learn function to associate a particular fader (say, the left-most fader on your MIDI controller, (with MIDI CC 25, say) with a particular track. This is analogous to setting the input to, say, channel 1 (guitar) on your Rhythm Guitar track. Next, you draw a captured automation region, which allows to determine what fader 0 controls in for the duration of the region. Scripted automation regions always have a cyan border, while their fill color takes on the hue of the track they reside in.
- Scripted Automation Region: this region represents automation which is pre-planned. For command tracks, scripted automation regions allow typing in the value taken on by the destination when the region is entered. For control tracks, scripted automation regions allow you to draw a curve using control points which will determine the values taken on by the destination over the course of the region. Captured automation regions always have a yellow border, regardless of the track they reside in.
Since there are two types of automation tracks, and two types of automation regions, there are in effect four variants of automation regions; these are summarized in the table below.
|Control||Scripted||Scripted Control Region||Used to draw automation curves|
|Control||Captured||Captured Control Region||Control ranges can be controlled through sliders.|
|Command||Scripted||Scripted Command Region||Destination value allows precise control of destination value.|
|Command||Captured||Captured Command Region||Used to trigger effects. Allows delayed triggering.|
All automation regions have the concept of a destination. This is essentially the output of the automation track. Note the difference with other track types: in, say, an instrument track, the output is controlled by adding cables on the output side (i.e. at the bottom) of that track’s track panel. Assuming you don’t add or remove cables during your performance, the output does not change. On the other hand, each captured automation region can output to a different destination. This destination is always the first text field that appears on the automation region (unless the region is so small that all of its controls are hidden, in which case zooming in will reveal them). Clicking on the destination field reveals a popup-menu which allows you to choose where to send the learned inputs for the duration of the region.
For example, in the second image down there are two captured command regions side by side. During a performance (i.e. when recording) or during playback, when the current position in the song is inside the first region, pressing the Bass track’s ENV1 Trigger will be set to Gate.
Latch to Last Value vs. Revert to Initial Value¶
Upon exiting an automation region, the current value of the region’s target can either latch to the last given value while inside this region, or revert to the initial value prior to entering the region.
Both modes have their use; for example the temporary control of an effect may be required in the chorus of a song. Once the next verse is entered, the effect will jump back to the previous value before it was manipulated in the chorus. This is achieved by setting the automation region to Revert to Last Value.
Captured Automation Regions¶
Captured Automation Regions are the analogue of Record Loops for automation tracks, in that their contents will be filled out during your performance, rather than when you create your arrangement. Similar to ALK’s handling of record loops during a performance, any input (from control sources that are learned) is routed to the appropriate automation destination during the region in question.
Furthermore, you can create any number of regions in a single automation track, and each can send its output to a different destination – even if the destination is on a different track or plugin. This allows for much more efficient use of a limited number of physical MIDI controls during a song or set.
Maximize your hardware! A controller may have only four rotary knobs but each one can be assigned to a different target at different points in time, allowing complex performance control with even the most limited of hardware controllers.
Captured Control Regions¶
Captured Control Regions are used for live control of continuous targets, such as track volume or plugin parameters. They take input from the learned MIDI inputs of their Control Tracks (eg: MIDI controller fader CC12) and output the data to the target assigned within the region (eg: volume, synth filter cutoff etc.). The minimum and maximum values of the control can be set inside the region which allows a sweet spot of control to be honed in on. These values can also be be reversed which allows physical controls to be flipped. For example, pulling a physical fader down could now turn up the volume of a track.
The min/max sliders only become visable when zoomed in to a reasonable degree. If they are not visable, increase the level of zoom.
Captured Command Regions¶
Captured Command Regions are used for live triggering of fixed values, such as on/off or delivering set values such as a triggering a tempo BPM change. They are triggered by the learned MIDI inputs of their Command Tracks (eg: MIDI controller button or pedal CC9) and pass that trigger onto to the target assigned within the region (eg: power, effect on/off etc.).
- A target destination with a value of on or off can be triggered in the ways seen in the above image.
- Off: Triggering within the region sends an off message to the target.
- On: Triggering within the region sends an on message to the target.
- Toggle Off/On: Triggering within the region sends consecutive on and off messages to the target.
A target destination such as Volume provides a choice of values which can be triggered. They can be accessed by clicking the value box which opens a drop-down menu.
Other continuous values such as plugin controls can be triggered by entering a value between 0 and 1 into the Command region. For example, entering “0.5” into a Command region targetting the “Feedback” parameter of a delay plugin will set the parameter at 50%.
Command regions can also be used to send Transport Commands, allowing major first degree functions of ALK to be controlled automatically or with physical controls.
- Play: Triggers playback.
- Record: Triggers Recording (Performance).
- Panic: Triggers the Panic Mode.
- Position: Jumps to the bar number set in the region value.
- Tempo: Jumps to the tempo set in the region value.
Every Captured Command Region has its own trigger mode. When the Learned MIDI input for the Command track is triggered, the following happens based on each setting:
- Immediately: Triggering within the region is sent to the target immediately.
- Quantize: Triggering within the region is sent to the target at the end of the chosen bar measure (1/1, 1/2, 1/4 or 1/8).
- At End Of Region: Triggering within the region is sent to the target at the end of the Captured Command Region.
Scripted Automation Regions¶
Scripted Automation Regions differ from Captured Control Regions in that they require no MIDI input source. Instead the input is derived from the region itself. This comes in two forms: automation triggers and automation curves. These pre-programmed regions allow a vast array of functions in ALK to be automated. Lets take a look:
Scripted Control Regions¶
As with Captured Control Regions, Scripted Control Regions deal with a continuous and changing value. They do this through the use of an automation curve which looks like this:
In the example above, the target is the Volume of our Guitar track. The height of the automation curve controls the value sent to this target across time. Clicking on the left hand Edit Curve buttton opens up the region for curve editing:
Editing an automation curve can be done in a number of ways. Once the region is open, clicking on the area inside creates automation nodes. Nodes can be moved by simply clicking and dragging them the mouse, and then deleted by clicking on them. There are also a number of trackpad gestures which can be used to generate more complex curves:
cmd + scroll down: Hovering over the mid point between two nodes, holding the cmd key, and scrolling down with two fingers on the trackpad moves down the whole section between the nodes. Scrolling down nearer to one of the nodes moves only that node, enabling diagonals to be drawn.
cmd + scroll left/right: Hovering over a diagonal, holding the cmd key, and scrolling left or right with two fingers on the trackpad changes the shape of the diagonal to convex or concave.
cmd + rotate: Hovering over a diagonal, holding the cmd key, and making a rotate gesture with two fingers on the trackpad changes the shape of the diagonal between straight and S-shaped.
Scripted Command Regions¶
Scripted Command Regions deal with on/off messages or trigger fixed values. As an example, they can be useful for triggering the power function (mute) of a Metronome, allowing it to be silenced after a specific point in the performance.
As explained above, scripted automation is used to pre-program changes in your performance that you would rather not control live. In this sense, scripted command regions could be used to automatically turn on and off the pedals of a virtual pedalboard, such as what is found in most guitar effects plugins. This allows the musician to truly focus on their instrumentation.