Many people think that MIDI is the old, synth-y sounding stuff they heard in old video games. This is actually not correct. MIDI is not *SOUND*. It is musical data.
MIDI stands for Musical Instrument Digital Interface. In the early 1980s a number of music technology companies collaborated to develop a universal language for the communication of musical information between computers and instruments or other music related hardware.
This digital music language can detect, record, or transfer many kinds of musical activity, including:
- What note you play
- How hard you hit it (velocity)
- When the note starts (attack), how long you hold it down for (sustain), and when it ends (release)
- Curve-based information like Mod and Pitch Wheel (curve-based data)
- (And many other functions)
DAW stands for "Digital Audio Workstation". The name is a little confusing because it does not refer to hardware or your physical studio, but to your master software program for recording, editing, and manipulating audio, and it can handle dozens or even hundreds of tracks at a time. Cubase, Logic, Digital Performer, and Pro Tools are popular options.
In addition to handling advanced audio recording and edit features, most DAWs today ALSO have robust tools for receiving, manipulating, and editing MIDI information. .
Your DAW is also a host software for PLUGINS--which are other, smaller pieces of software that, well… plug IN to your DAW. Plugin software includes things like your Virtual Instruments, and Effects like Reverb and delay.
Most DAWs today have some musical notation functionality, but these are typically not their strengths and are unused by most. I will speak about notation more later.
Now here is where things get cool.
You can use a piano keyboard (actually any kind of midi-compatible instrument) to input MIDI information and send it via a MIDI cable or USB cable into your computer, where it is recognized by your DAW. There you can APPLY virtual instrument sounds TO those incoming midi notes in REALTIME (there is actually a slight delay, but this can be minimized to be unnoticeable). The result being: when you play on your digital keyboard you hear back the virtual instrument sound that you have loaded in your DAW.
Like I mentioned, MIDI can transfer not only information about the NOTES, but you can use a wheel or joystick to send CURVES of information to control the pitch, volume, or expression of an instrument WHILE you play the notes.
The result of this is that you can create compelling performances with ANY virtual instruments you have on your computer.
Futhermore, you can record performances into your DAW and then edit every aspect of the recorded MIDI performance using advanced editing tools that are pretty much universal for all of the main DAWs there days.
If you want, you can copy and paste this performance to other tracks or apply a different instrument to the MIDI you recorded.
The broad term "Virtual Instruments" refers to any instruments that are run or performed virtually… on computers. Sub-sets of Virtual Instruments would include sampled instruments, software synthesizers (as opposed to hardware synths), and modeled instruments.
SAMPLED INSTRUMENTS/SAMPLE LIBRARIES
The sampling approach consists of recording (with microphones) isolated musical moments or elements that can be accessed later as a component of your composition.
Usually what happens is that an accomplished musician (let’s say, a vionist) is brought into a room for recording and is mic'd as if this was going to be a regular recorded performance. But instead of recording an entire performance, the individual notes will be recorded as fragments. G will be recorded individually. Then G#. Then A, and so on until the entire range of the instrument has been captured on a note-by note basis. After recording, these notes can be mapped to their respective pitches on the keyboards so when you play C-D-E on the keyboard what you hear back in real-time is those REAL violin notes recorded in a real room with real microphones.
The ADVANTAGE to this approach is that you can use the sound of REAL INSTRUMENTS without having to record them live!
The CHALLENGE with this approach is that while the sound is real, the virtual performance can sound very artificial and stale because they way you create the performance cannot or does not always mimic how the real instrument does.
But gratefully, my explanation of the sampling process is very primitive and a little bit simplistic. Over the last 20 years, sampling techniques have become very thorough and advanced so as to capture many levels of nuance in the articulations, dynamics, and behavior of the instruments, in order to offer the composer the ability to perform these instruments as expressively and possible.
So if you know what you are doing as a virtual instrument performer, you CAN create some very expressive and realistic-sounding music with virtual instruments. In fact, using Sampled-Based instruments within a MIDI-based approach is currently the BEST way to create realistic sounding orchestral music without recording a live orchestra performance.
Later in the course I will be covering performance techniques for getting the most realism out of your sampled instruments.
MOST ORCHESTRAL virtual instruments are created using this approach, because it captures the most realism in the sound.
Software synthesizers Omnisphere, Zebra, and Massive, generate synthetic, artificial sounds using oscillators and a number of controllable parameters for customizing the sounds. While there are synthesizers (like Omnisphere) that will use an audio sample as a sound source, the strength of synths is not in accurately replicating acoustic instruments, but in creating NEW sounds, textures, and soundscapes.
There are SOME orchestral virtual instruments that use a technique called sample modelling. This is really a type of synthesis, because the audio waveforms are not RECORDED but carefully modeled to artificially generate sound the same way that real instrument does.
I do not use any of these instruments but they can sound very compelling and some people swear by them for some things. I think that Sample Modelling technology has a bright future, but the instruments created with this approach are still used by a minority of orchestral composers who create on the computer.
SUMMARY OF THE MIDI-DAW APPROACH
As you can see, composing on a DAW with MIDI and Virtual Instruments offers a huge amount of potential for creating expressive music. You can control many fine details of the performance with relative ease.
SUMMARY OF THE MUSICAL NOTATION APPROACH
There are pieces of software that feature a notation-based approach like Finale, Sibelius, and Dorico. These options are better if you comfortable working with notation and are creating music to print out and give to live musicians, but if you plan to create finished, expressive orchestral music yourself, you will find that a MIDI-based approach in a DAW offers you more controls for shaping the performance, thus allowing for greater realism in the final product. It is also more accessible and intuitive for those who are less comfortable with reading notes.
So I will not be discussing the notation route during this course.
However, if you are working inside a notation-based option, I would recommend that you check out Note Performer, an artificial intelligence-based playback engine for musical notation. It comes with a full orchestral library of sounds, and automatically phrases and performs the music in a remarkably realistic fashion. It currently retails for $129. https://www.noteperformer.com/