I’ve got to say, the new features in Reason 7 are killer!  I’m having a whole lot of fun working with everything from the new Spectrum EQ Window, to the buses and parallel channels on the Main Mixer.  And of course all the rocking Rack Extension devices, especially the Korg Polysix and Propellerhead PX7 instruments, and the Pef Buffre and FXPansion Etch Red Filter effects, and iZotope Ozone for mastering.  Way too much fun and not enough time in a day to play with all these new shiny toys!

And did I mention that you can turn audio loops into REX loops directly in Reason now?  No more need for the ReCycle program.  This is so cool!

About the new bus paths in Reason’s Main Mixer.

And it’s so easy to create parallel channels now.

My Berklee Online Courses:

I’ve worked in a lot of DAW programs: Digital Performer, Cubase, Pro Tools, Logic, Live, and Reason.  I keep coming back to Reason for its amazing sonic palette (which has grown immensely with the release of Rack Extensions) and its inspiring interface for developing custom sounds.  I’m a big believer in taking the time to design sounds that fit each production, and then recycling these sounds for future productions.  Over time, I believe this helps you to develop a voice as a producer.  Your productions will be recognizable not only from your arrangements and writing, but also from your personal bank of synth and sampler patches.

Reason makes developing such sounds as easy as plug and play, like simple object oriented programming.  You don’t need to be a technically minded, sound designer whiz to cook up great sounding patches using just a combinator and some simple layering techniques.  This is how I make a lot of my sounds that you can hear in my productions.  In this video, Easy Sound Design with Reason (or, Building a Cool Electro Bass), I demonstrate how you can easily and quickly build your own custom sounds without needing to understand anything to technical.

AES Report 2012

Nov 07 2012

The Audio Engineering Society (AES) held its annual conference in the beautiful city of San Francisco a couple of weekends ago.  It was the 133rd conference!  That’s a lot of shows.  I had the opportunity to visit the show room floor where manufacturers where hocking their wares.  It’s always a fun atmosphere in which to see, touch, and learn about the newest and coolest music production gear.

As a rule, when I hit the show room floor I’m keeping my eyes open for specific products, as well as anything groundbreaking that might make my job easier and inspire my music production work.  This year I was looking out for studio monitor control devices, mid-sized studio monitors with great bass response, MIDI controllers with finger pads, and innovative work surfaces.

The monitor controller that caught my eye was the Oculus by Shadow Hills.  Its fat, ergonomic level knob felt great, and its toggle switches for selecting input and monitor sources were a pleasant change from the usual push buttons.  Most impressive of all, it was wireless!  The company’s demo guy handed me the Oculus controller, sat me in front of an array of Barefoot Sound monitors, and asked me what I wanted to hear.  Of course I asked, “Got some dance music with good bass?”  He happily obliged my musical preference and the next thing you know I’m banging an EDM track while fluidly switching between three sets of speakers.  Selecting speakers with the Oculus was a real pleasure, and the Barefoot Monitors sounded amazing.  I was especially impressed by their smooth, consistent mid and high frequency response across three different sized speakers: MicroMain35, MicroMain27, and MiniMain12.  The large MiniMain12 and mid-sized MicroMain27 speakers both had excellent, tight bass response.  I was impressed.

Oculus by Shadow Hills

Barefoot Monitors

While fantasizing that I could somehow fit the MiniMain12 speakers into my home studio, much less afford them ($19,950 a pair, ouch!) I heard two people behind me say, “We’ve got these speakers in a room at Berklee.”  I turn around to see Mark Wessel and Leanne Ungar, both Berklee College of Music Associate Professors in the Music Production and Engineering department.  Pretty cool!  It’s always a lot of fun to meet people at these shows, especially fellow Berklee folk.

Akai also had a booth at which the new Akai MPC Renaissance and its little brother, the Akai MPC Studio were on display.  Of course I had to try out some finger drumming on the pads to see if they felt at all similar to the classic MPC pads.  I was not disappointed.  The MPC Renaissance felt especially good, with a solid feel, responsive pads that are velocity and after touch sensitive, and a bank of sixteen very grab-and-turn friendly rotary knobs.  The Renaissance is a surefire hit for folks wanting that classic MPC feel in a fully integrated MIDI controller and beat making platform.  The MPC Studio’s pads felt identical to the Renaissance, but its dials felt decidedly inferior to the Renaissance’s rotary knobs.  I found myself wondering why you would design a controller with dials that feel like mini plastic plates rather than knobs you can grab between your thumb and forefinger?  Maybe they’re for spinning rather than turning and I’m missing the point?  In any case, some knobs on the Studio would be nice.

Akai MPC Renaissance

 

Raven Multitouch Audio Production Console by Slate Pro Audio

One booth that always had a crowd was Slate Pro Audio, where they were demonstrating the Raven Multitouch Audio Production Console.  It appears to be a truly innovative work surface, a giant touch screen from which to control your DAW program.  I’m always wanting a bigger screen and this definitely fits the bill!  But wait, didn’t I just say I like knobs to turn?  This is more like a really giant iPad.  In any case, it’s a truly innovative concept and it will be interesting to see how the platform develops.  Really amazing technology.

It doesn’t matter if you’ve got the best gear money can buy if your studio isn’t properly set up.  I can’t tell you how many home studios I’ve seen with improperly positioned monitors, uncomfortable workstations, and a poorly tuned room.  What you end up with is a sound that might be fine in your home studio but doesn’t translate at all to the outside world.  And you’re left scratching your head, wondering why you just bought the best gear you could afford but it’s not sounding right?  Well, you’ve got to set it up correctly in order to truly hear what you’re doing.  This doesn’t mean you have to build your own room from scratch, or spend a ton on acoustic material, you just need to understand basic acoustic principles and apply some common sense.

When I did consulting I used to go into home studios and help clients set up their gear for the best results.  When I saw Grammy award winning audio engineer Francis Buckley’s Studio Rescue series (sponsored by Rode Microphones) on YouTube, I said to myself, “Wow, that’s exactly what I would have recommended. I’ve got to tell my students about these YouTube videos.”  They’re really excellent.  Buckley knows what he’s talking about and offers practical advice on working with the space you have, and how to tune it using furniture placement and a few strategically placed Vicoustic foam panels.  Watch this video series if you’re not sure about how to position all the gear in your home studio.  I guarantee you’ll learn a ton.

There are twelve episodes posted so far.  Here are a few direct links:

Studio Rescue – Episode 1

http://youtu.be/02qpJt0hsL0

Studio Rescue – Episode 9

http://youtu.be/pf_7sC9wV8Q

Studio Rescue – Episode 12

http://youtu.be/K0iuj56c_eg

It’s super easy to sidechain compress in Reason.  And this is the key to producing that classic, pulsing synth pad sound you hear in dance music.  You know, the synth pad that throbs in time with the kick drum.  Here’s a video on how to set this type of sound up in Reason.  Plus, I show you how to keep it going even when your song’s main kick drum drops out, so you can produce inspirational breaks in your arrangement without ever losing the pulse of the kick.

Here’s the completed combinator patch that I demonstrate in the video so you can explore how it’s put together right in your own Reason Rack.

Combinator Patch [COMING SOON]

 

Lucky Date Interview

Jan 15 2012

I recently had the opportunity to chat with up-and-coming electronic music producer and DJ, Lucky Date (Jordan Atkins-Loria). He uses Reason to produce these fantastically phat dance tracks and remixes. Plus he regularly shares his production secrets on his YouTube channel, luckydatevideos. The music that he pumps out of Reason is truly inspirational, so I wanted to ask him about how he gets such a huge sound and what other software besides Reason is part of his production and DJ arsenal. He gave a great interview and had a lot of wonderful insight and advice. Watch out for Lucky Date, I predict he’ll be producing many mega-dance-floor hits in the coming years.

 

Here’s a common mistake I see over and over, producing and mixing a track with a gain maximizer on your mixer’s main output. Examples of gain maximizers are the Waves L2 Ultramaximizer, Avid Maxim, or if you’re working in Reason or Record, the MClass Maximizer. These devices are designed to limit a signal’s peaks and then automatically optimize the output (called automatic gain makeup), in relation to a given threshold, to a specific level you’ve set (such as 0 dB, or –0. 2 dB). Or, to put it in simpler terms, to make your audio sound as loud as possible.

If you don’t realize that you’re working through a gain maximizer, you’re probably thinking that your production sounds wonderfully loud and full. Indeed, some software programs actually feature default templates with a maximizer on the mixer’s main output (such as Reason). This is a great sales pitch, since it makes your track sound totally bombastic, but it’s not reality. The reality is that if you bypass the maximizer you’ll most likely discover that you’re clipping (exceeding 0 dB) your main output, badly. It’s only because of the maximizer that you can’t see or hear your woefully out of balance gain structure. Instead, the clipping is being rounded out by the maximizer’s brick-wall limiter algorithm in order to sound more palatable to your ear. But, the clipping is still there, your overdriven mixer channels are still there, your poor gain structure is still there.

As a dramatic example, I often demonstrate this mistake in Reason. With the default Mastering Combinator inserted on the mixer’s main output I create a Dr. Octo REX, press Play, and then turn up all the levels to the max: Dr. Octo’s Master Level, the mixer channel’s level, and the mixer’s Master Fader. It sounds great and there’s no clipping indicated on the Audio Output Clipping Indicator on the Transport Bar. But, then, I Bypass the Mastering Combinator and the Clip Indicator immediately illuminates, and stays on nonstop. In fact, in this extreme example you can actually hear the clipping, and this is difficult to do in Reason because its main outputs seem to be pretty forgiving even when you’re seeing the Clip indicator.

So, now, the obvious question is, if everything sounds fine with a maximizer on my main output why should I care? There are a few reason’s why it’s not a good idea to produce and mix with a sonic maximizer on your main output:

Just because you can’t see or hear the clipping doesn’t mean it’s not there. And, if it’s there, then when you master your mix all you’re doing is trying to smooth out the clips. You’re gain maximizing your entire mix, clips included, and this can only lead to an inferior sounding master.

If the maximizer is trying to turn up, or turn down, your signals for optimum loudness, then whenever you adjust a level or EQ a signal in your mix the maximizer is automatically countering your move. Consequently, you won’t have the full dynamic range to work in, you’ll be limited to the dynamic range that the maximizer is setting for you. With the maximizer countering every move you make in the mix you aren’t really hearing your work. Talk about counter productive.

As a rule, maximizers are serious processor hogs. Think about it, they have to look ahead at the digital signal and adjust every upcoming peak according to their Threshold and Gain Output settings. So, with a maximizer inserted on your main output, can you imagine how much latency you’re introducing? The answer is, a lot, in the thousands of samples. Just try monitoring a live signal (such as a vocal or guitar) through a maximizer inserted on your main output and you’ll immediately hear what I’m talking about. It’s a disturbing amount of latency and there’s no reason to be fighting for processor resources when all you have to do is delete the maximizer from your signal path. (When you’re mastering, this sort of latency on your main output isn’t an issue.)

If you’re already slamming your mix through a maximizer, you’ve pegged 0 dB, and everything is as loud as it can possibly be with hardly any dynamics left in your mix, what’s left for a mastering engineer to do? The answer is, not much. If you’re serious about releasing your music, leave some dynamics in your mix for a mastering engineer to work with.

Having said all this, I think it’s a great idea to fine tune your mix through a maximizer when you’re mastering directly in your multitrack mix session. I do this all the time, after my mix is complete, especially for reference mixes (tracks that need to impress clients), and background music for film and TV. However, if it’s for an album cut that I plan to send out for mastering, the maximizer effect (some maximizers, such as the L2 and Maxim, have dithering and bit reduction that can be used independently of their gain maximizer functions) is out of the signal path altogether.

I’ve received many requests for tutorials on writing/producing a hip-hop or dance beat. In theory, this is a nice idea. In reality, there’s just no way you can encapsulate all of the creative and technical know-how that goes into writing and producing a great sounding beat in a single tutorial. Fortunately, that hasn’t stopped me from trying, because even if I can’t pack all of the relevant information into one tutorial, it’s still worth doing for the information that I can share in about a ten-minute video.

So, I threw on some clothes, my Remix Miami T-shirt, didn’t bother to shave, set up the camera (top view down so you could see my hands on the control surfaces), and wrote a hip-hop style beat off the top of my head. It took me around 40 minutes, but I edited the whole process down to about a 12-minute video. Obviously, there are some parts missing, such as playing with MPC backdrops for Kong, or running the hi-hats through a compressor. But, if you watch carefully, it’s all there, because in addition to the techniques I describe as I’m working, you can also see all the device settings and the connections when I flip Reason’s rack over. The video is in HD so you can totally see all the details. I used Reason 5, Kong for all the drum sounds, and Thor for the bass line. Enjoy!

My New CD is Out!

Aug 09 2010

I’m very excited to announce that my new CD is out, Erik Hawk & The 12-Bit Justice League. If you like electronic dance music fused with orchestral elements, I think you’ll enjoy this CD. Plus, the physical CD contains remix stems (WAV, REX, and MIDI files) for your remixing and DJ-ing pleasure. The physical CD can be purchased through CD Baby, and digital only downloads are widely available, from iTunes to Amazon.

Here’s the official press blurb:

The new album by Erik Hawk, Erik Hawk & The 12-Bit Justice League, plays like the soundtrack to an action movie. Every song could underscore a scene, from the opening action of “Introductions”, to the heroics of “On a Mission”, and the closing images of “Into the Sunset”. So, it comes as no surprise to learn that Hawk’s alter ego is composer/producer/remixer, Erik Hawkins. His music has been featured in countless film and television shows (from The Informers, to Ugly Betty, and CSI:Miami).

Joining Hawk on his musical adventures are several critically acclaimed musicians, including Gilli Moon (vocals), Christine Wu (violin), Lygeia Ferra (vocals), Craig Seganti (trumpet), and the album was mastered by pioneering hip-hop producer/engineer Michael Denten. Hawk wrote/co-wrote, arranged, and produced all of the tracks. He plays guitar, keyboards, and sings throughout the album.

To keep up with announcements, shows, placements, and contests, join me on Facebook.

And, here’s my official YouTube announcement:

Peter Gabriel Remix

Jul 02 2010

Please vote for my remix at indabamusic by clicking on the widget below, and ask all of your friends to vote for me too. Voting goes until July 15, 2010. And, if you send me to London I promise to bug Gabriel for all of his best production tricks so that I can bring them back here and share them with all of you.

As a rule, I don’t generally have time to enter the many amazing remix contests offered on indbamusic.com. But, this time, I couldn’t pass up a chance to remix a classic Peter Gabriel song, “Games Without Frontiers”. And, more importantly, a chance to have Gabriel listen to my work and maybe even meet him in London! That’s just to cool of an opportunity to pass up. With everything that Gabriel has done in his life, both musically and as a philanthropist, he’s definitely a hero of mine. So, I went for it.

Remixing is a form of music production. Just like producing a song for an artist, the object shouldn’t be about imposing your musical ideas on another musician’s composition and performance. Instead, it’s about helping the artist and their material to be the best that they can be. To this end, I think it’s important to respect the songwriter’s original message and the vocalist’s performance when remixing, especially when the vocalist is the songwriter. Ideally, the recognizable elements of the song, such as vocal phrases and guitar lines, should be audible in your remix. With this in mind, I felt “Games Without Frontiers” could benefit from a more guitar-driven, pop rock arrangement, with a full kit played over an updated Roland CR-78 drum loop, and a touch of orchestral elements for added texture and movement.

In these videos, I take you on a mini tour of my “Games Without Frontiers” remix session using Pro Tools and Reason. There’s a lot to explain in this session, so I broke the tour into two videos. The first focuses on drums and rhythm section instruments (bass, guitars, piano, etc.). And, the second focuses on orchestral elements, voice parts (lead and backing vocals), and mastering. I’m also attaching the Pro tools session file, without its audio files, to this blog so that you can look through the session and see how it’s all set up.

Peter Gabriel Remix Session Video Tour (Part 1)

Peter Gabriel Remix Session Video Tour (Part 2)

*Remember, you can double click on these videos and watch them directly on YouTube to see them in HD.</em

The session file as a Zip.
PG REMIX VIDEO TOUR
Download directions:
Right-click PC and from the pop-up menu choose, “Save Link as…”
Control-click Mac and from the pop-up menu choose, “Save Link as…”