Please ensure Javascript is enabled for purposes of website accessibility Jump to content

DunedinDragon

Members
  • Posts

    3,527
  • Joined

  • Last visited

  • Days Won

    100

DunedinDragon last won the day on September 10 2024

DunedinDragon had the most liked content!

1 Follower

Contact Methods

  • Website URL
    https://www.facebook.com/SalvationSaloonPosseBand/

Profile Information

  • Gender
    Not Telling
  • Location
    Dunedin, FL
  • Interests
    Gear: Helix, Yamaha DXR12, Les Paul Standard, American Strat with Lace Sensor pickups, Gretsch Silver Falcon, Epiphone Sheraton II Pro
  • Registered Products
    3

Recent Profile Visitors

4,736 profile views

DunedinDragon's Achievements

Grand Master

Grand Master (14/14)

  • Dedicated Rare
  • One Year In
  • One Month Later
  • Week One Done
  • Reacting Well Rare

Recent Badges

1.5k

Reputation

11

Community Answers

  1. What I've found for years now is, IF you have a proper cable that fits correctly, and IF it's plugged into a REAL USB port and not an internal hub port (which depends on the computer) I've never had a problem with either of my Helix Floor units. I do know those type of USB B hub can be particularly picky about when you plug into it, but mine have always worked fine as long as I'm careful when I plug into my Helix. It's important to understand that not all ports on a computer are direct USB ports. Sometimes they're often defined as part of an internal USB port collection which acts as a hub and not a pure connection. Typically the only way to really be sure is to examine the way it's defined in your Device Manager. As long as it's a standalone USB entry it's always worked for me even if I use an external hub. Currently on both my PC and my laptop I plug in my hubs to a powered USB Type A (3.0) and have no problems using any of those ports to plug into my Helix.
  2. The VAST majority of folks doing PC studio work tend to use 48000Hz because it's quicker and faster to perform the mathematics on the conversion and therefore avoid latency. There's generally no audible advantage to higher rates at least as far as humans are concerned. Your dog may disagree.....
  3. The problem is the Helix, as good as it is, isn't a mixing board. I use a similar setup with my band and have been using it for several years now. The main issue is you have to be able to gain stage the backing tracks as well as the Helix. That's why I send the backing tracks from Ableton (gain staging the individual tracks to the right mix) as two XLR stereo outputs into the mixing board along with my separate Helix guitar signal and vocals on their own channels so I can gain stage all or those things to achieve the appropriate mix for the audience as well as for the stage monitors. I send the MIDI control outputs to a MIDI foot controller that converts the MIDI internally to the right combination that goes to the Helix or other devices. That way if I trigger pedal 1 on the foot controller it converts it into any number of MIDI sequences and isolates the Ableton track from those intricacies. For example I send a stock MIDI command to trigger the A footswitch in the MIDI foot controller which then might send two or three specific Helix commands to the Helix and a separate set of commands to the stage lighting controller. I realize that's a LOT more complex than you're wanting to do right now, but certainly a mixer is vital to be able to consistently get the right mix between the backing tracks and live Helix guitar and vocals as a first step.
  4. The only true way you can achieve the amp in the room feel is to do the same as is done with an orchestra or ensemble which is to capture the performance using three different types of mic setups just to capture the ambience of the room with near, wide and far microphones and blend them appropriately to create the room ambience. Fortunately there are convolution plugins like Spitfire Audio Essentials Reverb that can fully simulate the effect accurately at least as it would happen in a world class studio such as Air Studios in London. Seems a bit obsessive to me for just an electric guitar.
  5. Would you support charging a fee for a native Linux based HX Edit to offset the development costs of this FREE software? Or would you just prefer to spend other people's money for your personal benefit?
  6. DunedinDragon

    --

    That may be the case in non-professional, non-commercial deliveries. And I did refer to that in the second part of my statement, and that's why I spent a good portion of my career teaching organizations like those how to organize teams and the necessary cycle and critical guidelines of iterative development and delivery, of which actual code-work is about 1/3 of the process. Those organizations that don't follow those process and team elements generally have about an 80% chance of failure or rejection by end users. Sorry you seem to have been limited to working in one of those.....
  7. DunedinDragon

    --

    I can only say I will be eternally grateful you were never a developer on any of my projects given all your statements here. You're allowed your opinion, but certifying it with your claim of being a software developer given your obvious lack of technical knowledge regarding DSP real-time programming and the process used in professional team-based, versioned commercial delivery of products only serves to embarrass you further.....
  8. The MIDI required to make changes to a different preset and/or snapshot as well as enabling different stomp buttons and such is pretty well documented. A great source for such things would be the Helix Help web site as it has a section dedicated to the various MIDI actions available and the ways in which you use them. I will say, from someone that operates my Helix solely in conjunction with backing tracks you probably want to separate the MIDI interaction from the playing of the backing track because of timing constraints. It takes a moment or two for the Helix to change presets and be ready to be played whereas starting a backing track is pretty much an immediate thing. Your backing track player may provide a delay between the commands sent and starting the track, and if so that would probably work fine. In my case I use a dedicated MIDI footswitch controller where one switch sets up both the Helix and selects the backing track, and another footswitch starts the backing track. That way I know the Helix (and all other stage automation components) are ready before we start the song and it's worked flawlessly for many years now.
  9. Absolutely compared to other instrument plugins which are MIDI based. Consider drum plugins. The entire world of drum rhythms are available to you in addition to different types and sizes of drums for the kit you're using. The size, placement and mic positions on the drums are similar to what Helix Native does. The plug in drum rhythms that simplify the creation of the beat for the track is what something like Native doesn't have. In fact I can change the BPM without affecting any plugin tracks. That's where things like native require much more attention to get the track recorded. I can make a finished acoustic guitar track in no more than 20 minutes with something like Native Instruments Session Guitarist libraries. I could always route that into Helix Native in the DAW, but it would be nice if it were all one thing. As far as Basses I used to record those through my Helix with my precision bass, but anymore I just select the bass from any number of different bass plugins or even upright bass, or orchestral bowed bass and do them on the keyboard. It's MUCH faster. As far as effects, pretty much all DAWs come with plenty of options and there's more out there in plugin land. I just purchased a convolution reverb plugin that allows you to apply various mic'ing and environmental setups captured in some of the most famous location in the world like AIR Studios at Lyndhurst Hall in London and apply it to any type of instrument.
  10. I think what limits the appeal of Helix Native as well as many of it's other competitors is that it's so guitar centric which limits it's usefulness in today's plugin world. Even though I'm a guitar player I predominantly use a keyboard as my input tool when recording. I don't think I'm unusual in that respect given the number of people who have MIDI keyboards as a part of their recording setup. When you compare the workflow in Native to the workflow in more keyboard oriented guitar plugins it's easily three times effort involved in laying down a finished track in audio rather than MIDI. Plus in those type of plugins you have access to a wide variety of strumming patterns and other prepared riffs. That significantly limits the appeal for people that predominantly use keyboards in their workflow.
  11. As you can tell from the above responses, since Day One of Helix there have always been ways to achieve the 'amp in the room' effect one gets when using the Helix. The reason a lot of us don't really worry or care about that is that it never affects or is important to our audiences whether it's a live performance or a recorded performance. We can get close enough with the tools in the Helix to not worry about it and gain the simplified benefits it brings in both performance and recording. If it inspires your your performance then do it. But there is no magic formula to reproduce it because the 'amp in the room' effect changes dramatically with the room, the position your amp is in and your position relative to the amp. That's just physics.
  12. Not everything needs to be a snapshot and it sounds like this is a classic one. Just use a simple footswitch setup that you can momentarily do the boost.
  13. Personally I think most people overestimate the value of what generative AI can do when compared to actual human intellectual capabilities. Ultimately it depends on how well it's been trained in order to consider "out of the box" solutions to problems. Someone like Ben Adrian is unique in that regard so a system trained by Ben would likely exhibit more of those traits than one trained by a very strict "rules based" individual. I think we're quite a few years away from understanding how creativity works in humans before we'll be able to even approach considering how to design it into AI computers.
  14. First and foremost, the vast majority of my EQ comes from selecting and accurately configuring my cabinet, mic and mic positions on the new Helix cabs. The only EQ I ever apply is with a final EQ on the patch at the end of the signal chain if I need to take a little bit off of the highest portion of the frequency spectrum, but that's usually pretty minor like a high cut, but that's typically very slight. EQ'ing too much, especially when done in isolation from the rest of the instruments in the band, buries the guitar requiring you to overuse volume to be heard in the mix. If you want a real eye opener on how to appropriately EQ something, get yourself a copy of iZotope's Neutron 4 Elements and allow it to professionally and automatically adjust the EQ on all the individual various instruments being used in your song. You'll be shocked at how each instrument can be heard within the arrangement even if it's at a lower volume than some of the other instruments creating a much more professional sounding mix.
  15. I haven't been on a stage in years with a bass amp. If you have a Helix you already have several choices for a bass amp and speakers. All you need to do is plug it into a mixing board. That should save you all the room you need in your van. See how easy that was?
×
×
  • Create New...