Feature Requests

Enable Dynamic Audio Destination Switching via Actions and MIDI, Analogous to MIDI Routing
Description This feature request proposes the implementation of dynamic audio destination switching in Loopy Pro, mirroring the flexibility currently available for MIDI routing. While Loopy Pro allows users to enable or disable MIDI destinations dynamically through actions or MIDI bindings, similar functionality for audio routing is absent. Introducing this capability would enhance live performance adaptability and streamline complex audio setups. Problems Static Audio Routing : Currently, audio destinations are fixed once set, requiring manual reconfiguration to alter routing paths, which is impractical during live performances. Inconsistent Control Paradigm : The disparity between MIDI and audio routing flexibility can lead to confusion and limits the potential for cohesive control schemes. Limited Live Adaptability : Without dynamic audio routing, performers cannot easily adjust their audio signal paths in response to real-time needs or creative decisions. Proposed Solution Action-Based Audio Routing : Introduce actions that allow users to switch audio destinations on-the-fly, similar to existing MIDI destination controls. MIDI Binding for Audio Destinations : Enable MIDI controllers to toggle or select audio destinations, facilitating hands-free control during performances. Widget Integration : Allow widgets to control audio routing paths, providing visual feedback and touch-based control within the user interface. Examples Live Instrument Switching : A guitarist uses a footswitch to reroute their signal from a clean amp setup to an effects-heavy chain during a solo. Dynamic Vocal Processing : A vocalist toggles between different vocal effect chains for verse and chorus sections using on-screen buttons. Studio Workflow Optimization : A producer quickly switches audio outputs between monitoring systems and recording devices without navigating through menus. Benefits Enhanced Performance Flexibility : Performers can adapt their audio routing in real-time, responding to the dynamics of a live show. Streamlined Workflow : Reduces the need for manual adjustments, allowing for more focus on creative aspects during both live and studio sessions. Consistent Control Experience : Aligns audio routing capabilities with existing MIDI routing features, providing a unified and intuitive control environment. This summary was automatically generated by ChatGPT-4 on 2025-05-25. Original Post: A setting to switch destinations for audio inputs on the fly available to gesture or widget. This setting is currently available for midi destinations but not audio Being able to solo a destination on the fly would avoid the current method of creating multiple inputs for same mic and soloing the input channel. Problem Having a mic input routed to multiple destinations means it is monitored through every destination which is problematic. Being able to solo destination would solve this problem.
2
·

under review

Smart Text Fields & Placeholders for Dynamic Widget Displays (Text, Buttons, Faders, etc.)
Problem: Text in Loopy Pro widgets is static – there’s no way to display live values like AUv3 or MIDI, or to show dynamic status feedback. Users can’t show the value of a fader or knob, nor adapt text visibility, size or content dynamically. Proposed Solution: Text binding to controls: Let text widgets display live MIDI values from bound faders or knobs Placeholder system with support for: - $project.name$ , $bar.current$ , $channel_name_1$ , $AUv3_post_3$ - Current/total clip layers, named layers - Clip colors, groups, AUv3 plugin parameters User-defined labels for value ranges , e.g. 30–155 Hz = “Rumble”, 1000–3000 Hz = “Medium” Conditional formatting & visibility - Show exact value only while knob/fader is touched - After a delay, return to label display (e.g. "Range name") Text resizing logic - Auto-scale font size with widget/canvas changes - Manual ± font size per widget Formula-based expressions , like SWITCH(...) Text changes via actions , e.g. “Set label to: VERSE” Benefit: Real-time, intelligent feedback in the UI itself No need for extra screens or MIDI displays Clear overview of dynamic values, actions and playback states Great for live performance, section navigation, parameter feedback and more Customizable and responsive interface that adapts to user interaction Extension: Define Dynamic Behavior for Each Placeholder Ideally, each placeholder should have its own display logic – controlling when and how it appears. Examples: Always visible (default) Only visible during interaction , e.g. while a fader or knob is being moved Visible for a short time after interaction , e.g. 1 second after release Conditionally visible , e.g. only when a clip is playing or a value is in a certain range Toggle visibility via actions or logic This would allow for highly adaptive and context-aware UIs – showing the right info at the right moment without visual overload. It would give Loopy Pro a unique edge as a flexible, user-definable performance environment. This summary was automatically generated by ChatGPT-4 on April 30, 2025.
58
·

planned

Proper Support for Compound Time Signatures and Musical Metronome Accentuation
Description: Currently, Loopy Pro treats time signatures with a denominator of 8 (e.g. 6/8, 7/8, 12/8) the same as their /4 counterparts (6/4, 7/4, 12/4), resulting in incorrect bar lengths and non-musical behavior. For example, a 7/8 bar at 120 BPM should last 1.75 seconds, not 3.5 seconds. Problem: Compound time signatures (like 6/8 or 12/8) are not interpreted correctly in terms of duration and rhythmic feel. The metronome accenting does not reflect the musical pulse of compound meters. Common groove-based time signatures used in blues, soul, and R&B (e.g. 12/8) are rendered inappropriately, making live performance or composition less intuitive. Proposed Solution: Implement proper internal handling of /8 time signatures based on eighth-note timing, not quarter-note equivalence. Add automatic or customizable metronome accent patterns based on typical groupings (e.g. 12/8 = 4x dotted eighth, 7/8 = 2+2+3, etc.). Possibly allow users to configure beat grouping/accentuation manually for advanced rhythms. Benefits: Aligns Loopy Pro behavior with standard musical conventions. Improves metronome usability for groove-based and non-standard time signatures. Makes live looping and composition in compound meters far more intuitive and musical. Examples: A 6/8 loop plays in two dotted-quarter pulses per bar, not six individual eighths. 12/8 groove (e.g. triplet feel) sounds natural with correct metronome accenting. A 7/8 bar at 120 BPM correctly lasts 1.75 seconds. This summary was automatically generated by ChatGPT-4 on 2025-06-05. Original Post: The Loopy Pro manual says "Loopy does not make use of the denominator (the bottom number). Time signatures displayed as 6/8 and 7/8 are the same to Loopy as 6/4 and 7/4." but I feel like there's an opportunity to improve this behavior and make it more musical. Problem Eighth notes are half a beat, so a 7/8 measure at 120 bpm shouldn't be 7 beats * (60 seconds per minute / 120 bpm) = 3.5 seconds long, it should be 1.75 seconds long, because each eighth note is half a beat in length, not a full beat. Contrast that with with Pink Floyd's Money, which is in 7/4. In that case, each of the beats is a full beat, and so a 7/4 measure at 120bpm should actually take 3.5 seconds. The eighth note value in Compound Time Signatures, where the top number is divisible by 3, and the bottom number is 8 (eg. 6/8, 9/8, 12/8) are often each one third of a beat in length (Blues and R&B music often have this feel). So a 6/8 measure at 120 bpm wouldn't be 3 seconds long, it would be 1 second long. In this feel, a 12/8 measure takes the same amount of time as a 4/4 measure given the same bpm, just as 9/8 and 3/4 measures are the same length at the same bpm. Note that the Tick Metronome https://apps.apple.com/app/tick-metronome/id1573209073 plugin, when used as an AU inside of Loopy, inherits the time signature (eg. 4/4, 6/8, 7/8, etc) from Loopy as one would expect but behaves very differently from Loopy. It plays each eighth note as half a beat (which is great!). However, that doesn't change the fact that Loopy doesn't play each note as half a beat, so what you'd be hearing as one Tick measure would only actually be half a Loopy measure. Note that Tick doesn't seem to have any way to set the eight note pulse in compound meters to be an eighth note triplet pulse. Proposed solution All N/8 time signatures are played such that the 8th note length is 1/2 the length of a single beat Add an option so that N/8 time signatures where N is a multiple of 3 are played such that the 8th note length is 1/3 the length of a single beat Benefiits Better conformity with the general understanding of how Time signatures work, eg. https://en.wikipedia.org/wiki/Time_signature and https://en.wikipedia.org/wiki/Time_signature#Simple_versus_compound Other notes In order to properly support Compound Time Signatures, the metronome should output an additional tone to differentiate the main pulse from the triplet subdivisions. Eg. Right now for 12/8 we hear LHHHHHHHHHHH (L = low, H = high pitches) but in order to accentuate the rhythmic pulse, we should hear LHHMHHMHHMHH (L = low, M = mid, H = high pitches).
1
·

under review

Musical Typing Variations
Description: Enhance the current musical typing feature to allow more flexibility in key mapping and scale selection. Users should be able to define which computer keys trigger which notes, and select from different musical scales (e.g. minor, pentatonic). Problem: Musical typing is currently limited to major scales mapped to fixed keys, which restricts creative expression and usability for musicians who prefer different scales or custom key layouts (e.g. using a numpad). This limits musical variety and playability without external hardware. Proposed Solution: Allow users to customize which keys trigger which notes. Support a variety of scales (minor, modal, custom). Optionally support note duration via long presses or combinations (e.g. numpad keys = durations). Benefits: Makes Loopy Pro more accessible to users without MIDI keyboards. Encourages experimentation and musical input via unconventional interfaces (e.g. laptop keyboards, numpads). Empowers users to compose in different tonalities with ease. Examples: A user assigns the numpad keys to minor scale notes and uses 0 , . and Enter for note durations. A performer maps QWERTY keys to a pentatonic scale for live improvisation. This summary was automatically generated by ChatGPT-4 on 2025-06-05. Original Post: Musical typing is great, but limited to a major scale on specific keys/buttons. Would it be possible to get to define which keys you want to use and for what scales? (Speaking as a "minor" musician toying with a numpad as controller: it's a silly interface, but with a more adjustable version of musical typing you would get notes with durations!)
1
·

under review

Option to Enable True Solo Mode Instead of Pseudo-Solo
Description: Introduce a toggleable "True Solo" mode in Loopy Pro that fully mutes all non-soloed channels, unlike the current behavior that only pseudo-solos clips within active groups. Problem: The current "solo" implementation in Loopy Pro doesn't behave as expected for users coming from traditional DAW workflows. Soloing a clip or channel does not mute all other audio sources—it instead interacts with Loopy Pro's play group system and can allow other unintended audio to remain audible. Proposed Solution: Add an optional "True Solo" mode toggle in the app or per project. In True Solo mode, soloing a clip or channel automatically mutes all other clips, play groups, or channels that are not soloed. Ensure the solo logic applies consistently across all types of channels and clips, including one-shots, loopers, and MIDI clips. Allow MIDI/OSC actions to toggle this setting. Benefits: Provides expected and consistent behavior for users with DAW experience. Enables more precise control during live performance or recording situations. Reduces confusion and increases trust in soloing behavior. Helps in isolating elements for troubleshooting, mix checks, and creative control. This summary was automatically generated by ChatGPT-4 on 2025-06-07. Original Post: An option to have actual Solo in 2.0 It is by design that when you solo a track in 2.0 it only mutes other tracks of that track type. Therefore if for one example, a project has two audio units, two color groups, two busses, two midi inputs, two hardware inputs, and two inter-app inputs, soloing any track would mute only one track, the one of the same channel type. This is not solo, obviously enough, counterintuitive, and requires significant working around to be done when you need actual solo. Options for this behaviour would be therefore very welcome
1
·

under review

Direct Parameter Targeting for XY Pad via Button or Stepper Dial Widgets
Description Currently, Button and Stepper Dial Widgets can trigger specific values on an XY Pad widget (e.g. X: 55%, Y: 80%). This is useful for recalling XY positions but lacks the precision required when the XY Pad controls specific AUv3 parameters or internal Loopy Pro parameters. Problem The current percentage-based control is not ideal when an XY Pad axis is mapped to multiple parameters. For example, if both “Reverb Decay” and “Delay Time” are mapped to the Y axis, setting “Y = 80%” affects both — without the ability to precisely target one of them. Furthermore, users can't define explicit parameter values like “1.2s Delay Time” or “50% Reverb”. Proposed Solution Allow both Button Widgets and Stepper Dial Widgets to target specific parameters controlled by the XY Pad. Instead of assigning “X = 55%” or “Y = 80%”, users should be able to select a concrete parameter (e.g. “Delay Time”) and set its precise value. If multiple parameters are mapped to the same axis, users should choose the exact parameter to target. Setting one parameter will internally adjust the XY Pad position accordingly, and co-mapped parameters will update as a result — which is expected behavior. Benefits Enables precise, intentional parameter control via Button and Stepper Dial Widgets Better support for performance setups involving AUv3 plugins More predictable and accurate interaction between widgets and XY Pad mappings Unlocks the full potential of parameter-based modulation without needing to rely on percentage estimates Examples Button A: Set “Delay Time” (mapped to X-axis of XY Pad 1) to 0.5s Stepper Dial: Adjust “Reverb Decay” (mapped to Y-axis of XY Pad 1) between 0.3–1.2s in steps → Loopy Pro internally calculates the XY Pad position and applies the new values, including updating any linked parameters. This summary was automatically generated by ChatGPT-4 on 2025-06-05.
1
·

under review

Zoom Factor Control for Sequencer Timeline
Description: Introduce new actions to control the horizontal zoom factor (e.g. 0.5x, 1x, 2x, 4x, 8x) of the sequencer timeline using widgets such as buttons, dials, faders, or knobs. Problem: Currently, there's no way to programmatically control the sequencer timeline's horizontal scale. Users are limited to pinch gestures or manual UI interaction. In live performance or controller-based setups, there's a strong need to control the sequencer zoom factor hands-free or via external hardware. Proposed Solution: Add the following widget actions to control the sequencer's horizontal zoom factor: Set Specific Zoom Factor: Use buttons or stepper dials to jump directly to a defined zoom factor (e.g. 1x, 2x, 8x). Control Zoom Factor via Range Mapping: Use knobs or faders to adjust the zoom factor smoothly across a defined range. Example: map 0%–100% of a fader or knob to zoom factor values between 0.5x and 8x. These values should correspond to the existing visual scaling behavior of the sequencer timeline and allow for intuitive, musically aligned navigation. Benefits: Enables precise control of timeline scale during live performance or editing. Makes Loopy Pro more accessible in touchless or controller-driven environments. Improves efficiency when navigating complex sequences. Enhances integration with external MIDI controllers and custom UIs. Examples: A knob widget ranges from 0.5x (overview) to 8x (detailed view), giving smooth control over the sequencer’s horizontal scale. A button instantly resets the view to 1x for a centered editing workflow. A foot controller fader lets a performer zoom in on the fly without touching the screen. This summary was automatically generated by ChatGPT-4 on 2025-06-05.
1
·

under review

Load More