Feature Requests

Action to Record to Next Clip in a Color
Description This feature request proposes the implementation of an action within Loopy Pro that enables users to record directly into the next available clip of a specified color. The goal is to streamline the recording process, especially during live performances, by allowing for efficient and organized clip management based on color coding. Problems Limited Visual Feedback : On certain MIDI controllers, such as the APC40, it can be challenging to distinguish between clips that contain content and those that are empty, as the brightness levels are similar. This ambiguity can lead to recording over existing clips unintentionally. Workflow Inefficiency : Currently, users must manually select the next available clip for recording, which can be time-consuming and prone to errors, particularly in complex projects with multiple colors and pages. Limited MIDI Controller Integration : Users with MIDI foot controllers that have a limited number of buttons face challenges in assigning specific controls for each clip, restricting their ability to manage recordings efficiently. Proposed Solution "Record to Next Clip in Color" Action : Introduce an action that, when triggered, records into the next available (empty) clip of a specified color. This action can be assigned to on-screen buttons or MIDI controller inputs. Color-Based Clip Selection : Allow users to define the color context for the action, ensuring that recordings are directed to the appropriate set of clips. Integration with Existing Systems : Ensure compatibility with Loopy Pro's current action and MIDI control systems, allowing for seamless integration into existing workflows. Examples APC40 Integration : A user assigns the "Record to Next Clip in Color" action to the buttons beneath each column on the APC40. Pressing a button records into the next available clip of the corresponding color, simplifying the recording process during live performances. Multi-Page Template Management : A user has a template with multiple pages, each containing clips of different colors. By assigning the action to color-coded buttons on a dedicated page, the user can efficiently record into the appropriate clips across various pages without navigating through each one manually. MIDI Foot Controller Usage : A performer with a MIDI foot controller assigns the action to a limited number of footswitches. This setup allows for hands-free recording into designated clip colors, enhancing live performance capabilities. Benefits Enhanced Live Performance Flexibility : Enables performers to manage recordings efficiently, reducing the risk of overwriting existing clips and minimizing the need for manual navigation. Improved Workflow Efficiency : Streamlines the recording process by automating clip selection based on color, saving time and reducing the potential for errors. Expanded MIDI Controller Integration : Provides users with limited-button MIDI controllers the ability to manage recordings effectively, maximizing the utility of their hardware. This summary was automatically generated by ChatGPT-4 on 2025-05-09.
7
·

planned

Smart Text Fields & Placeholders for Dynamic Widget Displays (Text, Buttons, Faders, etc.)
Problem: Text in Loopy Pro widgets is static – there’s no way to display live values like AUv3 or MIDI, or to show dynamic status feedback. Users can’t show the value of a fader or knob, nor adapt text visibility, size or content dynamically. Proposed Solution: Text binding to controls: Let text widgets display live MIDI values from bound faders or knobs Placeholder system with support for: - $project.name$ , $bar.current$ , $channel_name_1$ , $AUv3_post_3$ - Current/total clip layers, named layers - Clip colors, groups, AUv3 plugin parameters User-defined labels for value ranges , e.g. 30–155 Hz = “Rumble”, 1000–3000 Hz = “Medium” Conditional formatting & visibility - Show exact value only while knob/fader is touched - After a delay, return to label display (e.g. "Range name") Text resizing logic - Auto-scale font size with widget/canvas changes - Manual ± font size per widget Formula-based expressions , like SWITCH(...) Text changes via actions , e.g. “Set label to: VERSE” Benefit: Real-time, intelligent feedback in the UI itself No need for extra screens or MIDI displays Clear overview of dynamic values, actions and playback states Great for live performance, section navigation, parameter feedback and more Customizable and responsive interface that adapts to user interaction Extension: Define Dynamic Behavior for Each Placeholder Ideally, each placeholder should have its own display logic – controlling when and how it appears. Examples: Always visible (default) Only visible during interaction , e.g. while a fader or knob is being moved Visible for a short time after interaction , e.g. 1 second after release Conditionally visible , e.g. only when a clip is playing or a value is in a certain range Toggle visibility via actions or logic This would allow for highly adaptive and context-aware UIs – showing the right info at the right moment without visual overload. It would give Loopy Pro a unique edge as a flexible, user-definable performance environment. This summary was automatically generated by ChatGPT-4 on April 30, 2025.
59
·

planned

Loop Scratching with External Gear via MTC or Other Sync Protocols
Description: Enable external gear (e.g. turntables, jog wheels, timecode-based controllers) to control clip or loop playback position in Loopy Pro in a “scratching” or scrubbing style, using MTC (MIDI Time Code) or other suitable sync protocols. Problem: Currently, Loopy Pro does not allow real-time, granular manipulation of a clip’s playback position via external hardware. For DJs, turntablists, or experimental performers, this prevents techniques like scratching, scrubbing, or fast cue jumps from being integrated into a Loopy Pro workflow. MTC or other sync-capable hardware sends position data, but Loopy Pro doesn’t interpret it in a way that enables this type of hands-on control. Proposed Solution: Implement support for external playback position control via: MTC (MIDI Time Code) or other transport/location protocols Possibly HID jog wheels or other pitch/position-sensitive input devices Binding to loop/cue position, allowing external manipulation of timeline playback in real time Smooth integration with Loopy’s timebase and clip grid system Benefits: Enables DJ-style live manipulation of loops Supports hybrid setups combining Loopy Pro with turntables or media players Expands creative use cases like live remixing, scratching, or spontaneous rearrangement Bridges the gap between traditional DJ workflows and live looping environments Examples: Using a DVS setup to scratch a vocal loop in Loopy Pro Jog wheel control of a loop's playhead for beat juggling Real-time repositioning of a clip's timeline from external hardware during performance This summary was automatically generated by ChatGPT-4 on 2025-05-17.
6
·

planned

Automation Tracks in Sequencer – Including MIDI CC Recording, Drawing, and Timed Actions
Description: Introduce automation tracks in the Loopy Pro sequencer that allow both manual drawing and real-time recording of MIDI CC data, including from external MIDI controllers or hardware synthesizers. Additionally, enable time-based automation triggers that can be set to occur after a certain number of bars in the arrangement. Problem: Currently, there is no way to record MIDI CC data over time or draw automation lines directly inside the sequencer. This makes it hard to automate effect changes, synth parameter modulations, or scene transitions with precision. Users must rely on dummy clips or static widgets to emulate automation behavior. Proposed Solution: – Add automation lanes/tracks to the sequencer view – Record incoming MIDI CC data in real time, including from hardware synths – Option to automatically record any incoming CC without manual mapping (like Logic or Ableton) – Allow drawing and editing of automation curves and values (e.g. volume, FX wet/dry, AUv3 parameters) – Visual representation of automation values over time – Automations can target AUv3, internal effects, or external MIDI destinations – Add timed Follow Actions: • Trigger any action after X bars have passed • Examples: unmute audio source X after 8 bars, disable effect Y after 16 bars – Allow assigning actions based on sequencer bar position (e.g. “do X on bar 129”) Community Input: – Users want to “tweak knobs” live and have the automation recorded and played back – Drawing value lines would allow for better control than dummy clips – Automations should work like in Ableton or Logic on iPad – BeatMaker was mentioned negatively due to its manual mapping requirement – Ideal for both clip automation and long-form arrangement control Benefits: – Complete automation workflow for effects, plugins and MIDI – Captures live synth or controller tweaks without setup hassle – Enables dynamic, evolving arrangements and transitions – Great for hybrid setups (hardware + iPad) – Significantly improves sequencing, especially for live performances and structured song-building This summary was automatically generated by ChatGPT-4 on April 30, 2025.
9
·

planned

AUv3 Clips (Dedicated Plugin-Based Clip Type)
Description: Introduce a new type of clip: AUv3 Clips. These would function similarly to Audio and MIDI Clips, but are directly tied to AUv3 plugins. Each clip could instantiate and control its own AUv3 plugin instance with specific state, allowing highly modular, per-clip plugin behavior. Problem: Currently, AUv3 plugins must be assigned at the channel level, limiting flexibility. If a user wants a specific plugin to behave differently across different clips, they must use multiple channels, which becomes inefficient and cluttered. There is no way to encapsulate AUv3 plugin states within a clip itself. Proposed Solution: – Add a third clip type: AUv3 Clip – Each AUv3 Clip can load and store its own AUv3 plugin instance and state – Clips can load different plugins, presets, or plugin configurations – Include per-clip plugin automation and recall – Allow AUv3 Clips to be used in the sequencer and timeline like any other clip – Support seamless switching between clips with different AUv3s, without audio dropouts – Provide options to pre-load plugins or load on activation to optimize performance Benefits: ✅ Enables highly modular setups with clip-specific sound sources ✅ Allows radically different processing chains per clip without extra channels ✅ Makes Loopy Pro more DAW-like while staying loop-oriented ✅ Ideal for creative sound design, plugin chaining, and live performance setups ✅ Simplifies session organization and expands creative potential of the clip model This summary was automatically generated by ChatGPT-4 on April 30, 2025.
5
·

planned

Rename Busses and Add Optional Descriptive Notes per Bus
Description: Allow users to rename audio busses and optionally add visible notes or hints describing their function or purpose. Problem: Currently, busses in Loopy Pro are automatically named with letters (A, B, C...) upon creation, and users are unable to rename them. This makes it difficult to organize and recall bus functions, especially in complex projects with many busses. The default visual representation is also hard to distinguish from color groups, making the UI harder to parse quickly. Proposed Solution: – Allow users to rename busses at any time – Display the name prominently (e.g. at the top of the bus or next to the letter) – Optional: restore default letter-based name (e.g., "A") if custom label is deleted – Add support for a “hint” or “notes” field for each bus (shown e.g. on hover or toggle) – Consider visual improvements to further differentiate busses from color groups – Optional: provide a UI dialog showing an overview of a bus’s routing and settings, including name, balance, etc. Benefits: – Greatly improves clarity in complex routing setups – Reduces need for external “cheat sheets” – Helps identify and manage up to 24+ busses more efficiently – Enhances UI and UX, especially for live performance or sound design – Makes Loopy Pro more scalable and user-friendly for professional workflows This summary was automatically generated by ChatGPT-4 on April 30, 2025.
9
·

planned

Load More