« Adnan Wazwaz - Bloopers | App News | Harmonic Exciter AUv3 Plugin by 4Pockets.com »

Atom | Piano Roll by Victor Porof

Victor Porof released Atom | Piano Roll, a new AUv3 MIDI piano roll sequencer. It arrives with a lot of tools for detailed sequencing control; such as non-destructive quantization and dynamic stretching. Reader stub will be pleased to know that it includes support for weird time signatures. Atom | Piano Roll is the first in a series of Atom apps that Victor plans to bring to iOS!

Atom | Piano Roll iTunes Description:

Piano Roll is a sequencer for iOS, and the first in our Atom series of modular sequencer plugins. Record, replay, loop and manipulate MIDI notes anywhere.

The MIDI audio unit can be loaded in compatible hosts (AUM, ApeMatrix, Audiobus, BeatMaker 3, Cubasis 2, etc.). Make sure your host is capable of loading, transmitting to, and receiving MIDI from audio unit MIDI plugins.

It’s intended to augment existing MIDI processing capabilities, or in scenarios where a host doesn’t provide sequencing functionality out of the box.

Features:

1) Loop record
Grow your ideas over any number of bars. Overdub drum loops, layer complex melodies, or even record an entire track. Arm recording in multiple instances, in the background, receiving from hardware or even other audio units. Piano Roll is always sample accurately synced with the host.

2) Step input
Quickly create sequences. Insert notes respecting grid length, and resize them to any duration. Layer chords in one operation. Just send the notes, Piano Roll takes care of the rest.

3) Extensive editing tools
All the standard editing tools you are used to are implemented, pen for entering notes at any length, brush for painting continual notes at grid size and scissors for slicing.

Advanced tools allow you to be productive. Grow notes backwards or forwards. Group notes with additive selection. Select rows or chords in one operation. Time and pitch flip. Velocity editing is precise and unobtrusive. With Piano Roll, anything you need is a tap, hold, or swipe away.

3) Non destructive quantize
Quantised or not? You decide when, and how much! Enable full quantisation beforehand with grid snapping. Turn it back off, or adjust the strength afterwards on an already recorded sequence. Choose the quantisation percentage for note starts, note ends and even note durations. In Piano Roll, you have full control over timing precision.

4) Dynamic stretching
Stretch a clip from 4 bars to 8. Fix the note lengths or extend them in harmony. In Piano Roll, your sequences are elastic.

5) Advanced time signatures
Does 4/4 sound boring to you? Create complex polyrhythms by choosing any timing. From 3/8 to 11/8. From 2/1 to 17/16. Piano Roll lets you invent your own rhythm.

6) Advanced tempo settings
Negative BPM? No problem. Pick a tempo for every clip and play backwards or forwards. Multiply host tempo or choose something arbitrary, all with extreme accuracy. Piano Roll gives you control over tempo like never before.

7) Scales
Never go outside a scale, or go chromatic and add spice to the melody. With scale overlays, your music isn’t forced to sound in a predetermined way, and you aren’t forced to be a music theory genius to sound the way you want. Piano Roll gives you the ability to decide in which key to be and when.

8) Visibility layers
Aid composition by overlaying other clips from other plugin instances over the one you’re editing. Make sure your chords are in harmony with the melody, visually. Piano Roll roll helps you focus where you need, when you need it.

10) Sate saving
Your sequences are saved alongside the project you already work with. And they’re loaded back automatically when opening up a project. Clips can be saved as host presets. Piano Roll is in perfect harmony with the host you already use.

Reader echo opera has already put Atom | Piano Roll to work on a track. There are a whole lot of tutorials on the AudioVeek YouTube channel. Embedded here is an overview.

Reader Comments 17

Can this be used as a stand alone instrument type (like iaa) with independent routing or is it just another "needs a host to run" app?
Thx:)
April 09, 2019  | person_outline Spaced invader
Needs a host.
Bit over hyped imho.
April 09, 2019  | person_outline Bianca
It isn't an instrument. It is a sequencer.
@Tim Webb knows me well enough by now. :D

@Bianca That description didn't seem very hype-y to me.

It is nice to see a the scale tempo function, which is a nice way to deal with the relationship of the BPM for the host's project and the kind of time-signature you really want to create in Atom. It could mean that you could have something like an odd-tuplet, and use a standard 8th or 16th snap grid. Since it only goes to two decimal places of precision, I don't know how long it would go before drifting out of sync with the master tempo. For example, if I want to have it scaled to 133.33%, how many bars before it's off?

Also nice to see negative tempo. I never thought of it that way, but that is a clever way to initiate backwards playback.
April 09, 2019  | person stub
Yeah, i know what a sequencer is after 20+ years of audio work, thx;)
would be nice as a stand-alone with midi routing capabilities... Would have gotten it right away. - and will if the dev adds stand-alone capability...
"Needs a host" doesn't turn me on for this at this time;)
April 09, 2019  | person_outline Spaced invader
That overlay is very nice and very helpful in composition. Nothing else on iOS I know of like that. That is very appealing. There is velocity control but no other cc control? I would need that to really want this. This is a good addition though towards the full modular iOS DAW concept.
Taking a broader perspective, this is a key piece in an important puzzle, as iOS musicking is shaping into something quite interesting.
Sure, this is a way to add DAW-like features to AUv3 hosts like AUM. But the workflow is actually pretty different from traditional DAWs and gets a bit closer to some of things people do in modular synths or in Puredata and Max.
Some of these could become crazy while remaining quite manageable. For instance, you could feed Riffer into Rozeta Scaler, send that to Atom Piano Roll, match that with a Rozeta Bassline sent to another Atom Piano Roll instance, trigger full MPE sequences from Photon, modulate diverse effects through CC from FAC Envolver…

In AUM (or apeMatrix, or Audiobus…), it’s not really about maintaining a single timeline or a consistent signal chain going to a “Master” bus. It’s not necessarily about triggering clips either. It’s about chaining instruments with audio and MIDI effects in ways which make sense in a certain context and having direct control of several things at once through multitouch gestures. Again, closer to a “patcher” than to Reaper or Live or Pro Tools or Logic Pro X or Reason…

Not to say that it’s unique. But there’s something emerging which is different from “Golden Age of the DAW”. The differences might sound subtle, now. But we have a tendency to take a lot of things for granted once we’ve made them part of our routines. Which is partly why it’s so fascinating to notice the little things which amount to a significant, qualitative difference.
On April 09, 2019 - @Enkerli said:
For instance, you could feed Riffer into Rozeta Scaler, send that to Atom Piano Roll, match that with a Rozeta Bassline sent to another Atom Piano Roll instance, trigger full MPE sequences from Photon, modulate diverse effects through CC from FAC Envolver…
I recall a video of someone using Senode and Sonogrid together featured on this blog that was fascinating in it’s combination of non linear interactions, similar to what you allude to hear. Unfortunately I am not intuiting what running Riffer or Baseline into Piano Roll would give but a way to capture their output in a static and non interactive way. I like the possibilities implied, and perhaps a demo of those inspirational app combining ideas would be a benefit to us all.
On April 09, 2019 - @Enkerli said:
Not to say that it’s unique. But there’s something emerging which is different from “Golden Age of the DAW”. The differences might sound subtle, now. But we have a tendency to take a lot of things for granted once we’ve made them part of our routines. Which is partly why it’s so fascinating to notice the little things which amount to a significant, qualitative difference.

Fully agree here! It's something which I keep on highlighting as it is indeed subtly special, but often dismissed with a casual "Oh but I can already do that with Ableton on my desktop" (which is missing the point).
April 10, 2019  | person_outline Bram Bos
@Enkerli @Bram Box @Laarz

I've re-read this thread a few times. I'm a hard-core traditional DAW fan (MOTU DP all-the-way), and have appreciated MultiTrackStudio on iOS for its strong DAW functionality.

I'm trying to wrap my head around this subtle difference that is "emerging". Is it as Laarz and Enkerli mentioned that it is a more non-linear, modular approach?

Like the traditional DAW, the iOS world has two main aspects of creation: 1. a content production aspect (i.e., composition and user-influenced content generation), and 2. a sound production aspect (samplers, synths, noise, effects). Traditional DAWs can but don't focus on automated content generation (several plugins do that kind of thing-- Numerology is a great example), but make integrated composition and sound-production very streamlined.

I'm getting that in iOS you can assemble various AUM-interlinked apps, and/or host/AUv3 elements and have them routed in various ways. But I'm not clear on what can be done in iOS that can't be done in a traditional DAW. What does seem unique is that there are just different tools in iOS that we don't see for computer-based work. But the interconnectedness seems to revolve around a host DAW, AUM, IAA, and combinations thereof.

On a Mac, NI Reaktor does a bunch of this stuff and is perhaps a comparable model.

I'd really like to hear more of your thoughts on this!
April 10, 2019  | person stub
On April 10, 2019 - @stub said:
@Enkerli @Bram Box @Laarz

I'm getting that in iOS you can assemble various AUM-interlinked apps, and/or host/AUv3 elements and have them routed in various ways. But I'm not clear on what can be done in iOS that can't be done in a traditional DAW. What does seem unique is that there are just different tools in iOS that we don't see for computer-based work. But the interconnectedness seems to revolve around a host DAW, AUM, IAA, and combinations thereof.
I think the subtle difference between iOS and Windows/Mac audio environments comes down to a music making experience that specifically includes other senses in the music making process. Apps specifically designed for iOS have a visual & tactile experience built in via the touch screen that enhances the music making process in a way you don't currently get in a Windows or Mac environment. Even with a touchscreen MacBook or Windows tablet, the current music making apps we have in those environments aren't designed with touch in mind. They are keyboard & mouse-centric, possibly with a touch element added as an afterthought.
On April 10, 2019 - @stub said:
@Enkerli @Bram Box @Laarz
But I'm not clear on what can be done in iOS that can't be done in a traditional DAW. What does seem unique is that there are just different tools in iOS that we don't see for computer-based work. But the interconnectedness seems to revolve around a host DAW, AUM, IAA, and combinations thereof.
I can't really address what can or cannot be done on a computer environment, but one thing about modular iOS is that one can build up their workspace outside of BM3 or Cubasis, and not be tied to those app's architecture. Many small efficient apps with a variety of behaviors could give better choice of workflow and behavior and better stability.

The interaction and manipulation of midi information with generative capabilities yields more of a performance application of a workspace, rather than building a track, for which a static end is in mind. Bram's project of roll your own midi AUv3 will be interesting in how we will be able to introduce scripting possibilities into midi streams, increasing the way generative can manipulate generative. What interests me is the possibilities of personal interaction with a certain amount of the generative architecture of apps, played among a group of similarly situated musickers, where there is still live interplay among the non-machines. As with any performance, this is a more interactive and less static. So I am thinking of all this as a digital audio workspace rather than a digital audio workstation, which just implies a difference in approach.
Thanks. Those are fascinating responses. I try to keep an open mind about new approaches and workflows, while at the same time I don't want to lose the power, flexibility, and versatility that CAN be available in apps with depth.
April 10, 2019  | person stub
@stub Long time DP user here too! (2.7 actually!)
I think the attraction is indeed the touch interface, but also the weirdness and creativity of iOS apps. Look at Geoshread! NOTHING compares on desktop. Or “Oval Synth”. So many ways to come up with content that the desktop environment cannot replace. Suddenly, music making and production got FUN and ENJOYABLE again for me, on an iPad. In recent years, it was all beginning to feel like “work.” That has changed now.
April 11, 2019  | person burnalot
@burnalot

My first dabblings with DM1 were fun because of the speed, but because the relationship between tempo & time-signatures was messed up, I lost interest because all the cool tuplet stuff had to cheat outside the actual system which just made me feel like I had to fight the GUI.

But otherwise, I did get a very good sense of the immediacy of a touch screen for that kind of work. And though I know there are far more colorful examples, ultimately, the touchscreen didn't add much more fun for me than that. And I have TONS of apps-- I am really trying to enjoy glass rubbing but I don't.

It does serve a purpose and I can get some things done. But I'm much more fast and productive with a mouse/keyboard/laptop.

It is hard for me to get past having to touch a screen 3-5 times before it acknowledges that I exist. It's always been that way. Touchscreens feel like unresponsive beta-test devices to me.

But all that notwithstanding, I do get how this is revolutionizing music making for various types of learners.
April 12, 2019  | person stub
@stub
The iPad came ALIVE once I started buying midi controllers for it. Like you, I don’t like touching glass instead of knobs and pads. I’m old skul, I like actual faders too. But for everything else? Wow. Especially GeoShred.
April 15, 2019  | person burnalot
@burnalot

Thanks for that validation. I was also annoyed at how picky the iPad was about connecting devices, but haQ's excellent video has helped me in that regard.

In my very rare spare time, I'm hoping to one day finish a hardware MIDI controller and use MIDI Designer as a GUI and data processor/router for it.
April 15, 2019  | person stub
comment

  Post a New Comment

Anonymous comments are closed for old articles. If you'd like to comment please Register.

Comment:

Do not use HTML in your comments. Tags: [b] Bold Text [/b] [i] Italic Text [/i]
Links will be generated if you include http:// or https:// at the beginning of a URL.
Submit

Contact | Privacy Policy | Developer Services | Advertise
discchord.com is a service of Gnubesoft