Discussion:
[linux-audio-dev] processing plugin standard wrapper
Stefano D'Angelo
2007-02-12 12:23:28 UTC
Permalink
Hi all,
who would be interested in writing a processing plugin standard
wrapper (LADSPA, DSSI, LV2, VST, etc.)?
Malte Steiner
2007-02-12 15:13:37 UTC
Permalink
Post by Stefano D'Angelo
Hi all,
who would be interested in writing a processing plugin standard
wrapper (LADSPA, DSSI, LV2, VST, etc.)?
As far as I know, DSSI accepts OSC for controling so you could use the
OSC library for processing to create for instance a gui for DSSI
instruments without the need of any wrappers.

Cheers,

Malte
--
Malte Steiner
media art + development
-www.block4.com-
Stefano D'Angelo
2007-02-12 16:36:50 UTC
Permalink
Post by Malte Steiner
Post by Stefano D'Angelo
Hi all,
who would be interested in writing a processing plugin standard
wrapper (LADSPA, DSSI, LV2, VST, etc.)?
As far as I know, DSSI accepts OSC for controling so you could use the
OSC library for processing to create for instance a gui for DSSI
instruments without the need of any wrappers.
Cheers,
Malte
--
Malte Steiner
media art + development
-www.block4.com-
I was addressing a different matter: a compatibility layer for
different plugin standards.
Malte Steiner
2007-02-12 16:52:46 UTC
Permalink
its not clear if you meant processing or proce55ing as in proce55ing.org.
It would be difficult and against the platform independence nature of
the ladder.
If you meant a wrapper to handle the plugin formats on Linux in general
and transparent way, its a tough goal too. One tricky part is different
UI handling. I remember being annoyed by Apple and their constant change
of the UI handling of their AudioUnits so I gave up on the end. I expect
they change everything for OSX 10.5 again. It was a lot of trouble in
one (socalled) standard/format so I guess it quadruples by throwing in
several ones.

Cheers,

Malte
--
Malte Steiner
media art + development
-www.block4.com-
Pieter Palmers
2007-02-12 16:52:09 UTC
Permalink
Post by Stefano D'Angelo
Post by Malte Steiner
Post by Stefano D'Angelo
Hi all,
who would be interested in writing a processing plugin standard
wrapper (LADSPA, DSSI, LV2, VST, etc.)?
As far as I know, DSSI accepts OSC for controling so you could use the
OSC library for processing to create for instance a gui for DSSI
instruments without the need of any wrappers.
Cheers,
Malte
--
Malte Steiner
media art + development
-www.block4.com-
I was addressing a different matter: a compatibility layer for
different plugin standards.
Why don't you just choose one of them (e.g. LV2) and write a 'bridge'
plugin to convert the others into that one format?

Pieter
David García Garzón
2007-02-12 19:03:12 UTC
Permalink
Post by Stefano D'Angelo
Post by Malte Steiner
Post by Stefano D'Angelo
Hi all,
who would be interested in writing a processing plugin standard
wrapper (LADSPA, DSSI, LV2, VST, etc.)?
As far as I know, DSSI accepts OSC for controling so you could use the
OSC library for processing to create for instance a gui for DSSI
instruments without the need of any wrappers.
Cheers,
Malte
--
Malte Steiner
media art + development
-www.block4.com-
I was addressing a different matter: a compatibility layer for
different plugin standards.
Well, that's our intent in CLAM[1]. The goal is that CLAM should be able to
run a given processing algorithm transparently under several backends.
Currently we support, to some extend, PortAudio, Jack, Alsa, and VST. The
first three backends can be used with a Qt Designer interface. We still have
to face several fronts: Unifying the interface to fit all the backends,
incorporating more backends (some work on Ladspa has been done), and enabling
Qt GUI's to more backends (notably VST).

This work in progress should be due for the next months, so any requirements
or feedback is very wellcome.

David.

[1] http://clam.iua.upf.edu
Stefano D'Angelo
2007-02-12 20:34:08 UTC
Permalink
Post by Malte Steiner
If you meant a wrapper to handle the plugin formats on Linux in general
and transparent way, its a tough goal too. One tricky part is different
UI handling. I remember being annoyed by Apple and their constant change
of the UI handling of their AudioUnits so I gave up on the end. I expect
they change everything for OSX 10.5 again. It was a lot of trouble in
one (socalled) standard/format so I guess it quadruples by throwing in
several ones.
I think that's their fault, since other formats (LADSPA, DSSI, VST)
tend to change very slowly over time. However UI handling would be
done only after the actual processing part.
Post by Malte Steiner
Why don't you just choose one of them (e.g. LV2) and write a 'bridge'
plugin to convert the others into that one format?
Well, this in fact something that could easily be done (and has also
been done), but I don't think that all of these standard have "logical
compatiblity" as I say... that means the metaphor they present is not
always compatible, also if their functionality is the same.
Then a format wrapper would also mean that you would use all of your
plugin with all applications, that you can create a standard and all
your applications can use it and that a format version update would be
much less painful.
Post by Malte Steiner
Well, that's our intent in CLAM[1]. The goal is that CLAM should be able to
run a given processing algorithm transparently under several backends.
Currently we support, to some extend, PortAudio, Jack, Alsa, and VST. The
first three backends can be used with a Qt Designer interface. We still have
to face several fronts: Unifying the interface to fit all the backends,
incorporating more backends (some work on Ladspa has been done), and enabling
Qt GUI's to more backends (notably VST).
Well, I think I've not understood what you mean: Jack, Alsa and
PortAudio are not sound processing plugin formats... can you explain
it easier please? (I'm sorry, I'm not a native English speaker)
David García Garzón
2007-02-13 03:48:35 UTC
Permalink
Post by Stefano D'Angelo
Post by David García Garzón
Well, that's our intent in CLAM[1]. The goal is that CLAM should be able
to run a given processing algorithm transparently under several backends.
Currently we support, to some extend, PortAudio, Jack, Alsa, and VST. The
first three backends can be used with a Qt Designer interface. We still
have to face several fronts: Unifying the interface to fit all the
backends, incorporating more backends (some work on Ladspa has been
done), and enabling Qt GUI's to more backends (notably VST).
Well, I think I've not understood what you mean: Jack, Alsa and
PortAudio are not sound processing plugin formats... can you explain
it easier please? (I'm sorry, I'm not a native English speaker)
They are not plugin systems, you're right, but if you have a processing
algorithm encapsulated in a way that it describes itself (number of ports,
controls... then you can build wrappers (what we call backends) that maps
this algorithm to a given plugin system (ladspa, vst...) but also to a given
underlaying audio system api (portaudio, alsa, directx..)

For example, if i want my algorithm to be a Laspa plugin, i could use a Ladspa
backend, that compiles as library, that maps connection topology of the
processing algorithm to a Ladspa descriptor and when it is called, it just
feeds data to and from the ports and executes the algorithm. A Jack backend
should be very similar but it compiles as an application and publishes the
ports as Jack ports to the server. And so on. Of course, there is a lot of
work on handling each backend singularities such as JACK server availability,
api/device enumeration in portaudio...

In short, the intended result is that the developer designs the algoritm once
and then it can be used as any kind of plugin (what you asked for, didn't
you?), and not just plugins but also standalone application.

We also provide a way of relating a qt interface to certain parts of the
algorithm. This is already available for standalone applications backends[1]
(JAck, PA, Alsa..) but we want to provide that also for plugin systems such
as VST.

[1] http://iua-share.upf.es/wikis/clam/index.php/Network_Editor_tutorial

I hope to have explained myself better, but feel free to ask me more
information if you are interested in.

David.
Stefano D'Angelo
2007-02-13 10:31:21 UTC
Permalink
Post by David García Garzón
Post by Stefano D'Angelo
Post by David García Garzón
Well, that's our intent in CLAM[1]. The goal is that CLAM should be able
to run a given processing algorithm transparently under several backends.
Currently we support, to some extend, PortAudio, Jack, Alsa, and VST. The
first three backends can be used with a Qt Designer interface. We still
have to face several fronts: Unifying the interface to fit all the
backends, incorporating more backends (some work on Ladspa has been
done), and enabling Qt GUI's to more backends (notably VST).
Well, I think I've not understood what you mean: Jack, Alsa and
PortAudio are not sound processing plugin formats... can you explain
it easier please? (I'm sorry, I'm not a native English speaker)
They are not plugin systems, you're right, but if you have a processing
algorithm encapsulated in a way that it describes itself (number of ports,
controls... then you can build wrappers (what we call backends) that maps
this algorithm to a given plugin system (ladspa, vst...) but also to a given
underlaying audio system api (portaudio, alsa, directx..)
For example, if i want my algorithm to be a Laspa plugin, i could use a Ladspa
backend, that compiles as library, that maps connection topology of the
processing algorithm to a Ladspa descriptor and when it is called, it just
feeds data to and from the ports and executes the algorithm. A Jack backend
should be very similar but it compiles as an application and publishes the
ports as Jack ports to the server. And so on. Of course, there is a lot of
work on handling each backend singularities such as JACK server availability,
api/device enumeration in portaudio...
In short, the intended result is that the developer designs the algoritm once
and then it can be used as any kind of plugin (what you asked for, didn't
you?), and not just plugins but also standalone application.
We also provide a way of relating a qt interface to certain parts of the
algorithm. This is already available for standalone applications backends[1]
(JAck, PA, Alsa..) but we want to provide that also for plugin systems such
as VST.
[1] http://iua-share.upf.es/wikis/clam/index.php/Network_Editor_tutorial
I hope to have explained myself better, but feel free to ask me more
information if you are interested in.
David.
Mmmm... I think we are interested in two opposite things: I want an
host to use any kind of plugin without having to know which kind it
is.
For example I have some LV2, some VST and some LADSPA plugins. The
wrapper I'm talking about would be able to interface with them and let
my host use any of them (as it was a gstreamer for audio plugins).

Instead you want that a plugin/application writer describes its
algorithm and it can be "traslated" in a LADSPA plugin, as well as a
JACK application, etc. Am I right?

But, anyway, maybe combining the two things could be of some interest:
imagine that you want to be able to develop and use immediately in all
supporting applications a plugin system capable of using the
z-transform. In this way you could build a module for this wrapper and
soon start programming your plugins and use them, without having to
wait for the adoption of "your standard".

Also, this way some noticeable improvements can be made on performance
if this wrapper would be able to represent processing networks which
can be "simplified", as for example a net of LTI systems with known
transfer function (fourier transform).

What do you think about it?

Regards,

Stefano D'Angelo
David García Garzón
2007-02-13 13:18:29 UTC
Permalink
Post by Stefano D'Angelo
Post by David García Garzón
Post by Stefano D'Angelo
Post by David García Garzón
Well, that's our intent in CLAM[1]. The goal is that CLAM should be
able to run a given processing algorithm transparently under several
backends. Currently we support, to some extend, PortAudio, Jack,
Alsa, and VST. The first three backends can be used with a Qt
Designer interface. We still have to face several fronts: Unifying
the interface to fit all the backends, incorporating more backends
(some work on Ladspa has been done), and enabling Qt GUI's to more
backends (notably VST).
Well, I think I've not understood what you mean: Jack, Alsa and
PortAudio are not sound processing plugin formats... can you explain
it easier please? (I'm sorry, I'm not a native English speaker)
They are not plugin systems, you're right, but if you have a processing
algorithm encapsulated in a way that it describes itself (number of
ports, controls... then you can build wrappers (what we call backends)
that maps this algorithm to a given plugin system (ladspa, vst...) but
also to a given underlaying audio system api (portaudio, alsa, directx..)
For example, if i want my algorithm to be a Laspa plugin, i could use a
Ladspa backend, that compiles as library, that maps connection topology
of the processing algorithm to a Ladspa descriptor and when it is called,
it just feeds data to and from the ports and executes the algorithm. A
Jack backend should be very similar but it compiles as an application and
publishes the ports as Jack ports to the server. And so on. Of course,
there is a lot of work on handling each backend singularities such as
JACK server availability, api/device enumeration in portaudio...
In short, the intended result is that the developer designs the algoritm
once and then it can be used as any kind of plugin (what you asked for,
didn't you?), and not just plugins but also standalone application.
We also provide a way of relating a qt interface to certain parts of the
algorithm. This is already available for standalone applications
backends[1] (JAck, PA, Alsa..) but we want to provide that also for
plugin systems such as VST.
[1] http://iua-share.upf.es/wikis/clam/index.php/Network_Editor_tutorial
I hope to have explained myself better, but feel free to ask me more
information if you are interested in.
David.
Mmmm... I think we are interested in two opposite things: I want an
host to use any kind of plugin without having to know which kind it
is.
For example I have some LV2, some VST and some LADSPA plugins. The
wrapper I'm talking about would be able to interface with them and let
my host use any of them (as it was a gstreamer for audio plugins).
Instead you want that a plugin/application writer describes its
algorithm and it can be "traslated" in a LADSPA plugin, as well as a
JACK application, etc. Am I right?
Ok, your are right; we are talking about different things but related
anyway. :-)

Yes, it's very interesting and it is a path we want to walk. Currently, apart
of building Ladspa plugins, CLAM also can be a Ladspa host and we should
extend that to other plugins systems. We have two students in our lab working
on plugin and hosting aspects, but they need some time for any outcome.
Post by Stefano D'Angelo
imagine that you want to be able to develop and use immediately in all
supporting applications a plugin system capable of using the
z-transform.
In this way you could build a module for this wrapper and
soon start programming your plugins and use them, without having to
wait for the adoption of "your standard".
CLAM is not an standard to be adopted. Alsa, Jack and so on are the standards.
CLAM should be a convenience implementation tool. Migration is something that
can not be expected and we have a lot of experience on that. I am for
providing interframework wrappers so everyone could develop on the framework
he is used to (Mathlab, Marsyas, Pd...) and still reuse what it is done in
other frameworks.
Post by Stefano D'Angelo
Also, this way some noticeable improvements can be made on performance
if this wrapper would be able to represent processing networks which
can be "simplified", as for example a net of LTI systems with known
transfer function (fourier transform).
Sorry, I don't understand you here.

David.
Stefano D'Angelo
2007-02-13 14:06:27 UTC
Permalink
Post by David García Garzón
Yes, it's very interesting and it is a path we want to walk. Currently, apart
of building Ladspa plugins, CLAM also can be a Ladspa host and we should
extend that to other plugins systems. We have two students in our lab working
on plugin and hosting aspects, but they need some time for any outcome.
Well, CLAM is a big and important project and I am just an unknown
student from Italy who is trying to develop a replacement for pedal
boards and stomp boxes and trying to let people easily reuse the code
I'm going to write.
Anyway, if you think that me and/or my project
(http://freeadsp.sourceforge.net - the site is not being updated since
we're working on a new one) can contribute, I/we'll be pleased to.
Post by David García Garzón
Post by Stefano D'Angelo
imagine that you want to be able to develop and use immediately in all
supporting applications a plugin system capable of using the
z-transform.
In this way you could build a module for this wrapper and
soon start programming your plugins and use them, without having to
wait for the adoption of "your standard".
CLAM is not an standard to be adopted. Alsa, Jack and so on are the standards.
CLAM should be a convenience implementation tool. Migration is something that
can not be expected and we have a lot of experience on that. I am for
providing interframework wrappers so everyone could develop on the framework
he is used to (Mathlab, Marsyas, Pd...) and still reuse what it is done in
other frameworks.
I think it's clear that I'm not talking about a new standard and
things like that. I'm talking mainly about a wrapper. The possiblity
to develop new formats and have them working with any app that uses
such wrapper comes directly from the nature of the wrapper itself...
In other words I was just wondering how things could go after such
thing would be ready and working.
Then, to be honest, I think that if I/we succeed with implementing a
clean way to make VST, LADSPA, LV2, DSSI, etc. work well together,
some already started projects would at least consider the chance of
using such framework.

To be even clearer the "integration" I was talking about could work like this:

Host -> Wrapper -> Wrapper module (plugin loader - one per standard)
-> Processing object (plugin)

In this case the host needs information on how to use a processing object.
But if you put inside such wrapper module also information on how to
build a processing object from an algorithm, than CLAM could use the
same interface to do that other thing.
Post by David García Garzón
Post by Stefano D'Angelo
Also, this way some noticeable improvements can be made on performance
if this wrapper would be able to represent processing networks which
can be "simplified", as for example a net of LTI systems with known
transfer function (fourier transform).
Sorry, I don't understand you here.
It's quite simple: if you have a processing standard which represent
processing objects as LTI (linear time-invariant) systems using the
fourier transform of their transfer function (books often call this
H(f)) and you arrange such objects in a network, then, instead of
calculating outputs for each object, you can just multiply all H(f)s
following a certain path and use this result as the H(f) of the whole
network. This would allow network-based optimization (but obviously
the wrapper would have to know how the net is made).
Well, it seems like you're a teacher or a researcher, so you probably
know more than me about these stuff.
This, however, is just a thought.
In case I wasn't clear enough, just tell me.

Regards,

Stefano D'Angelo
David García Garzón
2007-02-14 15:50:16 UTC
Permalink
Post by Stefano D'Angelo
Post by David García Garzón
Yes, it's very interesting and it is a path we want to walk. Currently,
apart of building Ladspa plugins, CLAM also can be a Ladspa host and we
should extend that to other plugins systems. We have two students in our
lab working on plugin and hosting aspects, but they need some time for
any outcome.
Well, CLAM is a big and important project and I am just an unknown
student from Italy who is trying to develop a replacement for pedal
boards and stomp boxes and trying to let people easily reuse the code
I'm going to write.
Anyway, if you think that me and/or my project
(http://freeadsp.sourceforge.net - the site is not being updated since
we're working on a new one) can contribute, I/we'll be pleased to.
Nice. It is one of the promissing new projects Dave Phillips reported in its
blog, so it will deserve a close look. :-)
Post by Stefano D'Angelo
Post by David García Garzón
Post by Stefano D'Angelo
imagine that you want to be able to develop and use immediately in all
supporting applications a plugin system capable of using the
z-transform.
In this way you could build a module for this wrapper and
soon start programming your plugins and use them, without having to
wait for the adoption of "your standard".
CLAM is not an standard to be adopted. Alsa, Jack and so on are the
standards. CLAM should be a convenience implementation tool. Migration is
something that can not be expected and we have a lot of experience on
that. I am for providing interframework wrappers so everyone could
develop on the framework he is used to (Mathlab, Marsyas, Pd...) and
still reuse what it is done in other frameworks.
I think it's clear that I'm not talking about a new standard and
things like that. I'm talking mainly about a wrapper. The possiblity
to develop new formats and have them working with any app that uses
such wrapper comes directly from the nature of the wrapper itself...
In other words I was just wondering how things could go after such
thing would be ready and working.
Then, to be honest, I think that if I/we succeed with implementing a
clean way to make VST, LADSPA, LV2, DSSI, etc. work well together,
some already started projects would at least consider the chance of
using such framework.
Host -> Wrapper -> Wrapper module (plugin loader - one per standard)
-> Processing object (plugin)
In this case the host needs information on how to use a processing object.
But if you put inside such wrapper module also information on how to
build a processing object from an algorithm, than CLAM could use the
same interface to do that other thing.
That would be great. Our main motivation for looking for hosting plugins (or
other implementation platform such pd) in CLAM that is to give reuse options
to people developing with CLAM. Providing an adapter layer among plugin
systems is a side effect that may occur some time later. But note that so
many layers is not the best option for performance so wrapping the wrapper
should not be the first option.

Splitting with an interface between what you call 'Wrapper' and 'Wrapper
module' ('Audio System Backend', and 'Plugin adapter' in our vocabulary) is
a good idea as it eases the effort from NxN adapters to 2xN. But still is a
hard task to define such intermediate interface that we are designing
incrementally. My advice to you is not to generalize so much at the
beginning.
Post by Stefano D'Angelo
Post by David García Garzón
Post by Stefano D'Angelo
Also, this way some noticeable improvements can be made on performance
if this wrapper would be able to represent processing networks which
can be "simplified", as for example a net of LTI systems with known
transfer function (fourier transform).
Sorry, I don't understand you here.
It's quite simple: if you have a processing standard which represent
processing objects as LTI (linear time-invariant) systems using the
fourier transform of their transfer function (books often call this
H(f)) and you arrange such objects in a network, then, instead of
calculating outputs for each object, you can just multiply all H(f)s
following a certain path and use this result as the H(f) of the whole
network. This would allow network-based optimization (but obviously
the wrapper would have to know how the net is made).
Well, it seems like you're a teacher or a researcher, so you probably
know more than me about these stuff.
This, however, is just a thought.
In case I wasn't clear enough, just tell me.
Ok, i get the context now. As you say, I'am both teacher and researcher, but
my field is Software Engineering and my knowledge on theoretical DSP is not
that mature, so don't take my DSP related statements so serious. To my level
of knowledge, i could say that most of the plugins are not just LTI so, the
kind of optimization you suggest would be not general just appliable to
consecutive adjacent LTI systems. At the same time is something that may have
a lot of sense in FreeADSP.

David.
Stefano D'Angelo
2007-02-14 17:29:15 UTC
Permalink
Post by David García Garzón
Post by Stefano D'Angelo
Post by David García Garzón
Yes, it's very interesting and it is a path we want to walk. Currently,
apart of building Ladspa plugins, CLAM also can be a Ladspa host and we
should extend that to other plugins systems. We have two students in our
lab working on plugin and hosting aspects, but they need some time for
any outcome.
Well, CLAM is a big and important project and I am just an unknown
student from Italy who is trying to develop a replacement for pedal
boards and stomp boxes and trying to let people easily reuse the code
I'm going to write.
Anyway, if you think that me and/or my project
(http://freeadsp.sourceforge.net - the site is not being updated since
we're working on a new one) can contribute, I/we'll be pleased to.
Nice. It is one of the promissing new projects Dave Phillips reported in its
blog, so it will deserve a close look. :-)
Well, I hope that Dave knows what he said :-) Currently we are two
part-time developers (both university students here in Italy) and my
friend is not very skilled at programming at this level, but he
whishes to learn.
Personally, I do what I can.
Post by David García Garzón
Post by Stefano D'Angelo
Post by David García Garzón
Post by Stefano D'Angelo
imagine that you want to be able to develop and use immediately in all
supporting applications a plugin system capable of using the
z-transform.
In this way you could build a module for this wrapper and
soon start programming your plugins and use them, without having to
wait for the adoption of "your standard".
CLAM is not an standard to be adopted. Alsa, Jack and so on are the
standards. CLAM should be a convenience implementation tool. Migration is
something that can not be expected and we have a lot of experience on
that. I am for providing interframework wrappers so everyone could
develop on the framework he is used to (Mathlab, Marsyas, Pd...) and
still reuse what it is done in other frameworks.
I think it's clear that I'm not talking about a new standard and
things like that. I'm talking mainly about a wrapper. The possiblity
to develop new formats and have them working with any app that uses
such wrapper comes directly from the nature of the wrapper itself...
In other words I was just wondering how things could go after such
thing would be ready and working.
Then, to be honest, I think that if I/we succeed with implementing a
clean way to make VST, LADSPA, LV2, DSSI, etc. work well together,
some already started projects would at least consider the chance of
using such framework.
Host -> Wrapper -> Wrapper module (plugin loader - one per standard)
-> Processing object (plugin)
In this case the host needs information on how to use a processing object.
But if you put inside such wrapper module also information on how to
build a processing object from an algorithm, than CLAM could use the
same interface to do that other thing.
That would be great. Our main motivation for looking for hosting plugins (or
other implementation platform such pd) in CLAM that is to give reuse options
to people developing with CLAM. Providing an adapter layer among plugin
systems is a side effect that may occur some time later. But note that so
many layers is not the best option for performance so wrapping the wrapper
should not be the first option.
Splitting with an interface between what you call 'Wrapper' and 'Wrapper
module' ('Audio System Backend', and 'Plugin adapter' in our vocabulary) is
a good idea as it eases the effort from NxN adapters to 2xN. But still is a
hard task to define such intermediate interface that we are designing
incrementally. My advice to you is not to generalize so much at the
beginning.
I know that's hard. However the actual wrapper, in my case, is the
'Wrapper module'. The 'Wrapper' is just the generic library to use in
hosts.
Anyway I'm taking a quite direct approach to the matter: first wrap
LADSPA, then extend to the processing-side of other systems, then work
on the UI-side, and at the end experiment with new possibilities...
few words for a lot of work really :-)
Post by David García Garzón
Post by Stefano D'Angelo
Post by David García Garzón
Post by Stefano D'Angelo
Also, this way some noticeable improvements can be made on performance
if this wrapper would be able to represent processing networks which
can be "simplified", as for example a net of LTI systems with known
transfer function (fourier transform).
Sorry, I don't understand you here.
It's quite simple: if you have a processing standard which represent
processing objects as LTI (linear time-invariant) systems using the
fourier transform of their transfer function (books often call this
H(f)) and you arrange such objects in a network, then, instead of
calculating outputs for each object, you can just multiply all H(f)s
following a certain path and use this result as the H(f) of the whole
network. This would allow network-based optimization (but obviously
the wrapper would have to know how the net is made).
Well, it seems like you're a teacher or a researcher, so you probably
know more than me about these stuff.
This, however, is just a thought.
In case I wasn't clear enough, just tell me.
Ok, i get the context now. As you say, I'am both teacher and researcher, but
my field is Software Engineering and my knowledge on theoretical DSP is not
that mature, so don't take my DSP related statements so serious. To my level
of knowledge, i could say that most of the plugins are not just LTI so, the
kind of optimization you suggest would be not general just appliable to
consecutive adjacent LTI systems. At the same time is something that may have
a lot of sense in FreeADSP.
I actually don't know how many plugins are LTI, but, for example, a
lot of delays, reverbs, choruses, eq. filters, compressors, modulators
and "sound mixers" should be, and that's quite enough after all.
Consider that in the case of commonly used guitar effects the only
non-linear effects by their nature should be hardware-simulators
(valves, amps, etc.), distortions and synths.

Anyway (did I say this?) I'm considering also that if a "module"
contains info on how to generate plugins following the standard they
wrap, well, I think that CLAM developers could be interested in this
too. In other words, I'm going my way but I'm seeking for people who
want to contribute or have some nice ideas or can give useful
suggestions, and that's why I'm here...

Stefano
David García Garzón
2007-02-14 17:56:22 UTC
Permalink
Post by Stefano D'Angelo
Post by David García Garzón
Post by Stefano D'Angelo
Post by David García Garzón
Yes, it's very interesting and it is a path we want to walk.
Currently, apart of building Ladspa plugins, CLAM also can be a
Ladspa host and we should extend that to other plugins systems. We
have two students in our lab working on plugin and hosting aspects,
but they need some time for any outcome.
Well, CLAM is a big and important project and I am just an unknown
student from Italy who is trying to develop a replacement for pedal
boards and stomp boxes and trying to let people easily reuse the code
I'm going to write.
Anyway, if you think that me and/or my project
(http://freeadsp.sourceforge.net - the site is not being updated since
we're working on a new one) can contribute, I/we'll be pleased to.
Nice. It is one of the promissing new projects Dave Phillips reported in
its blog, so it will deserve a close look. :-)
Well, I hope that Dave knows what he said :-) Currently we are two
part-time developers (both university students here in Italy) and my
friend is not very skilled at programming at this level, but he
whishes to learn.
Personally, I do what I can.
Post by David García Garzón
Post by Stefano D'Angelo
Post by David García Garzón
Post by Stefano D'Angelo
But, anyway, maybe combining the two things could be of some
interest: imagine that you want to be able to develop and use
immediately in all supporting applications a plugin system capable
of using the z-transform.
In this way you could build a module for this wrapper and
soon start programming your plugins and use them, without having to
wait for the adoption of "your standard".
CLAM is not an standard to be adopted. Alsa, Jack and so on are the
standards. CLAM should be a convenience implementation tool.
Migration is something that can not be expected and we have a lot of
experience on that. I am for providing interframework wrappers so
everyone could develop on the framework he is used to (Mathlab,
Marsyas, Pd...) and still reuse what it is done in other frameworks.
I think it's clear that I'm not talking about a new standard and
things like that. I'm talking mainly about a wrapper. The possiblity
to develop new formats and have them working with any app that uses
such wrapper comes directly from the nature of the wrapper itself...
In other words I was just wondering how things could go after such
thing would be ready and working.
Then, to be honest, I think that if I/we succeed with implementing a
clean way to make VST, LADSPA, LV2, DSSI, etc. work well together,
some already started projects would at least consider the chance of
using such framework.
Host -> Wrapper -> Wrapper module (plugin loader - one per standard)
-> Processing object (plugin)
In this case the host needs information on how to use a processing
object. But if you put inside such wrapper module also information on
how to build a processing object from an algorithm, than CLAM could use
the same interface to do that other thing.
That would be great. Our main motivation for looking for hosting plugins
(or other implementation platform such pd) in CLAM that is to give reuse
options to people developing with CLAM. Providing an adapter layer among
plugin systems is a side effect that may occur some time later. But note
that so many layers is not the best option for performance so wrapping
the wrapper should not be the first option.
Splitting with an interface between what you call 'Wrapper' and 'Wrapper
module' ('Audio System Backend', and 'Plugin adapter' in our vocabulary)
is a good idea as it eases the effort from NxN adapters to 2xN. But still
is a hard task to define such intermediate interface that we are
designing incrementally. My advice to you is not to generalize so much at
the beginning.
I know that's hard. However the actual wrapper, in my case, is the
'Wrapper module'. The 'Wrapper' is just the generic library to use in
hosts.
Anyway I'm taking a quite direct approach to the matter: first wrap
LADSPA, then extend to the processing-side of other systems, then work
on the UI-side, and at the end experiment with new possibilities...
few words for a lot of work really :-)
Post by David García Garzón
Post by Stefano D'Angelo
Post by David García Garzón
Post by Stefano D'Angelo
Also, this way some noticeable improvements can be made on
performance if this wrapper would be able to represent processing
networks which can be "simplified", as for example a net of LTI
systems with known transfer function (fourier transform).
Sorry, I don't understand you here.
It's quite simple: if you have a processing standard which represent
processing objects as LTI (linear time-invariant) systems using the
fourier transform of their transfer function (books often call this
H(f)) and you arrange such objects in a network, then, instead of
calculating outputs for each object, you can just multiply all H(f)s
following a certain path and use this result as the H(f) of the whole
network. This would allow network-based optimization (but obviously
the wrapper would have to know how the net is made).
Well, it seems like you're a teacher or a researcher, so you probably
know more than me about these stuff.
This, however, is just a thought.
In case I wasn't clear enough, just tell me.
Ok, i get the context now. As you say, I'am both teacher and researcher,
but my field is Software Engineering and my knowledge on theoretical DSP
is not that mature, so don't take my DSP related statements so serious.
To my level of knowledge, i could say that most of the plugins are not
just LTI so, the kind of optimization you suggest would be not general
just appliable to consecutive adjacent LTI systems. At the same time is
something that may have a lot of sense in FreeADSP.
I actually don't know how many plugins are LTI, but, for example, a
lot of delays, reverbs, choruses, eq. filters, compressors, modulators
and "sound mixers" should be, and that's quite enough after all.
Consider that in the case of commonly used guitar effects the only
non-linear effects by their nature should be hardware-simulators
(valves, amps, etc.), distortions and synths.
Not sure at all, but i thought compressor was an example of non linear
transformation and they normally have an adaptive behavior which make them
not time invariant, so you cannot model it as a single H(f). It is very easy
to find a plugin which has been introduced non-linearity o time-variation
thought the main function (a filter, chorus, reverb...) is LTI. We are also
used other kinds of transformations such as sine shifting that are far from
linear also.
Post by Stefano D'Angelo
Anyway (did I say this?) I'm considering also that if a "module"
contains info on how to generate plugins following the standard they
wrap, well, I think that CLAM developers could be interested in this
too. In other words, I'm going my way but I'm seeking for people who
want to contribute or have some nice ideas or can give useful
suggestions, and that's why I'm here...
Yes we are, so keep us informed on your progress about plugin system
integration. We'll do so. ;-)
Camilo Polyméris
2007-02-17 03:45:37 UTC
Permalink
Post by David García Garzón
Post by Stefano D'Angelo
Post by David García Garzón
Ok, i get the context now. As you say, I'am both teacher and researcher,
but my field is Software Engineering and my knowledge on theoretical DSP
is not that mature, so don't take my DSP related statements so serious.
To my level of knowledge, i could say that most of the plugins are not
just LTI so, the kind of optimization you suggest would be not general
just appliable to consecutive adjacent LTI systems. At the same time is
something that may have a lot of sense in FreeADSP.
I actually don't know how many plugins are LTI, but, for example, a
lot of delays, reverbs, choruses, eq. filters, compressors, modulators
and "sound mixers" should be, and that's quite enough after all.
Consider that in the case of commonly used guitar effects the only
non-linear effects by their nature should be hardware-simulators
(valves, amps, etc.), distortions and synths.
Not sure at all, but i thought compressor was an example of non linear
transformation and they normally have an adaptive behavior which make them
not time invariant, so you cannot model it as a single H(f). It is very easy
to find a plugin which has been introduced non-linearity o time-variation
thought the main function (a filter, chorus, reverb...) is LTI. We are also
used other kinds of transformations such as sine shifting that are far from
linear also.
You are right. Compressors are the archetypical examples of non-linear
filters. Delays, reverbs, chori, etc. are non-TI (they depend of past
input).
Still, there may be other properties which could be exploited for
optimization, as Stefano suggests. For example, you could collapse
several equalizers to one filter. Or if two consecutive plugins do FFT,
the first could pass the information to the second in the frequency domain.
Neither of these would work with current plugin architectures, though.
While I didn't think of plugins, but rather of more low-level components
(constructs of an audio-oriented programming language, or similar), I
have also been playing with a similar idea, but have not come to any
conclusions yet. I hope the linux-audio-dev comunity does.

Camilo
Stefano D'Angelo
2007-02-17 06:19:05 UTC
Permalink
Post by Camilo Polyméris
Post by David García Garzón
Post by Stefano D'Angelo
Post by David García Garzón
Ok, i get the context now. As you say, I'am both teacher and researcher,
but my field is Software Engineering and my knowledge on theoretical DSP
is not that mature, so don't take my DSP related statements so serious.
To my level of knowledge, i could say that most of the plugins are not
just LTI so, the kind of optimization you suggest would be not general
just appliable to consecutive adjacent LTI systems. At the same time is
something that may have a lot of sense in FreeADSP.
I actually don't know how many plugins are LTI, but, for example, a
lot of delays, reverbs, choruses, eq. filters, compressors, modulators
and "sound mixers" should be, and that's quite enough after all.
Consider that in the case of commonly used guitar effects the only
non-linear effects by their nature should be hardware-simulators
(valves, amps, etc.), distortions and synths.
Not sure at all, but i thought compressor was an example of non linear
transformation and they normally have an adaptive behavior which make them
not time invariant, so you cannot model it as a single H(f). It is very easy
to find a plugin which has been introduced non-linearity o time-variation
thought the main function (a filter, chorus, reverb...) is LTI. We are also
used other kinds of transformations such as sine shifting that are far from
linear also.
You are right. Compressors are the archetypical examples of non-linear
filters. Delays, reverbs, chori, etc. are non-TI (they depend of past
input).
Still, there may be other properties which could be exploited for
optimization, as Stefano suggests. For example, you could collapse
several equalizers to one filter. Or if two consecutive plugins do FFT,
the first could pass the information to the second in the frequency domain.
Neither of these would work with current plugin architectures, though.
While I didn't think of plugins, but rather of more low-level components
(constructs of an audio-oriented programming language, or similar), I
have also been playing with a similar idea, but have not come to any
conclusions yet. I hope the linux-audio-dev comunity does.
Yes, compressors are non-linear because their output depends on the
value of the input, however reverbs, choruses, etc. are linear, no
matter that they depend on past input (also low-pass and high-pass
filters depend on past input really :-)
If you are not convinced a simple delay could have this transformation equation:

y(t) = sum ( alpha_k * delta (x - kT) )

with k >= 0. If you do

T(a*x1(t) + b*x2(t)) = a*y1(t) + b*y2(t) = a*T(x1(t)) + b*T(x2(t))

y1 and y2 are different only because of alpha_k coefficients, so the
filter is definitively linear :-)

Stefano
Stefano D'Angelo
2007-02-17 06:23:34 UTC
Permalink
Post by Stefano D'Angelo
Post by Camilo Polyméris
Post by David García Garzón
Post by Stefano D'Angelo
Post by David García Garzón
Ok, i get the context now. As you say, I'am both teacher and researcher,
but my field is Software Engineering and my knowledge on theoretical DSP
is not that mature, so don't take my DSP related statements so serious.
To my level of knowledge, i could say that most of the plugins are not
just LTI so, the kind of optimization you suggest would be not general
just appliable to consecutive adjacent LTI systems. At the same time is
something that may have a lot of sense in FreeADSP.
I actually don't know how many plugins are LTI, but, for example, a
lot of delays, reverbs, choruses, eq. filters, compressors, modulators
and "sound mixers" should be, and that's quite enough after all.
Consider that in the case of commonly used guitar effects the only
non-linear effects by their nature should be hardware-simulators
(valves, amps, etc.), distortions and synths.
Not sure at all, but i thought compressor was an example of non linear
transformation and they normally have an adaptive behavior which make them
not time invariant, so you cannot model it as a single H(f). It is very easy
to find a plugin which has been introduced non-linearity o time-variation
thought the main function (a filter, chorus, reverb...) is LTI. We are also
used other kinds of transformations such as sine shifting that are far from
linear also.
You are right. Compressors are the archetypical examples of non-linear
filters. Delays, reverbs, chori, etc. are non-TI (they depend of past
input).
Still, there may be other properties which could be exploited for
optimization, as Stefano suggests. For example, you could collapse
several equalizers to one filter. Or if two consecutive plugins do FFT,
the first could pass the information to the second in the frequency domain.
Neither of these would work with current plugin architectures, though.
While I didn't think of plugins, but rather of more low-level components
(constructs of an audio-oriented programming language, or similar), I
have also been playing with a similar idea, but have not come to any
conclusions yet. I hope the linux-audio-dev comunity does.
Yes, compressors are non-linear because their output depends on the
value of the input, however reverbs, choruses, etc. are linear, no
matter that they depend on past input (also low-pass and high-pass
filters depend on past input really :-)
y(t) = sum ( alpha_k * delta (x - kT) )
with k >= 0. If you do
T(a*x1(t) + b*x2(t)) = a*y1(t) + b*y2(t) = a*T(x1(t)) + b*T(x2(t))
y1 and y2 are different only because of alpha_k coefficients, so the
filter is definitively linear :-)
Stefano
OOOOOps... what did I say....
I just woke up and I'm a bit confused.
the y(t) I was referring to is actually the h(t) of a delay :-)

Stefano
Thorsten Wilms
2007-02-17 10:47:59 UTC
Permalink
Post by Camilo Polyméris
Or if two consecutive plugins do FFT,
the first could pass the information to the second in the frequency domain.
Neither of these would work with current plugin architectures, though.
Well, there has been that idea floating around, about having converter
plugins to and from frequency domain and other plugins that work in
frequency domain exclusively. Using LV2 with whatever extension(s)
would be necessary.
--
Thorsten Wilms

Thorwil's Creature Illustrations:
http://www.printfection.com/thorwil
Steve Harris
2007-02-17 12:16:40 UTC
Permalink
Post by Thorsten Wilms
Post by Camilo Polyméris
Or if two consecutive plugins do FFT,
the first could pass the information to the second in the
frequency domain.
Neither of these would work with current plugin architectures, though.
Well, there has been that idea floating around, about having converter
plugins to and from frequency domain and other plugins that work in
frequency domain exclusively. Using LV2 with whatever extension(s)
would be necessary.
Yeah, I still plan to do this if I ever get the time.

- Steve
Leonard Ritter
2007-02-14 04:18:38 UTC
Permalink
Post by Stefano D'Angelo
Hi all,
who would be interested in writing a processing plugin standard
wrapper (LADSPA, DSSI, LV2, VST, etc.)?
i read the follow-up posts by other contributers, and i believe that
your idea is failing, because of the paradox nature of what you want.

the following example is based on experience and regards any kind of
wrapper problem.

say you are confronted with supporting 4 different interfaces. although
they all more or less target the same goal, you believe that this is 3
interfaces too much, and you would like to have only one interface to
talk to, so you don't have to learn 4 different interfaces.

because writing your own stuff yourself (problem solving) is of course
more fun than learning the interface of someone else (understanding a
solved problem), and also because you perhaps believe that you can play
an important role in the history of interface development, you decide to
write a new interface, the one to rule them all.

so you have now 4+1 = 5 interfaces. theoretically, the issue should be
solved, since you have now one central interface to talk to the other 4.

however you disregard that in the process of writing your own interface,
you learned all other 4 interfaces (so you could translate between
them), which is exactly what you wanted to avoid. of course other people
would profit from your work, ideally.

the second issue is that your 5th interface accumulates all features of
the other 4, which means that it will be the hardest to comprehend.

and this is where we enter a vicious cycle, which hints a bit on the
past of those other 4 interfaces. say, a new person comes along, as
ambitious and imaginative as you are. looking at these 5 different
interfaces, and also seeing certain superficial issues with your
interface, that person decides that those 5 interfaces need to be
wrapped up into a 6th interface.

from my experience, the solution to reducing the amount of interfaces
one has to talk to is not to add another one, but to comprehend those
that exist, and just use the most popular one, with the best licence.

the route i took for aldrin was to support none of these (since they
didn't match the problem that i had), and, if the need for a certain
piece of dsp code arises, port that code over to my private interface.

my experience is that programming by contract is something you need in a
closed source environment. in an open source environment, you usually do
fine just moving around out the code that you want, e.g. i ported
freeverb3 back from ladspa to lunar in half an hour. this way, you
exercise full control over code executed in your process.

KISSes,
--
Leonard Ritter

-- Freelance Art & Logic
-- http://www.leonard-ritter.com
Steve Harris
2007-02-14 08:10:50 UTC
Permalink
Post by Leonard Ritter
Post by Stefano D'Angelo
Hi all,
who would be interested in writing a processing plugin standard
wrapper (LADSPA, DSSI, LV2, VST, etc.)?
...
Post by Leonard Ritter
so you have now 4+1 = 5 interfaces. theoretically, the issue should be
solved, since you have now one central interface to talk to the other 4.
however you disregard that in the process of writing your own
interface,
you learned all other 4 interfaces (so you could translate between
them), which is exactly what you wanted to avoid. of course other people
would profit from your work, ideally.
the second issue is that your 5th interface accumulates all
features of
the other 4, which means that it will be the hardest to comprehend.
...
Post by Leonard Ritter
the route i took for aldrin was to support none of these (since they
didn't match the problem that i had), and, if the need for a certain
piece of dsp code arises, port that code over to my private interface.
So... you're inviting him to learn from your mistake? ;)

I agree with the bulk of what you wrote, the solution to the problem
of too many APIs is not to add another, but I don't think your own
code is a good example of that.

I am also not in a position to throw any stones.

- Steve
Leonard Ritter
2007-02-14 17:19:10 UTC
Permalink
Post by Steve Harris
So... you're inviting him to learn from your mistake? ;)
what mistake?
Post by Steve Harris
I agree with the bulk of what you wrote, the solution to the problem
of too many APIs is not to add another, but I don't think your own
code is a good example of that.
my own interface code is not meant for public consumption.
--
Leonard Ritter

-- Freelance Art & Logic
-- http://www.leonard-ritter.com
Stefano D'Angelo
2007-02-14 09:21:26 UTC
Permalink
Hi,
Leonard, I know what you mean, for me it would be easier too to just
learn one, pick it up, let go the others and not creating much more
confusion with a wrapper.
I perfectly understand also that wrappers usually add unneeded
complexity, which is obviously the worst thing to do, especially with
pieces of code that should run at real-time.

I don't think I'm a great programmer or a genius with incredible ideas
in mind I'm quite young and inexperienced, and that's why I'm here.
However I'm just trying to solve a problem that is still open, since
no satisfying solution has been developed still (yes I know about fst,
dssi-vst-bridge, etc. but I don't think they can be considered much
more than a well done proof-of-concept).

On the other hand I'm willing to work on different levels and with an
open mind, so that such wrapper could not only provide a robust
implementation for header-defined standards like LADSPA, and solve
some metaphorical issues which would arise with API bridges (I think
that the only API which could wrap VST decently, for example, is LV2,
but it will take a bunch of extensions), and also to introduce new
standard-independent features (eg. LASH support) as well as providing
a tool to test new solutions more easily (again, LV2 could possibly do
this, but you need a very extensions-aware program to do, for example,
per-network optimizations).

I understand that most of you don't feel the need to have such thing,
because LADSPA support is everywhere and lots of LADSPA plugins are
good, but from my point of view there are thousands of VST plugins
around and thousands of hardware machines that use VST and that little
program I'm going to write simply can't ignore this situation.
I'm absolutely pro-LADSPA/LV2, and I particularly dislike VST license,
but it's not a reason to exclude support right now. And it is not a
reason to stop experimenting new possible solutions.
At the end freedom is choice, isn't it?

Stefano
Leonard Ritter
2007-02-14 17:22:41 UTC
Permalink
Post by Stefano D'Angelo
I understand that most of you don't feel the need to have such thing,
because LADSPA support is everywhere and lots of LADSPA plugins are
good, but from my point of view there are thousands of VST plugins
around and thousands of hardware machines that use VST and that little
program I'm going to write simply can't ignore this situation.
actually i'd rather like to see all audio plugin interfaces vanish.
audio plugins is a concept from the closed source world. i've worked
long enough with it to be able to say that this concept is severely
flawed in multiple ways.
--
Leonard Ritter

-- Freelance Art & Logic
-- http://www.leonard-ritter.com
Stefano D'Angelo
2007-02-14 17:30:17 UTC
Permalink
Post by Leonard Ritter
Post by Stefano D'Angelo
I understand that most of you don't feel the need to have such thing,
because LADSPA support is everywhere and lots of LADSPA plugins are
good, but from my point of view there are thousands of VST plugins
around and thousands of hardware machines that use VST and that little
program I'm going to write simply can't ignore this situation.
actually i'd rather like to see all audio plugin interfaces vanish.
audio plugins is a concept from the closed source world. i've worked
long enough with it to be able to say that this concept is severely
flawed in multiple ways.
This is a good point and truly interests me.
Why do you think that? And what would you suggest, instead?

Stefano
Leonard Ritter
2007-02-14 18:11:09 UTC
Permalink
Post by Stefano D'Angelo
Post by Leonard Ritter
actually i'd rather like to see all audio plugin interfaces vanish.
audio plugins is a concept from the closed source world. i've worked
long enough with it to be able to say that this concept is severely
flawed in multiple ways.
This is a good point and truly interests me.
Why do you think that? And what would you suggest, instead?
this is a quite raw opinion that needs to be fleshed out and turned into
some kind of blog article. i have this thought boiling for quite some
time, and i suppose it's about time i elaborate on it.

i'll let you know when it's ready.
--
Leonard Ritter

-- Freelance Art & Logic
-- http://www.leonard-ritter.com
Lars Luthman
2007-02-14 17:47:53 UTC
Permalink
Post by Leonard Ritter
Post by Stefano D'Angelo
I understand that most of you don't feel the need to have such thing,
because LADSPA support is everywhere and lots of LADSPA plugins are
good, but from my point of view there are thousands of VST plugins
around and thousands of hardware machines that use VST and that little
program I'm going to write simply can't ignore this situation.
actually i'd rather like to see all audio plugin interfaces vanish.
audio plugins is a concept from the closed source world. i've worked
long enough with it to be able to say that this concept is severely
flawed in multiple ways.
There is definitely something to be said for a user just having to
install a single plugin package of, say, a cool new filter, and
immediately have it work in his modular synth, his sample editor, his
harddisk recorder etc. Having the source doesn't help someone who
doesn't know how to program, and even if it was trivial to take the
filter code and wrap it in new interfaces for the modular synth, the
sample editor, and the harddisk recorder, that is a lot of unnecessary
work. Sure, one single plugin interface is usually not perfect for all
uses, but having to write multiple interfaces is even less ideal.


--ll
Leonard Ritter
2007-02-14 18:15:07 UTC
Permalink
Post by Lars Luthman
There is definitely something to be said for a user just having to
install a single plugin package of, say, a cool new filter, and
immediately have it work in his modular synth, his sample editor, his
harddisk recorder etc. Having the source doesn't help someone who
doesn't know how to program,
you are switching from one radical to the other. there are possibilities
and advantages between those poles, but as said, this needs to be
fleshed out and turned into some kind of blog article. i have this
thought boiling for quite some time, and i suppose it's about time i
elaborate on it.
--
Leonard Ritter

-- Freelance Art & Logic
-- http://www.leonard-ritter.com
Lars Luthman
2007-02-14 18:29:04 UTC
Permalink
Post by Leonard Ritter
Post by Lars Luthman
There is definitely something to be said for a user just having to
install a single plugin package of, say, a cool new filter, and
immediately have it work in his modular synth, his sample editor, his
harddisk recorder etc. Having the source doesn't help someone who
doesn't know how to program,
you are switching from one radical to the other. there are possibilities
and advantages between those poles, but as said, this needs to be
fleshed out and turned into some kind of blog article. i have this
thought boiling for quite some time, and i suppose it's about time i
elaborate on it.
Well, if you want a piece of code to work in multiple environments you
either need to write multiple interfaces or make sure that those
different environments are equivalent from the new code's point of view
(i.e. uses a common plugin interface). I don't see how there can be
anything inbetween.


--ll
Paul Coccoli
2007-02-14 19:54:31 UTC
Permalink
Post by Stefano D'Angelo
I understand that most of you don't feel the need to have such thing,
because LADSPA support is everywhere and lots of LADSPA plugins are
good, but from my point of view there are thousands of VST plugins
around and thousands of hardware machines that use VST and that little
program I'm going to write simply can't ignore this situation.
I'm absolutely pro-LADSPA/LV2, and I particularly dislike VST license,
but it's not a reason to exclude support right now. And it is not a
reason to stop experimenting new possible solutions.
At the end freedom is choice, isn't it?
Stefano
Aren't you kind of glossing over the fact that those thousands of VST
plugins around were all written for a different OS, and therefore
"wrapping" them involves a whole mess of technical and philosophical
problems?

Personally, I'd rather see the effort go towards making LV2 a real,
workable standard with all the important features (presets, host tempo
sync, MIDI handling/processing) that some of the other standards have.
Stefano D'Angelo
2007-02-14 20:18:19 UTC
Permalink
Post by Paul Coccoli
Post by Stefano D'Angelo
I understand that most of you don't feel the need to have such thing,
because LADSPA support is everywhere and lots of LADSPA plugins are
good, but from my point of view there are thousands of VST plugins
around and thousands of hardware machines that use VST and that little
program I'm going to write simply can't ignore this situation.
I'm absolutely pro-LADSPA/LV2, and I particularly dislike VST license,
but it's not a reason to exclude support right now. And it is not a
reason to stop experimenting new possible solutions.
At the end freedom is choice, isn't it?
Stefano
Aren't you kind of glossing over the fact that those thousands of VST
plugins around were all written for a different OS, and therefore
"wrapping" them involves a whole mess of technical and philosophical
problems?
Personally, I'd rather see the effort go towards making LV2 a real,
workable standard with all the important features (presets, host tempo
sync, MIDI handling/processing) that some of the other standards have.
The wrapper would be cross-platform (that means VST implementations
for Linux, Mac and Windows would be different, of course).
However if I work on A it doesn't mean I won't work on B, but just
that I've less time to spend on A. And since I'm not an LV2 guru,
well, maybe I'm not even capable of that.
Then LV2 hasn't been released yet, so maybe it's better to wait right now.
Post by Paul Coccoli
Well, if you want a piece of code to work in multiple environments you
either need to write multiple interfaces or make sure that those
different environments are equivalent from the new code's point of view
(i.e. uses a common plugin interface). I don't see how there can be
anything inbetween.
What about a text file with a math formula within it to be used as a
"processing object"?
Ok, it's not a piece of code, but...

Stefano
Stephen Sinclair
2007-02-14 20:40:20 UTC
Permalink
Post by Stefano D'Angelo
What about a text file with a math formula within it to be used as a
"processing object"?
Ok, it's not a piece of code, but...
sounds a bit like FAUST.
http://faust.grame.fr/

steve
Stefano D'Angelo
2007-02-14 20:56:24 UTC
Permalink
Post by Stephen Sinclair
Post by Stefano D'Angelo
What about a text file with a math formula within it to be used as a
"processing object"?
Ok, it's not a piece of code, but...
sounds a bit like FAUST.
http://faust.grame.fr/
Well, FAUST developers could take advantage of such wrapper too, if
implemented the way I previously suggested: with code to build plugins
within wrapping modules.

Stefano
Xavier Amatriain
2007-02-14 21:06:03 UTC
Permalink
Hard to jump in the thread at this point, but let me try...

I agree that the plugin approach is severely flawed in many senses
(including the false sense of "freedom" that gives to developers that
are then caught in proprietary specifications). Having a better open
standard like LV2 is a good thing but I see Stefano's point because this
is not going to help promote interoperability, is it?

My humble opinion is that the solution is already sketched out by
existing tools. Imagine an open standard that combines audio streaming
connection a la Jack with control interchange a la OSC. Make it 100%
cross-platform, add a little of Dataflow theory into it to guarantee
proper scheduling and avoiding deadlocks and... voila!

Curiously enough this has a lot to do with another email sent by another
CLAM developer (Pau) to the list ("Data-flow systems integration").

I think an effort in integrating data + control inter-application
communication would be much more worthwhile than following the plugin
route.

Stefano, have you though what your wrapper will suffer when let's say
VST 3.0 is published and you have to rewrite all your stuff simply
because Steinberg has decided to rename a few functions or change the
behavior of many others. Are you going to support a subset of the
specification (like most hosts do)? a subset of all of them? a superset?

Also your talk on LTI makes me think that you are not familiar with
models of computation (e.g. Dataflow Networks). Contact me off-list if
you need good pointers).

Xavier
Post by Stephen Sinclair
Post by Stefano D'Angelo
What about a text file with a math formula within it to be used as a
"processing object"?
Ok, it's not a piece of code, but...
sounds a bit like FAUST.
http://faust.grame.fr/
steve
--
Stefano D'Angelo
2007-02-14 21:18:21 UTC
Permalink
Post by Xavier Amatriain
Hard to jump in the thread at this point, but let me try...
I agree that the plugin approach is severely flawed in many senses
(including the false sense of "freedom" that gives to developers that
are then caught in proprietary specifications). Having a better open
standard like LV2 is a good thing but I see Stefano's point because this
is not going to help promote interoperability, is it?
Right.
Post by Xavier Amatriain
My humble opinion is that the solution is already sketched out by
existing tools. Imagine an open standard that combines audio streaming
connection a la Jack with control interchange a la OSC. Make it 100%
cross-platform, add a little of Dataflow theory into it to guarantee
proper scheduling and avoiding deadlocks and... voila!
... just a dream? :-)
Post by Xavier Amatriain
Curiously enough this has a lot to do with another email sent by another
CLAM developer (Pau) to the list ("Data-flow systems integration").
I think an effort in integrating data + control inter-application
communication would be much more worthwhile than following the plugin
route.
Stefano, have you though what your wrapper will suffer when let's say
VST 3.0 is published and you have to rewrite all your stuff simply
because Steinberg has decided to rename a few functions or change the
behavior of many others. Are you going to support a subset of the
specification (like most hosts do)? a subset of all of them? a superset?
I mean a superset and I know that's a tremendous amount of work, I do
hope that someone will join in.
Post by Xavier Amatriain
Also your talk on LTI makes me think that you are not familiar with
models of computation (e.g. Dataflow Networks). Contact me off-list if
you need good pointers).
I'm going to do it right now. I'm just a beginner in DSP-related stuff
(I have my first DSP exam the day after tomorrow), but it interests me
a lot.
I have some texts on the topic, but knowledge is always welcome :-)

Stefano
Loki Davison
2007-02-16 04:24:19 UTC
Permalink
Post by Stefano D'Angelo
Post by Xavier Amatriain
Hard to jump in the thread at this point, but let me try...
I agree that the plugin approach is severely flawed in many senses
(including the false sense of "freedom" that gives to developers that
are then caught in proprietary specifications). Having a better open
standard like LV2 is a good thing but I see Stefano's point because this
is not going to help promote interoperability, is it?
Right.
Post by Xavier Amatriain
My humble opinion is that the solution is already sketched out by
existing tools. Imagine an open standard that combines audio streaming
connection a la Jack with control interchange a la OSC. Make it 100%
cross-platform, add a little of Dataflow theory into it to guarantee
proper scheduling and avoiding deadlocks and... voila!
... just a dream? :-)
Post by Xavier Amatriain
Curiously enough this has a lot to do with another email sent by another
CLAM developer (Pau) to the list ("Data-flow systems integration").
I think an effort in integrating data + control inter-application
communication would be much more worthwhile than following the plugin
route.
Stefano, have you though what your wrapper will suffer when let's say
VST 3.0 is published and you have to rewrite all your stuff simply
because Steinberg has decided to rename a few functions or change the
behavior of many others. Are you going to support a subset of the
specification (like most hosts do)? a subset of all of them? a superset?
I mean a superset and I know that's a tremendous amount of work, I do
hope that someone will join in.
Post by Xavier Amatriain
Also your talk on LTI makes me think that you are not familiar with
models of computation (e.g. Dataflow Networks). Contact me off-list if
you need good pointers).
I'm going to do it right now. I'm just a beginner in DSP-related stuff
(I have my first DSP exam the day after tomorrow), but it interests me
a lot.
I have some texts on the topic, but knowledge is always welcome :-)
Stefano
As a beginner maybe it is a much more solid option to work on writing
some simple lv2 plugins instead of a new plugin wrapper/standard. Some
new plugins would probably benifit everyone much more than even more
hot air. Helping out with the lv2 zyn efforts would be pretty awesome
and I'm sure working on an existing project is great for having some
direction and assistance at getting better.

Loki
Stefano D'Angelo
2007-02-16 09:43:12 UTC
Permalink
Post by Loki Davison
Post by Stefano D'Angelo
Post by Xavier Amatriain
Hard to jump in the thread at this point, but let me try...
I agree that the plugin approach is severely flawed in many senses
(including the false sense of "freedom" that gives to developers that
are then caught in proprietary specifications). Having a better open
standard like LV2 is a good thing but I see Stefano's point because this
is not going to help promote interoperability, is it?
Right.
Post by Xavier Amatriain
My humble opinion is that the solution is already sketched out by
existing tools. Imagine an open standard that combines audio streaming
connection a la Jack with control interchange a la OSC. Make it 100%
cross-platform, add a little of Dataflow theory into it to guarantee
proper scheduling and avoiding deadlocks and... voila!
... just a dream? :-)
Post by Xavier Amatriain
Curiously enough this has a lot to do with another email sent by another
CLAM developer (Pau) to the list ("Data-flow systems integration").
I think an effort in integrating data + control inter-application
communication would be much more worthwhile than following the plugin
route.
Stefano, have you though what your wrapper will suffer when let's say
VST 3.0 is published and you have to rewrite all your stuff simply
because Steinberg has decided to rename a few functions or change the
behavior of many others. Are you going to support a subset of the
specification (like most hosts do)? a subset of all of them? a superset?
I mean a superset and I know that's a tremendous amount of work, I do
hope that someone will join in.
Post by Xavier Amatriain
Also your talk on LTI makes me think that you are not familiar with
models of computation (e.g. Dataflow Networks). Contact me off-list if
you need good pointers).
I'm going to do it right now. I'm just a beginner in DSP-related stuff
(I have my first DSP exam the day after tomorrow), but it interests me
a lot.
I have some texts on the topic, but knowledge is always welcome :-)
Stefano
As a beginner maybe it is a much more solid option to work on writing
some simple lv2 plugins instead of a new plugin wrapper/standard. Some
new plugins would probably benifit everyone much more than even more
hot air. Helping out with the lv2 zyn efforts would be pretty awesome
and I'm sure working on an existing project is great for having some
direction and assistance at getting better.
I know you're right, but it is not something I need. And when I don't
need a thing, I just get bored :-)

Stefano
Malte Steiner
2007-02-14 20:36:57 UTC
Permalink
Post by Paul Coccoli
Personally, I'd rather see the effort go towards making LV2 a real,
workable standard with all the important features (presets, host tempo
sync, MIDI handling/processing) that some of the other standards have.
count me in, I prefer putting developer time into creation and
communication of native formats like LADSPA/DSSI and LV2 over supporting
proprietary closed formats.

Communication means telling the world that there are some plugins for
Linux too. When I look on KVRAudio I see nearly nothing in the Linux
entrys and submitted immediately Ardour as a host which was missing too,
a shame because its an important plus for Linux as musicplatform.

Problem is that the kids watch that site to see whats going on and what
is available and perceive Linux kinda lame for audio, missing the fact
that there are several standalone synths so plugins are not so
necessary. Of course there are the Linux audio pages and they are good
for us, but the kids living elsewhere although I think an slight
interest is there, especially with all the inherent problems of Windows
Vista.

Why bother with the kids? To get a critical mass so the driver situation
gets improved, that are my selfish motives.

I am about to create some DSSI Synths. I wanted to share some pd patches
which act and sound very well as multipurpose synths (as the opposite as
very specialized patches which are good only for 1 song) but find it
hard to 'distribute' because they depend on some externals which might
be not available, to complicate for instant gratification. So I prefer
now a self contained format. More soon here or on LAU list...

Cheers,

Malte
--
Malte Steiner
media art + development
-www.block4.com-
Jeff McClintock
2007-02-18 21:32:17 UTC
Permalink
Post by Stefano D'Angelo
I actually don't know how many plugins are LTI, but, for example, a
lot of delays, reverbs, choruses, eq. filters, compressors, modulators
and "sound mixers" should be, and that's quite enough after all.
Yeah, It's a good optimization. The SynthEdit plugin API supports
inputs being flagged as 'linear', if several such plugins are used in
parallel they are automatically collapsed into a single instance which
is fed the summed signals of the original plugins. Plugin are collapsed
only when their control inputs are the same.

BEFORE optimation:

[plugin]-->[delay1]------>
[plugin]-->[delay2]-/

AFTER:

[plugin]--->[delay1]--->
[plugin]-/

e.g. two parallel 100ms delays are combined. Two different length
delays aren't.

This is most useful in synth patches where each voice is an identical
parallel sub-patch.


Jeff McClintock
Stefano D'Angelo
2007-02-18 22:41:06 UTC
Permalink
Post by Jeff McClintock
Post by Stefano D'Angelo
I actually don't know how many plugins are LTI, but, for example, a
lot of delays, reverbs, choruses, eq. filters, compressors, modulators
and "sound mixers" should be, and that's quite enough after all.
Yeah, It's a good optimization. The SynthEdit plugin API supports
inputs being flagged as 'linear', if several such plugins are used in
parallel they are automatically collapsed into a single instance which
is fed the summed signals of the original plugins. Plugin are collapsed
only when their control inputs are the same.
[plugin]-->[delay1]------>
[plugin]-->[delay2]-/
[plugin]--->[delay1]--->
[plugin]-/
e.g. two parallel 100ms delays are combined. Two different length
delays aren't.
This is most useful in synth patches where each voice is an identical
parallel sub-patch.
Jeff McClintock
After the first 40 mails ( :-P ) I still have the impression that this
is a good thing after all and that it is more time-consuming than hard
to do, so I think that if someone joins me we can do all of this
become reality. So here I'm asking once again: does someone work on
this?

Anyway I'm continuously uploading the code I'm working on to the
FreeADSP SVN repository (naspro folder), so that everyone can see how
things are going and can give suggests and/or make arguments and/or
anything.

Right now I've only set up an autotoolized tree (+ doxygen and texinfo
support), wrote some header files and a good piece of the module
loading part (which still has to be refined obviously).

I say it again in case it is not clear: this will NOT be a new plugin
API, since its aim is completely different, it's all about
interoperability among existing standards and, in some future, about
optimization and experimentation/research of new processing paradigms.

Sincerely,

Stefano
Camilo Polyméris
2007-02-19 00:22:56 UTC
Permalink
Post by Jeff McClintock
Post by Stefano D'Angelo
I actually don't know how many plugins are LTI, but, for example, a
lot of delays, reverbs, choruses, eq. filters, compressors, modulators
and "sound mixers" should be, and that's quite enough after all.
Yeah, It's a good optimization. The SynthEdit plugin API supports
inputs being flagged as 'linear', if several such plugins are used in
parallel they are automatically collapsed into a single instance which
is fed the summed signals of the original plugins. Plugin are
collapsed only when their control inputs are the same.
[plugin]-->[delay1]------>
[plugin]-->[delay2]-/
[plugin]--->[delay1]--->
[plugin]-/
e.g. two parallel 100ms delays are combined. Two different length
delays aren't.
This is most useful in synth patches where each voice is an
identical parallel sub-patch.
Jeff McClintock
How often are more than one plugin with the same control inputs used in
paralel? I was rather thinking of colapsing (or swapping) plugins in
series. They'd have to be linear and time invariant, of course.
Or maybe plugins could 'know' how to colapse themselves, sort of like
overriding Plugin::operator+(const Plugin&), to use a C++ metaphor.

Camilo
Stefano D'Angelo
2007-02-19 13:18:36 UTC
Permalink
Post by Camilo Polyméris
Post by Jeff McClintock
Post by Stefano D'Angelo
I actually don't know how many plugins are LTI, but, for example, a
lot of delays, reverbs, choruses, eq. filters, compressors, modulators
and "sound mixers" should be, and that's quite enough after all.
Yeah, It's a good optimization. The SynthEdit plugin API supports
inputs being flagged as 'linear', if several such plugins are used in
parallel they are automatically collapsed into a single instance which
is fed the summed signals of the original plugins. Plugin are
collapsed only when their control inputs are the same.
[plugin]-->[delay1]------>
[plugin]-->[delay2]-/
[plugin]--->[delay1]--->
[plugin]-/
e.g. two parallel 100ms delays are combined. Two different length
delays aren't.
This is most useful in synth patches where each voice is an
identical parallel sub-patch.
Jeff McClintock
How often are more than one plugin with the same control inputs used in
paralel? I was rather thinking of colapsing (or swapping) plugins in
series. They'd have to be linear and time invariant, of course.
Or maybe plugins could 'know' how to colapse themselves, sort of like
overriding Plugin::operator+(const Plugin&), to use a C++ metaphor.
Well, stereo sounds passing through mono plugins is one case.
However as Jeff describes this optimization, it is applicable when
output signals are summed, and I don't know how often it happens.
Anyway it is another idea to optimize processing for linear plugins,
definitively not something to discard.
This makes me think that some common basic "pieces" like mixers and
delay filters can have special properties which involve even more
aggressive optimization. Maybe it's worth considering how this special
blocks could be developed and used.

Stefano
Paul Davis
2007-02-19 15:36:09 UTC
Permalink
Post by Stefano D'Angelo
Post by Camilo Polyméris
How often are more than one plugin with the same control inputs used in
paralel? I was rather thinking of colapsing (or swapping) plugins in
series. They'd have to be linear and time invariant, of course.
Or maybe plugins could 'know' how to colapse themselves, sort of like
overriding Plugin::operator+(const Plugin&), to use a C++ metaphor.
Well, stereo sounds passing through mono plugins is one case.
nope. thats not a linear arrangement of the two mono plugins, but a
parallel arrangement. the signal going to each instance of the mono
plugin is different.
Post by Stefano D'Angelo
However as Jeff describes this optimization, it is applicable when
output signals are summed, and I don't know how often it happens.
Anyway it is another idea to optimize processing for linear plugins,
definitively not something to discard.
This makes me think that some common basic "pieces" like mixers and
delay filters can have special properties which involve even more
aggressive optimization. Maybe it's worth considering how this special
blocks could be developed and used.
you can think all you want. unless there a plugin->host callback that
allows the plugin to determine its operating environment in huge detail,
this kind of idea is pretty impossible to make use of.

--p
Stefano D'Angelo
2007-02-19 17:10:18 UTC
Permalink
Post by Paul Davis
Post by Stefano D'Angelo
Post by Camilo Polyméris
How often are more than one plugin with the same control inputs used in
paralel? I was rather thinking of colapsing (or swapping) plugins in
series. They'd have to be linear and time invariant, of course.
Or maybe plugins could 'know' how to colapse themselves, sort of like
overriding Plugin::operator+(const Plugin&), to use a C++ metaphor.
Well, stereo sounds passing through mono plugins is one case.
nope. thats not a linear arrangement of the two mono plugins, but a
parallel arrangement. the signal going to each instance of the mono
plugin is different.
I'm obscure even in Italian, I can just imagine how it can sound like
in English :-)
I was not talking about that specific thing, I was talking about a
case which could take benefit of some kind of parallel processing
merging.
Post by Paul Davis
Post by Stefano D'Angelo
However as Jeff describes this optimization, it is applicable when
output signals are summed, and I don't know how often it happens.
Anyway it is another idea to optimize processing for linear plugins,
definitively not something to discard.
This makes me think that some common basic "pieces" like mixers and
delay filters can have special properties which involve even more
aggressive optimization. Maybe it's worth considering how this special
blocks could be developed and used.
you can think all you want. unless there a plugin->host callback that
allows the plugin to determine its operating environment in huge detail,
this kind of idea is pretty impossible to make use of.
What?
Once again: misunderstood! These optimizations involve that the
"wrapper" (I should stop calling it this way) knows about the network
of processing objects (read: plugins) and that these last ones contain
"generic" information on their functionality (ex. STFT for LTI proc.
objects).
Then the wrapper takes care of optimizing the net.

Stefano
Paul Davis
2007-02-19 17:20:54 UTC
Permalink
Post by Stefano D'Angelo
Post by Paul Davis
nope. thats not a linear arrangement of the two mono plugins, but a
parallel arrangement. the signal going to each instance of the mono
plugin is different.
I'm obscure even in Italian, I can just imagine how it can sound like
in English :-)
I was not talking about that specific thing, I was talking about a
case which could take benefit of some kind of parallel processing
merging.
you don't merge or gain anything with a parallel graph. only serial
ordering is amenable to "optimization", and such arrangements are very
rare.
Post by Stefano D'Angelo
Post by Paul Davis
you can think all you want. unless there a plugin->host callback that
allows the plugin to determine its operating environment in huge detail,
this kind of idea is pretty impossible to make use of.
What?
Once again: misunderstood! These optimizations involve that the
"wrapper" (I should stop calling it this way) knows about the network
of processing objects (read: plugins) and that these last ones contain
"generic" information on their functionality (ex. STFT for LTI proc.
objects).
Then the wrapper takes care of optimizing the net.
find me a host author who would want to use such a thing... managing
plugins is a central task of a host, and handing that over to some
"wrapper" that hides information from the host doesn't make the host's
life easier, it makes it more complex.

--p
Stefano D'Angelo
2007-02-20 14:03:51 UTC
Permalink
Post by Paul Davis
Post by Stefano D'Angelo
Post by Paul Davis
you can think all you want. unless there a plugin->host callback that
allows the plugin to determine its operating environment in huge detail,
this kind of idea is pretty impossible to make use of.
What?
Once again: misunderstood! These optimizations involve that the
"wrapper" (I should stop calling it this way) knows about the network
of processing objects (read: plugins) and that these last ones contain
"generic" information on their functionality (ex. STFT for LTI proc.
objects).
Then the wrapper takes care of optimizing the net.
find me a host author who would want to use such a thing... managing
plugins is a central task of a host, and handing that over to some
"wrapper" that hides information from the host doesn't make the host's
life easier, it makes it more complex.
In fact you wouldn't have to.
You could just use it as a plugin wrapper, network representation and
optimizazion would be of some use only for some (experimental?
advanced? strange?) hosts.
Personally, I would use such thing for my project.
Then, who said that it has to hide such informations to the host?...
It is definitively not a plugin arch wrapper, but a kind of
"inter-application jack-like connectivity tool for processing objects
with already existing plugin archs wrapping and network optimization
capabilities".
As it goes when you build a GTK app, you don't have to use each
library function, and so it would become just a plugin wrapper.
Maybe the two things can be split, but maybe optimization could hardly
depend on each processing object interface (aka plugin format).
Well, I thought about it last night, and maybe the whole thing could
be split in three parts:

1. A modular processing object format wrapper (one module for each format)
2. A GUI generation and handling (and maybe embedding? XEMBED?)
wrapper with format-specific modules and toolkit-specific modules
3. A library with network representation, optimizing and processing
capabilities.

What do you think about it?

Stefano

Camilo Polyméris
2007-02-19 16:36:31 UTC
Permalink
Post by Stefano D'Angelo
Post by Stefano D'Angelo
Post by Jeff McClintock
Post by Stefano D'Angelo
I actually don't know how many plugins are LTI, but, for example, a
lot of delays, reverbs, choruses, eq. filters, compressors,
modulators
Post by Jeff McClintock
Post by Stefano D'Angelo
and "sound mixers" should be, and that's quite enough after all.
Yeah, It's a good optimization. The SynthEdit plugin API supports
inputs being flagged as 'linear', if several such plugins are used in
parallel they are automatically collapsed into a single instance which
is fed the summed signals of the original plugins. Plugin are
collapsed only when their control inputs are the same.
[plugin]-->[delay1]------>
[plugin]-->[delay2]-/
[plugin]--->[delay1]--->
[plugin]-/
e.g. two parallel 100ms delays are combined. Two different length
delays aren't.
This is most useful in synth patches where each voice is an
identical parallel sub-patch.
Jeff McClintock
How often are more than one plugin with the same control inputs used in
paralel? I was rather thinking of colapsing (or swapping) plugins in
series. They'd have to be linear and time invariant, of course.
Or maybe plugins could 'know' how to colapse themselves, sort of like
overriding Plugin::operator+(const Plugin&), to use a C++ metaphor.
Well, stereo sounds passing through mono plugins is one case.
However as Jeff describes this optimization, it is applicable when
output signals are summed, and I don't know how often it happens.
Anyway it is another idea to optimize processing for linear plugins,
definitively not something to discard.
This makes me think that some common basic "pieces" like mixers and
delay filters can have special properties which involve even more
aggressive optimization. Maybe it's worth considering how this special
blocks could be developed and used.
Stefano
Yes, I agree I think if one comes up with a couple of "rules" like that,
it could be possible to design a system which automatically simplifies
processing networks.
To recap:
* If two parallel filters have equal control inputs and their outputs
are summed, replace with one filter and feed with summed inputs.
* If two serial filters are LTI, they impulse response can be added to
one filter.
* If two serial filters are LTI, and their impulse response is unknown,
they can be swapped.
* Filter "classes" could know how to merge to instances into one. Those
instances may even cancel each other out.
* Remove filter chains which have no outputs.
etc... With a little thinking and some formal work, one could come up
with more ideas like those.
Software like puredata and jMax (which use such "common basic pieces" in
many diferent configurations) could benefit from such a system. I looked
at their websites, but could not find any references to similar ideas.

Camilo
Stefano D'Angelo
2007-02-19 17:16:27 UTC
Permalink
Post by Camilo Polyméris
Post by Stefano D'Angelo
Post by Stefano D'Angelo
Post by Jeff McClintock
Post by Stefano D'Angelo
I actually don't know how many plugins are LTI, but, for example, a
lot of delays, reverbs, choruses, eq. filters, compressors,
modulators
Post by Jeff McClintock
Post by Stefano D'Angelo
and "sound mixers" should be, and that's quite enough after all.
Yeah, It's a good optimization. The SynthEdit plugin API supports
inputs being flagged as 'linear', if several such plugins are used in
parallel they are automatically collapsed into a single instance which
is fed the summed signals of the original plugins. Plugin are
collapsed only when their control inputs are the same.
[plugin]-->[delay1]------>
[plugin]-->[delay2]-/
[plugin]--->[delay1]--->
[plugin]-/
e.g. two parallel 100ms delays are combined. Two different length
delays aren't.
This is most useful in synth patches where each voice is an
identical parallel sub-patch.
Jeff McClintock
How often are more than one plugin with the same control inputs used in
paralel? I was rather thinking of colapsing (or swapping) plugins in
series. They'd have to be linear and time invariant, of course.
Or maybe plugins could 'know' how to colapse themselves, sort of like
overriding Plugin::operator+(const Plugin&), to use a C++ metaphor.
Well, stereo sounds passing through mono plugins is one case.
However as Jeff describes this optimization, it is applicable when
output signals are summed, and I don't know how often it happens.
Anyway it is another idea to optimize processing for linear plugins,
definitively not something to discard.
This makes me think that some common basic "pieces" like mixers and
delay filters can have special properties which involve even more
aggressive optimization. Maybe it's worth considering how this special
blocks could be developed and used.
Stefano
Yes, I agree I think if one comes up with a couple of "rules" like that,
it could be possible to design a system which automatically simplifies
processing networks.
* If two parallel filters have equal control inputs and their outputs
are summed, replace with one filter and feed with summed inputs.
Maybe too specific... maybe also plugins with different control inputs
can be merged, I must see this.
Post by Camilo Polyméris
* If two serial filters are LTI, they impulse response can be added to
one filter.
Added = multiplied :-)
Post by Camilo Polyméris
* If two serial filters are LTI, and their impulse response is unknown,
they can be swapped.
Yes, but why?
Post by Camilo Polyméris
* Filter "classes" could know how to merge to instances into one. Those
instances may even cancel each other out.
Yes
Post by Camilo Polyméris
* Remove filter chains which have no outputs.
Absolutely not: what about a GUI oscillator?
Post by Camilo Polyméris
etc... With a little thinking and some formal work, one could come up
with more ideas like those.
I think too that this is an interesting path to follow: NASPRO (the
wrapper) will absolutely go this way, just after wrapping LADSPA,
DSSI, LV2 (without extensions) and similar.
When LV2 extensions will be implemented then work on these stuff will begin.
Post by Camilo Polyméris
Software like puredata and jMax (which use such "common basic pieces" in
many diferent configurations) could benefit from such a system. I looked
at their websites, but could not find any references to similar ideas.
Good, another use for it :-)

Stefano
Camilo Polyméris
2007-02-19 17:54:50 UTC
Permalink
Post by Stefano D'Angelo
Post by Stefano D'Angelo
Post by Stefano D'Angelo
Post by Stefano D'Angelo
Post by Jeff McClintock
Post by Stefano D'Angelo
I actually don't know how many plugins are LTI, but, for
example, a
Post by Stefano D'Angelo
Post by Stefano D'Angelo
Post by Jeff McClintock
Post by Stefano D'Angelo
lot of delays, reverbs, choruses, eq. filters, compressors,
modulators
Post by Jeff McClintock
Post by Stefano D'Angelo
and "sound mixers" should be, and that's quite enough after all.
Yeah, It's a good optimization. The SynthEdit plugin API supports
inputs being flagged as 'linear', if several such plugins are
used in
Post by Stefano D'Angelo
Post by Stefano D'Angelo
Post by Jeff McClintock
parallel they are automatically collapsed into a single instance
which
Post by Stefano D'Angelo
Post by Stefano D'Angelo
Post by Jeff McClintock
is fed the summed signals of the original plugins. Plugin are
collapsed only when their control inputs are the same.
[plugin]-->[delay1]------>
[plugin]-->[delay2]-/
[plugin]--->[delay1]--->
[plugin]-/
e.g. two parallel 100ms delays are combined. Two different length
delays aren't.
This is most useful in synth patches where each voice is an
identical parallel sub-patch.
Jeff McClintock
How often are more than one plugin with the same control inputs
used in
Post by Stefano D'Angelo
Post by Stefano D'Angelo
paralel? I was rather thinking of colapsing (or swapping) plugins in
series. They'd have to be linear and time invariant, of course.
Or maybe plugins could 'know' how to colapse themselves, sort of like
overriding Plugin::operator+(const Plugin&), to use a C++ metaphor.
Well, stereo sounds passing through mono plugins is one case.
However as Jeff describes this optimization, it is applicable when
output signals are summed, and I don't know how often it happens.
Anyway it is another idea to optimize processing for linear plugins,
definitively not something to discard.
This makes me think that some common basic "pieces" like mixers and
delay filters can have special properties which involve even more
aggressive optimization. Maybe it's worth considering how this special
blocks could be developed and used.
Stefano
Yes, I agree I think if one comes up with a couple of "rules" like that,
it could be possible to design a system which automatically simplifies
processing networks.
* If two parallel filters have equal control inputs and their outputs
are summed, replace with one filter and feed with summed inputs.
Maybe too specific... maybe also plugins with different control inputs
can be merged, I must see this.
I meant Jeff's idea: the simplification of parallel filters. He
mentioned the SynthEdit API using it.
Post by Stefano D'Angelo
Post by Stefano D'Angelo
* If two serial filters are LTI, they impulse response can be added to
one filter.
Added = multiplied :-)
Actually, *
Post by Stefano D'Angelo
Post by Stefano D'Angelo
* If two serial filters are LTI, and their impulse response is unknown,
they can be swapped.
Yes, but why?
That, per se, is no optimization, but moving stuff around can help
making the other rules apply.
Like, if you have:
eq -> LTIfilter -> eq ->
you can first swap the first two:
LTIfilter -> eq -> eq ->
and then reduce the second and third:
LTIfilter -> sum_of_eqs ->
Post by Stefano D'Angelo
Post by Stefano D'Angelo
* Filter "classes" could know how to merge to instances into one. Those
instances may even cancel each other out.
Yes
Post by Stefano D'Angelo
* Remove filter chains which have no outputs.
Absolutely not: what about a GUI oscillator?
Ok. Filter chains without outputs nor side-effects. (Like optimizing
away pure functions)
Post by Stefano D'Angelo
Post by Stefano D'Angelo
etc... With a little thinking and some formal work, one could come up
with more ideas like those.
I think too that this is an interesting path to follow: NASPRO (the
wrapper) will absolutely go this way, just after wrapping LADSPA,
DSSI, LV2 (without extensions) and similar.
When LV2 extensions will be implemented then work on these stuff will begin.
What's "naspro"?
Post by Stefano D'Angelo
Post by Stefano D'Angelo
Software like puredata and jMax (which use such "common basic pieces" in
many diferent configurations) could benefit from such a system. I looked
at their websites, but could not find any references to similar ideas.
Good, another use for it :-)
Stefano
Stefano D'Angelo
2007-02-19 18:08:36 UTC
Permalink
Post by Camilo Polyméris
Post by Stefano D'Angelo
Post by Stefano D'Angelo
Post by Stefano D'Angelo
Post by Stefano D'Angelo
Post by Jeff McClintock
Post by Stefano D'Angelo
I actually don't know how many plugins are LTI, but, for
example, a
Post by Stefano D'Angelo
Post by Stefano D'Angelo
Post by Jeff McClintock
Post by Stefano D'Angelo
lot of delays, reverbs, choruses, eq. filters, compressors,
modulators
Post by Jeff McClintock
Post by Stefano D'Angelo
and "sound mixers" should be, and that's quite enough after all.
Yeah, It's a good optimization. The SynthEdit plugin API supports
inputs being flagged as 'linear', if several such plugins are
used in
Post by Stefano D'Angelo
Post by Stefano D'Angelo
Post by Jeff McClintock
parallel they are automatically collapsed into a single instance
which
Post by Stefano D'Angelo
Post by Stefano D'Angelo
Post by Jeff McClintock
is fed the summed signals of the original plugins. Plugin are
collapsed only when their control inputs are the same.
[plugin]-->[delay1]------>
[plugin]-->[delay2]-/
[plugin]--->[delay1]--->
[plugin]-/
e.g. two parallel 100ms delays are combined. Two different length
delays aren't.
This is most useful in synth patches where each voice is an
identical parallel sub-patch.
Jeff McClintock
How often are more than one plugin with the same control inputs
used in
Post by Stefano D'Angelo
Post by Stefano D'Angelo
paralel? I was rather thinking of colapsing (or swapping) plugins in
series. They'd have to be linear and time invariant, of course.
Or maybe plugins could 'know' how to colapse themselves, sort of like
overriding Plugin::operator+(const Plugin&), to use a C++ metaphor.
Well, stereo sounds passing through mono plugins is one case.
However as Jeff describes this optimization, it is applicable when
output signals are summed, and I don't know how often it happens.
Anyway it is another idea to optimize processing for linear plugins,
definitively not something to discard.
This makes me think that some common basic "pieces" like mixers and
delay filters can have special properties which involve even more
aggressive optimization. Maybe it's worth considering how this special
blocks could be developed and used.
Stefano
Yes, I agree I think if one comes up with a couple of "rules" like that,
it could be possible to design a system which automatically simplifies
processing networks.
* If two parallel filters have equal control inputs and their outputs
are summed, replace with one filter and feed with summed inputs.
Maybe too specific... maybe also plugins with different control inputs
can be merged, I must see this.
I meant Jeff's idea: the simplification of parallel filters. He
mentioned the SynthEdit API using it.
Post by Stefano D'Angelo
Post by Stefano D'Angelo
* If two serial filters are LTI, they impulse response can be added to
one filter.
Added = multiplied :-)
Actually, *
:-)
Post by Camilo Polyméris
Post by Stefano D'Angelo
Post by Stefano D'Angelo
* If two serial filters are LTI, and their impulse response is unknown,
they can be swapped.
Yes, but why?
That, per se, is no optimization, but moving stuff around can help
making the other rules apply.
eq -> LTIfilter -> eq ->
LTIfilter -> eq -> eq ->
LTIfilter -> sum_of_eqs ->
But how do you know they are 2 eqs if you don't know their impulse response?
Maybe you mean that this happens when they are two instances of the same object?
Post by Camilo Polyméris
Post by Stefano D'Angelo
Post by Stefano D'Angelo
* Filter "classes" could know how to merge to instances into one. Those
instances may even cancel each other out.
Yes
Post by Stefano D'Angelo
* Remove filter chains which have no outputs.
Absolutely not: what about a GUI oscillator?
Ok. Filter chains without outputs nor side-effects. (Like optimizing
away pure functions)
Ok.
Post by Camilo Polyméris
Post by Stefano D'Angelo
Post by Stefano D'Angelo
etc... With a little thinking and some formal work, one could come up
with more ideas like those.
I think too that this is an interesting path to follow: NASPRO (the
wrapper) will absolutely go this way, just after wrapping LADSPA,
DSSI, LV2 (without extensions) and similar.
When LV2 extensions will be implemented then work on these stuff will begin.
What's "naspro"?
It's what I'm working on and what we're talking about!
I called such thing NASPRO which is a recursive acronym for NASPRO
Architecture for Sound PRocessing Objects. The real "naspro"
(pronounced like 'nnashpro) is a typical southern italian icing used
for sweets :-)
Post by Camilo Polyméris
Post by Stefano D'Angelo
Post by Stefano D'Angelo
Software like puredata and jMax (which use such "common basic pieces" in
many diferent configurations) could benefit from such a system. I looked
at their websites, but could not find any references to similar ideas.
Good, another use for it :-)
Stefano
Continue reading on narkive:
Loading...