Discussion:
[asio-users] [boost][asio]Extension for audio device service
adrien courdavault
2013-04-23 19:54:16 UTC
Permalink
Hello.

I make this new thread to be clearer.
There is currently no way to manage audio endpoints cconnection easily.
It looks like some people might find this usefull (as I do), and I've
been suggested on the boost dev list to try to detail this, as an
extension to Boost.ASIO.

For this reason I create this thread.

I'm trying to make a very basic first draft of the concepts and see if
this may be a good idea.

I attached here the first things I've written. This is very short and general.

I would like to know:
* do you think I'm going in the right direction by seing this as
Boost.ASIO extension.
* do you have suggestions
* would someone like to participate ?

Thank you
Gruenke, Matt
2013-04-23 20:56:06 UTC
Permalink
This is interesting. I'm not exactly a domain expert, but I know a bit
about the subject and I just think about the vast number of existing
libraries and APIs, yet I'm not aware of a single one (much less a
cross-platform one) that has broad consensus of getting it right.

At the level of your approach, it seems simple enough. But I think your
view is too simplistic. You cannot just gloss over mixers, sample rate
& format negotiation, and the various layers that sit between the
library and the electrical (or optical) audio signal coming in/out of
the computer.

It might be interesting to look at possibly mapping the audio signal
routing problem to the resolver concept that boost uses for IP
networking. The analogy isn't perfect, but it might shed some insights
and provide concepts that can be borrowed.

Also, keep in mind that one library often cannot be all things to all
people. The way audio APIs look often derives from which side the
author is coming from (i.e. device side, windowing system, or end-user
application developers), and whether they're more concerned with things
like A/V synchronization, ease of processing/mixing/routing,
low-latency, etc. Some of this might be addressed by decomposing the
problem into layers.

One option would be to write your extension as a separate library that's
designed to be used with Boost.Asio. Once you've gained some experience
and had a chance to try different approaches towards addressing some of
these issues, then it might make sense to take what you've done and
integrate that into Boost.Asio. Or not - it can be just as useful on
its own as it can as part of Boost. Granted, having it in Boost is a
great way to gain publicity, contributions, and simplify distribution,
but I recommend waiting until it's at least half-baked before worrying
too much about taking it there.


Matt


-----Original Message-----
From: adrien courdavault [mailto:***@gmail.com]
Sent: April 23, 2013 15:54
To: asio-***@lists.sourceforge.net
Subject: [asio-users] [boost][asio]Extension for audio device service

Hello.

I make this new thread to be clearer.
There is currently no way to manage audio endpoints cconnection easily.
It looks like some people might find this usefull (as I do), and I've
been suggested on the boost dev list to try to detail this, as an
extension to Boost.ASIO.

For this reason I create this thread.

I'm trying to make a very basic first draft of the concepts and see if
this may be a good idea.

I attached here the first things I've written. This is very short and
general.

I would like to know:
* do you think I'm going in the right direction by seing this as
Boost.ASIO extension.
* do you have suggestions
* would someone like to participate ?

Thank you
adrien courdavault
2013-04-23 21:37:34 UTC
Permalink
HI

Thank you for the feeback.
I agree I really wanted to focus on the audio connection of the
existing drivers interfaces.
I don't want to map all the possibilities of all drivers, but stay
focus on the audio streaming, and push interfaces.

For the Boost.ASIO integration. This is currently one of my big
question, the design at the core of boost seems good and very generic
which is good.
At the seme time, I don't know how the audio device service would fit
in that because it would already be a pretty high level service (very
different from the existing one).

The idea for addressing endpoints may look like ips.

You talk about optical or electrical signal, this is a problem in the
interface itself, but from the driver interface itself, you already
have access to much higher abstraction.

About the library being all things to all people, currently i want to
focus on low latency streaming apis.
These are really basics and all very common on the different OSs, this
is also low level, and usable for professional audio appliations. The
mixers push and shared mode on the other side is more complex, because
there is way more different mixer kind.

Thank you for the feedbacks again. This is really appreciated.
Post by Gruenke, Matt
This is interesting. I'm not exactly a domain expert, but I know a bit
about the subject and I just think about the vast number of existing
libraries and APIs, yet I'm not aware of a single one (much less a
cross-platform one) that has broad consensus of getting it right.
At the level of your approach, it seems simple enough. But I think your
view is too simplistic. You cannot just gloss over mixers, sample rate
& format negotiation, and the various layers that sit between the
library and the electrical (or optical) audio signal coming in/out of
the computer.
It might be interesting to look at possibly mapping the audio signal
routing problem to the resolver concept that boost uses for IP
networking. The analogy isn't perfect, but it might shed some insights
and provide concepts that can be borrowed.
Also, keep in mind that one library often cannot be all things to all
people. The way audio APIs look often derives from which side the
author is coming from (i.e. device side, windowing system, or end-user
application developers), and whether they're more concerned with things
like A/V synchronization, ease of processing/mixing/routing,
low-latency, etc. Some of this might be addressed by decomposing the
problem into layers.
One option would be to write your extension as a separate library that's
designed to be used with Boost.Asio. Once you've gained some experience
and had a chance to try different approaches towards addressing some of
these issues, then it might make sense to take what you've done and
integrate that into Boost.Asio. Or not - it can be just as useful on
its own as it can as part of Boost. Granted, having it in Boost is a
great way to gain publicity, contributions, and simplify distribution,
but I recommend waiting until it's at least half-baked before worrying
too much about taking it there.
Matt
-----Original Message-----
Sent: April 23, 2013 15:54
Subject: [asio-users] [boost][asio]Extension for audio device service
Hello.
I make this new thread to be clearer.
There is currently no way to manage audio endpoints cconnection easily.
It looks like some people might find this usefull (as I do), and I've
been suggested on the boost dev list to try to detail this, as an
extension to Boost.ASIO.
For this reason I create this thread.
I'm trying to make a very basic first draft of the concepts and see if
this may be a good idea.
I attached here the first things I've written. This is very short and general.
* do you think I'm going in the right direction by seing this as
Boost.ASIO extension.
* do you have suggestions
* would someone like to participate ?
Thank you
------------------------------------------------------------------------------
Try New Relic Now & We'll Send You this Cool Shirt
New Relic is the only SaaS-based application performance monitoring service
that delivers powerful full stack analytics. Optimize and monitor your
browser, app, & servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_apr
_______________________________________________
asio-users mailing list
https://lists.sourceforge.net/lists/listinfo/asio-users
_______________________________________________
Using Asio? List your project at
http://think-async.com/Asio/WhoIsUsingAsio
Gruenke, Matt
2013-04-23 22:04:58 UTC
Permalink
I see your points. Like I said, I don't claim to know enough to say
exactly how I think it *should* look.

I'd still like to see more about how sample rates and bit depth are
negotiated. I have certain audio devices which allow me to open them at
different sample rates, but there's an underlying native rate that it'd
be nice if the API would expose as a preferred rate, in some way. I
believe channel count might also be involved in this process, in some
cases.

I'm also curious what can be done about multi-stream synchronization, at
both capture and playback. Do any of the API's you're planning to wrap
provide "capture timestamps" or details about I/O buffer
sizes/latencies?

How do you plan to signal buffer overrun (on capture) and underrun (on
playback)?

Finally, how do you plan to wrap multiple lower-level APIs on the same
system (e.g. ALSA, PulseAudio, and OSS, to use your examples). Can
there be some way to enumerate these and select one? Since different
APIs might support different features, such as the timestamping &
latency control features I mentioned above, it might be a good idea to
make them programmatically selectable. You mentioned audio devices
being identified by an audio_device_id, so I'm assuming you already
planned on being able to enumerate and select one of those, such as when
a PC has multiple sound cards or a sound card has multiple channels.


Matt

-----Original Message-----
From: adrien courdavault [mailto:***@gmail.com]
Sent: April 23, 2013 17:38
To: asio-users
Subject: Re: [asio-users] [boost][asio]Extension for audio device
service

HI

Thank you for the feeback.
I agree I really wanted to focus on the audio connection of the existing
drivers interfaces.
I don't want to map all the possibilities of all drivers, but stay focus
on the audio streaming, and push interfaces.

For the Boost.ASIO integration. This is currently one of my big
question, the design at the core of boost seems good and very generic
which is good.
At the seme time, I don't know how the audio device service would fit in
that because it would already be a pretty high level service (very
different from the existing one).

The idea for addressing endpoints may look like ips.

You talk about optical or electrical signal, this is a problem in the
interface itself, but from the driver interface itself, you already have
access to much higher abstraction.

About the library being all things to all people, currently i want to
focus on low latency streaming apis.
These are really basics and all very common on the different OSs, this
is also low level, and usable for professional audio appliations. The
mixers push and shared mode on the other side is more complex, because
there is way more different mixer kind.

Thank you for the feedbacks again. This is really appreciated.
Post by Gruenke, Matt
This is interesting. I'm not exactly a domain expert, but I know a
bit about the subject and I just think about the vast number of
existing libraries and APIs, yet I'm not aware of a single one (much
less a cross-platform one) that has broad consensus of getting it
right.
Post by Gruenke, Matt
At the level of your approach, it seems simple enough. But I think
your view is too simplistic. You cannot just gloss over mixers,
sample rate & format negotiation, and the various layers that sit
between the library and the electrical (or optical) audio signal
coming in/out of the computer.
It might be interesting to look at possibly mapping the audio signal
routing problem to the resolver concept that boost uses for IP
networking. The analogy isn't perfect, but it might shed some
insights and provide concepts that can be borrowed.
Also, keep in mind that one library often cannot be all things to all
people. The way audio APIs look often derives from which side the
author is coming from (i.e. device side, windowing system, or end-user
application developers), and whether they're more concerned with
things like A/V synchronization, ease of processing/mixing/routing,
low-latency, etc. Some of this might be addressed by decomposing the
problem into layers.
One option would be to write your extension as a separate library
that's designed to be used with Boost.Asio. Once you've gained some
experience and had a chance to try different approaches towards
addressing some of these issues, then it might make sense to take what
you've done and integrate that into Boost.Asio. Or not - it can be
just as useful on its own as it can as part of Boost. Granted, having
it in Boost is a great way to gain publicity, contributions, and
simplify distribution, but I recommend waiting until it's at least
half-baked before worrying too much about taking it there.
Matt
-----Original Message-----
Sent: April 23, 2013 15:54
Subject: [asio-users] [boost][asio]Extension for audio device service
Hello.
I make this new thread to be clearer.
There is currently no way to manage audio endpoints cconnection easily.
It looks like some people might find this usefull (as I do), and I've
been suggested on the boost dev list to try to detail this, as an
extension to Boost.ASIO.
For this reason I create this thread.
I'm trying to make a very basic first draft of the concepts and see if
this may be a good idea.
I attached here the first things I've written. This is very short and general.
* do you think I'm going in the right direction by seing this as
Boost.ASIO extension.
* do you have suggestions
* would someone like to participate ?
Thank you
----------------------------------------------------------------------
-------- Try New Relic Now & We'll Send You this Cool Shirt New Relic
is the only SaaS-based application performance monitoring service that
delivers powerful full stack analytics. Optimize and monitor your
browser, app, & servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt!
http://p.sf.net/sfu/newrelic_d2d_apr
_______________________________________________
asio-users mailing list
https://lists.sourceforge.net/lists/listinfo/asio-users
_______________________________________________
Using Asio? List your project at
http://think-async.com/Asio/WhoIsUsingAsio
------------------------------------------------------------------------
------
Try New Relic Now & We'll Send You this Cool Shirt New Relic is the only
SaaS-based application performance monitoring service that delivers
powerful full stack analytics. Optimize and monitor your browser, app, &
servers with just a few lines of code. Try New Relic and get this
awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_apr
_______________________________________________
asio-users mailing list
asio-***@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/asio-users
_______________________________________________
Using Asio? List your project at
http://think-async.com/Asio/WhoIsUsingAsio
adrien courdavault
2013-04-23 23:24:26 UTC
Permalink
Hello

It is true that this is not clear in this draft the sample rates, and
encoding (eg 16bit integer sample rate 44100) this is the audio_format
structure. In fact I think I should separate format and samplig rate
(format can also be floating point 32bit as example)
Most of API (afaik) request a prefered format, which is then passed to
a is_format_supported function. This results in the acceptation of the
required format or not by the device. This device often can answer the
closest supported format that it accepts.
Depending on the way you open the device you may have different options.
If you pass in streaming you will connect using a format directly
supported by the driver, and no other. If however you pass through a
mixer (llike what MS call the shared mode) on windows, the mixer will
pretty much accept any format that you can describe.

The big difference is if you are streaming or not.
Obviously even in streaming you can add an audio format conversion
layer, but as this mode is use for low latency, you have direct access
to a supported format of the device. If in you implementation (your
program) then you want to convert the format, you can use the best
conversion available, but the idea of this opening mode is to give you
low level and efficient access to the device.
This is not the same idea in the other mode (with a mixer on which you
push the audio data) where you look more for for portability and
shared access to the audio device. In fact in that case the OS don't
let you access directly the device because you have all applications
playing on the same device (and the OS itself), and for this reason in
the second mode the OS implements the mixing layer. And the
consequence is that this second mode where you share the resource that
is the device through the Os mixer with the other applications is not
the good one for high performance audio.

Is that more clear ?
Post by Gruenke, Matt
I see your points. Like I said, I don't claim to know enough to say
exactly how I think it *should* look.
I'd still like to see more about how sample rates and bit depth are
negotiated. I have certain audio devices which allow me to open them at
different sample rates, but there's an underlying native rate that it'd
be nice if the API would expose as a preferred rate, in some way. I
believe channel count might also be involved in this process, in some
cases.
I'm also curious what can be done about multi-stream synchronization, at
both capture and playback. Do any of the API's you're planning to wrap
provide "capture timestamps" or details about I/O buffer
sizes/latencies?
How do you plan to signal buffer overrun (on capture) and underrun (on
playback)?
Finally, how do you plan to wrap multiple lower-level APIs on the same
system (e.g. ALSA, PulseAudio, and OSS, to use your examples). Can
there be some way to enumerate these and select one? Since different
APIs might support different features, such as the timestamping &
latency control features I mentioned above, it might be a good idea to
make them programmatically selectable. You mentioned audio devices
being identified by an audio_device_id, so I'm assuming you already
planned on being able to enumerate and select one of those, such as when
a PC has multiple sound cards or a sound card has multiple channels.
Matt
-----Original Message-----
Sent: April 23, 2013 17:38
To: asio-users
Subject: Re: [asio-users] [boost][asio]Extension for audio device
service
HI
Thank you for the feeback.
I agree I really wanted to focus on the audio connection of the existing
drivers interfaces.
I don't want to map all the possibilities of all drivers, but stay focus
on the audio streaming, and push interfaces.
For the Boost.ASIO integration. This is currently one of my big
question, the design at the core of boost seems good and very generic
which is good.
At the seme time, I don't know how the audio device service would fit in
that because it would already be a pretty high level service (very
different from the existing one).
The idea for addressing endpoints may look like ips.
You talk about optical or electrical signal, this is a problem in the
interface itself, but from the driver interface itself, you already have
access to much higher abstraction.
About the library being all things to all people, currently i want to
focus on low latency streaming apis.
These are really basics and all very common on the different OSs, this
is also low level, and usable for professional audio appliations. The
mixers push and shared mode on the other side is more complex, because
there is way more different mixer kind.
Thank you for the feedbacks again. This is really appreciated.
Post by Gruenke, Matt
This is interesting. I'm not exactly a domain expert, but I know a
bit about the subject and I just think about the vast number of
existing libraries and APIs, yet I'm not aware of a single one (much
less a cross-platform one) that has broad consensus of getting it
right.
Post by Gruenke, Matt
At the level of your approach, it seems simple enough. But I think
your view is too simplistic. You cannot just gloss over mixers,
sample rate & format negotiation, and the various layers that sit
between the library and the electrical (or optical) audio signal
coming in/out of the computer.
It might be interesting to look at possibly mapping the audio signal
routing problem to the resolver concept that boost uses for IP
networking. The analogy isn't perfect, but it might shed some
insights and provide concepts that can be borrowed.
Also, keep in mind that one library often cannot be all things to all
people. The way audio APIs look often derives from which side the
author is coming from (i.e. device side, windowing system, or end-user
application developers), and whether they're more concerned with
things like A/V synchronization, ease of processing/mixing/routing,
low-latency, etc. Some of this might be addressed by decomposing the
problem into layers.
One option would be to write your extension as a separate library
that's designed to be used with Boost.Asio. Once you've gained some
experience and had a chance to try different approaches towards
addressing some of these issues, then it might make sense to take what
you've done and integrate that into Boost.Asio. Or not - it can be
just as useful on its own as it can as part of Boost. Granted, having
it in Boost is a great way to gain publicity, contributions, and
simplify distribution, but I recommend waiting until it's at least
half-baked before worrying too much about taking it there.
Matt
-----Original Message-----
Sent: April 23, 2013 15:54
Subject: [asio-users] [boost][asio]Extension for audio device service
Hello.
I make this new thread to be clearer.
There is currently no way to manage audio endpoints cconnection
easily.
Post by Gruenke, Matt
It looks like some people might find this usefull (as I do), and I've
been suggested on the boost dev list to try to detail this, as an
extension to Boost.ASIO.
For this reason I create this thread.
I'm trying to make a very basic first draft of the concepts and see if
this may be a good idea.
I attached here the first things I've written. This is very short and general.
* do you think I'm going in the right direction by seing this as
Boost.ASIO extension.
* do you have suggestions
* would someone like to participate ?
Thank you
----------------------------------------------------------------------
-------- Try New Relic Now & We'll Send You this Cool Shirt New Relic
is the only SaaS-based application performance monitoring service that
delivers powerful full stack analytics. Optimize and monitor your
browser, app, & servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt!
http://p.sf.net/sfu/newrelic_d2d_apr
_______________________________________________
asio-users mailing list
https://lists.sourceforge.net/lists/listinfo/asio-users
_______________________________________________
Using Asio? List your project at
http://think-async.com/Asio/WhoIsUsingAsio
------------------------------------------------------------------------
------
Try New Relic Now & We'll Send You this Cool Shirt New Relic is the only
SaaS-based application performance monitoring service that delivers
powerful full stack analytics. Optimize and monitor your browser, app, &
servers with just a few lines of code. Try New Relic and get this
awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_apr
_______________________________________________
asio-users mailing list
https://lists.sourceforge.net/lists/listinfo/asio-users
_______________________________________________
Using Asio? List your project at
http://think-async.com/Asio/WhoIsUsingAsio
------------------------------------------------------------------------------
Try New Relic Now & We'll Send You this Cool Shirt
New Relic is the only SaaS-based application performance monitoring service
that delivers powerful full stack analytics. Optimize and monitor your
browser, app, & servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_apr
_______________________________________________
asio-users mailing list
https://lists.sourceforge.net/lists/listinfo/asio-users
_______________________________________________
Using Asio? List your project at
http://think-async.com/Asio/WhoIsUsingAsio
Gruenke, Matt
2013-05-06 20:42:43 UTC
Permalink
Apologies for the delayed response.

If anyone is interested in this exchange, please speak up. As there
seems to be a general lack of interest, I plan to move the exchange
off-list.

I think sample rate negotiation is a big issue. So far, you've seemed
to focus on the preference of the caller. However, some devices have a
global clock - failing to query that might mean you're forcing
resampling, which could incur a latency penalty and quality degradation.
Similarly, there might be a native sample format supported by the
hardware, and it might save some work if there's a way to query the
preferred format. But format and rate can probably be treated
independent of each other.

It still feels like a pretty big expansion of Boost.Asio's scope to
include this functionality, and the option remains to publish this as a
separate library that merely extends Boost.Asio. Also, consider that
through choices about how to handle some of the issues we're discussing,
you're imposing limitations on the library user. For most or all of
what Boost.Asio currently does, it merely provides a convenience layer
atop the underlying functionality.


Matt


-----Original Message-----
From: adrien courdavault [mailto:***@gmail.com]
Sent: April 23, 2013 19:24
To: asio-users
Subject: Re: [asio-users] [boost][asio]Extension for audio device
service

Hello

It is true that this is not clear in this draft the sample rates, and
encoding (eg 16bit integer sample rate 44100) this is the audio_format
structure. In fact I think I should separate format and samplig rate
(format can also be floating point 32bit as example) Most of API (afaik)
request a prefered format, which is then passed to a is_format_supported
function. This results in the acceptation of the required format or not
by the device. This device often can answer the closest supported format
that it accepts.
Depending on the way you open the device you may have different options.
If you pass in streaming you will connect using a format directly
supported by the driver, and no other. If however you pass through a
mixer (llike what MS call the shared mode) on windows, the mixer will
pretty much accept any format that you can describe.

The big difference is if you are streaming or not.
Obviously even in streaming you can add an audio format conversion
layer, but as this mode is use for low latency, you have direct access
to a supported format of the device. If in you implementation (your
program) then you want to convert the format, you can use the best
conversion available, but the idea of this opening mode is to give you
low level and efficient access to the device.
This is not the same idea in the other mode (with a mixer on which you
push the audio data) where you look more for for portability and shared
access to the audio device. In fact in that case the OS don't let you
access directly the device because you have all applications playing on
the same device (and the OS itself), and for this reason in the second
mode the OS implements the mixing layer. And the consequence is that
this second mode where you share the resource that is the device through
the Os mixer with the other applications is not the good one for high
performance audio.

Is that more clear ?
Post by Gruenke, Matt
I see your points. Like I said, I don't claim to know enough to say
exactly how I think it *should* look.
I'd still like to see more about how sample rates and bit depth are
negotiated. I have certain audio devices which allow me to open them
at different sample rates, but there's an underlying native rate that
it'd be nice if the API would expose as a preferred rate, in some way.
I believe channel count might also be involved in this process, in
some cases.
I'm also curious what can be done about multi-stream synchronization,
at both capture and playback. Do any of the API's you're planning to
wrap provide "capture timestamps" or details about I/O buffer
sizes/latencies?
How do you plan to signal buffer overrun (on capture) and underrun (on
playback)?
Finally, how do you plan to wrap multiple lower-level APIs on the same
system (e.g. ALSA, PulseAudio, and OSS, to use your examples). Can
there be some way to enumerate these and select one? Since different
APIs might support different features, such as the timestamping &
latency control features I mentioned above, it might be a good idea to
make them programmatically selectable. You mentioned audio devices
being identified by an audio_device_id, so I'm assuming you already
planned on being able to enumerate and select one of those, such as
when a PC has multiple sound cards or a sound card has multiple
channels.
Post by Gruenke, Matt
Matt
-----Original Message-----
Sent: April 23, 2013 17:38
To: asio-users
Subject: Re: [asio-users] [boost][asio]Extension for audio device
service
HI
Thank you for the feeback.
I agree I really wanted to focus on the audio connection of the
existing drivers interfaces.
I don't want to map all the possibilities of all drivers, but stay
focus on the audio streaming, and push interfaces.
For the Boost.ASIO integration. This is currently one of my big
question, the design at the core of boost seems good and very generic
which is good.
At the seme time, I don't know how the audio device service would fit
in that because it would already be a pretty high level service (very
different from the existing one).
The idea for addressing endpoints may look like ips.
You talk about optical or electrical signal, this is a problem in the
interface itself, but from the driver interface itself, you already
have access to much higher abstraction.
About the library being all things to all people, currently i want to
focus on low latency streaming apis.
These are really basics and all very common on the different OSs, this
is also low level, and usable for professional audio appliations. The
mixers push and shared mode on the other side is more complex, because
there is way more different mixer kind.
Thank you for the feedbacks again. This is really appreciated.
Post by Gruenke, Matt
This is interesting. I'm not exactly a domain expert, but I know a
bit about the subject and I just think about the vast number of
existing libraries and APIs, yet I'm not aware of a single one (much
less a cross-platform one) that has broad consensus of getting it
right.
Post by Gruenke, Matt
At the level of your approach, it seems simple enough. But I think
your view is too simplistic. You cannot just gloss over mixers,
sample rate & format negotiation, and the various layers that sit
between the library and the electrical (or optical) audio signal
coming in/out of the computer.
It might be interesting to look at possibly mapping the audio signal
routing problem to the resolver concept that boost uses for IP
networking. The analogy isn't perfect, but it might shed some
insights and provide concepts that can be borrowed.
Also, keep in mind that one library often cannot be all things to all
people. The way audio APIs look often derives from which side the
author is coming from (i.e. device side, windowing system, or
end-user
application developers), and whether they're more concerned with
things like A/V synchronization, ease of processing/mixing/routing,
low-latency, etc. Some of this might be addressed by decomposing the
problem into layers.
One option would be to write your extension as a separate library
that's designed to be used with Boost.Asio. Once you've gained some
experience and had a chance to try different approaches towards
addressing some of these issues, then it might make sense to take what
you've done and integrate that into Boost.Asio. Or not - it can be
just as useful on its own as it can as part of Boost. Granted, having
it in Boost is a great way to gain publicity, contributions, and
simplify distribution, but I recommend waiting until it's at least
half-baked before worrying too much about taking it there.
Matt
-----Original Message-----
Sent: April 23, 2013 15:54
Subject: [asio-users] [boost][asio]Extension for audio device service
Hello.
I make this new thread to be clearer.
There is currently no way to manage audio endpoints cconnection
easily.
Post by Gruenke, Matt
It looks like some people might find this usefull (as I do), and I've
been suggested on the boost dev list to try to detail this, as an
extension to Boost.ASIO.
For this reason I create this thread.
I'm trying to make a very basic first draft of the concepts and see if
this may be a good idea.
I attached here the first things I've written. This is very short and
general.
* do you think I'm going in the right direction by seing this as
Boost.ASIO extension.
* do you have suggestions
* would someone like to participate ?
Thank you
---------------------------------------------------------------------
-
-------- Try New Relic Now & We'll Send You this Cool Shirt New Relic
is the only SaaS-based application performance monitoring service that
delivers powerful full stack analytics. Optimize and monitor your
browser, app, & servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt!
http://p.sf.net/sfu/newrelic_d2d_apr
_______________________________________________
asio-users mailing list
https://lists.sourceforge.net/lists/listinfo/asio-users
_______________________________________________
Using Asio? List your project at
http://think-async.com/Asio/WhoIsUsingAsio
----------------------------------------------------------------------
--
------
Try New Relic Now & We'll Send You this Cool Shirt New Relic is the
only SaaS-based application performance monitoring service that
delivers powerful full stack analytics. Optimize and monitor your
browser, app, & servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt!
http://p.sf.net/sfu/newrelic_d2d_apr
_______________________________________________
asio-users mailing list
https://lists.sourceforge.net/lists/listinfo/asio-users
_______________________________________________
Using Asio? List your project at
http://think-async.com/Asio/WhoIsUsingAsio
----------------------------------------------------------------------
-------- Try New Relic Now & We'll Send You this Cool Shirt New Relic
is the only SaaS-based application performance monitoring service that
delivers powerful full stack analytics. Optimize and monitor your
browser, app, & servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt!
http://p.sf.net/sfu/newrelic_d2d_apr
_______________________________________________
asio-users mailing list
https://lists.sourceforge.net/lists/listinfo/asio-users
_______________________________________________
Using Asio? List your project at
http://think-async.com/Asio/WhoIsUsingAsio
------------------------------------------------------------------------
------
Try New Relic Now & We'll Send You this Cool Shirt New Relic is the only
SaaS-based application performance monitoring service that delivers
powerful full stack analytics. Optimize and monitor your browser, app, &
servers with just a few lines of code. Try New Relic and get this
awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_apr
_______________________________________________
asio-users mailing list
asio-***@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/asio-users
_______________________________________________
Using Asio? List your project at
http://think-async.com/Asio/WhoIsUsingAsio
arvid
2013-05-08 17:05:04 UTC
Permalink
Post by adrien courdavault
[...]
* do you think I'm going in the right direction by seing this as
Boost.ASIO extension.
In my mind it makes a lot of sense to build this on top of boost.asio's
message queue. This might even create some incentive to make sure the
queue stays generic enough to not just be useful for networking.
It's not obvious to me that it needs to be part of boost.asio itself
though. If anything, maybe the generic facilities of boost.asio should
be moved out into its own library.
Post by adrien courdavault
* do you have suggestions
In my experience with using boost.asio for networking, having an
interface where you pass in a function object which is then stored
internally and called repeatedly is error prone. You may notice that
boost.asio always make you pass in your function object for each new
asynchronous operation.

The reason for this is because a common pattern is to have your
function object contain a smart pointer, owning some object. Typically
the object the socket (or in this case, audio_port) belongs in. If this
socket or audio_port stores the function object, you have an ownership
cycle, and tearing it down becomes significantly more complicated.

you may want to consider an API where you instead push buffers, with a
callback associated to that push completing.
--
Arvid Norberg
Loading...