-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Web #103
Conversation
This reverts commit 218c800.
SoLoudController().captureFFI.getCaptureAudioTexture; | ||
|
||
void allocSamples() { | ||
_samplesPtr = wasmMalloc(512 * 256 * 4); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here I allocate the max amount of bytes needed by *texture2D*()
and used that memory for all kind of audio samples we want to acquire. I found not convenient to allocate the memory depending of the type we want to acquire (ie the getAudioTexture()
just needs 256*4 bytes).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this should be commented in the code. If I saw 512 * 256 * 4
without any other context, I wouldn't know what this means, and would be afraid to touch it.
Also, as an aside, I suggest we find ways to remove things from the package's API surface. There's a lot of work here that you need to do just in order to maintain AudioData
, and I don't think it's a feature that many people use. Of course, as soon as you have a feature, some people will start using it, and then it's really hard to remove it because you don't want to disappoint / anger folks.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would like to think that this plugin is not only useful for game developers but also be used for example for a music player. In 2019 when I released my music app, my greatest enjoyment was providing users with an audio visualizer in OpenGL!
IMHO giving developers a way to capture audio data could result in something niche but also something unique. In your game, for example, you could think of making the vertices of the widget at the bottom left move based on system sounds (shots, movements, etc.).
Not much more than what is already implemented can be added and as in the case of waveforms, we could leave the state of this feature frozen.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No yeah, I get it. I'm just comparing this with the massive use case of "simply" playing audio. If you're fine with supporting this, of course I'm fine with it, too. Just please tell us long in advance if you ever feel like you start burning out of maintaining this project. It's "only" been 1 year. There are many years to come. :)
@@ -0,0 +1,438 @@ | |||
// ignore_for_file: public_member_api_docs | |||
|
|||
import 'dart:js_interop'; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here I don't know well if I choosen the right solution to interop with JS. All the functions compiled to WASM are exposed globally. I think there is a way to put them inside an extension type
but I didn't find a solution.
@@ -57,3 +58,8 @@ flutter: | |||
ffiPlugin: true | |||
windows: | |||
ffiPlugin: true | |||
|
|||
assets: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here I am not sure what to do. These assets are only needed on the web. Is there a way to do not include them on other platforms?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure, but I think for that we would need a federated plugin (https://docs.flutter.dev/packages-and-plugins/developing-packages#federated-plugins). That's quite a lot of additional work, though.
(There are open issues on conditional bundling of assets flutter/flutter#65065 and platform specific assets flutter/flutter#8230. I suggest we put a big 👍 on those issues.)
For now I think it's okay to leave everything here? We can tally the size of the additional files and ask people: is web support worth the additional X megabytes on all other platforms, or do you want to wait for us to do it "properly" first?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These assets take 626 Kb uncompressed. It's ok also for me to leave everything here. I just add a comment for those 2 issues to remember this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perfect. I feel that 626 Kb is fine for most applications that ship audio, so I don't think it's a high priority.
It could be one of those "help needed" issues that others take on themselves if they feel like it.
Thanks, @alnitak, this is excellent! I'm going to have a look this evening or tomorrow. I love the deep context that you've provided. I also asked the community (on Mastodon and Twitter) to have a look because, well, the more the merrier in this case. I could easily overlook something, and I'm also a complete beginner when it comes to WASM or emscripten. |
This is clearly not something we'll address in this PR, but I wonder why this happens. I thought the audio decompression happens in C, and therefore on a non-UI thread, correct? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! Comments are minor.
Let's wait a few days in case there are others who'd like to chip in. But otherwise, let's merge and go from there.
When you call a C function from Dart it is executed in the main Flutter UI thread, but if that function creates another thread to do stuff, it doesn't affect the UI thread. I need more time to address this and don't know if I'll be able to. |
Description
The web platform is now supported.
In the
web
directory, there is acompile_wasm.sh
script that generates the.js
and.wasm
files for the native C code located in thesrd
dir. Run it after installing emscripten. There is also acompile_web.sh
to compile the web worker needed by native code to communicate with Dart. The generated files are already provided, but if it is needed to modify C/C++ code or theweb/worker.dart
code, the scripts must be run to reflect the changes.The
compile_wasm.sh
script uses the-O3
code optimization flag.To see a better errors logs, use
-O0 -g -s ASSERTIONS=1
incompile_wasm.sh
and rerun the script.To add the plugin to a web app, add the following line to the
<body>
section ofweb/index.html
:<script src="assets/packages/flutter_soloud/web/libflutter_soloud_plugin.js" defer></script>
The problems
The AudioIsolate was causing many problems related to web workers when trying to port to web. It was used to monitor all the sound states and to send some operations to the native code. These operations were not working inside a JS Worker because web audio is not supported.
The
AudioIsolate
has been removed and all the logic has been implemented natively. Events likevoice ended
are sent from C back to Dart. However, since it is not possible to call Dart from a native thread (the audio thread), a new web worker is created using the WASMEM_ASM
directive. This allows sending theaudio finished
event back to Dart via the worker. Here an example just to let me explain and receive suggestions/critiques:The same problem happens using
dart:ffi
, it is also not possible to call a function directly from a native thread (audio thread) to Dart without using send. Here,NativeCallable
helps. WithNativeCallable
, it's not necessary to import sendPort and receivePort into the native code, which are part of ffi and thus not compatible with the web.Notes
Acquiring audio data is (was?) experimental, the following methods are now deprecated and this functionality is now in the
AudioData
class.@experimental SoLoud.getAudioTexture2D()
@experimental SoLoudCapture.getCaptureAudioTexture2D()
It is not possible to read a local audio file directly on the web. For this reason,
loadMem()
has been added, which requires theUint8List
byte buffer of the audio file.IMPORTANT:
loadMem()
with modeLoadMode.memory
used the on web platform will freezy the UI for the time needed to decompress the audio file. Please use it with modeLoadMode.disk
or load your sound when the app starts.In addition to acquire data as a 2D texture (
getAudioTexture2D
), withAudioData
class is now possible to acquire audio aslinear
, which represents the FFT+wave vector, or just thewave
data vector for better performance. With this class, it is also possible to choose to acquire data from the player or from the microphone.loadUrl()
produce the following error when the app is run:This is due for the default beavior of http servers which don't allow to make requests outside their domain. Refer here to learn how to enable your server to handle this situation.
Instead, if you run the app locally, you could run the app with something like the following command:
flutter run -d chrome --web-renderer canvaskit --web-browser-flag '--disable-web-security' -t lib/main.dart --release
test/test.dart has been changed and a GUI has been added to simplify the logs. It was using
dart:io
which is not compatible with the web, alsostdout()
/stderr
were not output logs to the Android console.Type of Change