Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Web #103

Merged
merged 34 commits into from
Jul 15, 2024
Merged

Web #103

merged 34 commits into from
Jul 15, 2024

Conversation

alnitak
Copy link
Owner

@alnitak alnitak commented Jul 8, 2024

Description

The web platform is now supported.

In the web directory, there is a compile_wasm.sh script that generates the .js and .wasm files for the native C code located in the srd dir. Run it after installing emscripten. There is also a compile_web.sh to compile the web worker needed by native code to communicate with Dart. The generated files are already provided, but if it is needed to modify C/C++ code or the web/worker.dart code, the scripts must be run to reflect the changes.

The compile_wasm.sh script uses the -O3 code optimization flag.

To see a better errors logs, use -O0 -g -s ASSERTIONS=1 in compile_wasm.sh and rerun the script.

To add the plugin to a web app, add the following line to the <body> section of web/index.html:
<script src="assets/packages/flutter_soloud/web/libflutter_soloud_plugin.js" defer></script>

The problems

The AudioIsolate was causing many problems related to web workers when trying to port to web. It was used to monitor all the sound states and to send some operations to the native code. These operations were not working inside a JS Worker because web audio is not supported.

The AudioIsolate has been removed and all the logic has been implemented natively. Events like voice ended are sent from C back to Dart. However, since it is not possible to call Dart from a native thread (the audio thread), a new web worker is created using the WASM EM_ASM directive. This allows sending the audio finished event back to Dart via the worker. Here an example just to let me explain and receive suggestions/critiques:

// On C side define an inline function to create the Web Worker and another
// to send messages to the listening worker in Dart.

void createWorkerInWasm()
{
	EM_ASM({
		...
		// Create a new Worker
		var workerUri = "assets/packages/flutter_soloud/web/worker.dart.js";
		Module.wasmWorker = new Worker(workerUri);
	});
}

void sendToWorker(const char *message, int value)
{
	EM_ASM({
			// Send the message
			Module.wasmWorker.postMessage(JSON.stringify({
				"message" : UTF8ToString($0),
				"value" : $1
			}));
		}, message, value);
}
// On Dart side listen to the messages.
@JS('Module.wasmWorker')
external web.Worker wasmWorker;

void setDartEventCallbacks() {
	web.Worker _worker = wasmWorker;
	_worker?.onmessage = ((web.MessageEvent event) {
      _outputController?.add(event.data.dartify());
    }).toJS;
}

The same problem happens using dart:ffi, it is also not possible to call a function directly from a native thread (audio thread) to Dart without using send. Here, NativeCallable helps. With NativeCallable, it's not necessary to import sendPort and receivePort into the native code, which are part of ffi and thus not compatible with the web.

Notes

Acquiring audio data is (was?) experimental, the following methods are now deprecated and this functionality is now in the AudioData class.

  • @experimental SoLoud.getAudioTexture2D()
  • @experimental SoLoudCapture.getCaptureAudioTexture2D()

It is not possible to read a local audio file directly on the web. For this reason, loadMem() has been added, which requires the Uint8List byte buffer of the audio file.
IMPORTANT: loadMem() with mode LoadMode.memory used the on web platform will freezy the UI for the time needed to decompress the audio file. Please use it with mode LoadMode.disk or load your sound when the app starts.

In addition to acquire data as a 2D texture (getAudioTexture2D), with AudioData class is now possible to acquire audio as linear, which represents the FFT+wave vector, or just the wave data vector for better performance. With this class, it is also possible to choose to acquire data from the player or from the microphone.

loadUrl() produce the following error when the app is run:

Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://www.learningcontainer.com/wp-content/uploads/2020/02/Kalimba.mp3. (Reason: CORS header ‘Access-Control-Allow-Origin’ missing). Status code: 200.

This is due for the default beavior of http servers which don't allow to make requests outside their domain. Refer here to learn how to enable your server to handle this situation.
Instead, if you run the app locally, you could run the app with something like the following command:
flutter run -d chrome --web-renderer canvaskit --web-browser-flag '--disable-web-security' -t lib/main.dart --release

test/test.dart has been changed and a GUI has been added to simplify the logs. It was using dart:io which is not compatible with the web, also stdout()/stderr were not output logs to the Android console.

Type of Change

  • ✨ New feature (non-breaking change which adds functionality)
  • 🛠️ Bug fix (non-breaking change which fixes an issue)
  • ❌ Breaking change (fix or feature that would cause existing functionality to change)
  • 🧹 Code refactor
  • ✅ Build configuration change
  • 📝 Documentation
  • 🗑️ Chore

SoLoudController().captureFFI.getCaptureAudioTexture;

void allocSamples() {
_samplesPtr = wasmMalloc(512 * 256 * 4);
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here I allocate the max amount of bytes needed by *texture2D*() and used that memory for all kind of audio samples we want to acquire. I found not convenient to allocate the memory depending of the type we want to acquire (ie the getAudioTexture() just needs 256*4 bytes).

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this should be commented in the code. If I saw 512 * 256 * 4 without any other context, I wouldn't know what this means, and would be afraid to touch it.

Also, as an aside, I suggest we find ways to remove things from the package's API surface. There's a lot of work here that you need to do just in order to maintain AudioData, and I don't think it's a feature that many people use. Of course, as soon as you have a feature, some people will start using it, and then it's really hard to remove it because you don't want to disappoint / anger folks.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would like to think that this plugin is not only useful for game developers but also be used for example for a music player. In 2019 when I released my music app, my greatest enjoyment was providing users with an audio visualizer in OpenGL!
IMHO giving developers a way to capture audio data could result in something niche but also something unique. In your game, for example, you could think of making the vertices of the widget at the bottom left move based on system sounds (shots, movements, etc.).
Not much more than what is already implemented can be added and as in the case of waveforms, we could leave the state of this feature frozen.

filip

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No yeah, I get it. I'm just comparing this with the massive use case of "simply" playing audio. If you're fine with supporting this, of course I'm fine with it, too. Just please tell us long in advance if you ever feel like you start burning out of maintaining this project. It's "only" been 1 year. There are many years to come. :)

@@ -0,0 +1,438 @@
// ignore_for_file: public_member_api_docs

import 'dart:js_interop';
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here I don't know well if I choosen the right solution to interop with JS. All the functions compiled to WASM are exposed globally. I think there is a way to put them inside an extension type but I didn't find a solution.

@@ -57,3 +58,8 @@ flutter:
ffiPlugin: true
windows:
ffiPlugin: true

assets:
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here I am not sure what to do. These assets are only needed on the web. Is there a way to do not include them on other platforms?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure, but I think for that we would need a federated plugin (https://docs.flutter.dev/packages-and-plugins/developing-packages#federated-plugins). That's quite a lot of additional work, though.

(There are open issues on conditional bundling of assets flutter/flutter#65065 and platform specific assets flutter/flutter#8230. I suggest we put a big 👍 on those issues.)

For now I think it's okay to leave everything here? We can tally the size of the additional files and ask people: is web support worth the additional X megabytes on all other platforms, or do you want to wait for us to do it "properly" first?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These assets take 626 Kb uncompressed. It's ok also for me to leave everything here. I just add a comment for those 2 issues to remember this.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perfect. I feel that 626 Kb is fine for most applications that ship audio, so I don't think it's a high priority.

It could be one of those "help needed" issues that others take on themselves if they feel like it.

@alnitak alnitak marked this pull request as ready for review July 8, 2024 14:22
@alnitak alnitak requested review from filiph July 8, 2024 14:23
@filiph
Copy link
Collaborator

filiph commented Jul 8, 2024

Thanks, @alnitak, this is excellent! I'm going to have a look this evening or tomorrow. I love the deep context that you've provided.

I also asked the community (on Mastodon and Twitter) to have a look because, well, the more the merrier in this case. I could easily overlook something, and I'm also a complete beginner when it comes to WASM or emscripten.

@filiph
Copy link
Collaborator

filiph commented Jul 9, 2024

IMPORTANT: loadMem() with mode LoadMode.memory used the on web platform will freezy the UI for the time needed to decompress the audio file. Please use it with mode LoadMode.disk or load your sound when the app starts.

This is clearly not something we'll address in this PR, but I wonder why this happens. I thought the audio decompression happens in C, and therefore on a non-UI thread, correct?

Copy link
Collaborator

@filiph filiph left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Comments are minor.

Let's wait a few days in case there are others who'd like to chip in. But otherwise, let's merge and go from there.

lib/src/soloud.dart Show resolved Hide resolved
lib/src/soloud.dart Outdated Show resolved Hide resolved
lib/src/soloud.dart Show resolved Hide resolved
lib/src/soloud.dart Outdated Show resolved Hide resolved
lib/src/soloud.dart Outdated Show resolved Hide resolved
wasm.sh Outdated Show resolved Hide resolved
web.sh Outdated Show resolved Hide resolved
web/compile_worker.bat Outdated Show resolved Hide resolved
web/compile_worker.sh Outdated Show resolved Hide resolved
web/worker.dart.js.deps Outdated Show resolved Hide resolved
@alnitak
Copy link
Owner Author

alnitak commented Jul 9, 2024

IMPORTANT: loadMem() with mode LoadMode.memory used the on web platform will freeze the UI for the time needed to decompress the audio file. Please use it with mode LoadMode.disk or load your sound when the app starts.

This is clearly not something we'll address in this PR, but I wonder why this happens. I thought the audio decompression happens in C, and therefore on a non-UI thread, correct?

When you call a C function from Dart it is executed in the main Flutter UI thread, but if that function creates another thread to do stuff, it doesn't affect the UI thread.
Here the problems are 2 I think. The 1st is that the decoding stuff is made in the main UI thread. The 2nd is that the decoding stuff blocks also the audio Worker (or Worklet)!
In fact on the other platforms, even if you voluntarily block/freeze the UI (ie stop at a breakpoint), the sound still plays.

I need more time to address this and don't know if I'll be able to.

@alnitak alnitak merged commit 7f6c134 into main Jul 15, 2024
1 check passed
@alnitak alnitak deleted the web branch July 22, 2024 16:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants