Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Trigger Prod from Feature #70

Closed
wants to merge 253 commits into from
Closed

Trigger Prod from Feature #70

wants to merge 253 commits into from

Conversation

ks0m1c
Copy link
Contributor

@ks0m1c ks0m1c commented Jul 31, 2024

No description provided.

rtshkmr and others added 30 commits December 19, 2023 13:58
Just playing around with things...
This gets validated by just observing the html source code.

The next step would be to add in the image generation. For a first pass,
the image generation shall be done just-in-time, will add in some
cacheing thereafter as a v1.
Things added:
1. chapter, verse number
2. both text and transliteration
3. url dump
syncs commits in the prod branch with this feature branch

Co-authored-by: ks0m1c_dharma <sakiyamuni@sams.ara>
Co-authored-by: ks0m1c_dharma <johndoe@dharma.in>
Things done:
1. Added a livecomponent which has a div. The div calls a js hook
2. the js hook creates a new script tag which inserts itself as the
first script of the html document.
  * this is the most probable source of error because I think the
  callback functions are not being passed to the correct html element. I
  should explore this when I get back to this problem. The ideal case
  should be how it is shown on the iframe documentation that we see.
3. The cors issues have been resolved by adding the CORS plug that we
saw. This CORS plug allows youtube.com/iframe_api to be a valid source.
This actually works for all browsers.
4. If I embed this script tag directly to the heex template, it will
work. However, it won't work when I'm attempting to create do it via a
JS-Hook. There's a high likelyhood that it's just me being noob at
Elixir/LiveView and that's why I can't get it to work yet.

REF: https://developers.google.com/youtube/iframe_api_reference

Hunch(es):
1. I might have to just insert things into the correct script tag
properly. I had already tried inserting innerthtml but it just got
parsed as a string. Maybe I should relook at that.
2. Kinda related to point 2 above
Co-authored-by: ks0m1c_dharma <sakiyamuni@sams.ara>
Co-authored-by: ks0m1c_dharma <johndoe@dharma.in>
Conflicts: (accepted both, HEAD for outdated areas)
assets/js/hooks/index.js
lib/vyasa_web/live/gita_live/show_verse.html.heex
mix.exs
mix.lock
As long as this reference exists, we will be able to invoke JS API calls
that the youtube player exposes.
Currently, I've just supported "seekTo" and "loadVideoById" callbacks.
Got a bunch of video stats that gets spit out onhover of a button.
Built a custom tooltip here.

Key Takeaways:
1. added import to root template. UMD import done as described here [1], other imports won't work well.
2. followed their tutorial on how to setup a custom tooltip, used
show_verse to demonstrate it. Didn't create a generic template /
component for it, but I think this is good enough since it shows an
live-view-integrated version of the floating-ui tutorial as seen here [2]
3. the middleware is just magic.

[1]: https://floating-ui.com/docs/getting-started#umd
[2]: https://floating-ui.com/docs/tutorial
rtshkmr and others added 27 commits August 1, 2024 11:16
Init functions don't have action handlers yet
mainly consolidates the functions used for updating playback and
notifying the audio player
Separates the concerns b/w the MediaBridge (generic) and the
AudioPlayer (concrete)
Change list:

1. reorganise bridges into hooks/mediaEventBridges, within it there's a
new playbackMetaBridge. Intent is to comms playback info and separate
actions from playback info things.

2. now, when the server does a push_event for the audio events
registration, we also send a separate event that is using this new
bridge ("media_bridge:registerPlayback"), which dispatches the necessary
client side events to register the playback, load the audio and the
mediaSession as early as possible. Previously, this used to happen JIT,
when the user was to press the play/pause button. This seems to have
removed the initial time-delay we used to have between the instance when
user clicks "play" and actual playback starting.

3. naturally, we also prevent redundant re-loads of audio now by
guarding the source-setting for the html5 audio player.

Broad TODOs:
1. consider pushing to the new playbackMetaBridge in the same intervals
as the existing heartbeat.

2. this commit only added in message-passing from MediaBridgeHook to the
AudioPlayerHook, but not in the other direction.
Playback state is init @ point of events registration, via a new bridge now.
Automagically works == pat on the back for good design?
Refactor the event-handling patterns


# Table of Contents

1.  [Refactor to separate concerns MediaBridge vs others](#org25f2ca5)
    1.  [Overview](#orged49cc5)
    2.  [Initial Context & Reason to refactor:](#org8692118)
    3.  [More Backlogged Tasks](#org0aa6c21)
        1.  [handshake comms failure &#x2013; init handshake @ playback time as a fallback](#orgf5a1bcc)
        2.  [MediaSession: handle phone lock screen, which appears to kill the websocket](#orgb0027c7)
        3.  [MediaSession: use the playbackMetaBrige to push metadata updates at the same interval as the existing heartbeats.](#org9f8c929)
        4.  [MediaSession: add event handlers for scribing / seekTo / fastSeek](#orgfd02646)
        5.  [MediaBridge &#x2013;> MediaLibrary + voice fetching to happen @ MediaBridge.](#org4f37b9e)
    4.  [Some Implementaion Notes:](#org48a16ee)


<a id="org25f2ca5"></a>

# Refactor to separate concerns MediaBridge vs others


<a id="orged49cc5"></a>

## Overview

This PR cleans up the current patterns on how events flow through the server-side live components and the event bridge pattern that we&rsquo;ve used. Additionally, the following things have also been done:

1.  Improve ACK-ing of handshakes b/w the MediaBridge and the Written contexts. Duplicate ACKs are now ignored, thereby preventing redundant triggers of client-side hooks.
2.  A new client side event bridge got added (`playbackMetaBrige`) which allows message passing just for playback metadata via that bridge. This bridge is now being used to load the audio at the earlier possible time. In particular, this is the time at which the written context is loaded, which pubs to the MediaBridge ([ref commit](d627c53)).
3.  Fix the AudioPlayer hook crashing ([ref commit](26a1ae9)):
    -   the actual mechanics of this wasn&rsquo;t deeply investigated because it initially presented as a flaky behaviour and primarily affecting Chrome browsers.
    -   the current hypothesis is that it&rsquo;s because props passing via the `audio_player.ex` was not a good idea since when the playback state gets updated, that component will remount and create unnecessary chaos.
    -   this got fixed as a side-effect of registering playback preemptively.
4.  Adds action handlers for play pause events that are triggered externally, via the mediasessions api.


<a id="org8692118"></a>

## Initial Context & Reason to refactor:

We made the media bridge to coordinate the playback of mediums and allow coordination

Its intent was to supposed to be a Bridge pattern which separates the abstraction and the implementation and be a bridge to the implementation.

Let&rsquo;s standardise some terms &#x2013; for coordinating events, we can classify the type of events into:

1.  [USER EVENTS] events due to user interaction directly on the webapp
2.  [EXT EVENTS] events due to external interactions (e.g. mediaSessions API receives an event trigger from the user-agent)

In an almost CQRS-fashion, we should separate the event emission and consumption such that:

-   USER EVENTS flow in a single direction, from the server side live component to the MediaBridge, followed by the MediaBridge hook which then uses the EventBridges to dispatch messages to listeners on the event bridge.
    -   the MediaBridge hook is the broker for events on the client-side. No one shall by-pass the broker (e.g. by doing an event handling directly by one of the other Hooks, although that&rsquo;s absolutely feasible since events on the client-side can be captured by any DOM node since they are dispatched at the scope of the document.)

-   EXT EVENTS flow in a single direction as well wherein they emit messages to the server-side, upon which the event handling follows the same pattern as a USER EVENT and the flow of events follows the same standardised direction.

By doing this refactor, the events handling will be cleaner.

-   currently we don&rsquo;t have that behaviour,
    -   we have some duplication of efforts and weird flows because e.g. :
        -   `audio_player.ex` [sends](file:///Users/rtshkmr/Projects/vyasa/lib/vyasa_web/components/audio_player.ex) the playback info directly to the audio player hook so that it can read the playback info (e.g. in the handlePlayableState)
        
        -   the playback state send via the playPause bridge is the exact same source of info that gets read by the [handleMediaPlayPause](file:///Users/rtshkmr/Projects/vyasa/assets/js/hooks/audio_player.js)
-   here&rsquo;s the proposed FIX:
    -   User actions should flow through the bridge first and then go to other hooks


<a id="org0aa6c21"></a>

## More Backlogged Tasks


<a id="orgf5a1bcc"></a>

### [ ] handshake comms failure &#x2013; init handshake @ playback time as a fallback

Basically, what happens when the current handshake methodology fails? When to re-init it?


<a id="orgb0027c7"></a>

### - [ ] MediaSession: handle phone lock screen, which appears to kill the websocket

The current pattern is that play-pause events are to be captured by the AudioPlayer hook and then emitted to the MediaBridge server livecomponent which handles similarly to a user event. This breaks down in the case where a user locks their phone and the websocket is dead / inactive in the background.

A pedestrian solution would be to @ heartbeats, let there be a listener module which handles:

-   listening and log-stream creation
-   possible: teardown of a hook &#x2013; save last known states to localStorage and re-init when user&rsquo;s screen is unlocked again. This could be part of an overall push for better network-robustness as well.


<a id="org9f8c929"></a>

### [ ] MediaSession: use the playbackMetaBrige to push metadata updates at the same interval as the existing heartbeats.


<a id="orgfd02646"></a>

### [ ] MediaSession: add event handlers for scribing / seekTo / fastSeek

So that the media session API, which can already be used to do those operations on the playback, will be synced to states that we are managing on our live components.


<a id="org4f37b9e"></a>

### [ ] MediaBridge &#x2013;> MediaLibrary + voice fetching to happen @ MediaBridge.

This means that others may request to play voices based on voice ID, but the DB transacts shall be on the MediaLibrary side. This better aligns with the design seen in the LiveBeats repo as well.


<a id="org48a16ee"></a>

## Some Implementaion Notes:

1.  the initSession() for the audio player hook won&rsquo;t work anymore, we want to rm the playback payload that is passed
2.  ACK:
    -   the audio player has some bufferring to do. We want the player to send a broadcast to the others (esp. the MediaBridge hook) that it&rsquo;s ready to play.
        2 ways to do this:
        1.  [mediated by the server] audio player hook ==> audio player server live component ==>  media bridge live component ==> media bridge hook
        2.  [no server mediation] audio player hook ==> media bridge hook
    -   my champion solution is b). For now we don&rsquo;t need to support group singing yet, so client-side broadcast of buffer-state should be sufficient.
3.  audio player ready state event.
    -   using the `canplaythrough` event that gets emitted by the player for this. It dispatches when the player thinks that playback can happen from start to end without bufferring. ([ref](https://developer.mozilla.org/en-US/docs/Web/API/HTMLMediaElement/canplaythrough_event))
4.  livebeats has a separate hook just for pings ([ref](file:///Users/rtshkmr/Projects/live_beats/assets/js/app.js)). This acts as the heartbeat. It seems that it&rsquo;s only being used to broadcast active users that are currently tuned in ([ref broadcast fn](file:///Users/rtshkmr/Projects/live_beats/lib/live_beats_web/live/nav.ex)).
5.  [difference] livebeats gets its song via song id from the [medialibrary](file:///Users/rtshkmr/Projects/live_beats/lib/live_beats/media_library.ex) (our media bridge), in our case, we&rsquo;re getting it from the Medium, called by the [chapter::index.ex](file:///Users/rtshkmr/Projects/vyasa/lib/vyasa_web/live/source_live/chapter/index.ex)
@rtshkmr rtshkmr changed the title Testing Media Sessions on Prod Trigger Prod from Feature Aug 10, 2024
@rtshkmr
Copy link
Member

rtshkmr commented Aug 10, 2024

closing this because it's no longer necessary, will open up another one to trigger prod from the hanuman feature branch

@rtshkmr rtshkmr closed this Aug 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants