How to stream and buffer a video in React using the OSDK

Hi there! I am building out a React application in Foundry using the OSDK. I have an object type in my ontology called “[REDACTED] Media” with a backing dataset containing the media references of some videos I would like to display in my application.

I’ve figured out how to display a video, but the user has to wait for the entire video to load before they are able to start watching it. With video file sizes as high as 800MB, this is a problem. To improve UX, I’d like to instead stream chunks of the video into a buffer and allow the user to immediately begin watching a video while it is loaded in the background.

The solution seems to require the use of the MediaSource API. I’ve attempted a few different solutions but reached a dead end every time. I could really use someone’s expertise here.

My code currently looks like this:

const handleLoadVideo = () => {
    const videoElement = document.getElementById("test-vid") as HTMLVideoElement;
    const mediaSource = new MediaSource();
    const url = URL.createObjectURL(mediaSource);
    videoElement.src = url;

    mediaSource.addEventListener("sourceopen", async () => {
      const mime = 'video/mp4; codecs="avc1.42E01E"';

      if (!MediaSource.isTypeSupported(mime)) {
        console.error("Video.tsx: Unsupported MIME type/codec: ", mime);
        mediaSource.endOfStream("decode");
        return;
      }

      const sourceBuffer = mediaSource.addSourceBuffer(mime);

      try {
        const result = await client(PrasaMediaObject).fetchOneWithErrors(currentMediaItemRID!);

        if (result.error) {
          console.error("Video.tsx: Fetch error: ", result.error);
          mediaSource.endOfStream("network");
          return;
        }

        const videoContents = await result.value.mediaReference?.fetchContents();

        if (!videoContents?.body) {
          console.error("Video.tsx: No body in mediaReference");
          mediaSource.endOfStream("network");
          return;
        }

        const reader = videoContents.body.getReader();
        const appendChunk = async (chunk: Uint8Array) => {
          return new Promise<void>((resolve, reject) => {
            const onUpdateEnd = () => {
              sourceBuffer.removeEventListener("updateend", onUpdateEnd);
              resolve();
            };
            const onError = (e: Event) => {
              sourceBuffer.removeEventListener("error", onError);
              reject(e);
            };
            sourceBuffer.addEventListener("updateend", onUpdateEnd);
            sourceBuffer.addEventListener("error", onError);

            try {
              sourceBuffer.appendBuffer(chunk);
            } catch (err) {
              sourceBuffer.removeEventListener("updateend", onUpdateEnd);
              sourceBuffer.removeEventListener("error", onError);
              reject(err);
            }
          });
        };

        while (true) {
          const { done, value } = await reader.read();

          if (done) break;

          if (sourceBuffer.updating) {
            await new Promise(r => sourceBuffer.addEventListener("updateend", r, { once: true }));
          }
          await appendChunk(value);
        }
        mediaSource.endOfStream();
      } catch (e) {
        console.error("Video.tsx: Append/fetch error: ", e);
        try { mediaSource.endOfStream("decode"); } catch { };
      }
    });
  };

The <video> element looks like this:

image

In the console, I see that the byte stream is coming in fine right up until an error occurs:

…but I’m unsure what the error is. I don’t see any descriptive error message, and I’m unsure how to decipher the error otherwise.

Has anyone implemented a similar feature before? If so, how did you handle feeding the byte stream into your <video> element? Has someone perhaps found a solution that doesn’t use MediaSource (if that’s even possible)? Happy to provide more code/context if required!

I would really appreciate some guidance :slight_smile: Thank you!

Hey! I’m looking into if this error is on our end and if so, why. In the meantime however, I believe a lot of the complexity here stems from trying to natively use MediaSource with mp4 – it’s hard to build what you’re describing (streaming chunks into a buffer for immediate user consumption).

There are many third party libraries that play nicer with mp4s, not an endorsement of any of them as I have not tested them with OSDK but I’d recommend looking into mp4box.js, mux.js, or MediaBunny. I’m sure there’s other great ones here as well but I’d recommend looking into those to make what you’re trying to do easier!