Skip to content

Streaming data in Oak #490

@Fractal-Tess

Description

@Fractal-Tess

Hey guys,

First of all, I'd like to thank you for creating and maintaining oak - for the time that I've been using it, it has been nothing short of amazing.
This issue is aimed primarily to get some answers as to how things work with Oak, Deno, streams, and readers/writer.

Here is a sample of my code

Firstly, the client :

import { readableStreamFromReader } from '@deps'

const inputFile = await Deno.open('./testing/in.mkv')
const inputStream = readableStreamFromReader(inputFile)

const req = new Request('http://localhost:1234', {
  method: 'POST',
  body: inputStream
})

await fetch(req)

And secondly the server

import { Application, copy, readerFromStreamReader } from '@deps'

const app = new Application()

const file = await Deno.open('./out.mkv', {
  write: true,
  read: true,
  create: true
})

app.use(async ctx => {
  console.log('Started writing')
  const r = ctx.request.body({ type: 'stream' }).value.getReader()
  const reader = readerFromStreamReader(r)
  await copy(reader, file)

  console.log('Done writing')
})

const serverListen = app.listen({ port: 1234 })
console.log('Running')
await serverListen

Using this code, I'm able to transfer files between the two processes without any kind of problems, and that was my initial goal to start. But then I look at my RAM and I was surprised to say the least. This method eats through my RAM without any remorese.
But why thought, wasn't the intent of streamers to keep ram as free as possible while waiting on disk I/O?

So I tried chaning some stuff around and looking online for solutions, but this topic isn't too well documented and I also did not spend a vast amount of time hunting for answers since it was confusing to me anyway.

I tried a different setup - something like this:

import { Application, copy, readerFromStreamReader } from '../deps.ts'

const app = new Application()

const file = await Deno.open('./out.mkv', {
  write: true,
  read: true,
  create: true
})

app.use(async ctx => {
  console.log('Started writing')
  const reader = ctx.request.body({ type: 'reader' }).value
  await copy(reader, file)

  console.log('Done writing')
})

const serverListen = app.listen({ port: 1234 })
console.log('Running')
await serverListen

The only change is that I'm doing

  const reader = ctx.request.body({ type: 'reader' }).value
  await copy(reader, file)

Instead of

  const reader = ctx.request.body({ type: 'stream' }).value.getReader()
  const denoReader = readerFromStreamReader(reader)
  await copy(denoReader, file)

And with this solution, my RAM is not being murdered 10 seconds into the transfer.
Why? What is different? Also could you please give a mild explanation to the difference in boody.ts line 320 to 347.

Side question: Is it possible to speed this up? I'm doing those tests between 2 lan pcs that have 10gb network with NVME cache and the speed is not more than 100mbs.

Thank you very much!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions