Asciinema: upload failed: <urlopen error [Errno 32] Broken pipe>

Hello,

First 1001 thanks for this really nice piece of software and webservice, it’s great :slight_smile:

… but I have some trouble to upload content
I messed things up creating 2 accounts on asciinema.org and stopped an upload because I thought it was not working and then I revoke the tokens to see if it was better, but no success to upload any ascii on the website.

I tried to follow these advices Why `upload failed: Invalid or revoked install ID`?

but when I try to upload an ascii movie, I still get

asciinema upload aquicktest
asciinema: upload failed: <urlopen error [Errno 32] Broken pipe>
asciinema: retry later by running: asciinema upload aquicktest

I’m using asciinema 2.0.2 installed with pip on debian 9

best++
b

1 Like

Can you try and run pip and see if upgrading does anything to fix things

pip install asciinema --upgrade

or similar, depending on your setup. Once done, try recording a different terminal session and go with the flow uploading when asked to. See if that does help.

Also, making sure the recorder token that is within the config file is valid (ie not revoked)

You can verify the tokens for your account on the asciinema website user section

Locally, your recorder token should be in the following file

~/.config/asciinema/install-id

Make sure it is added to your account and not revoked

here’s a cast showing how one would go about resetting the recorder token

asciicast

1 Like

The broken pipe error is likely related to this: https://github.com/asciinema/asciinema/issues/335

hello,

ok, my asciicast was 88 Mo big (very spectacular :slight_smile: , that’s why

thanks
b

@ku1ik considering the cast was this big and assuming that compressing it with gzip would likely result in a more than acceptable size for storage, can i ask how the platform actually stores and serves the casts and if it could be possible, or even something you would think could live up to expectations, to have the app pre-compress before upload (or on the fly, whatever works best) and then the casts could be stored as gzipped (web-delivery ready on most new browsers i think) and served as is? or even add a pre decompression before delivery to outdated clients?

You could then make it clear that unless featured and considering the size, the cast would be kept only for X (im thinking 3 for registered accounts and the normal or 1-i believe- for anonymous/public uploads)

Then, and only maybe, you could specify that within an acceptable range of sizes, the service remains free as it is but for accounts needed to deliver or keep bigger sizes casts (or longer time ranges), somewhat of a paid subscription - even if small, enough to cover hosting/processing costs), could then be acquired for such individuals/entities.

It could open the platform for additional revenues, even as a non profit, paying for actual hosting costs and then returning any extras to the community via ‘INSERT GENIAL AND GROUMDBREAKING IDEA HERE’… or simply just become a for profit company (no idea of the current status) …

Then could come more specialized services like in-house gif,webp,apng,avi,mp4 encoding… you know, a black hole’s the limit from there…

i already have some ideas of how i could help… Let me know whereever or more this message where you think it could be most relevant if you think it is. Thanks for reading

1 Like

@ku1ik Hi, please post it on FAQ:

Q: What’s the upload limit on asciinema.org?
A: 5 MiB currently.

Answering the question about the storage on asciinema.org:

The recordings are stored in a S3 bucket, there’s also local cache directory on disk which nginx uses to limit the number of requests to S3 to a minimum (to keep the cost low).

The recordings are not gzipped at rest - they’re saved on S3 as is. This is fine because a) asciicasts are small in general (compared to video files) and b) the local nginx cache limits the S3 requests as described above.

Then, the webserver is serving the recordings compressed (on the fly) to the browsers (when accept-encoding: gzip is present in request headers), so the server->client transfer is small and fast.

I believe I could raise the upload limit significantly without increasing the hosting costs much, however…

Another reason for the limit is:

The web player needs to fetch the whole recording file and parse it in order to play it. Even if it gets served gzipped to a browser, after parsing it sits in memory. I don’t have exact numbers, but let’s say 1MB recording file when parsed into player’s data structures uses 10MB of RAM. If we let users upload 50MB files (or 5MB gzipped which uncompresses to 50MB) then when loaded in the player it would use 500MB of RAM. Now, if you embed such recording on your site then every visitor will be affected by such memory usage.

So I put the low limit also to prevent situations like above.

One possible solution to this would be to stream the recording in chunks to the player, which is possible with the new v3 player (via Websocket or EventSource driver), this however disables pause and seek features, as these drivers perform the playback in a live broadcast fashion.

One more thing about the storage - see my comment here about BYOS (bring your own storage) idea: Workaround the upload limit? - #5 by sickill