wiki/docs/media/index.md

177 lines
6.2 KiB
Markdown

# Media (General)
## Audio
### Split file with cue sheet and use filename from cue sheet
:::shell
shnsplit -D -f file.cue -t "%n %t" -o "flac flac -8 -o %f -" file.flac
To directly encode to opuus:
:::shell
shnsplit -D -f file.cue -t "%n %t" -o "cust ext=opus opusenc --bitrate=160k - %f" file.flac
### Remove all tags except MusicBrainz tags from flac files
:::shell
for i in *.flac; do tags=$(metaflac --export-tags-to=- $i | grep -E '^MUSICBRAINZ_'); metaflac --remove-all-tags $i; metaflac --import-tags-from=- $i <<< $tags; done
### Downmix 5.1/7.1 to 2.0
:::shell
mpv --oac=flac --audio-channels=stereo --oacopts=compression_level=0 --o=outfile.flac infile.flac
### Record pulseaudio device to flac
:::shell
parec [ -d DEVICE ] | flac --endian=little --channels=2 --bps=16 --sample-rate=48000 --sign=signed -o foo.flac -
### Copy Matroska Chapters to Opus
Requires the metadata to fit in one ogg page (so no cover art :().
:::shell
mkvextract file.mka chapters | xq -r 'def pad: tostring | (3 - length) as $l | ("0" * $l)[:$l] + .; [[.Chapters.EditionEntry.ChapterAtom[] | {start: .ChapterTimeStart, name: .ChapterDisplay.ChapterString}] | to_entries[] | "CHAPTER\(.key|pad)=\(.value.start)\nCHAPTER\(.key|pad)NAME=\(.value.name)"] | join("\n")' | opustags -i -S file.opus
## Video
### Copy DVD stream to file [with dvd already copied to local directory]
:::shell
mpv --stream-dump=1.mkv dvd://1 [--dvd-device path/to/dvd]
## Images
### Apply EXIF rotation to JPEG images
Requires `jhead` and `jpegtran`.
:::shell
nix shell nixpkgs#jhead nixpkgs#libjpeg.bin
jhead -ft -autorot *.JPG
## MKV
### Fix mimetype of font attachments
Some matroska files have the mimetype for font attachment for fonts set to
`application/octet-strem`.
:::shell
mkvpropedit --attachment-mime-type font/sfnt --update-attachment mime-type:application/octet-stream file.mkv
## FFmpeg
### Create color timeline image from video
:::shell
(infile=in.mkv; outfile=out.png; rows=320; width=1920; height=1080; ffmpeg -i $infile -vf tblend=all_mode=average,fps=${rows}/$(ffprobe -v error -show_entries format=duration -of default=noprint_wrappers=1:nokey=1 $infile),scale=1:1,scale=${width}/${rows}:${height},setsar=1,tile=${rows}x1 -frames:v 1 $outfile)
### Show duration of file in seconds
:::shell
ffprobe -v error -show_entries format=duration -of default=noprint_wrappers=1:nokey=1 file.mkv
### Remove EIA-608 subtitles from video bitstream
([source](https://stackoverflow.com/a/51439554))
:::shell
ffmpeg -i infile.mkv -c copy -bsf:v "filter_units=remove_types=6" outfile.mkv
## QP file
replace 24/1.001 with framerate
:::shell
ffprobe -i infile.mkv -print_format json -show_chapters -loglevel error | jq -r '.chapters[].start / 1000000000 * 24/1.001 | round | tostring + " I"' >> foo.qp
## Manga
### Convert greyscale images to actuual greyscale
For some reasons, many releases encode greyscale manga pages as yuv420. Sadly,
the chroma layers are not completely empty but include some (almost invisible)
noise. This fixes that for lower battery comsumption, a really small file size
gain and just because it seems right.
This only works for pages with the same size (image2/ffmpeg limitation), but
releases suffering from this mostly fulfill this requirement.
**WARNING**: This uses some heuristics (SSIM > 98) to determine if a page is
greyscale. This may not work all the time (it did for me though). Please verify
if all converted images actually are greyscale.
:::shell
ffmpeg -loglevel error -f lavfi -i "movie=%03d.jpg:f=image2,split=2[orig][in2];[in2]extractplanes=y[grey];[orig][grey]ssim=-" -f null - >> ssim
while read frame;do (( $(cut -d' ' -f5 <<< $frame | cut -c 7-8) < 98 )) || {file=$(printf "%03d.jpg\n" $(cut -d' ' -f1 <<< $frame|cut -d: -f2)); echo $file; jpegtran -copy none -optimize -grayscale -outfile $file $file}; done < ssim
jpegoptim -s *.jpg
exiftool -overwrite_original -all= *.jpg
# print all converted images for verification
grep -E 'All:0.(9[0-8]|[0-8][0-9])' ssim
### Merge page spreads to single page
Use function `merge_pages right-page left-page` (without .jpg). Result will be
written to left-page-right-page.jpg`.
:::shell
function merge_pages() {
convert ${2}.jpg ${1}.jpg +append ${1}-${2}.jpg
exiftool -overwrite_original -all= ${1}-${2}.jpg
}
# remove single pages
mkdir single_pages
for i in ???-???.jpg;do mv $(cut -d- -f1 <<< $i).jpg $(cut -d- -f2 <<< $i) single_pages;done
## mpv
### View thumbnails generated by [mpv-gallery-view](https://github.com/occivink/mpv-gallery-view)
:::shell
mpv --pause --demuxer=rawvideo --demuxer-rawvideo-mp-format=bgra --demuxer-rawvideo-w=288 --demuxer-rawvideo-h=162 FILE
Convert to tiles
:::shell
ffmpeg -codec:v rawvideo -pixel_format bgra -video_size 288:162 -f image2 -pattern_type glob -i '*' -vf tile=layout=10x10 tile-%04d.png
## Download
### Bilibili live recording
:::shell
curl 'https://api.live.bilibili.com/xlive/web-room/v1/record/getLiveRecordUrl?rid=R1sx411c7Xn&platform=html5'|jq -r '.data.list | map(.url) | to_entries[] | .value + "\n out=" + (.key|tostring) + ".flv"' | aria2c --auto-file-renaming=false -x 16 -j 10 -i -
mkvmerge '[' $(find . -name '*.flv'|sort -V) ']' -o merge.mkv
## PDF
### Downsample bitmap PDF
Useful for sending large 300/600 dpi scans as e-mail. Change `pdfimage32` to
`pdfimage8` for greyscale, `300` to the input DPI and `DownScaleFactor` to the
desired downscaling. For some reason fails when setting compression to JPEG.
:::shell
gs -sDEVICE=pdfimage24 -r300 -dDownScaleFactor=2 -o downscaled.pdf document.pdf
Imagemagick supports JPEG. Set your desired output density and JPEG quality.
:::shell
convert -density 300 -compress jpeg -quality 80 document.pdf downscaled.pdf
## Screen sharing
Since there is no good support for WebRTC screensharing with wayland, a virtual
webcam device can be used.
:::shell
sudo modprobe v4l2loopback exclusive_caps=1 card_label=screensharing # only has to be done once
wf-recorder --muxer=v4l2 --codec=rawvideo --file=/dev/video2 -x yuv420p # adjust /dev/video2 to the actual device
To remove the virtual webcam device run `sudo modprobe -r v4l2loopback`