BASH: Watch 5 QVC YouTube live-streams at once, using MPV


For no other reason than "just because". The Bash script is generic enough to be used in other scenarios. Most work is done by MPV and its '--ytdl-format' option. A small delay is added before each mpv call, to avoid swamping YouTube with concurrent video requests.

#!/bin/bash

# Stream 5 QVC YouTube live-streams simultaneously.
# - Requires 'mpv' - N.B. Kills all running instances of mpv when exiting.
# - See YouTube format codes, for video quality, below.
# ver. 2017.02.11.21.46.52

### YOUTUBE VIDEO IDS ##########################################################

# QVC ...... USA .......... UK ........ Italy ....... Japan ...... France
videos=('2oG7ZbZnTcA' '8pHCfXXZlts' '-9RIKfrDP2E' 'wMo3F5IouNs' 'uUwo_p57g5c')

### FUNCTIONS ##################################################################

function finish() # Kill all mpv players when exiting
{
  killall mpv
}
trap finish EXIT

function playVideo() # Takes YouTube video ID
{
   sleep "$2" # The "be nice" delay
   mpv --quiet --ytdl-format 91 https://www.youtube.com/watch?v="$1" 
}

### BEGIN ######################################################################

for ytid in "${videos[@]}"; do ((x+=2)); (playVideo "$ytid" "$x" &); done
read -p "Press Enter key to exit"$'\n' # Hold before exiting
#zenity --warning --text="End it all?" --icon-name="" # Zenity hold alternative
exit

### FORMAT CODES ###############################################################

# format code  extension  resolution note
# 91           mp4        144p       HLS , h264, aac  @ 48k
# 92           mp4        240p       HLS , h264, aac  @ 48k
# 93           mp4        360p       HLS , h264, aac  @128k
# 94           mp4        480p       HLS , h264, aac  @128k
# 95           mp4        720p       HLS , h264, aac  @256k
# 96           mp4        1080p      HLS , h264, aac  @256k

The story of Donald Trump's presidential Inauguration, told through the medium of scat painting




Long-exposure photography compared to image-stacking video frames (ImageMagick/FFmpeg)



Pictured above: comparisons of images made from a segment on "Good Mythical Morning" involving "light painting". In the top-left, a 30-second exposure from a still-camera in the studio. Below it, an image made using ImageMagick's '-evaluate-sequence' function, on all frames taken from the 30 seconds of video. In this case, the 'max' setting was used, which stacks maximum pixel values. In the top-right, a single frame from the video, and below it, 100-frames stacked with FFmpeg using sequential 'tblend' filters.

# ImageMagick - Use with extracted frames or FFmpeg image pipe (limited to 4GB)
 convert -limit memory 4GB frames/*.png -evaluate-sequence max merged-frames.png

# FFmpeg - Chain of tblend filters (N.B. inefficient - better ways to do this)
ffmpeg -i video.mp4 -vf tblend=all_mode=lighten,tblend=all_mode=lighten,... 
As a comparison, here is an image made from the same frames but using 'mean' average with ImageMagick.



A video demo for the FFmpeg version


source video: https://www.youtube.com/watch?v=1tdKZYT4YLY&t=2m4s

FFmpeg: Generate an image of tiled results from all 'blend' filter types



Two rudimentary bash scripts, which take two file paths as input for an FFmpeg instance, which in turn uses these files as sources for 'blend' filters. Image dimensions are irrelevant as both input images are scaled to '320x320' (for ease of formatting). Multiple for-loops generate the bulk of the filtergraph. There are more elegant ways of doing this, using multiple outputs and secondary applications, but these solutions here are based on a single instance of FFmpeg.

Note the "$format" variable in the scripts. Different pixel formats will produce different results, which is part of the reason why these scripts do not produce "all possible blend results".

There are two versions: one outputs "all_mode" only results [image above], and the other outputs the results of all blend modes. The "all_mode" only version is probably the more useful of the two. Even if neither script is used, the image examples included here could be useful as references, as they give a general idea of the effect of each 'blend' type.

script: ffmpeg-tiled-blend-results-all_mode-only.sh
script: ffmpeg-tiled-blend-results.sh

The scripts were written to aid with choosing the most suitable blend filter for tasks/projects, which can sometimes be hard to comprehend beforehand. The images produced allow for quick a assessment of possibilities between two test frames.



Input A source: https://www.flickr.com/photos/38983646@N06/15545146285/
Input B source: https://www.flickr.com/photos/archer10/4244381931/

Painting

Digital airbrush painting of a slightly surreal looking horse muzzle with bridle

2016


So much negativity about the year 2016, but really, every year since 2007 has been shit.
tweet: https://twitter.com/oioiiooixiii/status/815250339452567553

context: http://bgr.com/2016/12/30/was-2016-a-bad-year-yes-it-was/