AVAssetReader is fantastic, but I can only see how to use it with a local asset, a file, or I guess a composition,
So,
assetReader = try AVAssetReader(asset: self.asset)
...
assetReader.addOutput(readerOutput)
and so on,
Say you have an arriving stream
(perhaps Apple's examples of .M3U8 files,
https://developer.apple.com/streaming/examples/ )
In fact, can AVAssetReader be used for streams? Or only local files?
I just plain cannot find this explained anywhere. (Maybe it's obvious if you're more familiar with it. :/ )

It's not obvious. Patching together the header file comments for both
AVAssetReaderandAVCompositiongives the strong impression of an API designed only for local assets, although the language does not explicitly rule out non-local assets.From the
AVAssetReaderheader file:and from
AVComposition:If you're interested in video only, and don't mind processing as part of playback, you can capture frames from a remote asset by adding an
AVPlayerItemVideoOutputto yourAVPlayerItem. If you're interested in audio, you're up a creek.