It raises an interesting issue especially as it relates to long-form streaming audio and video, where the user isn't really uploading a binary image or document, as is typically the case with binary data in web pages.
I've been experimenting quite a bit with Flash as a container for multimedia messages and content, with user recorded audio and video as the primary data types. But the architecture is very different than a document-based HTML page. The audio and video data is captured and recorded to a streaming server (Flash Communication Server) in real-time from the client, and then a Flash application loads and streams it in real-time from within an HTML page.
The challanges here for an open API are a plenty:
-
With long-form video, especially, we can't assume (yet) that the weblog system is the primary storage and delivery vehicle for the streaming asset.
-
Each streaming architecture for rich media is different, and weblog systems don't have a standard API to be aware of that data.
-
How can weblogs account for content and data external to their system?
The solution we've come up with is very simple --- don't use the weblog to store or deliver any of the binary data, just use it to encapsulate HTML fragments that do live in pages and therefore can apply category meta-data and participate in RSS feeds. But this just doesn't feel right, it's sort of hacking around a system that hasn't yet been designed to handle multimedia conversations.
10:11:28 AM
|