SignalK / specification

Signal K is a JSON-based format for storing and sharing marine data from different sources (e.g. nmea 0183, 2000, seatalk, etc)
Other
91 stars 68 forks source link

RFC 0004: Entertainment #293

Open timmathews opened 7 years ago

timmathews commented 7 years ago

Summary

This goal of this RFC is to provide an entertainment group in the Signal K specification. It will be a new node directly under a vessel, encompassing all of the data necessary to display data from and control devices which provide audio or video entertainment streams.

The initial structure of the entertainment will be focused around what data Fusion provides in their proprietary PGNs and what data is available in the stock n2k entertainment PGNs. Where these PGNs fall short, I will attempt to fill in the gaps based on what I learned doing Crestron AV installs for a year or so.

The NMEA 2000 PGNs are detailed in Technical Corrigendum TC# 2000 20160715 and Technical Corrigendum TC# 2000 20160725 available publicly on the NMEA's website.

Motivation

@sbender9 has been working on adding Fusion support to n2k-signalk and it would be helpful if there were a set of paths to map to which we can all agree on. AV is also a really good area to start experimenting with two-way control because worst case scenarios here aren't particularly dangerous.

Detailed design

Regardless of the source of an AV stream, the data attached to it is basically the same. Some sources will have more and some will have less but these should cover 90% of all use cases. Because different sources could potentially be playing in different zones (not in a Fusion system, but certainly in larger installations), entertainment should follow the same basic structure as electrical.

Within the entertainment group there are inputs, these include stereos, satellite tuners, AppleTV, DVD libraries, etc.; and outputs, such as amplifiers, TVs, Bluetooth bridges. Because outputs have a $source parameter which is a JSONpath reference to their connected input, it is possible to describe arbitrarily complex AV installations and routings with this schema.

entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).name
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).deviceType
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).audioFormat
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).videoFormat

entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).transport.playbackState
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).transport.playbackSpeed
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).transport.repeatState
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).transport.shuffleState

entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).track.name
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).track.artistName
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).track.number
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).track.totalTracks
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).track.genre
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).track.releaseDate
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).track.albumName
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).track.albumArt
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).track.elapsedTime
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).track.length

entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).tuner.mode
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).tuner.region
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).tuner.frequency
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).tuner.channel
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).tuner.subChannel
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).tuner.stationName
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).tuner.isStereo
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).tuner.squelch
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).tuner.signalStrength

entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).name
entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).deviceType
entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).$source
entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).audioFormat
entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).videoFormat
entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).isMuted
entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).fade
entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).balance

entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).volume.master
entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).volume.frontLeft
entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).volume.frontRight
entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).volume.frontCenter
entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).volume.rearLeft
entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).volume.rearRight
entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).volume.surroundLeft
entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).volume.surroundRight
entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).volume.sub

entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).equalizer.preset
entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).equalizer.bass
entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).equalizer.mid
entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).equalizer.treble
entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).equalizer.(^[0-9]+kHz$)

deviceType values: See NMEA definitions above

playbackState values:
 * Stopped
 * Paused
 * Playing
 * Rewinding
 * Skipping Forward
 * Skipping Backward

repeatState values:
 * None
 * Repeat Track
 * Repeat Album
 * Repeat Playlist

shuffleState values:
 * None
 * Shuffle Album
 * Shuffle Playlist
 * Shuffle All

audioFormat values:
 * Mono
 * Stereo
 * Surround

videoFormat values:
 * ATSC
 * DVB
 * NTSC
 * PAL

tuner.region values:
 * US
 * Europe
 * Asia
 * Middle East
 * Latin America
 * Australia
 * Russia
 * Japan

tuner.mode values:
 * AM
 * FM
 * Shortwave
 * VHF
 * SiriusXM
 * CATV
 * DTV
 * Satellite TV

Drawbacks

Other than being incomplete, none at this time

Alternatives

The videoFormat values come from the NMEA PGNs above, but it's not really very useful information. It would be better (probably) for the videoFormat field to convey resolution information, e.g. 480p, 720p, 1080p, 2k, 4k, etc. or encoding format such as MPEG2, H.264, etc.

Likewise, the audioFormat values are very basic and it may make more sense for these to contain bitrate and encoding information.

As an alternative to the simple fields above, maybe this information should be expanded into a contentStream section with more information about the encoding of the current content.

entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).contentStream.containerFormat

entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).contentStream.audio.bitrate
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).contentStream.audio.samplingFrequency
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).contentStream.audio.codec
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).contentStream.audio.channelCount

entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).contentStream.video.bitrate
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).contentStream.video.bitDepth
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).contentStream.video.chromaFormat
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).contentStream.video.resolution
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).contentStream.video.frameRate
entertainment.device.(^[A-Za-z0-9]+$).input.(^[A-Za-z0-9]+$).contentStream.video.codec

Unresolved questions

Music / video library support

Related

Tracking Branch: rfc0004 Example Full Schema: entertainment-full.json

pod909 commented 7 years ago

Apologies for needing to open up a previous discussion but this has only really come up as part of the investigation into MQTT and support for sensors.

Including the meta in the middle of the key prevents keys being used by none context aware network nodes and stops keys from different groups being provided in a sensible manner for single pieces of equipment. The introduction of meta in this way for electrical equipment causes a big headache.

With respect for all the work done to date, the suggestion would be that meta should be carried in the context rather than the key.

e.g.

vessels.12345.engineRoom1.engine5.battery.eletrical... vessels.12345.bridge.mediaControl.eletrical... vessels.12345.bridge.mediaControl.entertainment...

You can see this issue in terms of the keys above my TV at home has inputs and outputs. These can not be grouped together with this approach.

tkurki commented 7 years ago

What meta are you referring to? There is a meta key defined in the schema for all paths: http://signalk.org/specification/master/data_model_metadata.html. I assume you are not talking about this - please try to be more specific for least amount of confusion.

A single path will never be enough to describe all the connections and arrangements onboard. In your example the path vessels.12345.engineRoom1.engine5.battery.eletrical is trying to describe both spatial arrangement (engine5 is in engineRoom1) and connections (battery is connected to engine5). How about a secondary battery? What if the battery is also connected to engine4 just as well with a switch in between?

We need to pick one unique path per something we have data about. The path needs to have an artificial id if we need to capture data that is fundamentally about different things: engine rpms for twin engines (unique paths) versus two echosounders (one path, multiple values).

Then connections and arrangements between the pieces of equipment can be modeled separately.

pod909 commented 7 years ago

I've seen a lot of things described as meta. I'm talking about the inclusion of context in the middle of an attribute key as described here.

I agree that there are many-to-many challenges here. These occur where ever this information is in the full path. I used examples with special context as they where what came to mind and it's a distraction from the point I need to make about strict separation of the attribute key and its context.

Mixing information about context with the description of the attribute is a big issue for nodes that do not know about the context they exist in, which includes most nodes at which values originate.

...

On the other points mentioned...

This all ties in to discussion on relationships and equipment.

Ultimately there are network nodes (battery1, battery2, battery3, paddleWheelStarboard) that have attributes that we want to know about. These coincide with the values a source node (e.g. a sensor) may want to transmit. Those attributes may cover many groups.

These source nodes exist in many contexts. To date Signal K has chosen to focus on the vessel but we've started to identify others such as location with in the vessel or electrical circuit ... or how about global nest of weather stations.

The need for a root level list/manifest of source nodes/devices/equipment has been called out a few times. May be that could be at the vessel level? Either way additional contexts should probably by allowed to be expressed along side the vessels containing a reference to the source nodes they contain rather than the node objects them selves.

A large part of the job of a Signal K Server is to understand the mapping from source node to context and to copy the values provided by the source node into the context for onward consumption. Nothing changes there.

pod909 commented 7 years ago

.. run out of time to provide a concrete suggestion but I have one, want to finish the RFC on streaming first!

joabakk commented 7 years ago

I think this discussion may end up in the critical path to v1. There are many opinions of the tree structure. Personally I do not think that mapping the connections or relations in detail in the path structure adds any value for the cost. We have talked about fault finding in complex systems, but that should be dealt with in another fashion. How about adding attributes in the end path where these are needed. Then you can still place your stereo on the chart table without changing the whole tree structure of the schema?

sbender9 commented 7 years ago

Can we add the device name somewhere?

sbender9 commented 7 years ago

Also add genre to track info?

timmathews commented 7 years ago

Added genre

sumps commented 7 years ago

How does a client know what an entertainment source/input is capable of so that it does not let a user try to skip a track on say a Satellite radio receiver ?

sbender9 commented 7 years ago

What's deviceType?

pod909 commented 7 years ago

Proposal on relationships, locally defined context, values relevant to more than one context, equipment, devices etc. added to RFC0001 #264

timmathews commented 7 years ago

@sumps, trying to issue a command to a source that doesn't support it would simply be a NOOP. Assuming that the client issued the command via an HTTP POST the server should respond with 405 Method Not Allowed. Issued via WebSockets, the client should get back something similar (which we need to define as part of the two-way communication spec).

Information on which buttons to display based on the chosen input probably belongs in metadata somewhere, since (to me) it fulfills the same function as colored zones in an analog gauge. That is, it's simply a user experience affordance. Interestingly, it's metadata which we can get directly from n2k: see PGN 130573, Field 8.

@sbender9, deviceType was pulled directly from the n2k PGN definitions linked above. It's probably somewhat duplicative, but as a general rule if there is a parameter in a parameter group in n2k, I like to provide it with a home. See specifically PGN 130569, Field 2.

timmathews commented 7 years ago

I've made some updates to the proposed schema above. Currently the only property under entertainment is device. I'm doing this to leave room for potentially adding library and other items at the same level as device.

Devices are named using our standard alpha-numeric, underscore-rejecting regex. This allows for simple auto-generated names like fusion0. Devices have inputs and outputs.

The simplest AV installation would be a single output AM/FM stereo and it might look like this (timestamp, meta, and other non-value stuff elided):

{
  "entertainment": {
    "device": {
      "stereo0": {
        "input": {
          "stereo0": {
            "name": {
              "value": "Cockpit Stereo",
            },
            "deviceType": {
              "value": "FM"
            },
            "tuner": {
              "mode": {
                "value": "FM",
              },
              "frequency": {
                "value": 88100000
              },
              "stationName": {
                "value": "WTMD",
              },
              "isStereo": {
                "value": true
              },
            }
          }
        },
        "output": {
          "stereo0": {
            "name": {
              "value": "Cockpit Stereo"
            },
            "deviceType": {
              "value": "2-Ch Stereo"
            },
            "$source": {
              "value": "entertainment.device.stereo0.input.stereo0"
            },
            "isMuted": {
              "value": false
            },
            "fade": {
              "value": 0
            },
            "balance": {
              "value": 0
            },
            "volume": {
              "master": {
                "value": 23
              }
            }
          }
        }
      }
    }
  }
}
rob42 commented 7 years ago

This is a good addtition, but take the time to get it right, rather than just mapping n2k, or fusion.

It may be worth looking at the https://kodi.tv/ project. Ive set one up on an RPi and its pretty impressive way to mange audio, video and images, including lookups of online libraries etc. Basically I can watch or listen to pretty much anything/anytime :-)

I'm not suggesting incorporating the kodi project, but there are lessons on how the entertainment section is likely to develop, and ideas on integrating with external media sources.

timmathews commented 7 years ago

Just completed the first draft of the entertainment schema in the rfc0004 branch linked above.

Please ignore the failing test. It only passes once the spec has actually been merged and published on signalk.org.

pod909 commented 7 years ago

@timmathews by putting "device" in you're creating a defacto universal rule for the handling of all devices.

How are the electrical values of sterio0 represented in full example?

sbender9 commented 7 years ago

Thanks tim! Now I can implement some tests.

sbender9 commented 7 years ago

We should probably add a subwoofer volume to zone

timmathews commented 7 years ago

It's in there. entertainment.device.(^[A-Za-z0-9]+$).output.(^[A-Za-z0-9]+$).volume.sub

I'm wondering if fade and balance belong under volume?

sbender9 commented 7 years ago

Probably

sbender9 commented 7 years ago

https://github.com/sbender9/n2k-signalk/tree/fusion-stereo-updates

Has fusion stereo conversions that comply with the current spec

sbender9 commented 7 years ago

@timmathews I think we need to specify somehow the volume and equalizer ranges. Are they 0-100 ?

timmathews commented 7 years ago

Depends on the device. Those will need to get defined in _meta.

sbender9 commented 7 years ago

Why don't we standardize? We can convert to standard values elsewhere.

sbender9 commented 7 years ago

Yeah, the more I think about it. We should do the same thing here we do with "units" everywhere else in the spec. We don't have to look in _meta to find out if windSpeed is in m/s. We know it is. If a device reported windSpeed in knots, it would have to be converted to m/s. We should do the same with these.

tkurki commented 7 years ago

@timmathews it would be easier for me to reason about this rfc's spec changes if the tracking branch would have an open pull request, allowing commenting with Github's review workflow. What do you think?

tkurki commented 7 years ago

The very top of the hieararchy is in plular vessels and so are /resources/routes, /resources/waypoints, /electrical/batteries etc. Propulsion is the notable difference.

To me entertainment/devices would work very well.

joabakk commented 7 years ago

I suggest volume and eq as ratio. 0.5 would mean half of what the unit is capable of. More in line with the rest of the spec

pod909 commented 7 years ago

so long as volume can go up to 11 it's all good ;)

If we're having entertainment/devices do we also have navigation/devices, environment/devices, steering/devices and performance/devices as somewhere to put these things?

Why are sensors in their own vessel level grouping but entertainment, electrics and propulsion things are not?

The lack of uniqueness of bottom level keys is going to be an issue I suspect...

jboynes commented 7 years ago

@pod909 I'm in the middle of writing an RFC on key and values to address this. Gimme a data to wrap it up :)

tkurki commented 7 years ago

@timmathews did you have something specific in mind with the creation of devices intermediate level? If there is never going to be any sibling it is a bit superfluous. Something outside the context of individual devices?

As far as I understand it entertainment is about controlling & information about entertainment devices.

Navigation is about the vessel's navigational properties, not devices. The source of those properties is separately captured under `/sources.

Performance groups together performance related properties, no need for devices with the keys there.

Propulsion has "propulsion units" under it.

Why are sensors in their own vessel level grouping but entertainment, electrics and propulsion things are not?

As far as I know sensors is not used anywhere and for the time being I believe it should be removed, but I don't see your point here: http://signalk.org/specification/master/keys/#vesselsregexpsensors http://signalk.org/specification/master/keys/#vesselsregexppropulsion http://signalk.org/specification/master/keys/#vesselsregexpelectrical They are all under vessel level grouping.

tkurki commented 7 years ago

What if we are on a cruise ship and there's live entertainment, where do we put that?

pod909 commented 7 years ago

Navigation, environment and performance data all come from many places/devices.

Multiple performance devices (devices providing performance values) are pretty common on race boats. Some examples:

There could well be the raw value the from a sensor, a calibrated value from one system and a back calculated value from another. And that's just apparent wind.

Unfortunately those MFDs also have depth and GPS chips in them these days, and some of them can control your entertainment.

Still trying to get my head around Sources. I thought it was a pretty open ended structure focused on which transport a server found a value in?

timmathews commented 7 years ago

@tkurki, I anticipate a library group as a sibling to device (or devices). I'm not 100% sure how it should work, but I'd like to be able to have a holistic view of the ships media library across multiple devices with a pointer to the device that hosts it (in the case of say multiple MP3 players) so that if you build up a playlist of music from different sources, a plugin in the server could detect that and automatically handle input switching.

tkurki commented 7 years ago

Performance data from multiple sources (in this case, simulated two NMEA0183 sources nmea1 and nmea2 both producing VPW sentence) looks like this:

image

sumps commented 7 years ago

I don't feel that we will reach a good solution for entertainment if we do not get our heads out of the detail i.e. Finding a quick solution for Fusion and consider the bigger picture.

@rob42 made a very good suggestion about looking at Kodi and seeing how other Open Source projects are handling entertainment systems and then applying it to marine networks.

Also we should look at the structures and approaches we have used in other groups, rather than define or "bolt on" another new method with devices.

jboynes commented 7 years ago

+1

I've yet to find time to dig into this in detail but at first blush this seems very centered on traditional systems (a.k.a. a car stereo jammed in a boat) rather than today's richer environment. Some areas that strike me as problematic are multi-channel systems, more complex media streams (e.g. different video codecs, never mind complex audio codecs like Atmos, variable bit rates), DSP sound field processing (now coming down to car audio systems e.g. room/cab correction)

timmathews commented 7 years ago

I'm curious why everyone thinks that this is Fusion specific? I've tried to approach this from my experience with large home AV installs (Crestron, AMX and the like) with dozens of sources and outputs all with different capabilities. What I learned doing that sort of work is that there is no 100% generic solution that will cover every type of device out there, but most AV devices have properties covered by the elements above.

Here's a quick example which uses discrete components (no Fusion or other marine-specific AV involved):

{
  "entertainment": {
    "device": {
      "cd0": {
        "input": {
          "tray0": {
            "name": "CD Player, Left Tray",
            "deviceType": "SACD",
            "audioFormat": "Surround",
            "transport": {
              "playbackState": "Playing"
            }
          },
          "tray1": {
            "name": "CD Player, Right Tray",
            "deviceType": "SACD",
            "audioFormat": "Surround",
            "transport": {
              "playbackState": "Playing"
            }
          }
        }
      }
    },
    "receiver0": {
      "input": {
        "digital1": {
          "name": "Digital Coax Input 1",
          "deviceType": "SPDIF Input",
          "audioFormat": "Surround"
        }
      },
      "output": {
        "zone1": {
          "name": "Main output zone (Saloon)",
          "deviceType": "Multichannel Amplifier",
          "audioType": "Surround",
          "$source": "entertainment.device.cd0.tray0",
          "isMuted": false,
          "volume": {
            "master": -18.3
          }
        }
      }
    }
  }
}

This is a CD player with two trays, connected to a standard AV receiver. I've simplified a bit (in a real SK system the values under each key would be objects with value, timestamp, etc). I could go on; for instance a television would be a device with (likely) just an output section. One thing that may be useful to add is a $source key to the input objects so that it is easier to see the connection between the CD player and the receiver via the digital coax input.

@jboynes, I've described a contentStream object in the Alternatives section which should cover all of the additional data related to a specific media stream, including bitrates, codecs, framerate, resolution, etc.

timmathews commented 7 years ago

Kodi is a media player (formerly XBMC). I'm not sure what lessons it has to teach us. There is certainly room for a lot more data regarding media tracks though. For instance:

I'm not committed to the current hierarchy, but I've yet to see a better suggestion. We need to support devices which are strictly input/source devices (most components in an AV system), output only (single zone amplifiers, televisions), and which have both inputs and outputs (AV receivers, matrix switches, multizone amplifiers). We also need to support all-in-one or hybrid devices such as the Fusion stereo and AV receivers which often have AirPlay, SiriusXM, AM/FM tuners, etc. built in.

sumps commented 7 years ago

@timmathews I didn't mean to denigrate your proposal, particulary as it is the only one put forward so far and does seem to be pretty broad in its scope. What I am struggling with is that we are proposing a large change to the schema with a completely new group and yet there are no "devices" that we can control/communicate with except those that use proprietary mechanisms or the new NMEA2000 implementation.

I think we either do an experimental branch/plugin for Node Server to work with Fusion so that we can gain some experience and understand what is possible or we do a larger change with a new entertainment group but based around an open source technology/project where we immediately have access to "devices" that we can control/communicate with - if such a technology/project exists.

I fear that implementing your proposal would give us a great solution on paper, but no real life implementations, other than Fusion, which we could achieve with a plugin and a lot less effort/risk.

sbender9 commented 7 years ago

@sumps I already have node server working with Fusion and using the latest proposed spec: https://github.com/sbender9/n2k-signalk/tree/fusion-stereo-updates

jboynes commented 7 years ago

@timmathews in some ways it can teach us what not to do :) Kodi struggles in whole house configurations so is perhaps not a good model for whole vessel configuration. Maybe some of the stream processing frameworks might offer insight - e.g. the Windows sound framework, PulseAudio or JACK on Linux? Any of the network audio protocols e.g. EtherSound?

There is a model for content at rest (i.e. sitting in a library), including all the metadata needed to select content to play. A simple CD player can be viewed as a library with zero-or-one collections of content depending on whether there's a disc present; a CD changer is then a library with zero-to-N collections.

That simple CD player has the ability to source one stream at a time. A computer with multiple outputs can source multiple ones; and with a multiplexed output like Ethernet, an arbitrary number of streams. "Spotify" the service is able to source thousands of streams concurrently.

"Begin Playback" can be construed as a command that creates a stream from content, and other transport commands (like pause, rewind, skip) as ways to manipulate the point of playback in that stream.

At the other end is a sink that consumes stream data and converts to its final form e.g. sound from a speaker or an image on a monitor.

Connecting source and sink is a series of links that route and transform the stream content, including channel splitting from multiplex streams, modifying per-channel amplitude for volume, balance and fade, frequency curves for equalization and crossover, and timing for room correction. Basically DSP (or even good old ASP) on the stream content.

I realize that's not generally compatible with the limited control mechanisms provided by traditional home AV devices. It is starting to become available with more specialist hardware and software that is trickling into the consumer market. Having dissed Kodi at the start for the whole-house setup, this is how it is implemented internally in how it sets up the playback pipeline from source to sink (it's just limited to a single source and single sink).

timmathews commented 7 years ago

@sumps I'm good with this remaining experimental and strictly available via a plugin for a while. I agree that it could use some more (non-Fusion) real-world experience before being merged and with a stated goal of a 1.0 release before the end of the year, it's unlikely that this will make the cutoff.

@jboynes a chain of sources and sinks is essentially what I'm trying to model here. Some devices are strictly sources, some are strictly sinks, but many are both. And you're right that with streaming content over Ethernet things get more complicated. Consider IP paging systems where each speaker is a VoIP device and can be a member of one or more paging zones (or addressed directly). The IP paging system my previous employer installed for DARPA a few years ago had something like 1200 speakers and well over 100 paging station inputs. And when it wasn't being used for paging, it was generating pink noise to prevent eavesdropping (sound masking). I can imagine something like this (though certainly not as large) being installed on a commercial vessel.

My initial approach had a separate section for describing these paths. Perhaps it is worth reevaluating that idea. I also wonder (somewhat OT) if RFCs should have their own repo so that we can track changes and comment on individual parts.

sbender9 commented 7 years ago

@timmathews what do you think about standardizing the volume and equalizer values? (see comments above by me and @joabakk )

timmathews commented 7 years ago

I would be OK with 0-100 for volume and +/-100 for equalizer settings, keeping in line with what NMEA does. It's the most generic solution I can think of. However, it won't be easy. Nearly every device would need special handling to convert from its scale to the Signal K scale. We can do unit conversions easily precisely because the ratios between units are well defined and the existing protocols specify the units they use. This doesn't hold for audio devices.

My Onkyo receiver also uses a scale of 0-100 for its serial protocol, but displays in dBm. Yamaha uses a range of -80.5 - 16.5 in steps of 0.5 for its control protocol (-6 to +6 for equalizer settings). And I believe Sony uses a slightly different scale.

jboynes commented 7 years ago

@timmathews I'm planning to replace the setup I inherited (car stereo & a TV) with a home-brew library-based system. I would be interested in making it SK controlled if you want a guinea pig for a more bespoke system.

sbender9 commented 7 years ago

I'm with you @timmathews, but I'd rather have the special handling on the server side instead of having every app that implements display/control via Signal K have to have special handling.

joabakk commented 7 years ago

On volume, any knob is or mimics a logarithmic potmeter, so conversion to dB should be straightforward.

pod909 commented 7 years ago

@tkurki can there be sources that don't refer to the transport?

timmathews commented 7 years ago

transport, track, tuner are all optional. Really depends on the device in question. A basic AM/FM radio might only use tuner and a boring CD player might only use transport. And something like Pandora might only use part of transport.