Closed thiccaxe closed 4 months ago
It is working on Linux. The "older" branch at least has a front end based on esplayer that the author of apsdk-public removed (along with fairplay) in the current version. ("older" reverted the esplayer front end deletion)
git clone https://github.com/FDH2/apsdk-public
cd apsdk-public
git checkout older
cmake . -DBUILD_APS_DEMO=ON
make
this builds (for me) the aps-demo executable in subdirectory demo. (I just checked).
To get this to build I had to track down and install all the prerequisites for esplayer etc. Unfortunately I didnt keep a record of them The list below may be useful. SDL2 is "Simple DirectMediaLayer" https://www.libsdl.org/
//Dependencies for the target
esplayer_LIB_DEPENDS:STATIC=general;SDL2;general;avformat;general;avcodec;general;avutil;general;swresample;general;swscale;general;fdk-aac;
Running cmake until it succeeds will help you incrementally find what you need to install. When it finally builds, you need to shutdown any firewall on the host server, to run it on your network.
My conclusions:
this "apsdk" codebase is not related to that which UxPlay derives from (apart from use of fairplay(playfair), which was removed from the upstream current apsdk-public, but replaced here)
It seems to contain handles for AirPlay video streaming (as opposed to mirror), maybe using an older AirPlay protocol. This is something UxPlay does not do.
unfortunately, the demo doesn't include code to actually handle video streaming, although it does handle mirroring. (Maybe there is something in older code history which was stripped out? I didnt look that hard for it). When I stream a youtube video it shows "FCUP_*" HTTP messages, but these finish with "unhandledURLResponse" because the code does not provide a handler for Video Streaming.
There is a Android app released by the apsdk author (seems to be from Singapore), presumably using apsdk-public.
https://play.google.com/store/apps/details?id=com.sheentech.airdisplay
I meant to get it to see what it could do but didn't get round to it.
EDIT: I just installed the AirDisplay app on an Android phone, connected to my local network, but (disappointingly) it does not seem to successfully register with DNS_SD/Bonjour, so my iOS devices don't see it. (It does eventually indicate that it has found the local network, so this is not a simple network issue.)
Thank You for the very detailed response! I just got a chance to test out the demo (On Ubuntu the instructions you provided are excellent, you may need to add #include <cstdint>
to ap_aes.cpp
in order to build).
I am noticing much of the same observations you have had with YouTube (and an encrypted m3u8 file??), and am currently looking into this, hopefully it's not a DRM issue.
EDIT: Seems like a lot of the code tied to the m3u8 files are located in ap_casting_media_data_store.cpp
if someone more knowledgeable wants to investigate, I'm currently looking into where this code is referenced but it includes a lot of references to sending "FCUP" requests and (for some reason) specific code to convert YouTube's MLHLS format into http.
"FCUP" requests appear to be empty? I'm not currently sure why, but this statement does not run
Upon further investigation, it appears that the code loops through sending FCUP requests until it is not empty, explaining the if statement's function, after which it appears to send something causing the client to request to start the video How would one gain access to the local m3u8 files? Maybe these could be used to "select" a stream to play?
I haven't heard of an encrypted m3u8 file, can you share? The YouTube m3u8 files I saw were in plaintext but not a standard format. I suspect you need to do some monkey business to make it work in for example VLC - host a loclhost server that proxies certain requests to the google servers and rewrite some of the urls too maybe
This is predicted based on the localhost URLs I observed...
GitHub isn't allowing the file to upload so a screenshot will have to suffice... My theory currently is this is locked behind some kind of TLS encryption that has to use the library somehow? This is due to the fact that the master m3u8 contains a regular m3u8 that leads to a YouTube API link to the VTT subtitles without any trickery, but that still leaves the question of why the video needs that instead...
I don't understand what is needed for AirPlay video streaming, but perhaps this helps: https://air-display.github.io/airplay-internal/media_cast_service.html
On another note, its clear that Airplay-Server1 (which @thiccaxe found to start this thread last year, and which has now disappeared fron github) is from the same codebase as apsdk-public
EDIT, from looking at CMakeLists.txt, Airplay-Server1 was a fork of apsdk-public code from around January 2019. apsdk-public was then indeed called Airplay Server. The apsdk author (tishion) has been working on it since 2018, (though the initial 2018 commit may itself come from some earlier codebase, there are no indications that it does).
I suspect there is something else going on, there is also this airplay server 1 repo that is unrelated https://github.com/KqSMea8/AirplayServer
And this one https://github.com/roy2651/AirplayServer-1
This seems to be related to the m3u8 files https://github.com/xbmc/xbmc/pull/744
AirPlayServer (KqsMea8) is our historical codebase (droidfang aka dsafa22, derived from shairplay) which became RPiPlay , or a fork of it
AirplayServer-1 (roy2651) is a fork of the tishion aps (now apsdk-public) codebase
I'm getting stuck with segfaults when trying to write cURLs for the m3u8 files (interestingly Firefox succeeds but the program internally doesn't? Probably a dumb mistake) (EDIT: Make sure you have a libcurl-dev version to run, this code specifically comes from StackOverflow for temporary debugging purposes, if rewritten better would use direct access)
size_t writefunction(void *ptr, size_t size, size_t nmemb, void *stream)
{
size_t written;
written = fwrite(ptr, size, nmemb, (FILE* )stream);
return written;
}
virtual void on_video_play(const uint64_t session_id, const std::string &location, const float start_pos) override {
LOGI() << "on_video_play: " << location << ", session: " << session_id << std::endl;
session_ = session_id;
CURL *ch;
CURLcode rv;
char caPath[128];
char errbuf[CURL_ERROR_SIZE];
rv = curl_global_init(CURL_GLOBAL_ALL);
ch = curl_easy_init();
rv = curl_easy_setopt(ch, CURLOPT_URL, location.c_str());
LOGI() << location << std::endl;
/* provide a buffer to store errors in */
curl_easy_setopt(ch, CURLOPT_ERRORBUFFER, errbuf);
/* provide a buffer to store errors in */
curl_easy_setopt(ch, CURLOPT_ERRORBUFFER, errbuf);
rv = curl_easy_setopt(ch, CURLOPT_WRITEFUNCTION, &airplay_cast_handler::writefunction);
//rv = curl_easy_setopt(ch, CURLOPT_WRITEDATA, stdout);
FILE *fp = fopen("/home/administrator/bruh.m3u8", "wb");
rv = curl_easy_setopt(ch, CURLOPT_WRITEDATA, fp);
rv = curl_easy_setopt(ch, CURLOPT_SSL_VERIFYPEER, 1L);
//printf("set Up CA Path..\r\n");
//memset(caPath,'0',sizeof(caPath));
//strcpy(caPath,"/home/administrator/");
//rv = curl_easy_setopt(ch, CURLOPT_CAPATH,caPath);
rv = curl_easy_perform(ch);
printf("curl easy perform done..\r\n");
if(rv == CURLE_OK)
printf("*** transfer succeeded ***\n");
else
{
printf("*** transfer failed..****\n");
perror("failed:");
/* if the request did not complete correctly, show the error
information. if no detailed error information was written to errbuf
show the more generic information from curl_easy_strerror instead.
*/
size_t len = strlen(errbuf);
fprintf(stderr, "\nlibcurl: (%d) ", rv);
if(len)
fprintf(stderr, "%s%s", errbuf,
((errbuf[len - 1] != '\n') ? "\n" : ""));
else
fprintf(stderr, "%s\n", curl_easy_strerror(rv));
}
}
Additionally if you want to automate this process more, I found some interesting util methods that may help:
get_best_quality_stream_uri
Uses the same method as Plex to get the best quality streamget_youtube_url
parses an m3u8 file (probably the one from best quality stream) and gets the YouTube m3u8, which for now I have found to be garbled/encryptedRight now the process seems to be
yt-dlp May be helpful later on?
I found a better way to do this, but I'm not sure how to implement it:
Inside ap_media_casting_data_store.cpp
is a method that should allow direct access to the m3u8 files wherever they are stored on the server
I assume the library code needs to somehow be edited to accommodate for direct access?
I FOUND SOMETHING The YouTube app has been the one giving weird m3u8 files, however when streaming from the website (and I don't know why) in Safari and airplaying, the link provided to the AirPlay client is infact a valid mp4 link, however the client seg-faults when creating the audio stream for some reason. I have no idea what to do with this information, but I hope it helps someone
It is fundamentally different afaik; the airplay feature from safari will simply send the same hls/mp4 url that it is using to stream the video in the first place. from the youtube app it is a different matter altogether
I was able to get the m3u8 files to work from the callback method, it appears that everything (seems) to be set in terms of protocol
The on_video_play
callback in the demo outputs a location, which is either a mp4 or m3u8, however it appears that localhost m3u8 files can simply be played in ffplay
, meaning that it is creating a valid output without any additional encryption/parsing.
I'm still researching how ffmpeg converts the YouTube URLs into valid files for playback, hopefully this works with other apps as well
Thanks for your research on this!
It would be great if UxPlay could be extended to support video streaming
I think I better understand how FFMPEG is compiling the videos: In the Mediadata.m3u8 files, there are segments of mp4s that need to be stitched in order to play a valid file (which is why none of them appeared to play), however a rendering tool could stream these parts (Not entirely sure how to do this with UxPlay).
Full Findings of Casting Process:
ffplay
, or directly streamed by understanding the codec and properly decoding sequentially by live-downloadingHopefully this is useful in implementing casting!
What is missing in APSDK that is needed to make it cast?. "older" has a renderer that works in mirror mode. why does it not work in casting mode? Could it be that casting mode stuff was removed prior to "older" and could be found in the earlier merge history?
Although there is a chance it could be in the commit history, I think it's easier to build our own implementation using ffplay for a mockup (Simply passing the argument provided in the callback on_video_play
after ffplay creates a video stream). Other parts of the demo contain callbacks to receive information about scrubbing, playback rate, etc, but that wouldn't be missing.
The only thing I haven't found out how to do (Although I think it's in the SDK) is tell the client that the video is playing. If you want I can make a quick mockup using ffplay to demonstrate that it works.
So you think maybe the issue is just a missing renderer for whatever codec the cast uses?.
A working model would be invaluable
Should I .gitignore all the .h and .o output files?
Oops I didn't clone the older branch
Very rudimentary prototype: https://github.com/GenghisKhanDrip/apsdk-public/tree/older EDIT: Make sure to install ffmpeg for this demo to work How it works: takes YouTube stream from app and puts into ffplay to show proof of concept Limitations: cannot perform scrub operations and does not allow you to tell the client that the video is playing (don’t know how to do this based on the information in the protocol, has it been discovered?) (ie, client doesn’t know video is playing) Other limitations: App segfaults randomly (don’t know how to fix this) and will segfault when using AirPlay from YouTube on safari due to ALAC stream
When casting is ported into UxPlay (hopefully), the only issue that matters is how to inform the client that the video is playing, what request needs to be made and I can try and implement it?
When casting is ported into UxPlay (hopefully), the only issue that matters is how to inform the client that the video is playing, what request needs to be made and I can try and implement it?
I think it is documented in one of the links @fduncanh or I sent earlier, either that or do a MITM proxy.
diff --git a/src/ap_config.cpp b/src/ap_config.cpp
index fb2ab15..7cc6f43 100644
--- a/src/ap_config.cpp
+++ b/src/ap_config.cpp
@@ -60,7 +60,7 @@ ap_config_ptr ap_config::default_instance() {
s_instance->flags_ = "0x04";
s_instance->macAddress_ = mac_address;
s_instance->vv_ = 2;
- s_instance->features_ = 0x527FFFF7; // 0x0E5A7FFFF7 with pv // 0x0E527FFFF7 w/o pv;
+ s_instance->features_ = 0x0E5A7FFFF7; //0x0E5A7FFFF7 with pv // 0x0E527FFFF7 w/o pv;^M
s_instance->statusFlag_ = 68;
s_instance->audioCodecs_ = "0,1,2,3";
s_instance->encryptionTypes_ = "0,3,5";
seems to only work for ads on youtube for me though! :)
I updated the prototype with dummy playback information, apparently the client is supposed to provide that and not the server, this is in the on_acquire_playback_info
method
@thiccaxe I haven't gotten ads on the videos that I've tested, could you please send me the mediadata.m3u8 file you are using so I could see if the tags are different? (Make sure to remove your IP!) Or alternatively, the console logs the location of the content that your airplay client is sending, you can put that into ffplay yourself and see if it works My test case is the YouTube app signed out, so if that fixes your problem you can try that?
Question: Are we using the apsdk implementation and adding UxPlay features into it, or adding HTTP support into UxPlay and then adding casting to the main app? I'm wondering because it seems that we have verified that the protocol is working, however apsdk appears to be more unstable compared to UxPlay as-is
I think it was determined that apsdk public is questionably licensed so probdbly not
I think one needs to get a working apsdk demo that does at least youtube videostreaming to see what infrastructure is needed, and then create similar features in uxplay. For example there are HTTP handlers needed (that is easy) but I don't know what other infrastructure is needed.:
{"HTTP", "GET", "/server-info", RH(get_server_info_handler)},
{"HTTP", "POST", "/fp-setup", RH(post_fp_setup_handler)},
{"HTTP", "POST", "/fp-setup2", RH(post_fp_setup2_handler)},
{"HTTP", "POST", "/reverse", RH(post_reverse_handler)},
{"HTTP", "POST", "/play", RH(post_play_handler)},
{"HTTP", "POST", "/scrub", RH(post_scrub_handler)},
{"HTTP", "POST", "/rate", RH(post_rate_handler)},
{"HTTP", "POST", "/stop", RH(post_stop_handler)},
{"HTTP", "POST", "/action", RH(post_action_handler)},
{"HTTP", "GET", "/playback-info", RH(get_playback_info_handler)},
{"HTTP", "PUT", "/setProperty", RH(put_setProperty_handler)},
{"HTTP", "POST", "/getProperty", RH(post_getProperty_handler)},
I am having some issues getting it to work today, getting this error:
*** WARNING *** The program 'aps-demo' uses the Apple Bonjour compatibility layer of Avahi.
*** WARNING *** Please fix your application to use the native API of Avahi!
*** WARNING *** For more information see <http://0pointer.de/blog/projects/avahi-compat.html>
AP Server is starting....
[DEBUG]ap_airplay_connection (0x12ab160) is being created
[DEBUG]Session (0x12ab160) is waiting
[DEBUG]AP service running on 39135
[DEBUG]Session (0x12aeee0) is waiting
[DEBUG]Media service running on 33287
AP Server started....
[DEBUG]Session (0x12ab160) accepted and started
[DEBUG]ap_airplay_connection (0x7f740c0011b0) is being created
[DEBUG]Session (0x7f740c0011b0) is waiting
[DEBUG][19575136]<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Request: GET /info RTSP/1.0
Header:
Active-Remote: 1783092797
CSeq: 0
Content-Length: 70
Content-Type: application/x-apple-binary-plist
DACP-ID: 3104E20A2578534F
User-Agent: AirPlay/755.3.1
X-Apple-ProtocolVersion: 1
Body:bplist00...Yqualifier..ZtxtAirPlay..................................."
[DEBUG][19575136]<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Request: POST /pair-setup RTSP/1.0
Header:
Active-Remote: 1783092797
CSeq: 1
Content-Length: 32
Content-Type: application/octet-stream
DACP-ID: 3104E20A2578534F
User-Agent: AirPlay/755.3.1
Body:|(....\..|x.w...K..G?.....1lDJrX
[DEBUG][19575136]<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Request: POST /pair-verify RTSP/1.0
Header:
Active-Remote: 1783092797
CSeq: 2
Content-Length: 68
Content-Type: application/octet-stream
DACP-ID: 3104E20A2578534F
User-Agent: AirPlay/755.3.1
X-Apple-AbsoluteTime: 732588286
X-Apple-PD: 1
Body:.........<}"...'...=.Mn.{_..........|(....\..|x.w...K..G?.....1lDJrX
[DEBUG][19575136]<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Request: POST /pair-verify RTSP/1.0
Header:
Active-Remote: 1783092797
CSeq: 3
Content-Length: 68
Content-Type: application/octet-stream
DACP-ID: 3104E20A2578534F
User-Agent: AirPlay/755.3.1
X-Apple-AbsoluteTime: 732588286
X-Apple-PD: 1
Body:.......k..@..^.>..<V.........N.I....N.uk.X.VX>.B.W&o..T..{(..-.3[.B.
[DEBUG][19575136]<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Request: POST /fp-setup RTSP/1.0
Header:
Active-Remote: 1783092797
CSeq: 4
Content-Length: 16
Content-Type: application/octet-stream
DACP-ID: 3104E20A2578534F
User-Agent: AirPlay/755.3.1
X-Apple-ET: 32
Body:FPLY............
[DEBUG][19575136]<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Request: POST /fp-setup RTSP/1.0
Header:
Active-Remote: 1783092797
CSeq: 5
Content-Length: 164
Content-Type: application/octet-stream
DACP-ID: 3104E20A2578534F
User-Agent: AirPlay/755.3.1
X-Apple-ET: 32
Body:FPLY..............Et..&c...vC-..6fW..".kb....J5[.Pk.~....cei!v].&..........)....}........3...I.3.@~.jw..*q..........@.[.......2.B.4....W97](.....}.^.Yx..7.)aTIe....
[DEBUG][19575136]<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Request: SETUP rtsp://fe80::8bf8:c237:4aaf:5b41/7452394713835457582 RTSP/1.0
Header:
Active-Remote: 1783092797
CSeq: 6
Content-Length: 615
Content-Type: application/x-apple-binary-plist
DACP-ID: 3104E20A2578534F
User-Agent: AirPlay/755.3.1
Body:bplist00.................................. Ret_..statsCollectionEnabledSeiv[sessionUUID^timingProtocolVosName^osBuildVersion]sourceVersionZtimingPortYosVersionTekey_..sessionCorrelationUUIDXdeviceIDUmodelTnameZmacAddress. .O..>.uU.c..Nj.....n_.$676C3981-FC29-4C2E-B438-999A45DFE283SNTPYiPhone OSU21D50W755.3.1...T17.3O.HFPLY.......<....Pf....8_..Dr...M....DP..$........J...j......C*A..r.&..}._.$E0E62A6E-01CF-44C3-BEA1-3811ACFB5596_..84:AB:1A:86:1A:A6ZiPhone12,8o...T.h.a.t. .w.o.n ..t. .w.o.r.k.._..BA:93:A0:DD:1B:58...+...G.K.W.f.m.|.............................'.-.5.8.=.........................!................
###################on_mirror_session_begin: 6435729900021920096
[DEBUG][19575136]<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Request: GET /info RTSP/1.0
Header:
Active-Remote: 1783092797
CSeq: 7
DACP-ID: 3104E20A2578534F
User-Agent: AirPlay/755.3.1
X-Apple-ProtocolVersion: 1
Body:<EMPTY>
[DEBUG][19575136]<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Request: GET_PARAMETER rtsp://fe80::8bf8:c237:4aaf:5b41/7452394713835457582 RTSP/1.0
Header:
Active-Remote: 1783092797
CSeq: 8
Content-Length: 8
Content-Type: text/parameters
DACP-ID: 3104E20A2578534F
User-Agent: AirPlay/755.3.1
Body:volume..
[DEBUG][19575136]<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Request: RECORD rtsp://fe80::8bf8:c237:4aaf:5b41/7452394713835457582 RTSP/1.0
Header:
Active-Remote: 1783092797
CSeq: 9
DACP-ID: 3104E20A2578534F
User-Agent: AirPlay/755.3.1
Body:<EMPTY>
[DEBUG]Timing query packet sent successfully
[DEBUG]Timing reply packet received successfully
[DEBUG][19575136]<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Request: SETUP rtsp://fe80::8bf8:c237:4aaf:5b41/7452394713835457582 RTSP/1.0
Header:
Active-Remote: 1783092797
CSeq: 10
Content-Length: 243
Content-Type: application/x-apple-binary-plist
DACP-ID: 3104E20A2578534F
User-Agent: AirPlay/755.3.1
Body:bplist00...Wstreams.........................ZlatencyMaxRsrYaudioModeZlatencyMinRctSspf[controlPort_..supportsDynamicStreamID[audioFormatWisMediaTtype...X...DWdefault.+.....`...........`....,7:DORVb|.............................................
[DEBUG]ap_audio_stream_service (0x7f740c038a00) is being created
[INFO]mirroring service audio data port: 54787, control port: 56293
on_audio_stream_started: 1
[INFO]audio CONTROL SYNC packet
on_audio_stream_data: 32, timestamp: 1664714600
[1] 28887 segmentation fault (core dumped) ./demo/aps-demo
I will try to debug. @GenghisKhanDrip but I saw the above error with regular videos and the usual google URLs with ad.
What ios version(s) are you trying with?
I got @GenghisKhanDrip 's demo working! Great work! First time I just got the advertisement, then with a different video it worked. No coredumps for me.
What ios version(s) are you trying with?
iOS 17.4 (or other latest)
@thiccaxe I noticed that on first run the program segmentation faults, try running again and it should work. Ensure that you’re using the YouTube app. This was an issue that was in the original app that I haven’t debugged since we are porting anyways. Glad to here that there has been some success though!
I was doing some debugging in the UxPlay source code and noticed that even in debug mode, the server does not report any HTTP connection attempts from the client. How do I get it to report said attempts (Even if they do nothing)? Could it be that IPv6 support is turned off but APSDK uses IPv6?
I would always have wireshark running
I also see that UxPlay does not receive/record HTTP as opposed to RTTP requests. Right now Uxplay uses a different port for airplay and raop. the apsdk guide https://air-display.github.io/airplay-internal/overall_architecture.html
and
https://air-display.github.io/airplay-internal/media_cast_service.html says use the same port. So in uxplay.cpp
if (tcp[2]) {
airplay_port = tcp[2];
} else {
/* is there a problem if this coincides with a randomly-selected tcp raop_mirror_data port?
* probably not, as the airplay port is only used for initial client contact */
airplay_port = (raop_port != HIGHEST_PORT ? raop_port + 1 : raop_port - 1);
}
This should be changed to
airplay_port = raop_port
This DOESNT yet fix things to receive HTTP and opposed to RAOP , but is probably a start
I made the change listed as well as changing the feature set in dnssdint.h to 0x527FFEE7
, however I noticed that the pairing never seems to complete.(wrong) Perhaps the issue is in the fact that the other implementation does not use legacy pairing? What are the differences between standard pairing and legacy pairing, seeing how the AirPlay server is using standard pairing and appears to show all the same RTSP requests during the pairing phase?
will probably need to add lots of extra debug output in apsdk to trace what it is doing make uxplay test version duplicate as much as possible,
Might want to create a reference implementation first outside of the uxplay environment, maybe in python or something for easier debugging
OK, I can confirm on my end that the playing is working. I will try to build up a reference implementation in python most likely.
you can switch off legacy pairing by commenting out this line in uxplay.cpp (line 1333)
}
/* bit 27 of Features determines whether the AirPlay2 client-pairing protocol will be used (1) or not (0) */
dnssd_set_airplay_features(dnssd, 27, (int) setup_legacy_pairing);
return 0;
}
or make it
dnssd_set_airplay_features(dnssd, 27, 0);
I’m still getting a mirrored connection… Are there any other things that can be tried? It’s hard to identify what UxPlay isn’t (or is) doing in the pairing process
Scroll up in the thread, @fduncanh made a comment about a slight cryptographic difference in UxPlay and airplay server 1
when using AirplayServer-1 and uxplay, screen mirroring, fairplay uses mode 1. the apsdk public uses mode 2. Is there any importance to mode 2? Because the code from Uxplay to handle fairplay is not working.
@thiccaxe is this the comment that you are referring to?
@GenghisKhanDrip https://github.com/FDH2/UxPlay/issues/134#issuecomment-1264677959
If "uses legacy pairing" (bit 27) is on. pair setup and pair verify are done before fp_setup (fair play)., If it is off, pair setup is skipped, and just not done, in mirror mode. As we found in UxPlay, pair_setup in not needed for mirror mode, and we tuned it off in UxPlay-1.65 . It is needed for pin-pairing so it was turned on again in uxplay-1.67 if pin-authentication is requested. apsk-public has bit 27 turned off, and doesn't use it for mirror mode (I will try to see what turning on bit 27 does to apsdk).
When one starts apsdk from the youtube app. it does a pair-setup and pair-verify with RTSP, then a HTTP get-server-info. It also seems to use the session-id which is NOT implemented in UxPlay.
[DEBUG]ap_airplay_connection (0x7f331825fa70) is being created
[DEBUG]Session (0x7f331825fa70) is waiting
[DEBUG][139857422717344]<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Request: POST /pair-setup RTSP/1.0
Header:
Content-Length: 32
Content-Type: application/octet-stream
User-Agent: AirPlay/760.20.1
X-Apple-Device-ID: 0x608b0e27c4d6
X-Apple-Session-ID: a7c14871-b100-4fd6-b151-ca1b1b45e93b
Body:....06E.>R...P.)@4.k..I..H.a....
[DEBUG][139857422717344]<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Request: POST /pair-verify RTSP/1.0
Header:
Content-Length: 68
Content-Type: application/octet-stream
User-Agent: AirPlay/760.20.1
X-Apple-AbsoluteTime: 732860813
X-Apple-Device-ID: 0x608b0e27c4d6
X-Apple-PD: 1
X-Apple-Session-ID: a7c14871-b100-4fd6-b151-ca1b1b45e93b
Body:..........R....Z.U..j.n..}...r.l...T....06E.>R...P.)@4.k..I..H.a....
[DEBUG][139857422717344]<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Request: POST /pair-verify RTSP/1.0
Header:
Content-Length: 68
Content-Type: application/octet-stream
User-Agent: AirPlay/760.20.1
X-Apple-AbsoluteTime: 732860813
X-Apple-Device-ID: 0x608b0e27c4d6
X-Apple-PD: 1
X-Apple-Session-ID: a7c14871-b100-4fd6-b151-ca1b1b45e93b
Body:....A.....+J.%....X-.tL.'...X........g."..T@.!.....IX)SC........s...
[DEBUG][139857422717344]<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Request: GET /server-info HTTP/1.1
Header:
Content-Length: 0
User-Agent: AirPlay/760.20.1
X-Apple-Client-Name: F D’s iPad
X-Apple-Device-ID: 0x608b0e27c4d6
X-Apple-Session-ID: a7c14871-b100-4fd6-b151-ca1b1b45e93b
X-Apple-VV: 16777984
Body:<EMPTY>
+++++++++++++++++++on_video_session_begin: 7606107702585987488
[DEBUG]Session (0x7f331825fa70) accepted and started
[DEBUG]ap_airplay_connection (0x7f3318468370) is being created
[DEBUG]Session (0x7f3318468370) is waiting
[DEBUG][139857425201776]<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Request: POST /pair-setup RTSP/1.0
Header:
Content-Length: 32
Content-Type: application/octet-stream
User-Agent: AirPlay/760.20.1
X-Apple-Device-ID: 0x608b0e27c4d6
X-Apple-Session-ID: a7c14871-b100-4fd6-b151-ca1b1b45e93b
Body:....06E.>R...P.)@4.k..I..H.a....
[DEBUG][139857425201776]<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Request: POST /pair-verify RTSP/1.0
Header:
Content-Length: 68
Content-Type: application/octet-stream
User-Agent: AirPlay/760.20.1
X-Apple-AbsoluteTime: 732860813
X-Apple-Device-ID: 0x608b0e27c4d6
X-Apple-PD: 1
X-Apple-Session-ID: a7c14871-b100-4fd6-b151-ca1b1b45e93b
Body:......U...>....../BW......&.&E&....!....06E.>R...P.)@4.k..I..H.a....
[DEBUG][139857425201776]<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Request: POST /pair-verify RTSP/1.0
Header:
Content-Length: 68
Content-Type: application/octet-stream
User-Agent: AirPlay/760.20.1
X-Apple-AbsoluteTime: 732860813
X-Apple-Device-ID: 0x608b0e27c4d6
X-Apple-PD: 1
X-Apple-Session-ID: a7c14871-b100-4fd6-b151-ca1b1b45e93b
Body:....7. ...=.#t.Ku#.....-7O..t.5......VrK[. ...-.......9...e...>..T.=
[DEBUG][139857425201776]<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Request: POST /reverse HTTP/1.1
Header:
Connection: Upgrade
Content-Length: 0
Upgrade: PTTH/1.0
User-Agent: AirPlay/760.20.1
X-Apple-Client-Name: F D’s iPad
X-Apple-Device-ID: 0x608b0e27c4d6
X-Apple-Purpose: event
X-Apple-Session-ID: a7c14871-b100-4fd6-b151-ca1b1b45e93b
Body:<EMPTY>
[DEBUG]Reverse purpose: event
@GenghisKhanDrip can you do a pull request so I can merge your fixes into older?
@fduncanh it should be created!
thanks @GenghisKhanDrip merged
I was trying to compare the JSONs outputted by the initial RTSP /info GET request on both APSDK and UxPlay, however outputting the plist to json appears to be impossible (plist_to_json called in raop_handler returns -2), while APSDK can produce a proper JSON file. Could the PLIST simply be invalid? I don't know why else I can't get it to print
uxplay uses libplist to handle plist. libplist supports JSON I believe. But I dont think it is used for communicating with the client. uxplay -d will print the plists that uxplay sends to the clients. I would be good to get apsdk do it too(I guess that's what you are trying to do?)
https://github.com/libimobiledevice/libplist
from libplist
plist_to_xml(plist, &output, &length);
plist_to_json(plist, &output, &length, ((options & PLIST_OPT_COMPACT) == 0));
uxplay is currently using plist_to_xml to print out plists when the -d option is used (in raop.c, raop_handlers.h,) you could easily switch it to plist_to_json.
It seems that this implementation of airplay protocol (https://github.com/alexfansz/AirplayServer-1) has all of the airplay protocol implemented (and is compatible with windows). It just needs a front end.
EDIT (2024/06/24) : this code is gone now, but can be recognized as an early version of code now released https://github.com/air-display/apsdk-public
Just posting this here, if anyone from the community wants to add such a front end to this. It also has cleaned up code and what not. I am currently testing this software and may ultimately create a frontend implementation (as it is only a library)