Open wxtrac opened 9 years ago
If wxGetDisplaySize()
returns wrong (halved) values, this would seem to be a bug in wxOSX, so probably not Python-specific.
Would be nice to have a confirmation from someone with a retina Mac that the problem can also be seen in the C++ code (e.g. the display sample).
If I'm not mistaken, this happens because the sizes on OS X are returned in points which on Retina devices translate to more than 1 actual pixels (usually 4, 2 vertically by 2 horizontally) - see OS X guidelines for highres devices: [https://developer.apple.com/library/mac/documentation/GraphicsAnimation/Conceptual/HighResolutionOSX/Explained/Explained.html#//apple_ref/doc/uid/TP40012302-CH4-SW1]
To obtain the real pixel count one has to use convertSizeToBacking Cocoa method (see https://developer.apple.com/library/mac/documentation/GraphicsAnimation/Conceptual/HighResolutionOSX/APIs/APIs.html#//apple_ref/doc/uid/TP40012302-CH5-SW2). But this is not exposed by wxwidgets AFAICS.
There is a public wxWindow::GetContentScaleFactor()
. But this should be done internally in wxOSX anyhow, so it could just use the Cocoa method directly.
I don't have a retina machine to test this on, but I think the fix should be relatively trivial and I'd be glad to apply any [HowToSubmitPatches patches] implementing it.
Issue migrated from trac ticket # 16893
component: wxOSX | priority: normal | keywords: GetDisplaySize incorrect
2015-03-05 13:09:37: adm746 (dan) created the issue
I recently migrated from wxpython '2.8.10.1 (mac-unicode)' to '3.0.0.0 osx-cocoa (classic)'. The 2.8 system is on a 2013 mac book air. wx.GetDisplaySize() returns (1440, 900) which is correct and the output image (using a modified wx.lib.plot) looks like this:
http://imgur.com/tMkYwPv
The 3.0 system is on a 2014 mac book pro. wx.GetDisplaySize() returns (1280,800) which is incorrect (should be 2560, 1600) and the output image (using the same wx scripts) looks like this:
http://imgur.com/2eFcMHS
Both python environments were anaconda version 2.7