Re: Points vs pixels in a bash script
Re: Points vs pixels in a bash script
- Subject: Re: Points vs pixels in a bash script
- From: Gabriel Zachmann via Cocoa-dev <email@hidden>
- Date: Tue, 9 Jun 2020 23:31:28 +0200
> you're not taking into account the current screen resolution (a.k.a. display
> mode). The user
> [...]
> necessarily have to build a C utility for it. You can invoke the Swift
> interpreter to execute
Thanks a lot for your hints, but unfortunately , I don't have the time to learn
Swift or PyObjC.
(This is not my day job.)
And lacking any other quick fix for the moment,
I would even be happy with a non-perfect solution.
When my Macbook is only connected to the external monitor (lid closed),
everything is fine:
system_profiler reports a screen resolution of 2560 x 1440,
and "System Events" reports a full-screen window with exactly the same
resolution.
Incidentally, the macbook uses the Radeon for rendering (as reported by
system_profiler).
Now, when the Macbook uses only the built-in LCD display (no other monitor),
then it does not work, because
system_profiler reports a screen resolution of "2880 x 1800 Retina",
and "System Events" reports a full-screen window with 1680 x 1050.
I could hard-code the values for windows in full-screen mode,
but a slightly more generic solution would be nice.
Best, G.
Attachment:
smime.p7s
Description: S/MIME cryptographic signature
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden