Points vs pixels in a bash script
Points vs pixels in a bash script
- Subject: Points vs pixels in a bash script
- From: Gabriel Zachmann via Cocoa-dev <email@hidden>
- Date: Mon, 8 Jun 2020 23:43:22 +0200
I know this mailing list might not be the perfect fit for my question,
but maybe someone has an idea.
I have a problem converting points (I think) to pixels in a bash script.
I'd rather not write an extra C/Cocoa utility for that.
Using
system_profiler SPDisplaysDataType
I can retrieve the size of a Mac's display in pixels.
However, the command
tell application "System Events" to get the size of every window of every
process
(which I execute from my bash script using osascript) apparently does NOT
return window sizes in pixels. I guess it's the strange "Apple points" units.
According to my experiments, on my MacBook Pro Retina, for instance, a
fullscreen app (e.g., Keynote presentation) has a window size of 1680 x 1050.
By contrast, system_profiler reports 2880 x 1800.
So, the question is: how can I determine the screen size of a Mac from my bash
script in the same units like "System Events" uses?
Or, how can I determine the factor by which I have to convert pixels into those
other units?
Many thanks in advance for any insights or hints.
Best regards, Gabriel
Attachment:
smime.p7s
Description: S/MIME cryptographic signature
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden