Re: Points vs pixels in a bash script
Re: Points vs pixels in a bash script
- Subject: Re: Points vs pixels in a bash script
- From: Steve Christensen via Cocoa-dev <email@hidden>
- Date: Mon, 8 Jun 2020 15:46:12 -0700
I don’t have an answer for your question, and I know that this doesn’t actually
answer it below, but it may still provide some info to help you move forward.
The windowing system is always working in points from a coordinates point of
view. So window.frame and view.frame are in points, not pixels. This allows
drawing to be abstracted across screens that have different point scales
(pixel-to-point size ratio). You specify everything in points and what that
means when it’s time for the actual drawing to occur on a particular screen is
handled for you.
I would suggest that you check out the NSScreen class, which provides
properties such as the frame (in points) and backingScaleFactor (pixel-to-point
ratio), as well as methods to convert between point- and pixel-based NSRects.
Since it’s possible to, for example, have a 2x Retina display as the main
screen and plug in an older 1x display, then move the window so it straddles
the two screens, then there will not be a single scale value when calculating
the number of pixels per point.
You can do things like iterate over all attached screens and find the largest
backingScaleFactor, then use that to calculate the pixel size for an image, for
example.
> On Jun 8, 2020, at 2:43 PM, Gabriel Zachmann via Cocoa-dev
> <email@hidden> wrote:
>
> I know this mailing list might not be the perfect fit for my question,
> but maybe someone has an idea.
>
> I have a problem converting points (I think) to pixels in a bash script.
> I'd rather not write an extra C/Cocoa utility for that.
>
> Using
> system_profiler SPDisplaysDataType
> I can retrieve the size of a Mac's display in pixels.
>
> However, the command
>
> tell application "System Events" to get the size of every window of every
> process
>
> (which I execute from my bash script using osascript) apparently does NOT
> return window sizes in pixels. I guess it's the strange "Apple points" units.
> According to my experiments, on my MacBook Pro Retina, for instance, a
> fullscreen app (e.g., Keynote presentation) has a window size of 1680 x 1050.
> By contrast, system_profiler reports 2880 x 1800.
>
> So, the question is: how can I determine the screen size of a Mac from my
> bash script in the same units like "System Events" uses?
> Or, how can I determine the factor by which I have to convert pixels into
> those other units?
>
>
> Many thanks in advance for any insights or hints.
>
> Best regards, Gabriel
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden