Re: Mixing Cpp and ObjC - Memory Leak in ARC
Re: Mixing Cpp and ObjC - Memory Leak in ARC
- Subject: Re: Mixing Cpp and ObjC - Memory Leak in ARC
- From: Hado Hein Applelists <email@hidden>
- Date: Mon, 28 Nov 2016 13:26:51 +0100
> On 27.11.2016, at 23:54, Jens Alfke <email@hidden> wrote:
>
>
>> On Nov 27, 2016, at 8:40 AM, Hado Hein Applelists <email@hidden> wrote:
>>
>> Unfortunately, when I turn on Malloc Debug Functions in Xcode the thing does not compile anymore.
>
> I’m not sure what you mean by this; there isn’t anything with that exact name. Do you mean the Address Sanitizer? That’s the only related option that affects compilation.
>
> In any case you don’t need to use the Address Sanitizer to use Instruments’ leak detector. (In fact I’m not sure the two are even compatible.)
Instruments unfortunately finds no leaks.
>
>> I assume that this might have to do something with using a Foundation constructor In cpp code which doesn’t arc correctly.
>
> I have a lot of Objective-C++ code that calls and is called by pure C++, and it works fine with ARC.
I’d suppose the same.
>
>> No. I tried some autoreleasepools and it is still leaking.
>
> Can you show the code that you tried? It’s important to wrap the autorelease pool around all the code that calls Obj-C methods, not just places where you directly allocate objects, because anything you call might indirectly allocate objects.
>
> —Jens
Yup. It’s testing for a project. For me it looks correct. I have no idea where to look next.
These sensors are kept in an array in the AppDelegate. There are NSColorWells in the UI that are fed via a binding from these values. It does not make a difference when I cut this binding.
In a .h file:
@interface SensorField : NSObject
…
@property (retain, nonatomic) NSColor* avgColor;
- (void)setAvgColorWithR:(CGFloat)r G:(CGFloat)g B:(CGFloat)b;
@end
in a .m file since there is no cpp inside:
@implementation SensorField
- (void)setAvgColorWithR:(CGFloat)r G:(CGFloat)g B:(CGFloat)b
{
@autoreleasepool {
self.avgColor = [NSColor colorWithRed:r green:g blue:b alpha:1];
// this just for testing a call with three parameters instead an obj-c ptr
}
}
- (void) setAvgColor:(NSColor *)avgColor
{
@autoreleasepool { //[_avgColor release];
_avgColor = avgColor;
}
}
@end
Now in a .hpp:
#include "DeckLinkAPI.hpp"
#include <pthread.h>
#include <CoreFoundation/CoreFoundation.h>
#include <CoreGraphics/CoreGraphics.h>
#include <mach/mach.h>
#import "AppDelegate.h"
#import "SensorField.h"
#define BMD_DISPLAYMODE bmdModeHD720p50
#define TCISDROPFRAME true
#define PIXEL_FMT bmdFormat8BitYUV
void InitDeckLinkAPI(void);
class DeckLinkCaptureDelegate : public IDeckLinkInputCallback
{
public:
DeckLinkCaptureDelegate();
~DeckLinkCaptureDelegate();
virtual HRESULT STDMETHODCALLTYPE QueryInterface(REFIID iid, LPVOID *ppv) { return E_NOINTERFACE; }
virtual ULONG STDMETHODCALLTYPE AddRef(void);
virtual ULONG STDMETHODCALLTYPE Release(void);
virtual HRESULT STDMETHODCALLTYPE VideoInputFormatChanged(BMDVideoInputFormatChangedEvents, IDeckLinkDisplayMode*, BMDDetectedVideoInputFormatFlags);
virtual HRESULT STDMETHODCALLTYPE VideoInputFrameArrived(IDeckLinkVideoInputFrame*, IDeckLinkAudioInputPacket*);
private:
ULONG m_refCount;
pthread_mutex_t m_mutex;
};
and in it’s corresponding .mm (containing only the cpp class implementation, but needs calls ou to my AppDelegate/Sensor Class:
HRESULT DeckLinkCaptureDelegate::VideoInputFrameArrived(IDeckLinkVideoInputFrame* videoFrame, IDeckLinkAudioInputPacket* audioFrame)
{
static unsigned long g_frameCount = 0;
static int g_videoOutputFile = -1;
static int g_audioOutputFile = -1;
static AppDelegate* appDelegate = [[NSApplication sharedApplication] delegate];
@autoreleasepool
{
CGFloat masterDimmer = [[NSUserDefaults standardUserDefaults] floatForKey:@"masterDimmer"];
// Handle Video Frame
if (videoFrame)
{
if (videoFrame->GetFlags() & bmdFrameHasNoInputSource)
{
…
}
else
{
size_t inBytesPerRow = videoFrame->GetRowBytes();
void * frameBytes;
videoFrame->GetBytes(&frameBytes);
for (bmSensorField *iter in appDelegate.sensorArray)
{
CGFloat r,g,b = 0;
unsigned char yIn,uIn,vIn = 0;
… a whole bunch of calculations with the pixels in the buffer
… until passing out a color value to the sensor class
r = r/255;
g = g/255;
b = b/255;
// iter.avgColor = [NSColor colorWithRed:r green:g blue:b alpha:1];
//[iter performSelectorOnMainThread:@selector(setAvgColor:) withObject:[NSColor colorWithRed:r green:g blue:b alpha:1] waitUntilDone:NO];
// [iter setAvgColorWithR:r G:g B:b];
dispatch_async( dispatch_get_main_queue(), ^{
@autoreleasepool
{
iter.avgColor = [NSColor colorWithRed:r green:g blue:b alpha:1];
//[iter performSelectorOnMainThread:@selector(setAvgColor:) withObject:[NSColor colorWithRed:r green:g blue:b alpha:1] waitUntilDone:NO];
}
});
searchPointer = nil;
}
frameBytes = nil;
videoFrame = nil
}
g_frameCount++;
}
}
}
thx, Hado
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Xcode-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden