didDropSampleBuffer is called very often in iOSHow to set the AVCaptureVideoDataOutput in a libraryAVCaptureVideoDataOutput issue in IOS 5How to get Picture frames by frames in XcodeAVAssetWriter error when AVCaptureSession camera input is changedHow to call Objective-C code from SwiftiOS / iPhone - Silent switch not working after adding AVCaptureAudioDataOutput to AVCaptureSessionMemory Leak in CMSampleBufferGetImageBufferCIContext + createCGImage + UIImage CrashiOS camera facetracking (Swift 3 Xcode 8)Simple CIFilter Passthru with CGImage conversion returns black pixels
How to belay quickly ascending top-rope climbers?
What is the function of "mal" in saying "Das nenn ich mal ein X"?
What could make large expeditions ineffective for exploring territory full of dangers and valuable resources?
A spacecraft is travelling at X units per hour. But relative to what exactly? Does it depend on orbit? How?
Was demon possession only a New Testament phenomenon?
Applying for jobs with an obvious scar
Why don't humans perceive sound waves as twice the frequency they are?
Align the contents of a numerical matrix when you have minus signs
Are there foods that astronauts are explicitly never allowed to eat?
Did Hitler say this quote about homeschooling?
Simplest instruction set that has an c++/C compiler to write an emulator for?
Should I have shared a document with a former employee?
Evaluate the limit the following series
What is this green alien supposed to be on the American covers of the "Hitchhiker's Guide to the Galaxy"?
How to tell readers that I know my story is factually incorrect?
Making a Dataset that emulates `ls -tlra`?
To what extent does asymmetric cryptography secure bitcoin transactions?
Are there any satellites in geosynchronous but not geostationary orbits?
Who or what determines if a curse is valid or not?
What's a German word for »Sandbagger«?
Legendre Polynomial Integral over half space
Manager is asking me to eat breakfast from now on
Why are there few or no black super GMs?
Last-minute canceled work-trip means I'll lose thousands of dollars on planned vacation
didDropSampleBuffer is called very often in iOS
How to set the AVCaptureVideoDataOutput in a libraryAVCaptureVideoDataOutput issue in IOS 5How to get Picture frames by frames in XcodeAVAssetWriter error when AVCaptureSession camera input is changedHow to call Objective-C code from SwiftiOS / iPhone - Silent switch not working after adding AVCaptureAudioDataOutput to AVCaptureSessionMemory Leak in CMSampleBufferGetImageBufferCIContext + createCGImage + UIImage CrashiOS camera facetracking (Swift 3 Xcode 8)Simple CIFilter Passthru with CGImage conversion returns black pixels
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;
I capture the video and do some analyses on it in captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer
delegate. but after a short time this method is not called. then captureOutput:(AVCaptureOutput *)output didDropSampleBuffer
delegate is called.
When I don't do anything in didOutputSampleBuffer, everything is okay.
I run a tensor flow model in this delegate. And this causes the problem.
Problem:
The problem is that when didDropSampleBuffer
is called, didOutputSampleBuffer will not called again.
My solution:
My solution was stoping and starting avCaptureSession. but that caused extra memory usage! Which finally caused my app to crash.
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
// ****** do heavy work in this delegate *********
graph = [TensorflowGraph new];
predictions = [graph runModelOnPixelBuffer:pixelBuffer orientation: UIDeviceOrientationPortrait CardRect: _vwRect];
- (void)captureOutput:(AVCaptureOutput *)output didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
CFTypeRef droppedFrameReason = CMGetAttachment(sampleBuffer, kCMSampleBufferAttachmentKey_DroppedFrameReason, NULL);
NSLog(@"dropped frame, reason: %@", droppedFrameReason);
----> dropped frame, reason: OutOfBuffers
According to [https://developer.apple.com/library/archive/technotes/tn2445/_index.html]:
This condition is typically caused by the client holding onto buffers
for too long, and can be alleviated by returning buffers to the
provider.
How can I return buffer to the provider?
Edited
After 11 times that CGImageRef cgImage = [context createCGImage:resized fromRect:resized.extent];
line is executed, The didDropSampleBuffer
is called. commenting CFRelease(pixelBuffer)
has no difference in result. Does it means that pixelBuffer is not released?
CFRetain(pixelBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, kCVPixelBufferLock_ReadOnly);
CIImage* ciImage = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer];
ciImage = [ciImage imageByCroppingToRect:cropRect];
CGAffineTransform transform = CGAffineTransformIdentity;
CGFloat angle = 0.0;
transform = CGAffineTransformRotate(transform, angle);
CIImage* resized = [ciImage imageByApplyingTransform:transform];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:resized fromRect:resized.extent]; // *********************************
UIImage* _res = [[UIImage alloc] initWithCGImage:cgImage];
CFRelease(pixelBuffer);
1
ios objective-c tensorflow
|
show 1 more comment
I capture the video and do some analyses on it in captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer
delegate. but after a short time this method is not called. then captureOutput:(AVCaptureOutput *)output didDropSampleBuffer
delegate is called.
When I don't do anything in didOutputSampleBuffer, everything is okay.
I run a tensor flow model in this delegate. And this causes the problem.
Problem:
The problem is that when didDropSampleBuffer
is called, didOutputSampleBuffer will not called again.
My solution:
My solution was stoping and starting avCaptureSession. but that caused extra memory usage! Which finally caused my app to crash.
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
// ****** do heavy work in this delegate *********
graph = [TensorflowGraph new];
predictions = [graph runModelOnPixelBuffer:pixelBuffer orientation: UIDeviceOrientationPortrait CardRect: _vwRect];
- (void)captureOutput:(AVCaptureOutput *)output didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
CFTypeRef droppedFrameReason = CMGetAttachment(sampleBuffer, kCMSampleBufferAttachmentKey_DroppedFrameReason, NULL);
NSLog(@"dropped frame, reason: %@", droppedFrameReason);
----> dropped frame, reason: OutOfBuffers
According to [https://developer.apple.com/library/archive/technotes/tn2445/_index.html]:
This condition is typically caused by the client holding onto buffers
for too long, and can be alleviated by returning buffers to the
provider.
How can I return buffer to the provider?
Edited
After 11 times that CGImageRef cgImage = [context createCGImage:resized fromRect:resized.extent];
line is executed, The didDropSampleBuffer
is called. commenting CFRelease(pixelBuffer)
has no difference in result. Does it means that pixelBuffer is not released?
CFRetain(pixelBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, kCVPixelBufferLock_ReadOnly);
CIImage* ciImage = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer];
ciImage = [ciImage imageByCroppingToRect:cropRect];
CGAffineTransform transform = CGAffineTransformIdentity;
CGFloat angle = 0.0;
transform = CGAffineTransformRotate(transform, angle);
CIImage* resized = [ciImage imageByApplyingTransform:transform];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:resized fromRect:resized.extent]; // *********************************
UIImage* _res = [[UIImage alloc] initWithCGImage:cgImage];
CFRelease(pixelBuffer);
1
ios objective-c tensorflow
This may sound like a redundant question, but what do you expect to happen? From my point of view, this is what would happen if the didOutputSampleBuffer-method cannot keep up with the camera framerate.
– Mats
Mar 26 at 11:50
1
Something you are doing is likely callingCFRetain
onsampleBuffer
but not releasing it. When the capture pipeline runs out of buffers because they're all retained, it stops outputting. This is a memory leak. But you'll probably need to show more code to identify the problem and ensure your existing code snippets are accurate.
– allenh
Apr 2 at 12:01
@allenh Thanks. I updated the question.
– Fattaneh Talebi
Apr 4 at 9:31
1
You can check the retain count of your pixel buffer usingCFGetRetainCount
. It should be 1 when your delegate returns.[CIImage initWithCVPixelBuffer:
actually retains the pixel buffer, so if you're keeping the image around anywhere, that can cause problems. Additionally, I have no clue what[TensorflowGraph runModelOnPixelBuffer:orientation:CardRect:]
does to the pixel buffer. You also need to be callingCVPixelBufferUnlockBaseAddress
beforeCFRelease
.
– allenh
Apr 4 at 13:44
@allenh Thanks. It was a really good comment. After [ciimage initWithCvPixelBuffer] how should I release pixel buffer? Should I do something or it will be released automatically?
– Fattaneh Talebi
Apr 5 at 10:40
|
show 1 more comment
I capture the video and do some analyses on it in captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer
delegate. but after a short time this method is not called. then captureOutput:(AVCaptureOutput *)output didDropSampleBuffer
delegate is called.
When I don't do anything in didOutputSampleBuffer, everything is okay.
I run a tensor flow model in this delegate. And this causes the problem.
Problem:
The problem is that when didDropSampleBuffer
is called, didOutputSampleBuffer will not called again.
My solution:
My solution was stoping and starting avCaptureSession. but that caused extra memory usage! Which finally caused my app to crash.
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
// ****** do heavy work in this delegate *********
graph = [TensorflowGraph new];
predictions = [graph runModelOnPixelBuffer:pixelBuffer orientation: UIDeviceOrientationPortrait CardRect: _vwRect];
- (void)captureOutput:(AVCaptureOutput *)output didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
CFTypeRef droppedFrameReason = CMGetAttachment(sampleBuffer, kCMSampleBufferAttachmentKey_DroppedFrameReason, NULL);
NSLog(@"dropped frame, reason: %@", droppedFrameReason);
----> dropped frame, reason: OutOfBuffers
According to [https://developer.apple.com/library/archive/technotes/tn2445/_index.html]:
This condition is typically caused by the client holding onto buffers
for too long, and can be alleviated by returning buffers to the
provider.
How can I return buffer to the provider?
Edited
After 11 times that CGImageRef cgImage = [context createCGImage:resized fromRect:resized.extent];
line is executed, The didDropSampleBuffer
is called. commenting CFRelease(pixelBuffer)
has no difference in result. Does it means that pixelBuffer is not released?
CFRetain(pixelBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, kCVPixelBufferLock_ReadOnly);
CIImage* ciImage = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer];
ciImage = [ciImage imageByCroppingToRect:cropRect];
CGAffineTransform transform = CGAffineTransformIdentity;
CGFloat angle = 0.0;
transform = CGAffineTransformRotate(transform, angle);
CIImage* resized = [ciImage imageByApplyingTransform:transform];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:resized fromRect:resized.extent]; // *********************************
UIImage* _res = [[UIImage alloc] initWithCGImage:cgImage];
CFRelease(pixelBuffer);
1
ios objective-c tensorflow
I capture the video and do some analyses on it in captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer
delegate. but after a short time this method is not called. then captureOutput:(AVCaptureOutput *)output didDropSampleBuffer
delegate is called.
When I don't do anything in didOutputSampleBuffer, everything is okay.
I run a tensor flow model in this delegate. And this causes the problem.
Problem:
The problem is that when didDropSampleBuffer
is called, didOutputSampleBuffer will not called again.
My solution:
My solution was stoping and starting avCaptureSession. but that caused extra memory usage! Which finally caused my app to crash.
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
// ****** do heavy work in this delegate *********
graph = [TensorflowGraph new];
predictions = [graph runModelOnPixelBuffer:pixelBuffer orientation: UIDeviceOrientationPortrait CardRect: _vwRect];
- (void)captureOutput:(AVCaptureOutput *)output didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
CFTypeRef droppedFrameReason = CMGetAttachment(sampleBuffer, kCMSampleBufferAttachmentKey_DroppedFrameReason, NULL);
NSLog(@"dropped frame, reason: %@", droppedFrameReason);
----> dropped frame, reason: OutOfBuffers
According to [https://developer.apple.com/library/archive/technotes/tn2445/_index.html]:
This condition is typically caused by the client holding onto buffers
for too long, and can be alleviated by returning buffers to the
provider.
How can I return buffer to the provider?
Edited
After 11 times that CGImageRef cgImage = [context createCGImage:resized fromRect:resized.extent];
line is executed, The didDropSampleBuffer
is called. commenting CFRelease(pixelBuffer)
has no difference in result. Does it means that pixelBuffer is not released?
CFRetain(pixelBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, kCVPixelBufferLock_ReadOnly);
CIImage* ciImage = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer];
ciImage = [ciImage imageByCroppingToRect:cropRect];
CGAffineTransform transform = CGAffineTransformIdentity;
CGFloat angle = 0.0;
transform = CGAffineTransformRotate(transform, angle);
CIImage* resized = [ciImage imageByApplyingTransform:transform];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:resized fromRect:resized.extent]; // *********************************
UIImage* _res = [[UIImage alloc] initWithCGImage:cgImage];
CFRelease(pixelBuffer);
1
ios objective-c tensorflow
ios objective-c tensorflow
edited Apr 4 at 9:37
Fattaneh Talebi
asked Mar 26 at 11:31
Fattaneh TalebiFattaneh Talebi
4128 silver badges28 bronze badges
4128 silver badges28 bronze badges
This may sound like a redundant question, but what do you expect to happen? From my point of view, this is what would happen if the didOutputSampleBuffer-method cannot keep up with the camera framerate.
– Mats
Mar 26 at 11:50
1
Something you are doing is likely callingCFRetain
onsampleBuffer
but not releasing it. When the capture pipeline runs out of buffers because they're all retained, it stops outputting. This is a memory leak. But you'll probably need to show more code to identify the problem and ensure your existing code snippets are accurate.
– allenh
Apr 2 at 12:01
@allenh Thanks. I updated the question.
– Fattaneh Talebi
Apr 4 at 9:31
1
You can check the retain count of your pixel buffer usingCFGetRetainCount
. It should be 1 when your delegate returns.[CIImage initWithCVPixelBuffer:
actually retains the pixel buffer, so if you're keeping the image around anywhere, that can cause problems. Additionally, I have no clue what[TensorflowGraph runModelOnPixelBuffer:orientation:CardRect:]
does to the pixel buffer. You also need to be callingCVPixelBufferUnlockBaseAddress
beforeCFRelease
.
– allenh
Apr 4 at 13:44
@allenh Thanks. It was a really good comment. After [ciimage initWithCvPixelBuffer] how should I release pixel buffer? Should I do something or it will be released automatically?
– Fattaneh Talebi
Apr 5 at 10:40
|
show 1 more comment
This may sound like a redundant question, but what do you expect to happen? From my point of view, this is what would happen if the didOutputSampleBuffer-method cannot keep up with the camera framerate.
– Mats
Mar 26 at 11:50
1
Something you are doing is likely callingCFRetain
onsampleBuffer
but not releasing it. When the capture pipeline runs out of buffers because they're all retained, it stops outputting. This is a memory leak. But you'll probably need to show more code to identify the problem and ensure your existing code snippets are accurate.
– allenh
Apr 2 at 12:01
@allenh Thanks. I updated the question.
– Fattaneh Talebi
Apr 4 at 9:31
1
You can check the retain count of your pixel buffer usingCFGetRetainCount
. It should be 1 when your delegate returns.[CIImage initWithCVPixelBuffer:
actually retains the pixel buffer, so if you're keeping the image around anywhere, that can cause problems. Additionally, I have no clue what[TensorflowGraph runModelOnPixelBuffer:orientation:CardRect:]
does to the pixel buffer. You also need to be callingCVPixelBufferUnlockBaseAddress
beforeCFRelease
.
– allenh
Apr 4 at 13:44
@allenh Thanks. It was a really good comment. After [ciimage initWithCvPixelBuffer] how should I release pixel buffer? Should I do something or it will be released automatically?
– Fattaneh Talebi
Apr 5 at 10:40
This may sound like a redundant question, but what do you expect to happen? From my point of view, this is what would happen if the didOutputSampleBuffer-method cannot keep up with the camera framerate.
– Mats
Mar 26 at 11:50
This may sound like a redundant question, but what do you expect to happen? From my point of view, this is what would happen if the didOutputSampleBuffer-method cannot keep up with the camera framerate.
– Mats
Mar 26 at 11:50
1
1
Something you are doing is likely calling
CFRetain
on sampleBuffer
but not releasing it. When the capture pipeline runs out of buffers because they're all retained, it stops outputting. This is a memory leak. But you'll probably need to show more code to identify the problem and ensure your existing code snippets are accurate.– allenh
Apr 2 at 12:01
Something you are doing is likely calling
CFRetain
on sampleBuffer
but not releasing it. When the capture pipeline runs out of buffers because they're all retained, it stops outputting. This is a memory leak. But you'll probably need to show more code to identify the problem and ensure your existing code snippets are accurate.– allenh
Apr 2 at 12:01
@allenh Thanks. I updated the question.
– Fattaneh Talebi
Apr 4 at 9:31
@allenh Thanks. I updated the question.
– Fattaneh Talebi
Apr 4 at 9:31
1
1
You can check the retain count of your pixel buffer using
CFGetRetainCount
. It should be 1 when your delegate returns. [CIImage initWithCVPixelBuffer:
actually retains the pixel buffer, so if you're keeping the image around anywhere, that can cause problems. Additionally, I have no clue what [TensorflowGraph runModelOnPixelBuffer:orientation:CardRect:]
does to the pixel buffer. You also need to be calling CVPixelBufferUnlockBaseAddress
before CFRelease
.– allenh
Apr 4 at 13:44
You can check the retain count of your pixel buffer using
CFGetRetainCount
. It should be 1 when your delegate returns. [CIImage initWithCVPixelBuffer:
actually retains the pixel buffer, so if you're keeping the image around anywhere, that can cause problems. Additionally, I have no clue what [TensorflowGraph runModelOnPixelBuffer:orientation:CardRect:]
does to the pixel buffer. You also need to be calling CVPixelBufferUnlockBaseAddress
before CFRelease
.– allenh
Apr 4 at 13:44
@allenh Thanks. It was a really good comment. After [ciimage initWithCvPixelBuffer] how should I release pixel buffer? Should I do something or it will be released automatically?
– Fattaneh Talebi
Apr 5 at 10:40
@allenh Thanks. It was a really good comment. After [ciimage initWithCvPixelBuffer] how should I release pixel buffer? Should I do something or it will be released automatically?
– Fattaneh Talebi
Apr 5 at 10:40
|
show 1 more comment
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55356106%2fdiddropsamplebuffer-is-called-very-often-in-ios%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Is this question similar to what you get asked at work? Learn more about asking and sharing private information with your coworkers using Stack Overflow for Teams.
Is this question similar to what you get asked at work? Learn more about asking and sharing private information with your coworkers using Stack Overflow for Teams.
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55356106%2fdiddropsamplebuffer-is-called-very-often-in-ios%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
This may sound like a redundant question, but what do you expect to happen? From my point of view, this is what would happen if the didOutputSampleBuffer-method cannot keep up with the camera framerate.
– Mats
Mar 26 at 11:50
1
Something you are doing is likely calling
CFRetain
onsampleBuffer
but not releasing it. When the capture pipeline runs out of buffers because they're all retained, it stops outputting. This is a memory leak. But you'll probably need to show more code to identify the problem and ensure your existing code snippets are accurate.– allenh
Apr 2 at 12:01
@allenh Thanks. I updated the question.
– Fattaneh Talebi
Apr 4 at 9:31
1
You can check the retain count of your pixel buffer using
CFGetRetainCount
. It should be 1 when your delegate returns.[CIImage initWithCVPixelBuffer:
actually retains the pixel buffer, so if you're keeping the image around anywhere, that can cause problems. Additionally, I have no clue what[TensorflowGraph runModelOnPixelBuffer:orientation:CardRect:]
does to the pixel buffer. You also need to be callingCVPixelBufferUnlockBaseAddress
beforeCFRelease
.– allenh
Apr 4 at 13:44
@allenh Thanks. It was a really good comment. After [ciimage initWithCvPixelBuffer] how should I release pixel buffer? Should I do something or it will be released automatically?
– Fattaneh Talebi
Apr 5 at 10:40