iphone - Why is scaling down a UIImage from the camera so slow? -
[Update: Last call for creative ideas here! My next option is to ask for the apple, I think.]
Yes, this is a lot of pixels, but in the graphics hardware 1 on the screen, the quad with 1024x1024 texture is capable of creating too many pictures / sec 60's, so in reality the way to reset a 2048x1536 image to 640x480 should be in less than 1.5 seconds.
So why is it so slow? Does the underlying image data give the OS returns from the peer, which is not ready to be pulled in some way, so that it switches into some fashion that the GPU can not help with it?
My best estimate is that it should be changed from RGBA to ABGR or something; Can anyone think in such a way that it is possible for the system to understand the system to give me data quickly, even if it is in the wrong format, and I will work with you later?
As far as I know, there is no dedicated "graphics" memory in the iPhone, so there should be no question of moving image data from one place to another.
So, the question: is there something? In addition to using the CGBitmapContextCreate and CGContextDrawImage that takes advantage of GPU, the option drawing method
to check some: If I start with a UIImage of the same size that the picture picker is not from Is it just as slow? Obviously not ...
UPDATE: Matt Long found that it only takes 30ms to change the size of the [informationForKey: @ "UIImagePickerControllerEditedImage"] image you get back from picker
, if you have enabled the crop with manual camera control, for this matter, I have no use of where I am using where I use the take picture
I think the edit The image shown is kCGImageAlphaPremultipliedFirst
but the original image is kCGImageAlphaNoneSkipFirst
.
Further updates: Jason Crofford suggests CGContextSetInterpolationQuality (reference, kCGInterpolationLow)
, which actually costs about 1.5 seconds to 1.3 seconds, with the cost in the cut image Quality-- but it still should be able to speed away the GPU!
The final update expires before the week : User Refenantis has done some profiling, which indicates that the camera captures 1.5 seconds of the disk as JPEG Spent reading and then reading it back. If true, very bizarre
Use the shark, profile it, find out what's going on for so long
I have to do a lot of work with Media Player Framework and when you get the property to sing on the iPod, the first property request is very slow compared to the first request, because the first property request is in the mobilemediaplayer package.
I can bet that this Not a similar situation. The timing profile and generic UIImagePickerControllerOriginalImage status in the shark of the UIImagePickerControllerEditedImage status of both Matt Long.
In both cases, the majority of time is picked up by CGContextDrawImage. In the case of Matt Long, UIImagePickerController keeps the user's image and image in the middle of the entry into the 'Edit' mode.
CGContextDrawImage = Scaling up the percentage of time taken up to 100%, CGContextDelegateDrawImage then takes 100%, so ripc_DrawImage (from libRIP.A.dylib) takes 100%, and then ripc_AcquireImage (which looks it JPEG's is like decompresses, and takes up to 93% of the time, _cg_jpeg_idct_islow, vec_ycc_bgrx_convert, decompress_onepass, sep_upsample takes most of your time). In fact, only 7% of the time is spent in Repech_Rendreme, which I think is the actual drawing.