Graphics on iOS and OS X isn't just about stroking shapes and paths in Core Graphics and trying to figure out OpenGL. The Core Image framework gives you access to about 100 built-in filters, providing everything from photographic effects and color manipulation to face-finding and QR Code generation. It can leverage the power of the GPU to provide performance fast enough to perform complex effects work on real-time video capture. But even if you're not writing the next Final Cut Pro or Photoshop, it's easy to call in Core Image for simple tasks, like putting a blur in part of your UI for transitions or privacy reasons. In this session, we'll explore the many ways Core Image can make your app sizzle.
Axa Assurance Maroc - Insurer Innovation Award 2024
Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)
1. Core Image
The Most Fun API
You’re Not Using
Chris Adamson • @invalidname
CocoaConf Columbus, August 2014
2.
3.
4. “Core Image is an image processing and analysis
technology designed to provide near real-time
processing for still and video images.”
5. Agenda
• Images, Filters, and Contexts
• The Core Image Filter Gallery
• Neat Tricks with Built-In Filters
• Core Image on OS X
6. Core Image, Core Concepts
• Core Image is
of the time
• A chain of filters describes a “recipe” of processing
steps to be applied to one or more images
• “Stringly typed”
• You only get pixels when you render
7. Typical Workflow
• Start with a source CIImage
• Apply one or more filters
• Render resulting CIImage to a CIContext, or
convert CIImage out to another type
• A few filters take or produce types other than
CIImage (CIQRCodeGenerator)
8. CIImage
• An image provided to or produced by Core Image
• But no bitmap of pixel data!
• Immutable
• -imageByCroppingToRect,
-imageByApplyingTransform
• -extent — a CGRect of the image’s size
9. CIImage sources
• NSURL
• CGImageRef
• Bitmap or JPEG/PNG/TIFF in NSData
• OpenGL texture
• Core Video image/pixel buffer
10. CIContext
• Rendering destination for a CIImage (-
[drawImage:inRect:fromRect:])
• This is where you get pixels (also, this is the processor-
intenstive part)
• On iOS, must be created from an EAGLContext. On
Mac, can be created with CGContextRef
• Can also produce output as a CGImageRef, bitmap data,
or a CVPixelBuffer (iOS only)
12. CIFilter
• Performs an image processing operation
• Typically takes and produces a CIImage
• All parameters are provided via -[setValue:forKey:]
• Stringly-typed!
• Output is retrieved with -[valueForKey:]
16. Core Image Filter Reference
Filter Name
Parameters
Note the type & number to
provide
Categories
Watch for CICategoryBuiltIn
and CICategoryVideo
Example Figure
Availability
Watch for versioning and
OS X-only filters
17. Filter Categories
• Group filters by functionality: CICategoryBlur,
CICategoryGenerator,
CICategoryCompositeOperation, etc.
• Also group filters by availability and
appropriateness: CICategoryBuiltIn,
CICategoryVideo, CICategoryNonSquarePixels
18. CICategoryGenerator
• No input image, just produces an output
• CICategoryGradient is also output-only
• Example: CICheckerboardGenerator
32. Other output options
• Use a CIContext
• -[drawImage:inRect:fromRect:] draws pixels to
the EAGLContext (iOS) or CGContextRef (OS
X) that the CIContext was created from.
• CIContext can also render to a void* bitmap
• On iOS, can create a CVPixelBufferRef, typically
used for writing to a file with AVAssetWriter
33. Chaining filters
• Use the output of one filter as the input to the next
• This doesn’t cost anything, because the CIImages
just hold state, not pixels
39. Apply filters
// Get CIImage from source image
CGImageRef
! ! ! ! ! ! ! !
loupeImage = [
!
// Apply sepia filter
[self
loupeImage = [
!
// Set sepia-filtered image as input to blend-with-mask
[_blendWithMaskFilter
! ! ! ! ! !
loupeImage = [
40. Render in CIContext
if ([
! [
}!
! !
[self
!
// GL-on-Retina fix
CGRect
drawBoundsInPoints.
drawBoundsInPoints.
! !
// drawing to CIContext draws to the EAGLESContext it's based on
[self
! ! ! ! !! ! !
! ! ! ! ! ! ! !
!
// Refresh GLKView contents immediately
[self
41. Working with Video
• AVFoundation AVCaptureVideoDataOutput and
AVAssetReader deliver CMSampleBuffers
• CMSampleBuffers have timing information and
CVImageBuffers/CVPixelBuffers
• +[CIImage imageWithCVPixelBuffer:]
43. Chroma Key (“green screen”
recipe
• Use a CIColorCube to map green-ish colors to
transparent
• Use CISourceOverCompositing to draw this
alpha’ed image over another image
51. Other Points of Interest
• CIQRCodeGenerator filter — Converts data (e.g., a string) to
a QR Code
• CILenticularHaloGenerator filter — aka, lens flare
• CIDetector — Class (not a filter) to find features in images.
Currently only supports face finding (returned as an array of
CIFeatures). Optionally detects smiles and eye blinks within
faces.
• CIImage has a red-eye enhancement that takes the array of
face CIFeatures to tell it where to apply the effect
52. Core Image on OS X
• Core Image is part of QuartzCore (or Image Kit), so
you don’t @import CoreImage
• Many more filters are available
• Can create your own filter with OpenGL Shading
Language (plus some CI extensions). See CIKernel.
• Also available in iOS 8
• Filters can be set on CALayers
53. CALayer Filters on OS X
• Views must be layer-backed (obviously)
• Must also call -[NSView
setLayerUsesCoreImageFilters:] on 10.9+
• CALayer has properties: filters, compositingFilter,
backgroundFilters, minificationFilter,
magnificationFilter
• These exist on iOS, but do nothing
57. Wrap Up: Stuff to Remember
• Get psyched about filters, but remember to check
that they’re on your targeted platform/version.
• Drawing to a CIContext on iOS must be GL-
backed (e.g., with a GLKView)
58. Q&A
Slides and code will be posted to:
http://www.slideshare.net/invalidname/
!
@invalidname
http://subfurther.com/blog