Platforms to show: All Mac Windows Linux Cross-Platform

PHLivePhotoEditingContextMBS class

Type Topic Plugin Version macOS Windows Linux iOS Targets
class Photos MBS Mac64bit Plugin 20.2 ✅ Yes ❌ No ❌ No ✅ Yes Desktop & iOS
An editing session for modifying the photo, video, and audio content of a Live Photo.

A Live Photo is a picture, captured by a supported iOS device, that includes motion and sound from the moments just before and after it was taken. Editing the content of a Live Photo works much like editing other asset types:

1. In an app using the Photos framework, fetch a PHAsset object that represents the Live Photo to edit, and use that object’s requestContentEditingInputWithOptions method to retrieve a PHContentEditingInputMBS object.
In a photo editing extension that runs within the Photos app, your extension’s main view controller (which adopts the PHContentEditingController protocol) receives a PHContentEditingInputMBS object when the user chooses to edit a Live Photo with your extension.
2. Create a Live Photo editing context with the initWithLivePhotoEditingInput initializer.
You can create a Live Photo editing context only from PHContentEditingInputMBS object that represents a Live Photo. Use the livePhoto property of the editing input to verify that it has live Photo content.
3. Use the frameProcessor property to define a block to be used in processing the Live Photo’s visual content. Photos will call this block repeatedly to process each frame of the Live Photo’s video and still photo content.
4. Create a PHContentEditingOutputMBS object to store the results of your edit, then call the saveLivePhotoToOutput:options to process the Live Photo and save it to your editing output object. This method applies your frameProcessor to each frame.
Note
You can also use the prepareLivePhotoForPlaybackWithTargetSize method to process a preview-quality version of the Live Photo to display in your app’s UI during editing.
5. To allow a user to continue working with the edit later (for example, to adjust the parameters of a filter), create a PHAdjustmentDataMBS object describing your changes, and store it in the adjustmentData property of your editing output.
6. In an app using the Photos framework, use a photo library change block to commit the edit. (For details, see PHPhotoLibrary.) In the block, create a PHAssetChangeRequestMBS object and set its contentEditingOutput property to the editing output that you created.
In a photo editing extension, provide the PHContentEditingOutputMBS object that you created in your main view controller’s finishContentEditing method.

When you use either of the methods listed in Processing an Editing Context’s Live Photo, Photos calls your frameProcessor delegate repeatedly to process each frame of the Live Photo’s video and still photo content. In that block, a PHLivePhotoFrameMBS object provides the Live Photo’s existing content as a CIImageMBS object. You use Core Image to modify the image, then provide the result of your edits by returning a CIImageMBS object representing the result of processing the input image.

Core Image provides several ways to process the Live Photo’s visual content. You can use the built-in filters listed in Core Image Filter Reference or create CIFilterMBS subclasses using custom graphics kernel code. Or, to use other image processing technologies, you can directly access and modify image content in pixel buffers, Metal textures, or IOSurfaceRef objects with a custom CIImageProcessorKernel subclass.

This class has no sub classes.


The items on this page are in the following plugins: MBS Mac64bit Plugin.


PHImageRequestOptionsMBS   -   PHLivePhotoFrameMBS


The biggest plugin in space...