Real-time Object Recognition using Apple's CoreML 2.0 and Vision API -
-
Updated
Jul 8, 2018 - Swift
Real-time Object Recognition using Apple's CoreML 2.0 and Vision API -
Uses iOS 11 and Apple's CoreML to add nutrition data to your food diary based on pictures. CoreML is used for the image recognition (Inceptionv3). Alamofire (with CocoaPods) is used for REST requests against the Nutritionix-API for nutrition data.
The browser that can predict where is the user gaze area in smart phone's screen.
Stable Diffusion over gRPC for Apple Platforms
Add a description, image, and links to the apple-coreml topic page so that developers can more easily learn about it.
To associate your repository with the apple-coreml topic, visit your repo's landing page and select "manage topics."