Posts

The Making of Filini's Ghost Ride Trophy

Image
A couple of years ago I came up with a brand and logo name for our MTB team. That is FILINI. For Christmas I wanted to build something custom from my dear friend Cristiano which runs with the nickname of GHOST RIDER within the team. You can cruise the images in higher resolution if you click on them.
This is the logo.
I used imagetostl.com to convert the image from pixel to an STL that I could print with my 3D printer. Pixelmator at some point was involved as well. And also a good list of WTFs to figure out why the conversion wasn't producing a more clear result.

The final print had a bit of smudges here and there but nothing that clippers and acetone couldn't remove. Can't say elbow work, more knuckle hard word.



I used Liquid Metal paint (Bronze, Silver) to create sand-metal effect with which I painted the parts.
I recycled a GOPRO part which I painted Silver. Several times because although the paint holds well on lucid plastic, it required multiple passes (and time) to …

UIAppDelegate and SceneDelegate accessing variables

As you likely know iOS 13 introduced the multiple window concept. Which means what you used to do with

(UIApplication.shared.delegate as! AppDelegate).somevariableHERE No longer works and I am happy to tell you that you can still do it just slightly different. And in my opinion clearer.

(self.window?.windowScene?.delegateas! SceneDelegate).someOtherVariableHere
References Apple usual bare-extreme-bone docs Background

Coordinator Pattern for iOS 12 and 13

Image
As the awesome Swift master Paul puts it:
Using the coordinator pattern in iOS apps lets us remove the job of app navigation from our view controllers, helping make them more manageable and more reusable, while also letting us adjust our app's flow whenever we need.  It is a darn useful pattern and something that since I learned about it, I used it in all my apps. Paul also released an advanced version as well of the original tutorial.

Patrick has a slightly different implementation of the same pattern and although I prefer Paul's approach (only slightly different in character) there are some nuances in his post that are definitely worth reading.

All that magic is challenged with iOS 13 because of the introduction of the UISceneDelegate. I hammered my head a few times before I got the clue of what had to be changed given that the window object is now no longer created in the AppDelegate but the SceneDelegate.

Mark describes the difference very well in this post. Something that I…

Convert multiple MOV files to multiple images

You want to convert your .MOV files likely taken with your iPhone when the motion option was on, to still images. You scouted the web and you can't find a simple command on the thousands of ffmpeg options to do that.

You are not crazy, they don't have it! Not in a human feasible way. So here it is:
for i in *.mov; do ffmpeg -i "$i" -frames:v 1 png/"${i%.*}.png"; done Referenceshttps://www.ostechnix.com/20-ffmpeg-commands-beginners/
https://stackoverflow.com/questions/5784661/how-do-you-convert-an-entire-directory-with-ffmpeg

Extracting Text from Image AKA OCR

Image
Extracting text from an image (OCR) can be very convenient for automating operations upon a user-provided content. iOS13 has a ton of improvements on the subject via the Vision library. To try it out, follow this recipe:

Create a new playgroundUnder the resources, folder add two images that contain textUpdate their names in the code (image1, image2) or rename the files accordinglyRun the playground and enjoy the console output.
As usual, the code is on GitHub.

ReferencesBuilding Custom Deep Learning Based OCR modelsBuilding an iOS camera calculator with Core ML’s Vision and Tesseract OCRIntroduction to Deep Learning2018’s Top 7 Libraries and Packages for Data Science and AI: Python & REmbracing Machine Learning as a Mobile DeveloperThe Lifecycle of Mobile Machine Learning Models

iBeacons, WatchKit and Fun

Image
As you remember a while back we built a Watch app for a typical caveman. In this two series article we’re going to learn how to build an iPhone app that can detect iBeacons (PART I) and then communicate with the Apple Watch to render some caveman staff on the screen (PART II).

Background Knowledge What is an iBeaconBeacons are hardware devices that intermittently send out a Bluetooth signalBased on the type of hardware you get slightly different features, like longer ranges and faster frequencies of updatesBeacons identify themselves with a unique ID (UUID), and a major/minor versioning systemYou can’t connect to any beacon just because you are awesome, you need to know to know the combo at the previous bullet. Beacons are found in packs like Dinosaurs, if they all have the same identification qualities (called Region) than they belong to the pack.A beacon region can be created using just a UUID What can you do with them?You can place in every room of your house and when you app is cl…