If you are having problems paying for your Apple Developer membership in 2022, try this

I have been trying to renew my Apple Developer membership for a couple of days now, talking with support at Apple provided very little info on what might be going on. The card i use, is the card I have used before, and is being used for other Apple services and purchases, so it works with Apple. Still no luck paying for the Apple Developer program. The support from Apple suggested that I try with another card, well, It would be just weird for me to just open a new bank account and different cards just to try to debug their system, which worked well with my bank, Raiffeisen, and I am not very keen in switching my bank. So what can it be done? I could go and talk to people at my bank, but I thought I will leave it for another step, and give Revolut a try. I quickly opened an account with my Phone, then used a virtual disposable card, and surprise, Apple accepted the payment, so that was the solution. Hopefully this will help other Apple developers around the world to purchase an Apple Developer account.

Practical SwiftUI stuff I learned while building a Photo Viewer for MacOS

Photo Viewer for Mac
Photo Viewer for MacOS made using SwiftUI

If you just want to tinker with all the code, you can get it here.

How to make your SwiftUI MacOS app open a file

You can open the file by using “Command + Click” Open With dialog, or you change the file association to use your app as the default app for that file type.

Since my MacOS app is used to open images, the code is for opening image types:

VStack{
//.....
}.onOpenURL(){ url in
      if let imageData = try? Data(contentsOf: url) {
         DispatchQueue.main.async {
         image = NSImage(data: imageData)
        }
    }
}

Don’t forget to remove the App Sandbox in Xcode so you can access the files in the operating system. Unfortunately this will make your app very unlikely to be accepted in the App Store, but you can still distribute it on your own.

How to make your SwiftUI MacOS App open a file in the same app window

By default, you MacOS app will open a new window each time you open a new file with your program. You rarely will need this, not exactly sure why one would make this behaviour the default, but here is how “fix it”

WindowGroup {
   ContentView().preferredColorScheme(.dark)
                .handlesExternalEvents(preferring: Set(arrayLiteral: "pause"), allowing: Set(arrayLiteral: "*"))
                .frame(minWidth: 300, minHeight: 300)
        }
        .windowStyle(.hiddenTitleBar)
        .commands {
            CommandGroup(replacing: .newItem, addition: { })
   }.handlesExternalEvents(matching: Set(arrayLiteral: "*"))

Download the FastImage for MacOS source code here.

Xcode Testing ML model for your app without running on the device

As of this writing, testing the image detection ML model inside the simulator (I am using Xcode 14.0 beta 3), won’t work if you are targeting iOS or iPadOS, you have to use an actual device. But now I stumbled onto the fact that if you can make your app, run for the “My Mac (Designed for iPad)” target, you can actually test the ML model without installing it on an actual device. So next time you need to work on the integration of an ML model in your app, you can use the “My Mac (Designed for iPad)” target, and move trough development without a problem. I do hope in future versions, you will be able to run ML models inside the simulator.

Extension for inverting the Color of your image in SwiftUI based on .dark or .light theme

I am using this little extension to invert the colors of the icons in my app. The icons are in black and white, and designed first to be used in the .light theme mode, that means they have beed design in black lines, so when the user will switch to dark, the icons should change the lines to white.

struct DetectThemeChange: ViewModifier {
    @Environment(\.colorScheme) var colorScheme

    func body(content: Content) -> some View {
        
        if(colorScheme == .dark){
            content.colorInvert()
        }else{
            content
        }
    }
}

extension View {
    func invertOnDarkTheme() -> some View {
        modifier(DetectThemeChange())
    }
}

//usage example

Image("iconName").resizable().scaledToFit().frame(height: 40).invertOnDarkTheme()

That’s it, now let’s get back to WWDC2022 which will start in a couple of hours.

Listening this week

Sadly, we have lost one of my favourites composers.
In the words of Yanis Varoufakis, “We owe you melodies-soundscapes without which life would be much, much poorer”. For me, I am sure for others too, Vangelis, was a dream enabler.

Error – The VNCoreMLTransform request failed

If you are working with the vision framework and you are getting this error, while testing the iOS app in simulator, well, test it on the actual device and see if the error goes away. Very unfortunate error description.

This was very confusing personally, since the code was working fine while I was using it on an macOS app. Also I did not find any warnings anywhere that you are supposed to only test it on an actual device. Not sure how this makes any sense, but glad that I figure it out, It drove me crazy for the last 2 days.

Today I started coding (using Xcode) on the MacBook Pro 16 2021, here are my first thoughts

Let me start with the main reason I switched from MacBook Air to MacBook Pro 16, easy to guess, screen size. After coding for a while on the Air, I realized that I need a bit more real estate for running the simulator or seeing the changes in the Canvas while working on some SwiftUI component. For a while, I used my old iPad Pro as an “external monitor”, but still I found, I moved my head too much and break the concentration from the current task. I have tried using my desktop monitor, but it is a little too big and again I found that I moved my head a bit too much, weirdly enough, I preferred to go back to my 13 inch screen. This is one of the reasons I actually didn’t went for a desktop iMac, while the bigger screen is good if I would edit a photo or do some drawing, for actual coding, I prefer something smaller.

My Air is light, very light, you can pick it up with two fingers and feels safe in your hand. The MacBook Pro 16 is much more heavier, I can’t pick it up with two fingers at all, I have to actually use the whole hand. But it’s not a problem at all, and is something that I can get used to it. The reason, I do need to quickly hold the laptop in one hand, is that sometimes I need to protect it from the incoming tornado that is my 2,8 year old son rushing in my office and jumping straight on my armchair where I work. He is growing up fast, and with it, is a bit more careful and aware of the tech in the house, so we should be fine.

Is it speedier? Well yes, you can notice it in Xcode, but not by a huge amount. I do think, it will save a bit of time over the course of a year. The quicker the projects runs the better it is for you as a coder. If it takes too much time, you can break the flow. The Air still does a really good job with the M1 CPU, for the size and the price, the Air is still remarkable.

What else, I didn’t hear the fans yet, I suspect I won’t, with just a simple Xcode project. I do expect to hear them when I work on some ML models, will see. Obviously, if there were no fans, there was no heat.

The screen is really nice, and feels just the right amount of size for what I need. Not to small, not too big. Like all Apple screens, it’s comfortable to look at, and while I don’t necessarily work outside in the daylight, I do like that I can bring it with me and while the kids have fun outside, in grandma’s yard, I can work on my projects. I suspect this to happen during the summer days.

Moving stuff to the new machine was pretty easy, and installed all the stuff I need to code, and left other programs on the Air, like for example, Minecraft (education) that I play daily with my oldest son.

The sound is really impressive, while I won’t use it much at home since I prefer my Bose speakers, the sound coming from the MacBook Pro 16 is not something you would expect from a laptop, really deep bass and clear and well balanced in the midrange.

I don’t think I will bring this one on trips with me, because it’s so heavy, and will probably prefer to bring the Air on trips where I only use the system for Photography, even if the MacBook Pro has an SD card reader and I have to carry additional connectors for the Air.

There is definitely room and uses (in my case) for both. Initially the plan was to give the Air to my big son, when he is going to start school in the autumn, but he prefers my older Lenovo Yoga with touch screen, that comes with a bigger screen than the Air. The other question might be whether to buy the Max or the Pro, and the answer should be given by your intended usage, if you mainly use the CPU and rarely the GPU cores, then go for the Pro, if you need the GPU’s then go for the Max.

Well that’s it for now, if you feel that you like the screen size, the MacBook Pro 16 2021 is definitely a good machine to code on.

Today I started coding (using Xcode) on the MacBook Pro 16 2021, here are my first thoughts

Fixing people on a global scale

So it seem Elon is looking to fix Social Media:

One can only guess how much of this fixing is an algorithm fix and how much is just the human nature of mental errors. I am cautiously optimistic about the world in general, so I believe we might be able to guide/nudge people into healthier behaviours if one platform won’t optimise for engagement/selling ads. One of the big selling points of Apple (for me) is the fact that their products and software can help you reach higher goals, trough physical and mental health, the nudge is in a better direction.

I do feel that a new Social Network is needed and I believe that people should be allowed to say whatever they want. We should fix the cause of the problems when it comes to misinformation, which is better educate the people on this planet and not restrict input. Giving some people all the power to establish the truth, without any way to contest it, is very anti-science and after a lot of iteration, this “game” will output a lot errors. Some of this errors will enable incompetent and really bad people to get to power. This people will inflict a lot of suffering onto the living things on this planet, as we could observe had actually happened over and over again. The solution can only be a good global continued (adults too) education.

Quick update

Kaleidoscope – by Cosmin Dolha

For the last 1,5 years I have dived deep into the land of Photography and found out that I really like it.

Another deep dive (currently in progress) is into Behavioral Science, Neuropsychology, Behavioral Economics and Marketing, topics that I always had a fascination with.

I have left the coding part, to the side for the most part, for a couple of reasons. The first one is that I wanted to gain some understanding of how I can make use of the new Machine Learning tech in my own projects, and second I wanted to make a big switch to Apple’s Swift programming language and its ecosystem. So I have left some space for me to be able to forget a bit about the way I worked for the last 15 years, unlearning things takes time, reconfiguring yourself is not a small task.

The reason I felt compelled to write this update is because today I had a couple of encouraging results with ML and Swift, and of course it’s all related to Photography. A big help in generating the project idea came from the Behavioral Science and Marketing books I am currently studying.

There is a long road ahead (I suspect 3-6 months) until I have proof it works well enough to open it to the public, in the meantime, I will have some more fun with Photography and continue my deep dive with Neuropsychology, all while experimenting with ML and Swift.

P. S. I have my eyes on an EEG sensor 😉