Jekyll2025-12-29T12:41:40+00:00https://dasdom.dev/feed.xmldasdom.devA blog about Xcode, iOS development and test-driven development.I was wrong2025-06-19T06:00:00+00:002025-06-19T06:00:00+00:00https://dasdom.dev/i-was-wrong-kind-ofIn a recent post on Mastodon I expressed my frustration that user interfaces Apple engineers build with SwiftUI tend to be so bad, they could be from a very old Windows version.

This new user interface is garbage. It gets worse when you realize that this had to be implemented by a person working on Xcode. This poor person had to build a user interface that is clearly designed for a bad toy software into a software they have to use every day.

So I assumed the reason has to be that it’s easier to build bad user interfaces in SwiftUI on the Mac. To proof me wrong I build the Behavior settings user interface from Xcode 16 myself using SwiftUI.

It turned out, I indeed was wrong.
Kind of.

It is very easy to build the user interface from Xcode 16 using SwiftUI. Even a person like me, who hates and avoids SwiftUI with a passion can build a usable user interface with it.

(I ignored data handling and there is a bug that the selection of the table on the left side isn’t shown, but apart from that, this seems to work.)

My experiment for a usable Behaviors setting build with SwiftUI.

So the question is, why did Apple choose to build such a horrible user interface?

The Behavior setting in Xcode 26..

What do you think?

Let me know on Mastodon.

Apple, if you need that code, my DM are open. ;)

(No AI was used to write this post.)

]]>
Frames in the iOS simulator in landscape2025-06-02T06:00:00+00:002025-06-02T06:00:00+00:00https://dasdom.dev/frames-in-the-ios-simulator-in-landscapeI might be building a developer tool for iOS developers. Recently I encountered yet another problem. In landscape the frames of the user interface elements are all wrong. It looks like they are the frames of the elements in portrait.

To me it looks like landscape on iOS simulator (or maybe even on the real device) is some kind of hack. First I tried to rotate and shift the frames but this didn’t work. It turned out that the elements would have to be shifted by different amounts.

Next I tried to check how Apple solved this in their demo project. They didn’t. The demo project had the same problem.

OK, next idea: Accessibility Inspector. Turns out, if the source is set to the host Mac, I see the same problem.

But if I change the source to the iOS simulator, the frames are correct.

OK…

At the moment I have no idea how I could change the host in my tool to get the same behavior. I might still release the app with this little flaw.

If you know how to fix this, hints would be highly appreciated!

Until next time, have a nice day, week and month.

(No AI was used to write this post.)

]]>
Building A Dev Tool2025-05-25T06:00:00+00:002025-05-25T06:00:00+00:00https://dasdom.dev/building-a-dev-toolFrom time to time I switch off my external trackpad to force myself to use keyboard shortcuts in Xcode. In principle I know many useful shortcuts but I often forget to use them because I need to think to much about them. The easiest way to make using them more natural is to use them more.

But when the external trackpad is switched off, I need to use the trackpad of my MacBook to control the iOS simulator. I tried apps like Homerow or Shortcat but these do not find all elements in the iOS simulator. In addition, as they need accessibility access to my Mac I might not be able to use them on my work Mac (because they can work like keyloggers).

I’m a developer. So the natural conclusion could be that I have to build such a tool myself. And this is what I tried. I’m not done yet and it’s still not clear if I will succeed. Especially as I face some strange problems right now. But these will be discussed in a later post. This post is about how I build the tool and what it can do at the moment.

Demo

SwiftUI or AppKit?

SwiftUI is a scam.
– Dominik Hauser

I don’t like SwiftUI. So I use AppKit. This comes with another advantage. I can write the code in Objective-C. This is a good thing because I like and miss Objective-C a lot.

In addition the Accessibility API is C-based. Such APIs are easier to use in Objective-C than in Swift.

How this works

Most Mac apps have some kind of accessibility support. For example native controls like buttons or text fields expose themselves to the accessibility system. Screenreaders or other accessibility tools can find those user interface elements. Apps can hook into this system.

In the accessibility API the user interface elements have different roles. For example an NSButton has the role AXButton. Elements can be grouped into AXGroups. A tool can ask the accessibility system for the AXChildren of a accessibility element. And this is what I tried first.

Here is the process. First the app searches for running iOS simulators and takes the first it finds. (I might improve that in the future and let the user select from a list of simulators.)

- (void)findSimulators {
    NSArray<NSRunningApplication *> *applications = [[NSWorkspace sharedWorkspace] runningApplications];
    NSMutableArray<NSString *> *names = [[NSMutableArray alloc] init];
    NSMutableArray<NSRunningApplication *> *simulators = [[NSMutableArray alloc] init];
    for (NSRunningApplication *application in applications) {
        if ([application.bundleIdentifier isEqualToString:@"com.apple.iphonesimulator"]) {
            NSLog(@"simulator: %@", application);

            [simulators addObject:application];

            [names addObject:application.localizedName];
        }
    }

    NSRunningApplication *simulator = simulators.firstObject;
    self.simulatorRef = AXUIElementCreateApplication(simulator.processIdentifier);
    self.simulator = simulator;
    NSLog(@"applicationRef: %@", self.simulatorRef);

    [simulator addObserver:self forKeyPath:@"ownsMenuBar" options:NSKeyValueObservingOptionNew context:nil];
}

Then the app searches for the NSWindow (role AXWindow) and asks it for it’s children. To find the children of an AXElement (in this case an AXWindow) I use methods from a demo project provided by Apple.

+ (NSArray<NSValue *> *)childrenOfUIElement:(AXUIElementRef)element {
    CFArrayRef children = nil;

    children = (__bridge CFArrayRef)([UIElementUtilities valueOfAttribute:NSAccessibilityChildrenAttribute ofUIElement:element]);

    return (__bridge NSArray<NSValue *> *)(children);
}

+ (id)valueOfAttribute:(NSString *)attribute ofUIElement:(AXUIElementRef)element {
    CFTypeRef result = nil;
    NSArray *attributeNames = [UIElementUtilities attributeNamesOfUIElement:element];

    if (attributeNames) {
        if ( [attributeNames indexOfObject:(NSString *)attribute] != NSNotFound
                &&
        	AXUIElementCopyAttributeValue(element, (CFStringRef)attribute, &result) == kAXErrorSuccess
        ) {
        }
    }
    return (__bridge id)(result);
}

Unfortunately there is a bug in the Accessibility API. If the app recursively asks the AXElements for their children it does not find all elements. For example in the following screenshot the navigation bar items (back button and the button on the right) are missing.

After some digging and debugging I found out that the navigation bar is exposed as an AXGroup but with no children. To make sure it’s not a bug in my code I used Shortcat with the same result.

A different approach

The I tried to define a grid and ask accessibility to find all the elements on the grid cross sections. This worked but took significantly longer. First I was willing to accept the worse performance but then I had an idea. What if I use the quick method first and for all the AXGroup with zero children I then use the grid method. This worked. The tool now quickly finds all user interface elements. Hooray!

More problems

The tools works well unless the simulator is in landscape. For some strange reason the frames of the elements are all wrong in this case. I’ll discuss that in a future post.

Until then, have a nice day, week and month.

(No AI was used to write this post.)

]]>
Why I don’t use AI2025-05-08T06:00:00+00:002025-05-08T06:00:00+00:00https://dasdom.dev/why-i-dont-use-ai

I’ve come up with a set of rules that describe our reactions to technologies:

  1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.
  2. Anything that’s invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.
  3. Anything invented after you’re thirty-five is against the natural order of things.

– Douglas Adams

I try to avoid AI where ever I can. I ecpecially don’t use it for development tasks. And here is why.

I’m older than thirty-five

So, AI is against the natural order of things.

Good solutions need time

The AI can’t come up with good solutions. It just presents one solution it figured out by adding strings to other strings. I have to figure out if it is a good solution. To figure that out, I need time. I need to think. I need to wrap my head around the problem I try to solve. Figuring out if something works is way harder if I didn’t come up with the solution myself. Most people use just the solution the AI presents and don’t care if it’s a good solution. I don’t like that.

The joy of using my brain

I like using my brain. Figuring something out gives me a dopamine kick. Finally finding a good solution for a problem is the reason why I love coding. Why should I let the AI have all the fun.

Selling cheap

Each input trains the AI. By using AI I help it to finally replace me in my job. I don’t like that.

Environment

AI is destroying the planet faster than my brain does.

Society

AI helps rich people to get free labor from poor people. In this scenario I’m on the side of the poor people. The rich are too rich already. I’m not willing to help make them richer. More than I do already. (Written on a MacBook…)

You

What about you? Do you use AI? Why? Why not?

]]>
How I use AI as an iOS developer2025-04-28T06:00:00+00:002025-04-28T06:00:00+00:00https://dasdom.dev/how-i-use-aiI don’t.

]]>
Hot Reloading In SwiftUI2024-04-29T06:00:00+00:002024-04-29T06:00:00+00:00https://dasdom.dev/hot-reloading-in-swiftuiPreviews in Xcode are kind of nice, when they work. But for me they are often slow or stop working after a few minutes.

Fortunately for us, the amazing John Holdsworth wrote a packaged that adds hot reloading to SwiftUI projects. Even better, it’s suprisingly easy to add to your project.

Method One: Swift Package

First, we need to add the following two Swift packages to the project:

When asked, only add the HotReloading package product to your target:

Next add -Xlinker -interposable to ‘Other Linker Flags’ in the build settings of the target:

In this method you need to remember to remove the hot reloadng package before you upload your app to App Store Connect.

Method Two: Injection III

You still need to add the HotSwiftUI package to your project. Again add -Xlinker -interposable to ‘Other Linker Flags’ in the build settings of the target:

Next load the latest release of the Inject III app and start it.

Setup your view

In the view file you are currently working on, add the following import:

@_exported import HotSwiftUI

Next, erase the type of the root view using .eraseToAnyView():

var body: some View {
  // Your SwiftUI code...
  .eraseToAnyView()
}

Finally add the following line to your view struct:

@ObserveInjection var redraw

Build and run your app on the Simulator, change some code and save the file. The changes are magically compiled and injected into the running app:

Conclusion

Hot reloading does not work all the time. Sometimes you need to recompile using Xcode. But most of the times it works suprisingly well. Give it a try and see how amazing this is.

And if you like it and it helps in your daily work, consider becomming a sponsor of the project on GitHub.

Thanks for reading!

]]>
Renaming a widget extension2022-08-07T08:00:00+00:002022-08-07T08:00:00+00:00https://dasdom.dev/renaming-a-widget-extensionI’m working on an update for my formulary app. Among other improvements, this update will add widgets.

Along the way I decided to add another widget so the name of the extension didn’t match anymore and I had to rename it.

First I feared that this could be as difficult as renaming a project in Xcode. But it turned out that I just had to replace all the occurrences of the old name.

For now I’ll keep the names of the entitlement files as they have been. This is a problem for future Dominik. :)

]]>
Minimize the number of test assertions2022-06-11T08:00:00+00:002022-06-11T08:00:00+00:00https://dasdom.dev/minimize-number-of-test-assertionsRecently I wrote a tweet about demo code shown by an Apple engineer in a WWDC session.

Demo code doesn’t need to be good. If you are wondering, this is not how to write unit tests when you are not demonstrating. Try to only have one assertion in a test method.

I got several answers that it’s not practical to write a test for each small little detail one wants to test and that it’s better to test several things in one test if possible. I totally disagree. It might be OK to have several asserts in one test but this should be an exception. Especially in the case of the test code from the WWDC session, I would rather have several tests.

This is the code that was shown in the video:

func testExtractEventCount () throws {
    
    let providerClass = ServerBackedEventProvider.self 

    // Simple cases
    XCTAssertEqual(providerClass.extractEventCount(from: "0 records"), 0)
    XCTAssertEqual(providerClass.extractEventCount(from: "1 record"), 1)
    XCTAssertEqual(providerClass.extractEventCount(from: " 1 record(s)"), 1)
    XCTAssertEqual(providerClass.extractEventCount(from: "25 records"), 25)
    XCTAssertEqual(providerClass.extractEventCount(from:"50 records"), 50)
    
    // Cases we expect parsing to return nil
    XCTAssertNil(providerClass.extractEventCount(from: "NaN records") )
    XCTAssertNil(providerClass.extractEventCount(from: ""))
    XCTAssertNil(providerClass.extractEventCount(from: "jUnKdAtA"))
}

In the demo code is a bug and therefore this test fails in the line extracting from “0 records” and from “50 records”. With the information from this test, the engineer can fix the bug quickly.

But if this test would fail on Xcode Cloud, the engineer would only see the following:

testExtractEventCount(): XCTAssertEqual failed: ("nil") is not equal to ("Optional(0)")

Not really helpful in my opinion. Critical information is missing:

  • What was tested?
  • What was the precondition?
  • What did we expect?

I would split the test assertions into several tests. Testing for the extraction from “0 records” would then look like this:

func test_extractEventCount_whenInputIs0Records_shouldExtractO() throws {
    // given
    let sut = SeverBackedEventProvider.self

    // when
    let result = sut.extractEventCount(from: "0 records")

    // then
    XCTAssertEqual(result, 0)
}

In case of a failure of this test we would see in the test result the following:

text_extractEventCount_whenInputIs0Records_shouldExtractO(): XCTAssertEqual failed: ("nil") is not
equal to ("Optional(0)")

Without looking at the test code, I already know what exactly failed. Tests should help my future self and my coworkers to find the reason for the failure as quick as possible. This is the main feature a test should have. Personally I find the failure message of the second test way better and looking at the test I do better understand its purpose and why it was written.

What do you think? Which of these tests is better? Let me know on Twitter.

]]>
Xcode - Move Focus2022-02-04T08:00:00+00:002022-02-04T08:00:00+00:00https://dasdom.dev/move-focusUse the shortcut ⌘ J to move focus between editors.

Let me know what you think about this blog post on Twitter: @dasdom.

]]>
The View Debugger in Xcode2022-01-23T08:00:00+00:002022-01-23T08:00:00+00:00https://dasdom.dev/the-view-debugger-in-xcodeWhen ever I have a problem in the user inteface of and app or when I just need to confirm that my assumptions about the user interface are correct, I start the View Debugger in Xcode. When the app is running in the Simulator, Xcode shows the debug bar. The icon with the three squares viewed from the side, halts the execution of the app (like a breakpoint does) and opens the View Debugger with the current view.

For a simple to-do app, the view debugger looks like this:

On the left side the View Debugger presents the view hierachy of the currently visible view. It shows all the views you defined but also the views (and view controllers) iOS created for you. See for example the UILayoutContainerView or the _UISystemBackgroundView. This representation of the view hierachy gives you a fantastic overview of what is going on in the visible screen. You can use this presentation to confirm that the user interface elements are added to the expected super views.

Object inspector

When you select a user interface element in the editor in the middle, Xcode shows you the properties and configurations of that element in the inspectors on the right. The object inspector shows the following sections:

Object

The selected object is an UILabel and you also get its address in memeroy. The address is useful if you want to change its properties or inspect it further using lldb.

Label

In the Label section Xcode tells you the values of all the properties of a UILabel. For example you see the text an the text color and how many lines the label can show. In this case it is configured to show as many lines as are needed to present the set text (Lines 0).

View

In the View section Xcode tells you all the view properties of the label. For example you can read there if the user interaction is enabled and if the label registers multi-touch.

Note the last sub-section called Description. Shown here is the description string of that UI element. Some information, like added gesture recognizers, are only shown here.

Hierarchy

The Hierachy section tells you the inheritence tree of the UI element.

Size Inspector

The size inspector shows you the frame and bounds and the contstraints of the selected element. These informations are often my first stop to figure out the root of a layout problem.

Finding a bug

Let’s assume, there is this app, you are working on.

The text is not shown as you planed. It looks like the labels in the table view cells are shifted to the left. As this is your first project without a Storyboard, you are not sure what is going on. Maybe the constraints are wrong.

You start the view debugger. To check if the labels are in fact shifted to the right, you activate Show Clipped Content.

This setting tells the view debugger to render the parts of the views that are clipped by their super views or by the screen frame. The result looks like this:

You can see that the label are indeed shifted to the left but it’s hard to see. Fortunately Xcode has you covered. You can Change canvas background color.

Better! But how is the label added to the cell? Is it put directly onto the content view or is it added to a host view? To figure that out, you can Orient to 3D by clicking the button with the cube or by dragging the view with the mouse pointer.

The view debugger then shows you the views from an angle and changes z distance between the views to make the hierarchy clearer. The slider on the left lets you change the z distance further.

Better. But you still can’t clearly see what is going on. You’d like to see the constraints for the seleced view in the view debugger editor. To do that, you select the button Show constraints button.

Then you select one of the labels with the problem and inspect the constraints.

With all the other elements in the way, it’s hard to see what the problem might be. So you narrow down the shown views to the important portion using the range slider on the right. After you have zoomed in and changed the perspective a bit, you can clearly see what is going on with this constraint.

OK, it looks like the constant of the leading constraint of that label is wrong. But you are not that experienced and want to ask your co-worker. So you export the view hierarchy as it is shown in the view debugger with the menu item

You send the resulting file to your co-worker and she can open it on her Mac without even having the project stored on her Mac.

This way, your co-worker can confirm your results and you can finally fix the bug.

Conclusion

The View Debugger is a valuable tool to figure out bugs and problems in the user interfaces of your apps. Even if there is no problem, you can us it to get insight into how the view hierarchy is constructed.

Feedback

Let me know what you think about this feature and this blog post on Twitter: @dasdom.

]]>