

Discover more from Rice Cakes Media
As someone that has been learning SwitftUI for the past few years it was great to hear that ChatGPT 4 was effective at helping write usable code. In this series I will be sharing the recording of the prototype with the link to the prompts I used.
Note: I just wanted to call out that ChatGPT share links are relatively new at the time of this writing and so useful!
Overview
I had a random idea where if I wrote text on the app the letters would appear in random locations at random sizes. I wanted to see how effective ChatGPT 4 would be at writing the code. I’ve always been fascinated at creative coding examples similar to what you would see from the Openframeworks and Processing communities. So I decided to try something you would see in those communities and implement it in Xcode using SwiftUI.
Here is the link to the prompts I used.
Main Learnings
In the past I did a bit of prompt engineering before I started asking SwiftUI related questions but this time I didn’t do any prompt engineering and I got great results. Something I want to experiment more is to see the effectiveness of prompt engineering in regards to the effectiveness of the code. Perhaps a post for another time.
Each time the code ran which is impressive but it still takes a bit of Q&A to get to a usable result. For example there was a scenario where the input field stopped working when I made a change to the background color:
I learned that:
“I'm sorry for the confusion. In SwiftUI, TextField
is not designed to be used with a black background and white foreground color directly. It uses the system's default style that doesn't handle color inversion well. To address this, we need to customize the text field's style.
In iOS 15, Apple introduced a new way to style text fields using .textFieldStyle()
. We can use the PlainTextFieldStyle
to customize our text field's appearance.”
It would be nice if ChatGPT picked up on the usability aspect of things as well but regardless, still impressive the code ran.
Another thing that caught my attention were optimizations around the native device implementations. For example, the original code was clashing with iOS activity bar so I had to ask ChatGPT to lower some of the components:
Another observation was accessibility. I noticed that the default colors provided were not accessible but perhaps this is the native iOS component color:
It might be a bit advanced but it would be impressive if ChatGPT asked some follow up questions before the code was provided, things like “What color would you like the background to be? What color would you like the text to be?”. An alterative would be a simple component with a color picker that gave you the option ahead of time instead of the back and forth to change the colors since there is a short wait time while the response is being generated.
Another impressive thing that stood out to me was how educational the experience was. Instead of ChatGPT just providing the code, it also explained how things worked which is a great way to learn how SwiftUI works.
Overall this experiment was successful since the prototype I had in mind was built with the help of ChatGPT.
Thanks for reading. Please subscribe and get notified of new posts.