iTranslated by AI
Overview of the Journaling Suggestions API
Since iOS 17.2, Apple's official "Journal" app has become available.
Journal allows you to record daily moments and special events in detail using photos, videos, recorded audio, locations, and more.
Along with the app's release, the Journaling Suggestions API was also made public.
Individuals and companies developing apps for writing personal text, such as diaries, can now use the information on the iPhone in the same way.
In this article, I will take a brief look at the overview of this API.
Overview
The Journaling Suggestions API provides a picker interface.
The picker displays personal events that have occurred in a person's life, such as places visited, people they have connected with, photos in their library, and songs they play repeatedly.
For example, a diary app would use these details to show the start of a new journal entry related to the selected suggestion.
Supported OS versions are as follows:
- iOS 17.2+
- iPadOS 17.2+
- Mac Catalyst 17.2+
How to Set Up
The Journaling Suggestions framework cannot display the JournalingSuggestionsPicker in apps that do not have this entitlement in their code signature.
Add this permission to your app by enabling the Journaling Suggestions capability in Xcode and adding the Entitlements.

How to Use
struct ContentView: View {
var body: some View {
JournalingSuggestionsPicker { {
Text("Show Suggestions")
} onCompletion: { suggestion in
print(suggestion.title) // Use the automatically suggested title
print(suggestion.date) // Time range of the suggested event (DateInterval)
}
}
}
From the JournalingSuggestion obtained in onCompletion, you can get the title and date. Depending on the selected content, you can access it like this:
suggestionPhotos = await suggestion.content(forType: JournalingSuggestion.Photo.self)
There are various types available.
| Type | Description |
|---|---|
| Photo | Images from the personal library, including the date they were taken |
| Workout | Information about completed workouts, such as calories burned, distance, route, average heart rate, start and end times, and workout type icons |
| Contact | Interactions with people registered in contacts, including the person's name and contact photo |
| Location | Places the person has visited, including name, surrounding city, and geographic coordinates (if available) |
| Song | Artist name, album name, album artwork, etc. |
| Podcast | Information about podcasts played, such as show, artist, episode, and related artwork |
| Video | Videos in the person's library, including the date they were taken |
| LivePhoto | Images and short videos representing Live Photos from the library, including the date they were taken |
| MotionActivity | Number of steps walked based on iPhone motion events, including activity icons |
Conclusion
After exploring various aspects, I tried to verify them in Xcode. However, in my environment (Xcode 15.2 beta) with Simulator 15.2 running iOS 17.2, JournalingSuggestions was not available, and I had to handle it as follows:
#if canImport(JournalingSuggestions)
import JournalingSuggestions
#endif
Additionally, I was able to use it on a physical device updated to iOS 17.2.
Initially, several highlights from the Photos app were suggested, but once they were selected, they seemed to stop appearing.
And when those were gone, no further suggestions were offered, and I was left stuck with the screen shown in the screenshot below.

I feel that without some kind of consideration for development environments, it might be difficult to progress with development efficiently.
I started writing this guide with great enthusiasm, but it seems it may be a bit premature for actual production use.
If anyone has any insights regarding the use of these, I would appreciate it if you could let me know.
Thank you.
Discussion