# Acapella iOS Port: Feasibility Analysis & Architecture Options

## Executive Summary

Porting Acapella to iPhone is highly feasible. The codebase was built with decent separation of concerns, and roughly **60% of the code (14 of 27 files) is already platform-independent**. The main work falls into three buckets: replacing AppKit rendering with UIKit/SwiftUI equivalents, adapting the file I/O from desktop panels to iOS document pickers, and configuring the iOS audio session. None of these are high-risk — they're well-trodden paths in the Apple ecosystem.

This document covers the full picture: what ports cleanly, what needs rewriting, the architectural options for sharing code, and the iOS-specific concerns you'll need to address.

---

## 1. Current Architecture Audit

### File-by-File Portability Assessment

Here's every file in the project, categorized by how much work it needs:

#### Fully Portable — Zero Changes (14 files)

These files use only Foundation and pure Swift. They compile on iOS today:

| File | Lines | Notes |
|------|-------|-------|
| `Song.swift` | 57 | Core model |
| `Note.swift` | 53 | Core model |
| `Pitch.swift` | 150 | Core model + frequency math |
| `Duration.swift` | 92 | Core model |
| `Measure.swift` | 49 | Core model |
| `TimeSignature.swift` | 37 | Core model + parsing |
| `KeySignature.swift` | 184 | Core model + circle of fifths |
| `Selection.swift` | 79 | Pure data, no UI |
| `PlaybackState.swift` | 101 | ObservableObject, pure SwiftUI |
| `FrequencyTable.swift` | 43 | Utility math |
| `ParseError.swift` | 52 | Error definitions |
| `SongFileParser.swift` | 155 | Text parsing |
| `AcapellaBundle.swift` | 144 | Bundle model + FileManager |
| `Constants.swift` | 53 | App constants |

**Total: ~1,249 lines that port with zero changes.**

#### Nearly Portable — Minor Tweaks (3 files)

| File | Lines | What Needs Changing |
|------|-------|--------------------|
| `AudioEngine.swift` | 775 | Add `AVAudioSession` configuration (see Section 3). The AVAudioEngine API itself is identical on iOS. |
| `ToneGenerator.swift` | 126 | Nothing — `AVAudioSourceNode` and render blocks work identically on iOS. |
| `Part.swift` | 105 | Has `#if canImport(AppKit)` blocks for `NSColor`. Already has cross-platform `paletteRGB` tuple fallback. Just needs a `#if canImport(UIKit)` block mirroring the AppKit one with `UIColor`. |

**Total: ~1,006 lines with minor platform conditionals.**

#### Needs Rewrite — Platform-Specific UI (10 files)

| File | Lines | Porting Approach |
|------|-------|-----------------|
| `AcapellaApp.swift` | 288 | Replace `Window` scene with `WindowGroup`. Replace `NSOpenPanel`/`NSSavePanel` with iOS file pickers. Replace menu bar commands with toolbar/sheet UI. |
| `MainWindow.swift` | 188 | Moderate restructuring. Replace `nsColor` refs. Adapt layout for phone screen. Core logic (callbacks, playback coordination) stays. |
| `Toolbar.swift` | 124 | Complete redesign for phone form factor. Same data bindings, different layout. |
| `LegendBar.swift` | 42 | Trivial — replace `Color(nsColor:)` with `Color` equivalents. |
| `SheetMusicView.swift` | 349 | **Biggest piece.** Replace `NSView` with `UIView`. The `CGContext` drawing code is 95% identical — `CGContext` is cross-platform. Main changes: remove `NSGraphicsContext.current`, handle `UIGraphicsGetCurrentContext()`, swap mouse events for touch events. |
| `SheetMusicViewContainer.swift` | 56 | Replace `NSViewRepresentable` + `NSScrollView` with `UIViewRepresentable` + `UIScrollView`. |
| `NoteRenderer.swift` | 192 | Replace `NSColor` → `UIColor`, `NSFont` → `UIFont`, `NSString.draw(at:)` → same API exists on UIKit. Core drawing logic unchanged. |
| `StaffRenderer.swift` | 169 | Same as NoteRenderer — color/font swaps only. All `CGContext` calls are identical. |
| `SelectionRenderer.swift` | 64 | Replace `NSColor` → `UIColor`. Three lines change. |
| `LayoutMetrics.swift` | 147 | Already platform-independent (`CGFloat`, `CGRect`). Only has `#if canImport(AppKit)` guard that needs a UIKit equivalent. May need to adjust constants for smaller screens. |

**Total: ~1,619 lines needing platform adaptation, but much of it is mechanical find-and-replace (NSColor→UIColor, NSFont→UIFont).**

### Summary

| Category | Files | Lines | % of Codebase |
|----------|-------|-------|---------------|
| Zero changes | 14 | ~1,249 | ~33% |
| Minor tweaks | 3 | ~1,006 | ~27% |
| Platform rewrite | 10 | ~1,619 | ~43% |

The "platform rewrite" category is misleading though — most of those files need only color/font type swaps. The truly significant rewrites are `AcapellaApp.swift` (file I/O + menus), `SheetMusicView.swift` (NSView→UIView), and `Toolbar.swift` (layout redesign for phone).

---

## 2. Architecture Options

There are four realistic approaches. Here they are in order of my recommendation:

### Option A: Multiplatform Xcode Project with Shared Sources (Recommended)

**How it works:** One Xcode project, two targets (macOS app + iOS app). Shared source files are added to both targets. Platform-specific files are target-exclusive.

**Project structure:**
```
Acapella/
├── Shared/                          ← Added to BOTH targets
│   ├── Models/                      ← All 9 model files (unchanged)
│   ├── Parsing/                     ← Parser + errors (unchanged)
│   ├── Audio/
│   │   ├── AudioEngine.swift        ← #if os(iOS) for AVAudioSession
│   │   └── ToneGenerator.swift      ← Unchanged
│   ├── Bundle/                      ← Bundle manager (unchanged)
│   └── Utilities/                   ← Constants, FrequencyTable
├── macOS/                           ← macOS target only
│   ├── AcapellaApp_macOS.swift
│   ├── MainWindow_macOS.swift
│   ├── Toolbar_macOS.swift
│   └── SheetMusic/                  ← NSView-based renderers
├── iOS/                             ← iOS target only
│   ├── AcapellaApp_iOS.swift
│   ├── MainView_iOS.swift
│   ├── ToolbarView_iOS.swift
│   └── SheetMusic/                  ← UIView-based renderers
└── Resources/
    └── Samples/
```

**Pros:**
- Minimal disruption to existing macOS code — it stays exactly as-is
- Shared files are literally the same file, not copies
- Xcode handles this natively with target membership checkboxes
- Easiest to maintain going forward — fix a model bug once, both platforms get it
- You already have `#if canImport(AppKit)` guards in several files, so the pattern is established

**Cons:**
- Slightly more complex Xcode project configuration
- Have to be disciplined about not putting platform code in shared files

**Estimated effort:** 2-3 weeks for an experienced Swift developer. The macOS app keeps working throughout.

### Option B: Swift Package for Shared Code

**How it works:** Extract all platform-independent code into a Swift Package (`AcapellaCore`). Both the macOS app and iOS app import it as a dependency.

**Package structure:**
```
AcapellaCore/                        ← Swift Package
├── Package.swift
└── Sources/AcapellaCore/
    ├── Models/
    ├── Parsing/
    ├── Audio/
    └── Utilities/

Acapella-macOS/                      ← macOS app, depends on AcapellaCore
Acapella-iOS/                        ← iOS app, depends on AcapellaCore
```

**Pros:**
- Cleanest architectural separation — forces you to keep shared code truly platform-agnostic
- Could publish AcapellaCore separately (e.g., for a future watchOS complications showing current song)
- Industry best practice for large teams

**Cons:**
- More upfront refactoring to extract the package
- AudioEngine.swift has `#if os(iOS)` platform conditionals, which work in packages but feel less clean
- Overkill for a solo project — adds indirection without much practical benefit over Option A
- Two separate Xcode projects to manage

**Estimated effort:** 3-4 weeks (extra week for the extraction refactor).

### Option C: Full SwiftUI Rewrite with Canvas

**How it works:** Replace the NSView-based `SheetMusicView` with SwiftUI `Canvas` (available since iOS 15 / macOS 12). Canvas provides a `GraphicsContext` that's very similar to `CGContext`, and it works on both platforms with zero conditional compilation.

**What changes:**
- `SheetMusicView` (NSView) → `SheetMusicCanvasView` (SwiftUI View with Canvas)
- `SheetMusicViewContainer` (NSViewRepresentable) → deleted entirely
- All renderers rewritten to use `GraphicsContext` instead of `CGContext`
- Mouse/touch handling unified via SwiftUI gesture system

**Pros:**
- True write-once UI — no platform conditionals for rendering
- Modern SwiftUI throughout, no bridging wrappers
- SwiftUI gestures handle both mouse and touch automatically
- Future-proof (Apple is clearly pushing SwiftUI)

**Cons:**
- `GraphicsContext` API is subtly different from `CGContext` — not a mechanical port
- SwiftUI `Canvas` doesn't support text rendering as cleanly as `NSString.draw(at:)` — music symbols may need workarounds
- Performance for complex scores is unknown (Canvas redraws the whole view)
- Would need to rewrite the macOS app too, or maintain two rendering paths
- Riskiest option — you might hit Canvas limitations mid-implementation

**Estimated effort:** 4-6 weeks, with risk of unexpected blockers.

### Option D: Mac Catalyst

**How it works:** Check the "Mac Catalyst" box in Xcode to run your iOS app on Mac, or vice versa. Apple provides a compatibility layer.

**Pros:**
- Least code changes
- One binary runs on both platforms

**Cons:**
- Mac Catalyst apps feel like iOS apps on Mac — not native macOS feel
- You'd have to port the macOS app to iOS FIRST, then Catalyst gives you Mac "for free" — but you already have a native Mac app that's better
- Menu bar support is limited and awkward
- NSOpenPanel/NSSavePanel don't exist — you'd lose the native file experience on Mac
- Generally considered a dead end by the Apple developer community

**Estimated effort:** Not recommended.

### My Recommendation

**Go with Option A (Multiplatform Xcode Project)**. It's the pragmatic choice for a solo developer. You get maximum code sharing with minimum disruption to the working macOS app. The `#if canImport` pattern you're already using scales fine for this size of project. Option B is the "right" architecture for a team, but it's overengineered for one person. Option C is tempting but risky for the sheet music rendering.

---

## 3. iOS-Specific Technical Concerns

### 3.1 Audio Session Configuration

This is the single biggest iOS-specific requirement. macOS doesn't need it at all. On iOS, you must configure `AVAudioSession` before any audio work:

```swift
// Required in AudioEngine.setup() on iOS
#if os(iOS)
let session = AVAudioSession.sharedInstance()
try session.setCategory(.playAndRecord, options: [
    .defaultToSpeaker,     // Play through speaker, not earpiece
    .allowBluetooth,       // Support AirPods for monitoring
    .mixWithOthers         // Optional: don't kill other apps' audio
])
try session.setActive(true)
#endif
```

**Key differences from macOS:**
- Must request `.playAndRecord` category for simultaneous playback + recording
- Route changes (plugging in headphones, AirPods connecting) cause `AVAudioEngine` to reset — need to handle `AVAudioSession.routeChangeNotification` and restart the engine
- Interruptions (phone calls, Siri) pause audio — need to handle `AVAudioSession.interruptionNotification` and resume gracefully
- Background audio requires the "Audio" background mode capability in Xcode AND setting the category before going to background

**Estimated code:** ~50 lines of notification handling added to AudioEngine.swift, wrapped in `#if os(iOS)`.

### 3.2 Microphone Permissions

Works very similarly to macOS. iOS uses the same `NSMicrophoneUsageDescription` in Info.plist. The permission dialog appears automatically on first mic access. One nice thing: iOS doesn't have the sandbox entitlement confusion we fought with on macOS — it just works once the Info.plist key is present.

### 3.3 File I/O — The Biggest UX Difference

iOS has no `NSOpenPanel` or `NSSavePanel`. File management works completely differently:

**Opening song files (.txt) and bundles (.acapella):**
- Use `UIDocumentPickerViewController` (or SwiftUI's `.fileImporter()` modifier)
- User navigates to the file in Files app / iCloud Drive
- Returns a security-scoped URL — must call `url.startAccessingSecurityScopedResource()` before reading

**Saving bundles and exports:**
- Use `.fileExporter()` modifier or `UIDocumentPickerViewController` in export mode
- Or: store in the app's own Documents directory and use `UIActivityViewController` (share sheet) to export

**Recommended approach for iOS:**
- Store everything in the app's own sandbox (Documents directory)
- Provide an "Import Song" button that uses `.fileImporter()`
- Provide a "Share" button that uses share sheet for exporting WAV/bundles
- Optionally support iCloud Drive sync for cross-device access

**This is a significant UX redesign** — not technically hard, but the user flow is fundamentally different from the Mac. On Mac you think in terms of "files on disk." On iPhone you think in terms of "things inside the app" that you can share out.

### 3.4 Screen Size & UI Layout

The current macOS UI assumes an 800×600 minimum window. An iPhone SE is 375×667 points. The toolbar alone has 7 controls in a horizontal row — that won't fit.

**Toolbar redesign needed:**
- Part picker: could be a navigation title or segmented control
- Play All / Play Current: segmented control, maybe in a bottom bar
- Tones / Recorded: same
- BPM slider: could be a stepper or a popover with a slider
- Loop toggle: toolbar button
- Play/Stop: prominent center button
- Record: prominent button, maybe with a long-press for options

**Sheet music on small screens:**
- Current `measureMinWidth` is 120pt — you'd get about 2.5 measures per line on iPhone
- Consider pinch-to-zoom for the sheet music view
- Horizontal scrolling might work better than the current vertical-only layout
- The 4-staff SATB layout at 8pt staff spacing will be tiny — may need to let users show/hide individual parts

**Menu items → iOS equivalents:**

| macOS Menu Item | iOS Equivalent |
|----------------|----------------|
| File → Open Song | Import button + `.fileImporter()` |
| File → Close Song | Back navigation or swipe |
| Audio → Open Bundle | Import button (different UTI filter) |
| Audio → Save Bundle | Share button / auto-save |
| Audio → Export Final Mix | Share button → "Export as WAV" |
| Audio → Count-In toggle | Settings screen or toolbar toggle |
| Audio → Monitor Recording | Settings screen |
| Audio → Metronome During Recording | Settings screen |

### 3.5 Touch Interaction vs. Mouse

The current `SheetMusicView` handles `mouseDown`, `mouseDragged`, `mouseUp` for measure selection. iOS equivalents:

| macOS | iOS |
|-------|-----|
| `mouseDown(with:)` | `touchesBegan(_:with:)` or UITapGestureRecognizer |
| `mouseDragged(with:)` | `touchesMoved(_:with:)` or UIPanGestureRecognizer |
| `mouseUp(with:)` | `touchesEnded(_:with:)` |
| Shift-click to extend | Long-press to enter selection mode, then drag |
| Scroll via NSScrollView | UIScrollView with same pattern |

The mapping is straightforward. I'd recommend using gesture recognizers rather than raw touch handling — they compose better with scroll views.

### 3.6 Performance on iPhone

The current timer-based playback (`Timer.scheduledTimer` at 60fps) works fine on Mac. On iPhone:

- `Timer` is acceptable but can be deprioritized by the system
- For more reliable timing, consider `CADisplayLink` (syncs with screen refresh)
- The tone generation via `AVAudioSourceNode` render blocks runs on the audio thread — this is the same on iOS and won't be an issue
- Sheet music rendering at 60fps redraws could be expensive on older iPhones — consider using `setNeedsDisplay()` only when the playback position actually changes measures, not every frame

### 3.7 Background Audio

If the user starts recording and switches to another app (to read lyrics, for instance), you need background audio capability:

- Add "Audio, AirPlay, and Picture in Picture" background mode in Xcode
- Set audio session category to `.playAndRecord` (already needed)
- The recording tap and playback will continue in background
- Note: Apple will reject apps that claim background audio but don't actually need it — Acapella legitimately needs it for recording workflows

---

## 4. Implementation Roadmap

If you go with Option A (recommended), here's the order I'd tackle things:

### Phase 1: Project Setup & Shared Code (Day 1-2)
- Add iOS target to the existing Xcode project
- Set target membership on all shared files (models, parsing, audio, utilities)
- Add `#if canImport(UIKit)` blocks to `Part.swift` for `UIColor` palette
- Add `#if os(iOS)` audio session config to `AudioEngine.swift`
- Verify shared code compiles for iOS target

### Phase 2: Basic iOS UI Shell (Day 3-5)
- Create `AcapellaApp_iOS.swift` with `WindowGroup`
- Create minimal `MainView_iOS.swift` with placeholder views
- Create `ToolbarView_iOS.swift` redesigned for phone
- Wire up PlaybackState and AppState
- Get tone playback working (play a built-in sample song)

### Phase 3: Sheet Music Rendering (Day 6-10)
- Port `SheetMusicView` to `UIView` subclass
- Port `NoteRenderer`, `StaffRenderer`, `SelectionRenderer` (NSColor→UIColor, NSFont→UIFont)
- Create `SheetMusicViewContainer_iOS` with `UIViewRepresentable` + `UIScrollView`
- Implement tap and drag gesture recognizers for selection
- Adjust `LayoutMetrics` constants for phone screen widths

### Phase 4: File I/O (Day 11-13)
- Implement song file import via `.fileImporter()`
- Implement bundle save/load using app Documents directory
- Implement share sheet for export
- Test iCloud Drive file import flow

### Phase 5: Recording (Day 14-16)
- Test mic permissions on iOS
- Add audio session interruption and route change handlers
- Test simultaneous playback + recording on device
- Add background audio capability
- Verify WAV playback alignment with recording

### Phase 6: Polish & Platform Niceties (Day 17-20)
- Settings screen for toggles (count-in, monitor, metronome)
- Pinch-to-zoom on sheet music
- Handle device rotation
- iPad layout (larger screen, could show more controls)
- Test on various device sizes (SE, standard, Pro Max)
- App icon and launch screen

---

## 5. Risk Assessment

| Risk | Likelihood | Impact | Mitigation |
|------|-----------|--------|------------|
| AVAudioEngine route change crashes | Medium | High | Robust notification handling, engine restart logic. Apple's AVAEMixerSample shows the pattern. |
| Sheet music too small on phone | Low | Medium | Pinch-to-zoom, adjustable staff size, show/hide parts |
| Recording latency on iPhone | Low | Low | iPhone audio hardware is actually excellent. Use `.measureInput` on audio session for lowest latency. |
| Timer-based playback jitter | Low | Medium | Switch to `CADisplayLink` if Timer proves unreliable |
| App Store review: background audio | Low | Low | Legitimate use case — recording while reading lyrics |
| File format compatibility | None | None | .txt and .wav are universal, no conversion needed |

---

## 6. iPad Considerations

Worth mentioning: the same iOS build runs on iPad with zero extra work. And iPad is arguably a *better* fit for Acapella than iPhone:

- Larger screen = sheet music is readable at current sizes
- Toolbar fits comfortably
- Better microphone hardware than many iPhones
- Users might prop up an iPad on a music stand — perfect for the use case
- Split View lets users have lyrics/reference material alongside the app

If you build for iPhone, iPad support is essentially free. The only extra work would be an optimized iPad layout that takes advantage of the larger screen (e.g., sidebar navigation instead of modal sheets).

---

## 7. What NOT to Port

A few things from the macOS version don't make sense on iOS:

- **`NSOpenPanel` directory browsing for .acapella bundles** — iOS doesn't expose directory picking the same way. Use the app's own document storage instead.
- **Keyboard shortcuts** (⌘O, ⌘S, ⌘E, ⌘K, Space for play) — these only work with hardware keyboards on iPad. Not worth special-casing for iPhone. iPad can get them later as a nice-to-have via `.keyboardShortcut()`.
- **`SongFileWriter`** — the write-back-to-text feature works, but iOS users are less likely to hand-edit .txt song files. Keep it for bundle saving but don't expose it as a primary feature.

---

## 8. Bottom Line

This is a very portable codebase. You made good architectural decisions early on — keeping models clean, using AVFoundation (which is cross-platform), and already having `#if canImport(AppKit)` guards in the rendering code. The port is real work, but it's predictable work with no scary unknowns.

The biggest effort is the UI redesign for phone form factor, not the platform porting itself. If I had to put a number on it: **60% of the work is UI layout decisions, 25% is mechanical NSColor→UIColor type swaps, and 15% is the iOS audio session + file I/O plumbing.**

Go with Option A (multiplatform Xcode project), tackle it in the phased order above, and you'll have Acapella running on your phone in about 3 weeks of focused work.

---

## 9. Implementation Notes for Claude Code

**This section is written for a Claude Code session that will be doing the actual implementation.** It contains hard-won lessons from building the macOS app that aren't obvious from reading the source code. These will save you hours of debugging.

### 9.1 Critical AVAudioEngine Gotchas

These were discovered through painful trial-and-error during the macOS build:

**You MUST stop the engine before installing an input tap.**
```swift
// ❌ THIS WILL KILL ALL AUDIO OUTPUT
engine.inputNode.installTap(onBus: 0, bufferSize: 1024, format: inputFormat) { ... }

// ✅ CORRECT — stop first, install, restart
engine.stop()
engine.inputNode.installTap(onBus: 0, bufferSize: 1024, format: inputFormat) { ... }
try engine.start()
```
Installing a tap while the engine is running causes AVAudioEngine to reconfigure its audio graph internally. This kills all `AVAudioSourceNode` output (the tone generators). On iOS this will be the same behavior — it's an AVAudioEngine fundamental, not a platform thing.

**Guard `removeTap` — only call it when a tap was actually installed.**
```swift
// ❌ THIS CORRUPTS ENGINE STATE
engine.inputNode.removeTap(onBus: 0) // crashes or corrupts if no tap exists

// ✅ CORRECT — guard with recording state
if recordingFile != nil {
    engine.inputNode.removeTap(onBus: 0)
}
```
Calling `removeTap` when no tap is installed doesn't crash immediately but corrupts the engine's internal state. The symptom was that playback worked once after launch but never again after stop→play. This was the single hardest bug to find in the entire project.

**Engine restart safety check.** Always verify the engine is running before starting playback:
```swift
if let engine = engine, !engine.isRunning {
    try engine.start()
}
```

### 9.2 Recording During Count-In

The count-in metronome uses negative `accumulatedBeats` (beat < 0 = count-in, beat >= 0 = content). During recording, the input tap is active the entire time, but you must NOT write mic data to the WAV file during the count-in — otherwise the recording will be one measure late (contains a measure of metronome bleed + silence before the singer's actual part).

The fix is the `isInCountIn` flag checked inside the tap callback:
```swift
engine.inputNode.installTap(...) { buffer, _ in
    if self.isInCountIn { return }  // Skip writing during count-in
    try self.recordingFile?.write(from: buffer)
}
```

### 9.3 WAV Playback Alignment

Recordings are stamped with the measure they started from (`recordingStartMeasures[partName]`). When playing back from a cursor or selection that's past the recording start, you need `scheduleSegment` instead of `scheduleFile` to seek into the WAV:
```swift
let measuresToSkip = playbackStartMeasure - wavStartMeasure
let secondsToSkip = Double(measuresToSkip) * beatsPerMeasure * 60.0 / Double(tempo)
let framesToSkip = AVAudioFramePosition(secondsToSkip * audioFile.processingFormat.sampleRate)
playerNode.scheduleSegment(audioFile, startingFrame: framesToSkip, frameCount: framesRemaining, at: nil)
```

### 9.4 Sample Rate Mismatch in Export

iPhone microphones record at 48000 Hz, not 44100 Hz. The tone generator runs at 44100 Hz. When exporting the final mix, you MUST probe the first WAV file for its native sample rate and use that for the entire mix buffer. If you mix at 44100 and the WAVs are 48000, the exported audio sounds slow and pitched down (like a drunk person singing).

```swift
if let firstURL = recordedAudioURLs.values.first {
    let probe = try AVAudioFile(forReading: firstURL)
    sampleRate = probe.fileFormat.sampleRate  // Will be 48000 on most devices
}
```

### 9.5 Sandbox and Permissions (Xcode Configuration)

On macOS, we had significant trouble because Xcode ignores standalone entitlement files and Info.plist files that aren't properly configured in the build settings. On iOS this is simpler, but still:

- **Microphone permission:** Add `NSMicrophoneUsageDescription` to the iOS target's Info tab in Xcode (not a standalone file). Value: "Acapella needs microphone access to record your vocal parts."
- **Background audio:** Add the "Audio, AirPlay, and Picture in Picture" background mode in Signing & Capabilities.
- **File access:** iOS sandbox is different from macOS. The app can freely read/write its own Documents directory. For importing files from outside, `.fileImporter()` handles security-scoped URLs automatically — just call `url.startAccessingSecurityScopedResource()` and `url.stopAccessingSecurityScopedResource()` when done.

### 9.6 The Callback Architecture

The macOS app deliberately uses closures (callbacks) instead of SwiftUI `onChange` observers for play/stop/record coordination. This was to avoid feedback loops where stopping playback would trigger an onChange that would try to start playback again.

The flow is:
```
ToolbarView (button tap)
  → callback closure (onPlay/onStop/onRecord)
    → MainWindow function (startPlayback/stopPlayback/startRecording)
      → AudioEngine method
        → callback to MainWindow (onPositionUpdate/onPlaybackStopped/onRecordingComplete)
          → updates @State vars
            → SwiftUI re-renders
```

When building the iOS UI, maintain this same pattern. Don't try to simplify it with Combine publishers or onChange — we tried that and it caused infinite loops.

### 9.7 Count-In Delayed WAV Start

When count-in is enabled, WAV player nodes can't start immediately because they'd play during the count-in. The engine stores the playback parameters (`storedPartMode`, `storedCurrentPartName`, `storedExcludePartName`) and starts the WAV players only when `isInCountIn` transitions from true to false in the `tick()` method.

We had a bug where the delayed WAV start used hardcoded `.all` part mode instead of the stored values, causing the wrong parts to play after count-in. Make sure the stored values are used.

### 9.8 Metronome Compound Time

The metronome was initially wrong for 6/8 time — it played 3 clicks per measure (on quarter-note beats) instead of 6 (on eighth-note beats). The fix:
- `metronomeClicksPerMeasure` = time signature numerator (6 for 6/8)
- `metronomeClickInterval` = quarterNoteBeatsPerMeasure / clicksPerMeasure
- Strong beats at positions 0 and 3 (compound group size = 3)

### 9.9 Existing `#if canImport` Guards

Several files already have platform guards. Look for them and extend rather than duplicate:
- `SheetMusicView.swift` — `#if canImport(AppKit)` wraps the entire file
- `NoteRenderer.swift` — same
- `StaffRenderer.swift` — same
- `SelectionRenderer.swift` — same
- `LayoutMetrics.swift` — `#if canImport(AppKit)` on the import only
- `Part.swift` — `#if canImport(AppKit)` on `PartColors.palette` and `color(forIndex:)`. Already has platform-independent `paletteRGB`.

### 9.10 During Recording, Force PartMode to .all

When the user starts recording a part, the backing tracks should always play ALL other parts (regardless of the "Play Current" / "Play All" toggle). The part being recorded is excluded via the `excludePartName` parameter. The `monitorRecordingPart` toggle controls whether the user hears their own part's tone playback while singing.

### 9.11 iOS-Specific: Route Change Handling

This doesn't exist in the macOS code and needs to be added for iOS. When headphones are plugged/unplugged or AirPods connect/disconnect, `AVAudioEngine` resets itself. All player nodes stop and all scheduled audio is purged. You need:

```swift
#if os(iOS)
NotificationCenter.default.addObserver(
    forName: AVAudioSession.routeChangeNotification,
    object: nil, queue: .main
) { [weak self] notification in
    guard let reason = notification.userInfo?[AVAudioSessionRouteChangeReasonKey] as? UInt,
          reason == AVAudioSession.RouteChangeReason.oldDeviceUnavailable.rawValue else { return }
    // Headphones unplugged — stop playback gracefully
    self?.stopPlayback()
    DispatchQueue.main.async {
        self?.onPlaybackStopped?()
    }
}
#endif
```

Also handle interruptions (phone calls):
```swift
#if os(iOS)
NotificationCenter.default.addObserver(
    forName: AVAudioSession.interruptionNotification,
    object: nil, queue: .main
) { [weak self] notification in
    guard let type = notification.userInfo?[AVAudioSessionInterruptionTypeKey] as? UInt else { return }
    if type == AVAudioSession.InterruptionType.began.rawValue {
        self?.stopPlayback()
        DispatchQueue.main.async {
            self?.onPlaybackStopped?()
        }
    }
}
#endif
```

### 9.12 iOS UI Design Recommendations

**For iPhone:** Use a vertically-stacked layout. Top area = sheet music (scrollable), bottom area = transport controls. The macOS toolbar has too many controls for a phone — split them across a bottom bar (play/stop/record) and a settings gear icon (count-in, metronome, monitor toggles). The part picker and play-all/play-current toggle can be a segmented control above the sheet music.

**For iPad:** The macOS layout works almost as-is. Consider keeping the horizontal toolbar but using a sidebar or popover for settings toggles. The sheet music view can be much larger.

**Sheet music orientation:** Consider defaulting to landscape on iPhone. A phone in landscape gives you roughly 667×375 points — enough for 4-5 measures per system with the current 120pt measure width. Portrait gives only 2-3 measures. You could lock to landscape during playback/recording, or just let the user choose.

**Pinch-to-zoom:** Worth implementing for the sheet music. The `LayoutMetrics` constants (`staffLineSpacing`, `measureMinWidth`, etc.) could be multiplied by a zoom factor. Store it in a `@State` var and update via `MagnificationGesture`.

### 9.13 Project Setup Steps (for Xcode)

Here's exactly how to add the iOS target to the existing project:

1. In Xcode, File → New → Target → iOS → App
2. Product Name: "Acapella" (or "Acapella iOS" to distinguish)
3. Interface: SwiftUI, Language: Swift
4. This creates a new target with its own Info.plist and entry point

5. **Set target membership on shared files** (check BOTH targets):
   - All files in `Models/` (9 files)
   - All files in `Parsing/` (2 files)
   - `Audio/AudioEngine.swift`
   - `Audio/ToneGenerator.swift`
   - `Bundle/AcapellaBundle.swift`
   - `Utilities/Constants.swift`
   - `Utilities/FrequencyTable.swift`

6. **Keep macOS-only** (check only macOS target):
   - `AcapellaApp.swift`
   - `UI/MainWindow.swift`
   - `UI/Toolbar.swift`
   - `UI/LegendBar.swift`
   - `UI/SheetMusicViewContainer.swift`
   - `SheetMusic/SheetMusicView.swift`
   - `SheetMusic/NoteRenderer.swift`
   - `SheetMusic/StaffRenderer.swift`
   - `SheetMusic/SelectionRenderer.swift`
   - `SheetMusic/LayoutMetrics.swift` (has AppKit import, OR make it shared with conditional imports)

7. **Create new iOS-only files** (check only iOS target):
   - `iOS/AcapellaApp_iOS.swift`
   - `iOS/MainView_iOS.swift`
   - `iOS/ToolbarView_iOS.swift`
   - `iOS/LegendBar_iOS.swift`
   - `iOS/SheetMusic/SheetMusicView_iOS.swift`
   - `iOS/SheetMusic/SheetMusicViewContainer_iOS.swift`
   - `iOS/SheetMusic/NoteRenderer_iOS.swift`
   - `iOS/SheetMusic/StaffRenderer_iOS.swift`
   - `iOS/SheetMusic/SelectionRenderer_iOS.swift`

8. **iOS target capabilities** (Signing & Capabilities tab):
   - Background Modes → Audio, AirPlay, and Picture in Picture

9. **iOS target Info tab:**
   - Add `NSMicrophoneUsageDescription`: "Acapella needs microphone access to record your vocal parts."

### 9.14 File Organization Note

The existing macOS files don't need to be renamed or moved. You're just adding target membership and creating NEW files for the iOS-specific code. The macOS app continues to work exactly as before throughout this process.

### 9.15 Testing Strategy

Since you have `xcodebuild` and the iOS simulator available:
1. After each phase, build for iOS Simulator to catch compile errors
2. Run in simulator to verify UI layout (simulator can't test mic recording but can test playback and UI)
3. For recording testing, you'll need to deploy to a physical device
4. Test on both iPhone SE (smallest) and iPad (largest) simulator sizes to verify adaptive layout
