KanaShark is a SwiftPM package that implements a new Japanese input method for small screens such as iOS and watchOS. By using SHARK2-based word prediction for gesture operations tracing consonants, it enables fast Japanese text input.
GitHub Repository: https://github.com/toyoshin5/KanaShark
Paper: https://www.wiss.org/WISS2025Proceedings/data/paper/22.pdf
Japanese input on small touchscreens is challenging, often relying on voice input or candidate selection. Flick keyboards also suffer from the "fat finger problem" due to small keys. Gesture input, which allows entering entire words with a single stroke, is attracting attention as an efficient method. This project applies a consonant-tracing approach combined with SHARK2 technology to Japanese input.
Supported Platforms
- iOS 17 or later
- watchOS 10 or later
You can add KanaShark as a dependency using Swift Package Manager in two ways:
Method 1: Using Xcode
- Go to
File → Add Package Dependencies... - Enter the following URL:
https://github.com/toyoshin5/KanaShark - Add to Targets:
- Select the target you want to add KanaShark to
- eg.
MyApp
- eg.
- Select the target you want to add KanaShark to
- Click
Add Package
Method 2: Edit Package.swift Directly
Add the following to your dependencies array:
.dependencies: [
.package(url: "https://github.com/toyoshin5/KanaShark", from: "0.1.0")
]You can use Japanese gesture input simply by displaying GestureKeyboardView in SwiftUI.
import KanaShark
struct ContentView: View {
var body: some View {
GestureKeyboardView(
hiraganaPositions: .default, // Hiragana layout (default recommended)
minConfidence: 0.001, // Confidence threshold for candidate generation
style: GestureKeyboardStyle( // Keyboard appearance
font: .system(size: 18, weight: .bold),
textColor: .primary,
traceColor: .primary.opacity(0.5),
traceLineWidth: 8,
loadingIndicatorColor: .primary
),
scoringConfig: GestureKeyboardScoringConfig( // Scoring configuration
sigmaShape: 0.008,
sigmaLocation: 12
),
onGestureStarted: {
// Callback when gesture starts
},
onGestureEnded: { points in
// Callback when gesture ends (receives array of trace points)
},
onCandidatesGenerated: { results in
// Receives candidate results (array of GestureKeyboardResult)
for (index, result) in results.prefix(10).enumerated() {
print("Result \(index): \(result.text), Confidence: \(result.confidence)")
}
}
).frame(width: 200, height: 200)
}
}Arguments:
hiraganaPositions: Hiragana layout on the keyboard (default recommended)minConfidence: Confidence threshold for candidate generation (smaller = more candidates)style: Keyboard appearance (font, color, line width, etc.)scoringConfig: Configuration for the scoring engine (sigmaShape, sigmaLocation)onGestureStarted: Callback when gesture startsonGestureEnded: Callback when gesture ends (array of trace points)onCandidatesGenerated: Callback when candidates are generated (result array)
- After gesture input, calculate the likelihood of each word in the dictionary based on the trajectory
- Return word candidates with likelihood above a threshold, sorted by likelihood
- Uses Japanese Web Corpus 2010
- 100,000 entries each for 1-gram/2-gram, 20,000 each for 3-gram/4-gram, 10,000 for 5-gram, totaling 250,000 words
- Readings generated with Mecab
- 4-gram/5-gram words are for fixed phrases
- Adjusted the weight formula for scoring.
- SHARK2: ACM Paper