Post

Replies

Boosts

Views

Activity

Will Apple Reject Apps That Read MacBook Lid-Angle Sensor via Private APIs?
Hey I’m working on a macOS app that wants to detect the MacBook lid / hinge angle (i.e. how far the screen is open) by directly reading the internal sensor via HID / IOKit (a private / undocumented API). I came across this project: LidAngleSensor — GitHub: https://github.com/samhenrigold/LidAngleSensor?tab=readme-ov-file Before investing too much effort, I’d like to ask the community: Has anyone succeeded in getting such an app accepted on the Mac App Store when it includes sensor-level, private API access like this? What were the reviewer feedback or rejection reasons (if any)? Are there documented cases (positive or negative) where Apple approved or rejected apps for accessing non-public hardware sensors? What’s the risk of getting banned or permanently rejected for integrating this kind of functionality? If you have direct experience (whether it passed or failed), I’d love to hear your stories, strategies, or pointers. Thanks in advance!
1
0
69
Sep ’25
visionOS Widget Bug
When I was developing the visionOS 26beta Widget, I found that it could not work normally when the real vision OS was running, and an error would appear. Please adopt container background api It is worth mentioning that this problem does not occur on the visionOS virtual machine. Does anyone know what the reason and solution are, or whether this is a visionOS error that needs Feedback? Thank you!
1
0
369
Sep ’25
visionOS plane anchor rotation and wall direction are inconsistent
I have a problem with the wall plane detection using visionOS/ARKit: I am using ARKitSession's PlaneDetectionProvider detection.wall in the space of visionOS. I recorded the position and rotation information of the first detected plane, but found that the rotation value will be facing when the user starts the space. There is a deviation in different directions. That is to say, even if the plane is located on the same wall, the rotation quaternion will be different. I hope that no matter from which direction the user enters the scan, the real direction of the wall can be correctly obtained so that the virtual content can be accurately aligned with the wall. I have tried to use anchor.originFromAnchorTransform or Transform.rotation directly, but the rotation value is still affected by the user's initial orientation. In addition, I would like to know whether the user's initial orientation will affect the location information. If so, please provide a solution. Thank you!
1
0
456
Sep ’25
Look to Scroll
Hello! I’m excited to see that Look to Scroll has been included in visionOS 26 Beta. I’m aiming to achieve a feature where the user’s gaze at a specific edge automatically scrolls to that position. However, I’ve experimented with ScrollView and haven’t been able to trigger this functionality. Could you advise if additional API modifiers are necessary? Thank you!
1
0
529
Jul ’25
View lifecycle in Tabview
In TabView, when I open a view in a Tab, and I switch to another Tab, but the View lifecycle of the view in the old Tab is still not over, and the threads of some functions are still in the background. I want to completely end the View lifecycle of the View in the previously opened tab when switching Tab. How can I do it? Thank you!
0
0
140
Jul ’25
Metal (Compositor Services) or RealityKit on visionOS
I am develop visionOS app. I am now very interested in Metal and Compositor Services, but I have not explored them in depth. I know that Metal has a higher degree of control freedom. I am wondering if using Compositor Services will have fewer functions than RealityKit in AR technology (such as scene reconstruction and understanding, hover effect, etc.).
4
0
177
Jun ’25
Join WWDC in Apple Park
Thank you very much for choosing me to go to Apple Park to participate in WWDC. I am looking forward to participating in this event. May I ask you some questions? I am a young Apple Developer Program from China. And I am the winner of the Swift Student Challenge in 2024. I am over 13 years old. I used my own Developer account, not my parents', to apply for WWDC activities and all the events I carry out. Since I am under 18 years old, my parents may need to sign the Special Event Parental Permission Statement. Where can I find it? My parents will sign it. At the same time, I noticed that the bottom of the RSVP form requires me to guarantee that I am at least 18 years old, but I am not. And I used my own account to apply for WWDC, so I want to know how to meet this need? I need a non-immigrant visa to go to the United States. So, I need to prove to the visa officer that I have received an invitation from Apple. Could Apple send me a formal invitation letter to prove that I have received an invitation from Apple? At the same time, as a teenager, I need to go to the United States with my mother, so can you mention the information of my guardian (my mother) in the invitation letter? Ps: I am very independent. I am well aware that the number of people in WWDC is limited. My mother will not enter the venue unless otherwise required.
1
0
110
Apr ’25
CompositorServices Or RealityKit
I have been concentrating on developing the visionOS application. While I am currently quite familiar with RealityKit, CompositorServices has also captured my attention. I have not yet acquired knowledge of CompositorServices. Could you please clarify whether it is essential for me to learn CompositorServices? Additionally, I would appreciate it if you could provide insights into the advantages of RealityKit and CompositorServices.
2
0
700
Mar ’25
ARKit hand tracking
Hello, I am developing a visionOS application and am interested in obtaining detailed data of users’ hands through ARKit, including but not limited to Transform and rotation angle. I have reviewed Happy Beem, but it appears to only introduce the method of identifying the user’s specific gestures. Could you please advise on how to obtain the Transform and rotation angle of the user’s hand? Thank you.
1
0
448
Mar ’25
Wireless debugging
The charging port of my iPhone may be damaged due to water, and it cannot be charged and transmitted data. It can only be charged wirelessly that does not support data transmission. However, since Xcode supports wireless debugging, I can continue to test my App. However, I recently changed to a new Mac, but there is no connection record with the iPhone in the new Mac, which makes it impossible to debug wirelessly. So I want to know how to realize wireless debugging on such a device without debugging records?
2
0
439
Feb ’25
Playground for more than 3 minutes to participate in the Swift Student Challenge
I am currently preparing my submission for the Swift Student Challenge, and my app playground is quite comprehensive. Based on my estimations, it may take approximately 4 to 5.5 minutes for the reviewers to fully experience the interactive elements of my app. Every component is integral to the overall experience, and I would prefer not to remove any content, as each part not only contributes to the overall interactivity but also effectively demonstrates my abilities across different technical and creative domains. However, I noticed the guideline on https://developer.apple.com/swift-student-challenge/eligibility stating that the interactive scene should be “experienced within three minutes.” While this does not appear to be a main requirement, my app playground significantly exceeds this timeframe. Could you kindly clarify whether exceeding the three-minute guideline could result in my submission being rejected, or if it might negatively impact the evaluation process? I would greatly appreciate any insights you can provide. Thank you for your time and consideration. I look forward to your response.
2
0
657
Feb ’25