This information presumes you have already done reading on the https://docs.vrcft.io/ docs and may have been as confused as I was when I started!
This is also EXTREMELY subjective, I am speaking to my experience creating many arkit edits of mine and others models! This is being written as a type of guide I personally would’ve appreciated when I started learning how to do this. Not a step by step tutorial, that is why it is called “advice” and not “tutorial”. If you don’t have the drive to use google and youtube, you may find the amount of information here to be lacking.
I've done 4ish ARKit edits now, and learning how to do this stuff is SO incredibly intimidating and confusing when I really don't think it needs to be. It's quite an easy and quick process once you understand what you're doing
These are the 2 vids I've made on the subject!
https://www.youtube.com/watch?v=KpoM6FDHGgI
https://www.youtube.com/watch?v=1dyfmbUW9y8
I think the most important thing to have is testing data from someone who has an iphone. The two apps I’ve had people use is the Rokoko app, and the ifacialmocap app. The rokoko app does not record eye rotation while ifacialmocap does. I believe ifacialmocap can only put out .fbx files that are incompatible with blender and require really annoying conversion methods, but I can’t use the app to verify that information @_@
I recommend doing ARKit shapes over Unified Expressions shapes because they are much simpler to do, you can build most of the shapes off of preexisting shape keys. The main negative of ARKit setups is it won't support tongue movement and eye dilation that some headsets have, but I've found it's perfect for my quest pro and is perfect for VTubers!
For VRChat, I highly recommend using Jerry's templates They are very optimized and not something your average creator could make!
For Jerry's templates, if you want to modify your eye blink sensitivity you'll need to go into where my screenshot is showing in the FX layer and adjust the thresholds, it helps a lot with jittering!