unity3d - Vuforia Android SDK explained samples? -
i'd appreciate if can provide sources can me understand how vuforia samples android work? i've installed , can run them on phone it's difficult understand project structure.
it great if there's tutorial on how create simplest ar
app android studio
(not unity
).
i've learnt how create ar scenes unity, export them android , run on device, still don't understand how work exported project in android studio further.
my goal have 1 image target , several 3d objects. want have simple ui listview
choose object place on target.
also, possible build listview
android , on item's click event switch 3d object in single scene created in unity? know can dynamically load 3d models unity c# script, can trigger function in script via android?
i'd appreciate advice.
summary:
first upload picture vuforia cloud returns xml , .dat file. stored in streaming assets. dat file contains info in binary format marker. xml contains info name , size , linked c# component.
vuforia allows creat runtime marker or cloud marker shall leave out now. idea remains same.
when run app, camera hardware ch (not unity camera, keep distinction in mind), provides feed. feed rendered on texture in unity scene, unity camera uc facing texture. fixed in space, content of texture updated ch provides each frame. reality of app.
at same time, vuforia scans ch feed , performs pattern recognition https://en.wikipedia.org/wiki/pattern_recognition trying find match dat file provided. when pattern found, performs second run define distance , rotation of pattern respect ch. possible since xml file contains dimensions of real marker. if xml says 50x50 , marker 25x25, twice small expected since system understands marker further away is.
when marker recognized, vuforia calls on state listener on defaulttrackableeventhandler (check script on parent of model), implements method:
public void ontrackablestatechanged( trackablebehaviour.status previousstatus, trackablebehaviour.status newstatus) { if (newstatus == trackablebehaviour.status.detected || newstatus == trackablebehaviour.status.tracked || newstatus == trackablebehaviour.status.extended_tracked) { ontrackingfound(); } else { ontrackinglost(); } }
basically, if vuforia detects change, calls method. can propagate event further making ontrackingfound/lost public event onto can register. or create new script implements itrackableeventhandler. listening if model got found or lost. in example, when found, shows model , vice-versa. , basic scenario can triggered.
the result of calculations represents transform (position, rotation). transform passed on second unity camera in scene. coordinates defined (0,0,0) position of marker. aims @ 3d model placed there. note can place model anywhere in scene, offset. vuforia camera cannot controlled, if try , pass value transform, overwritten vuforia. not meant play values. can on other hand set on , off, affect of rendering , on.
the first uc has lower depth renders real scene first, second rendered on top, augments reality 3d model. set of layer mask, second camera ignores rest of scene model considered.
you not want play around background feed, surely want interacts model, so, normal scene. grab camera component of vuforia camera , raycast in forward direction. check hit , action.
Comments
Post a Comment