Following on from the previous project and the lessons learned, I wanted to continue my test to see if I could improve the tracking outdoors.
The first thing I needed to sort out was the visor as I realised it was covering the sensors as well as the visor you look through. This was easily corrected by modifying the template for the tinted film and cutting out a new version.
The first video was just to show the same house from last week but without the tinted film over the sensors to see how much of a difference it made. My conclusion was that it did not make that much difference which meant other factors were far more influential in causing the jitter.
The next step was to try putting physical objects in the field to help the depth camera lock onto something with a bit more height than the grass. I also added some printed patterns to aid the RGB camera, as I discovered it is also employed to help with the tracking. As the depth cameras only see 5-10m, I think the boxes were not close enough to me.
I moved the boxes nearer to the device here, to see how much it helped. I was hoping it would make a bigger difference than it did, but the model still jumped around, perhaps not quite as much as before.
My last test was to go to a building site and try placing the model over a house that was mid-build. This was a) to give a lot more for the depth cameras to lock onto and b) to show how you might use mixed reality to see what a house will look like in the exact position where it is being built.
My conclusion is that something else still seems to be responsible for a lot of the jittering, although I am sure the steps I have taken have helped somewhat.
Things to test next time include:
- Testing to see if wifi helps. I believe wifi allows a bigger mesh to be established and tied to something without losing track of where you are (which causes the meshing to start again).
- Testing with much lower poly models to see what difference that makes.
1,325 total views, 0 views today