Adding a model to the scene and rotating it using gesture manipulation
Conducted by Fraser Savage
Overview:
This initial experiment focussed on getting one of the generated models from a scan into Unity and on allowing the user to rotate the model using a manipulation gesture.
Process:
1. Initial set up of the unity project
During experimentation the version of Unity which was being used to develop the demo with was switched. At points where the usage of the two versions differs significantly it will be noted. The two versions used were:
- Initially: Unity 5.4.0f3-HTP (HoloLens Technical Preview)
- Currently: Unity 5.5.0b11 (Unity 5.5 Beta 11)
The reason for switching to Unity 5.5 Beta 11 was down to the addition of HoloLens simulation in the Unity Previewer, as well as a slightly less convoluted build process.
First step towards setting up project involved going to the GitHub repository for HoloToolkit-Unity, cloning it and opening up the folder within Unity. With the project opened within the editor I used the menu option Assets -> Export Package...
to export the HoloToolkit as a Unity package ready for importing.
After exporting the HoloToolkit-Unity package I could create a new Unity project and use the menu item Assets -> Import Package -> Custom Package...
to import the HoloToolkit-Unity package into the new project. Note: I made sure while doing this to uncheck the "HoloToolkit-Examples" folder during the import dialogue as that resource was not needed.
Before adding any additional assets to the project I set up the Main Camera for working with the HoloLens. To do this I made sure that the Hierarchy
pane was empty. Once this was done I then performed the following steps to set up the scene camera for HoloLens development:
- Add the
Main Camera
prefab found withinHoloToolkit/Utilities/Prefabs
. - Click on the
Main Camera
game object in theHierarchy
pane and then selectedAdd Component
in theInspector
Pane. - In the
Add Component
dialogue I typed "Manual Camera Control" into the search bar and added that script to the game object.
Then using the HoloToolkit
menu item I configured the scene and project:
- Selecting
HoloToolkit -> Configure -> Apply HoloLens Scene Setting
- Selecting
HoloToolkit -> Configure -> Apply HoloLens Project Settings
Next I got the extra assets from the Microsoft Holographic tutorials Holograms 210 and Holograms 211 source. The assets borrowed from Scripts
are:
GestureAction.cs
Interactible.cs
InteractibleManager.cs
2. Adding the db28 sample model to the project
As the .stl
and .vtk
files provided are not directly useable within Unity I needed to convert the meshes into .obj
files so that they could be added. For this I used a program called MeshLab and then for each mesh provided in the folder (in this case the db28 folder) performed the following:
- Created a new project with the
File -> New Empty Project...
menu item. - Imported the mesh using the
File -> Import Mesh...
menu item. - Exported the mesh as an
.obj
file using theFile -> Export Mesh As...
menu item.
Once this was done for each mesh in the folder I moved the exported .obj
files into the Assets/Holograms
folder of the Unity project. To then add all the meshes to into the project I created an empty game object using the Create -> Create Empty
menu option available in the Hierarchy
pane, naming it "KidneyCollection". Each object from db28 was then dragged from the Assets/Holograms
folder in the Project
pane to the the newly created game object to add each mesh as a child.
To add different materials to each part of the "KidneyCollection" object I then created a Materials
folder in Assets/Holograms
and created a separate material for each child game object through right clicking on the Project
pane and selecting Create -> Material
. The materials were then added to the game objects by selecting each of them in the Hierarchy
pane and dragging the material from the Project
pane to the Inspector
pane.
Finally, to scale the "KidneyCollection" and place it at a better location I selected the "KidneyCollection" object in the Hierarchy
pane and then set the values of the Transform
component to:
Position: (0, -0.75, 2)
Rotation: (-90, 0, 180)
Scale: (0.005, 0.005, 0.005)
3. Setting up lighting
The next step was to add lighting to the scene so that the materials rendered as intended during runtime. Setting up lighting was a case of creating another empty game object in the scene using Create -> Create Empty
through the Hierarchy
pane again, this time calling it "Lights". Adding the lights to this object was a case of right-clicking on the newly created object in the Hierarchy
pane and selecting Light -> Directional Light
. I did this three times, with the directional light child objects named "Light 1", "Light 2" and "Light 3". Each light was set to point at the "KidneyCollection" object from different positions and at different angles.
4. Adding the manager scripts
Following the set up of the "Lights" object I moved on to create an empty game object called "Managers" through Create -> Create Empty
. Then using the Add Component
dialogue I added the following scripts to the object:
- Gaze Manager (
GazeManager.cs
) - Interactible Manager (
InteractibleManager.cs
) - Gaze Stabilizer (
GazeStabilizer.cs
) - Gesture Manager (
GestureManager.cs
) - Hands Manager (
HandsManager.cs
)
5. Setting up the cursor
Before setting up the gesture for interaction with the "KidneyCollection" object I needed to set up the gaze cursor. To do this I first dragged the "Cursor" prefab from HoloToolkit/Input/Prefabs
into the Hierarchy
pane, making sure the Cursor Manager.cs
component in the Inspector
pane correctly had the CursorOnHolograms
and CursorOffHolograms
scripts associated with it.
Once the "Cursor" object was added to the scene I added the Cursor Feedback.cs
component using the Add Component
dialogue in the Inspector
pane. Then to give the "Feedback Parent" an object I created an new game object called "CursorBillboard" through Create -> Create Empty
and added the Billboard.cs
script as a component of the object, setting this new object as the "Feedback Parent".
6. Adding the components to the KidneyCollection
The last task for this experiment was to get the gesture manipulation working with the "KidneyCollection" object. To do this I needed to add 4 components to the object using the Add Component
dialogue:
- A Mesh Collider
- A Mesh Renderer
- The
Interactible.cs
script - The
GestureAction.cs
script
All of the script files are available on the KidneyRotationExperiment branch but the function that handles the rotation is as follows:
private void PerformRotation()
{
if (GestureManager.Instance.ManipulationInProgress)
{
// Get rotation values
var horizontal_rotation = GestureManager.Instance.ManipulationOffset.x * RotationSensitivity;
var verticale_rotation = GestureManager.Instance.ManipulationOffset.y * RotationSensitivity;
// Perform a rotation around the z axis from the horizontal_rotation values
transform.Rotate(new Vector3(0, 0, -1 * horizontal_rotation));
// Rotate around the right axis and the collider's center point with the vertical rotation values
transform.RotateAround(GetComponent<Collider>().bounds.center, Vector3.right, verticale_rotation);
}
}
7. Building the project and running
- Select
File -> Build Settings
- Select
Windows Store
- Set the SDK to
Universal 10
To build the project for Unity 5.5.0b11 this additional step was taken:
- Set Target Device to
HoloLens
If building on Unity 5.4.0f3-HTP that step would be skipped. In both cases I would continue:
- Set UWP Build Type to
D3D
- Set "Build and Run on" to
Local Machine
7, CheckUnity C# Projects
- Click
Player Settings
and then checkVirtual Reality Supported
and selectWindows Holographic
- Click
Build
To run the project, I would then open the .sln
file within the build directory and do the following:
- Change the solution configuration from
Debug
toRelease
- Change the solution platform from
ARM
tox86
- Change the run target from
Local Machine
toHoloLens Emulator 10.0.xxxxx.x
- Select
Debug -> Start Without Debugging
8. Conclusion
From this experiment I learnt about some of the different properties that I could utilise from the Gesture Manager, allowing for more options when creating Gesture Manipulations. One of the key points was that I found using the ManipulationOffset
property was easier to use with the rotation manipulation. The experiment gave me extra perspective around what manipulations felt intuitive and made me reconsider what aspects of rotation are needed for useability.