It has been over a month since I graduated as Master of Science at the Delft University of Technology. I have somewhat recovered from the sleep deprivation that comes with a graduation project, so now it is time to show you what I worked on.
Human anatomy is complex in its nature and from the late bronze age to this date people have been trying to understand it. Resources such as books and software exist to train students in their knowledge of the human body. Most books focus on specific parts of the body or try to give a general overview. Images in books present the information from a single viewpoint, since interaction is not possible. Several software tools have been developed to illustrate anatomy. The aims of these tools are diverse, including education, anatomical research, surgical training and pre-operative planning.
Let me introduce the Online Anatomical Human (OAH in short), an online browser and annotation system for real human anatomy (Figure 1). It makes 2D medical image data and the 3D models that were created from it available to everyone with an Internet connection. The system runs completely inside a web-browser and directly from the web. There is no need to install any plugins or other software.
Figure 1. The Online Anatomical Human browsing system for human anatomy. A 3D model of the human pelvis is shown alongside three orthogonal 2D views containing medical imaging data.
The application functions as an educational tool where users can not only retrieve, but also add and share information. Its main contribution is that this system is the first of its kind to offer real anatomical data in an online environment with existing linked knowledge and the possibility to add new information. I describe this data as real anatomical data, because it is obtained from medical imaging data and is not based on an idealized average anatomy.
Besides tools for exploration of the data, an editor is available for annotations. These annotations can be added to the 3D mesh directly. This information can be used to enrich the model.
The following types of annotations are available in OAH:
Figure 2 shows the annotation view with an annotated human pelvic bone. In this case, the main regions of the pelvic bone are annotated and a landmark point is placed.
Figure 2. An annotated human pelvic bone in the Online Annatomical Human
The difficulty with annotations lies in the fact that they are placed on a 3D surface. For single points this is not a problem, as long as the point where the marker must be placed is visible in the current view. For regions and (closed) line segments, solutions are less trivial. For brushes we do not just need one point, but a range of points within a certain radius. For lines we need a strip of points to follow the curvature of the model.
For the region annotations I used a forward-search algorithm. First, the point on the mesh directly below the cursor is taken. Then the connected vertices within the selected radius are found using a forward search similar to Dijkstra's algorithm. Because of how the algorithm works, only points on parts of the surface that are connected get colored. This prevents discontinuities in the coloring process as you can see in figure 3.
Figure 3. When drawing a region in OAH, disconnected parts do net get colored.
To get to the initial point below the cursor I used an off-screen rendering. With this method, the mesh is rendered a second time in the background, with each triangle rendered with a distinct unique color. Figure 4 shows how that would look like if you would render it to screen. To retrieve the triangle below the cursor, we can simply check the color for the corresponding pixel in the background rendering. This is much faster than raycasting techniques and speed was a high priority in the web environment.
Figure 4. Each face of the model is rendered with a distinct color. By checking the color value directly below the cursor, you know the corresponding face.
In order not to make this article into a small book, I won't go into too much detail for now on the algorithms behind the line annotations. If you'd like me to write a separate post about it, let me know in the comments. In short, I wanted to give users of OAH the possibility to draw perfect straight lines directly onto a 3D mesh. Figure 5 shows a result. From the angle that the line was drawn in the line looks perfectly straight, while as you rotate the mesh, you see it actually follows the curvature of the mesh.
Figure 5. left: line drawn looks perfectly straight from drawing angle. right: rotated view shows that the line follows the curvature of the mesh.
When I got my hands on a Leap Motion device, I just had to try and see whether I could make it work within the application. The Leap motion is a computer hardware sensor device that supports hand and finger motions as input and does not require the user to touch anything. This was originally not even part of the project description, but sometimes trying something new at random can lead to beautiful things. In short time a basic prototype application was build showing a mesh of a pelvic bone. The Leap Motion can be used to control the camera and to paint on the mesh with a brush. I put a short video showing this interaction online. The reactions on this video were overwhelming, Leap Motion even covered it in their newsletter. At that moment I knew this was something that I needed to explore further. The result was Leap Mesh Painter about which I wrote before on this blog. Although it started out as a mere proof-of-concept application, it has become a small project on its own.
OAH can be controlled with a mouse, Leap Motion, or both. The latter is mainly useful for annotation. You can use one hand in the air to rotate the model and zoom in or out. The other hand controls the mouse to do the actual annotation.
The technique can be used for different applications as well. In the operation room for instance, computers are often used to display information about the patient and medical imaging data such as pre-operative scans. When images on this screen need to be rotated or a different view is needed, the surgeon needs to tell someone else to do this for him/her, because his/her hands need to remain sterile. With the Leap Motion, the surgeon would be able to control the screen directly, without touching any input devices.
In order to make Leap Mesh Painter controllable without ever needing to use a mouse, I had to think of ways to change the properties of the brush, such as the color and the diameter. In order to this I built in gestures to open these controls, one of which is Leap Color Picker. This color picker can be controlled by moving your hand in the air. A detailed description and tutorial to build your own Leap supported color picker is available here.
You'd like to try OAH for yourself you say? Great! I have good new and I have bad new. The bad news is that although the project was set up to be as open as possible, the data (medical images and 3D models) that have been courteously made available by the Leiden University Medical Center, may not be shared. That means, that the tool cannot be released with the current dataset. The good news is that I am looking into some datasets that are freely available. Once I do, the tool will definitely become available at some point. If you are interested in using OAH, I'd love to hear from you!
In the meantime, check out a teaser video.
There have been live demos of LeapMeshhPainter at the Delft Data Science New Years Event and study information days for Clinical Technology. LeapMeshPainter is currently available to try out for yourself.
An adaptation of this article was published in maCHazine, the magazine of the Applied Mathematics and Computer Science Study Association 'Christiaan Huygens'.