In the first part of this project we have created a Surgical Navigation System for the Neuroethology and Sensory Processing Lab in the Technion Faculty of Medicine. The system assists the surgeon to perform a surgical brain surgery by displaying the location of the surgical tool on the medical imaging, marking points of interest by the surgeon, and supplying indication and movement instruction toward these points. This system was a "proof of concept" but still not ready for continuous usage in real surgery.
In the second part of the project, the goal was to make the necessary adjustments so the system could be used in real surgery situations and to assist the researchers in various surgeries modes performed in the lab, including adding a user interface and improving the reliability of the system. We concluded the work after we successfully created a full surgical navigation system, able to instruct the surgeon to reach the surgical destination. The system lets the surgeon mark an entry point and destination point on CT and MRI scans, and instruct him to reach these points with the user interface we have created. The system is compatible with the lab's needs and provides the surgeon with easy and intuitive usage, while let him focus on the surgery operation.