Meetings Summary

Meetings Summary


3rd October. Initial team meeting in the Systems Engineering lab class.
  • Learned how to hardware hack an Xbox Kinect 2.0.
  • Brief discussion after this of what our next tasks were to be.
  • Timings:
    -Hacking the Kinect approx. 2 hours.
    -Post tutorial team meeting approx. 10 minutes.
  • Team members present: all.

7th October. Extra lab class.
  • Meeting with other teams to hack Xbox Kinects so that each team can take one or more to work with.
  • Brief discussion about calling Dr Dunning who informed us that he was free the following evening for an introductory call.
  • Timings:
    -Two hours spent with the Kinects.
    -Team discussion had while working on Kinect.
  • Team members present: all.

8th October. Phone call with client Dr Joel Dunning.
  • Met to call one of our clients, Dr Joel Dunning. Dr Dunning is a cardiothoracic surgeon who works at James Cook University Hospital.
  • The purpose of the call was to introduce our team to him and to hear a breif overview of his vision for this project. This is:
  • To redesign a surgical robot to be better suited to the task than current options.
  • To develop a camera system to be used with such a robot that would provide enhanced feedback of the situation.
  • The camera should ideally be able to be inserted into the patient through a single incision and then attached to the inside of the chest cavity, leaving the same incision free to be used to insert surgical tools into the patient.
  • The camera would ideally have a wide field of view (such as a fisheye lens) with software installed that is capable of focusing on different areas inside the body.
  • The camera should also be able to judge depth, and tell a user how far away the camera is from objects. This could be done either with a visual display of the approximate distance of an object, or by using a Geiger counter type system that beeps when the camera is getting close to an object.
  • A bonus would be if the camera could judge how far tools are from parts of the body and from each other, and then send tactile feedback to the surgeon if the camera judges the tools to be damaging a part of the body without the surgeon’s awareness or if the tools get too close. One of the main disadvantages of current robotic systems is that they provide no tactile feedback.
  • Dr Dunning also invited the team to observe him performing surgery in the hopes that this would act as a catalyst for generating new ideas and ways of thinking about robotic surgery.
  • Timings: -Call lasted approx. 20 minutes.
  • Team members present: Ed Collins, Kirthi Muralikrishnan. (Tom Page was unaware of the meeting due to a malfunction in Facebook messaging meaning that he did not receive the messages organising the meeting).

10th October. Kinect hacking lab session.
  • Further learning of skills such as soldering required to hack the Xbox Kinect 2.0.
  • Timing: -Lasted 2 hours.
  • Team members present: all.

14th October. Team meeting with Dr Dean Mohamedally.
  • Met with Dr Mohamedally to discuss our project. We had received two objectives from our client, a wireless camera and a depth sensing one, and Dr Mohamedally gave us guidance on which goal to pursue; this being the depth sensing project with a Kinect 2.0.
  • Dr Mohamedally gave us recommendations on where we should base our research, which is in:
  • Learning C#.
  • Reading about Computer Graphics and collision detection algorithms.
  • Reading about using the Xbox Kinect.
  • We were also informed of the more specific nature of the project - that it is for research and development, and we should aim to produce a working larger-than-scale prototype by the end of the project, not a to-scale fully working prototype.
  • Timings: -Meeting lasted approx. 30 minutes.
  • Team members present: all.

16th October. Team meeting with Dr Agapito
  • Brought her up to speed with the current state and goals of the project.
  • Dr Agapito gave us advice on potential tools to use. These included:
  • Mesh Lab - Graphics tool for rendering clouds of 3D points.
  • Kinect Fusion SDK - allows Kinect to produce high resolution images by moving it to different perspectives.
  • Loosely arranged to meet again in two week’s time.
  • Timing: -Lasted approx. 45 minutes
  • Team members present: Tom Page, Ed Collins. (Kirthi called away to an emergency).

17th October. Lab session focused on hacking the Kinect 2.0.
  • Formed a miniature production line to quickly prepare all USB 3.0 cables and all of the Kinects for soldering the following Friday.
  • Timing: -Lasted 2 hours
  • Team members present: all.

24th October. Lab session with Kinects.
  • The Kinects prepared the previous week had been misplaced, so ended up preparing three new ones, then soldering and testing them.
  • Managed to get all three working, however only on one of the laptops on which we tested them. We suspect that the other laptop may not have had sufficient power supply from the USB terminal to power the Kinects.
  • Timing: -Lasted 3 hours.
  • Team members present: all.

28th October.
  • The team travelled to Middlesbrough to meet Dr Dunning, who invited us to watch surgery and see some of the issues that face surgeons today.
  • We watched three surgeries: -Open surgery to remove a tumor from the patient’s lung. -Endoscopy to inspect lymph nodes in the lungs. -Open surgery to remove a tumor from a different patient’s lung.
  • Dr Dunning also showed us the Da Vinci surgical robot. This consisted of three separate units, each were very large. This system costs in the region of £1.8 million, and is so large that it does not fit in many operating theatres.
  • Team members present: all.

31st October. Lab session with Kinects.
  • This was the last session with our instructor who was showing us how to hack the Kinect. In this session we finished five Kinects, testing three of them to confirm that they were working.
  • Timing: -Lasted 3 hours.
  • Team members present: all.

14th November. Lab session.
  • This lab session was concerned with getting our HCI courseworks reviewed by teaching assistants. The team then scheduled a meeting in which we would discuss our separate HCI designs to create a final User Interface design.
  • Timing: -Lasted 2 hours.
  • Team members present: all.

18th November. Team meeting.
  • This meeting was concerned with comparing each team member’s User Interface design and deciding on the final design. The final design was then created in a single document to be discussed with a teaching assistant in the next lab session.
  • Timing: -Lasted 1 hour.
  • Team members present: all.

21st November. Lab session.
  • This session was heavily focused on learning to program the Kinect.
  • We discussed for the majority of the session with PhD student Aron Monszpart about how to program the Kinect, and then created a basic program to separate foreground from background by identifying objects that enter the camera’s field of view.
  • Then discussed user interface design with a teaching assistant to assess it and finalise the design.
  • Timing: -Lasted 2 hours.
  • Team members present: Tom Page, Ed Collins.

28th November. Lab session.
  • Team discussed work assignment and worked on first iteration of prototype program for Kinect.
  • Discussed with teaching assitant about User Interface - however it is currently essential that the team focus on the core functionality of depth sensing before attempting to create the User Interface design.
  • Timing: Lasted 2 hours.
  • Team members present: all.

5th December. Lab session.
  • Team discussed work assignment for the final week and final parts of the project.
  • Discussed the structure of the website in detail.
  • Discussed the creation of the project video.
  • Discussed the prototype created.
  • Discussed with HCI expert about heuristic evaluation of the system.
  • Timing: Lasted 2 hours.
  • Team members present: all.

9th December. Meeting to discuss construction of team website.
  • Discussed potential tools to use, including Bootstrap and Adobe Dreamweaver.
  • Discussed content that should be included in the website.
  • Timing: Lasted 45 minutes.
  • Team members present: Kirthi, Ed

12th December. Lab session.
  • Spent discussing website layout and content and video presentation.
  • Decided who would edit the video.
  • Wrote Bi-Weekly Report 5.
  • Timing: Lasted 2 hours.
  • Team members present: Kirthi, Ed

TERM TWO
16th January 2015. Lab session.
  • First meeting after break, discussed possible ways of progressing the project / design of algorithms.
  • Preliminary discussion about work assignment and elevator pitches.
  • Timing: Lasted two hours.
  • Team members present: Tom, Ed

20th January. Meeting with Dr Agaptio.
  • Discussed state of project, showed designs for user interface.
  • Received recommendations on direction of project:
  • Have another look at Kinect Fusion.
  • Create some foam and metal props ASAP.
  • Techniques for identifying surgical tools: Gaussian Model or Grab Cut Algorithm, possibly present on Photoshop.
  • Google’s Project Tango - https://www.google.com/atap/projecttango/#project
  • Agreed to meet again in two weeks (3rd February).
  • Timing: Lasted around 45 minutes.
  • Team members present: all.

23rd January. Lab session.
  • Discussed work assignment and what needs to be completed next. Agreed that depth sensing functionality relating to our project is the most important thing to begin work on.
  • Agreed that: -Tom would begin working on a program to measure how far away a green highlighter was from the Kinect. -Ed would source some props for the project, that is fake surgical tools and body tissue. -Kirthi and Ed would get the Kinects working with their Macs and then experiment with programming them to get a feel for them.
  • Discussed with PhD student Aron Monszpart about our idea to use bright green tools to make tracking easy. He is of the opinion that we will still need to do statistical analysis to identify which green pixels in the camera view are part of the tool.
  • Timing: Lasted 1 and a half hours .
  • Team members present: all.

30th January. Lab session.
  • Discussed work assignment document and work to be completed next:
  • Tom will work on creating a program to measure how far away a green highlighter is away from the camera.
  • Kirthi and Ed will gain experience with the Kinect and C#, having gained a reasonable amount of experience by next Friday.
  • Timing: Lasted 1 hour.
  • Team members present: all.

4th February. Meeting with client Dr Agapito.
  • Dr Agaptio informed us she has emailed the Surgical Vision group to see if we can talk to them.
  • Advised us on some methods for identifying the tools - segmentation for example and some basic algorithms.
  • Recommended that we have footage from the Kinect running one of our programs to show her at our next meeting, which will be Tuesday 10th February.
  • Timing: Lasted half an hour.
  • Team members present: all.

5th February. Meeting with Dr Dean Mohamedally for milestone review.
  • Recommended that we speak with Dr Dunning every two weeks to inform him of the project.
  • Recommended that we look into calculating the orientation of tools in the body.
  • Recommended that we look into Design Patterns, particularly the Strategy design pattern.
  • Would like us to produce a four page paper on the project to hopefully present at conferences to other surgeons.
  • Strongly emphasized the need to use design patterns.
  • Timing: Lasted half an hour.
  • Team members present: all.

6th February. Lab session.
  • Discussed aspect of the Kinect with Aron Monszpart.
  • Timing: Lasted an hour.
  • Team members present: Tom, Ed.

9th February. Meeting with Danail Stoyanov.
  • Team met with Dan who is in charge of the Surgical Vision Group at UCL to discuss the project and get his advice as his team has already found solutions to many of the problems we are facing such as tool identification.
  • Got to use a Da Vinci surgical robot that Dan and his team have for research purposes. The goal was to remove a small piece of wire from a prop without “damaging” the prop. We saw how our project could be applicable, it was extremely difficult to tell how far away the robotic tools were from the wire, even though the robot provides a 3D view.
  • Dan alerted us to the issue of props to test the system. As the Kinect’s camera is much larger than a laparoscopic camera, the props and experimental body tissue need to be scaled up accordingly, so the small aluminium rods we currently have as props may not be appropriate.
  • Discussed with PhD student Max Allan, who designed the algorithm that recognises tools in the laparoscope view. He gave us a brief overview of how he did this - using a form of machine learning and the library OpenCV.
  • Timing: Lasted 1 hour.
  • Team members present: all.

10th February. Meeting with Dr Agapito.
  • Discussed what had been discussed with Dan in his lab on the previous day.
  • Requested that we send her footage of some of our programs running with the Kinect.
  • Suggested organising a meeting between the team, Dean Mohamedally, Dan Stoyanov, Aron Monszpart and herself to discuss exactly where we should focus the project given the short time frame available to us.
  • Timing: Lasted 45 minutes.
  • Team members present: all.

13th February. Lab session.
  • Discussed with PhD student Aron Monszpart about how to map the Kinect’s depth feed to the colour feed so that we can identify and monitor the depth of a green highlighter.
  • Timing: Lasted 1 hour.
  • Team members present: all.

24th February. Meeting with Dr Agapito.
  • Discussed plans for finishing the project. We need to speak to our clients to determine exactly what they want to see when this system is demonstrated in April - what would be their ideal depth feedback system? How do they want it demonstrated, with a larger than scale model of the body?
  • Dr Agapito explained how depth is calculated by the camera, explaining the maths behind it.
  • Arranged to meet the following week.
  • Timing: 1 hour.
  • Team members present: Ed.

27th February. Lab session.
  • Discussed with PhD student Aron Monzspart, who showed us how to use MatLab as a prototyping tool.
  • Discussed the state of the project.
  • Timing: 1 hour.
  • Team members present: all.

4th March. Phone call with client Dr Joel Dunning.
  • Informed Dr Dunning of the state of the project and how work was progressing, telling him of our idea to use green tools.
  • Discussed potential scenarios to test the system with. The two main suggestions he had were:
  • Slinging a blood vessel, which involves passing a piece of string around a large blood vessel to hold it out of the way. Can be particularly dangerous as surgeons often have to keep moving the tool until it hits the blood vessel to judge where it is.
  • Threading a needle, one of the first things that trainee laparoscopic surgeons are required to do.
  • Informed us that when performing laproscopic surgery in the chest, the distance between the camera and the tools tends to be approximately 10cm. The Kinect’s minimum distance is approximately 50cm, so any scenarios we create will need to be scaled up by approximately a factor of 5.
  • Said that he was not particularly interested in tool orientation, so may not need to work on this.
  • Timing: Half an hour.
  • Team members present: Ed, deliberately only one member present for convenience as phone call was late in the evening.

6th March. Lab session.
  • Discussed erosion and inflation image augmentation with Aron Monzspart, something that could be applied to our project.
  • Discussed state of the project and assigned work among team so that each member now knows exactly what pieces of work need to be completed and by when.
  • Timing: 1 hour.
  • Team members present: all.

10th March. Meeting with Dr Agapito.
  • Demonstrated some of our progress with the Kinect to Dr Agapito.
  • Discussed potential test scenarios, as given to us by Dr Dunning.
  • Dr Agapito tasked us with preparing a first demonstration of the system by Friday 20th March.
  • Timing: 45 minutes.
  • Team members present: all.

13th March. Lab session.
  • Discussed speeding up the system with Aron Monzspart, ideas included fast-fourier transformations or running parts of the algorithm on the GPU.
  • Timing: Lasted 1 hour.
  • Team members present: all.

17th March. Meeting with Dr Agapito.
  • Showed her recent progress.
  • Discussed issue of double images when mapping depth to colour picture.
  • Timing: Lasted 30 minutes.
  • Team members present: all.

20th March. Demonstration with Dr Agapito and Aron Monszpart.
  • Demonstrated the current state of the system to our client Dr Agapito and teaching assistant Aron Monszpart.
  • Augmentations that displayed general depth with a heat map and ones that coloured the tool based on how far away it was from the camera worked well.
  • Tool proximity augmentation ran extremely slowly and will need to be speeded up before it can be presented to our surgeon clients. Also only coloured the edges of the tool, will need to be expanded to colour the whole green part of the tool.
  • Dr Agapito told us we needed to work on this before it would be acceptable to present to our clients.
  • After the demonstration, discussed improvements to the algorithm with Aron.
  • Timing: Lasted 1 and a half hours.
  • Team members present: all.

27th March. Lab session.
  • Discussed with Aron Monszpart how the Tool Proximity algorithm of the system could be speeded up.
  • Aron took a look at our code to see if he could suggest where we might speed it up.
  • Sorted out the work assignment for the Easter break.
  • Took some images of the final system running to send to our clients.
  • Timing: Lasted 1 and a half hours.
  • Team members present: all.