Saturday, July 30, 2011

Cloud computing in medical physics*: A snapshot - July 2011


These days if you do almost anything with a computer hooked up to the internet, you’ve probably heard the term “cloud computing”. Well I’m here to tell you that cloud computing actually does mean something and that it is something useful for medical physics! In this post I’m going to take a quick look at the current state of cloud computing research in medial physics and how it got there.

First things, first, what do I mean by cloud computing? Cloud computing generally refers to scalable computing resources, such as data storage, CPU time, or software access, offered over the internet with a pay-as-you-go pricing scheme (and sometimes free). This service can be provided at a few levels: raw server level access (infrastructure as a service, IaaS), pre-configured systems for specific tasks (platform as a service, PaaS), and software as a service, SaaS, e.g. Gmail or Dropbox. The interesting thing about cloud computing is that, suddenly, anyone armed with a network connection, a credit card, and a little know-how can have access to an unprecedented amount of computing resources. This opens up a huge number of computing possibilities for medical physicists, among others.

Starting in the second half of 2009, our research group at the University of New Mexico, the UNM Particle Therapy Group, started investigating using IaaS-style cloud computing as the basis for massively parallel medical physics calculations (read, very fast, large calculations). Our very first results, demonstrating proof-of-concept with proton beam calculations on a “virtual Monte Carlo cluster” of nodes running on Amazon.com’s EC2 service were presented at the XVIth ICCR conference in Amsterdam in May, 2010. We presented a poster of our second set of results at the AAPM annual meeting in July, 2010 and then posted a paper to the arXiv titled “Radiation therapy calculations using an on-demand virtual cluster via cloud computing” http://arxiv.org/abs/1009.5282 (Sept. 2010).

It has been exciting to see the reactions to our posters, talks, and arXiv print, with most physicists immediately seeing the potential benefits offered by the new levels of access to computing resources offered by cloud computing services. Even more exciting is to see the projects subsequently launched by other medical physics researchers.

So what’s happening now?
  • Chris Poole, et al (Queensland Univ. of Tech.) posted a note to the arXiv titled “Technical Note: Radiotherapy dose calculations using GEANT4 and the Amazon Elastic Compute Cloud” http://arxiv.org/abs/1105.1408 (May, 2011)
  • Chris Poole also posted code used in that project to http://code.google.com/p/manysim/ (May, 2011)
  • UNM held a workshop that included tutorials on using cloud computing for medical physics calculations. Sponsored in part by Amazon.com. (May, 2011)
  • UNM launched a cloud computing webpage in conjunction with the workshop: http://www.cs.unm.edu/~compmed/PTG/cloud.html (May, 2011)


Compared to our single abstract at AAPM last year, you could claim there has been an “exponential explosion” in interest in cloud computing in medical physics since we presented our first results in 2010! Does this mean we’ll see 50 abstracts in 2012 mentioning cloud computing? (Two points does not a trend make?? ;) )

I look forward to seeing where this new technology will take our field.


*I’m really leaving off all of the (mostly commercial) cloud-based PACS and related radiology software and services. Some of these are mentioned in our paper on the arXiv.




Friday, July 1, 2011

Antiproton Radiotherapy Experiments at CERN

In this moment we have a week of antiproton beam at CERN for radiobiology and dosimetry experiments. The main experiment is to measure the relative biological effectiveness (RBE) of antiprotons. As an endpoint we use clonogenic survival of V79 Chinese hamster cells (in vitro).

What makes this experiment so complicated, is :
  • we only have narrow beam geometry available at CERN
  • antiprotons are rare, we only get app. 1 Gy / hour
  • beam is highly pulsed, i.e. a 500 nanosecond spill every 90 seconds.
Therefore, we invest a lot of effort in performing precise dosimetry with multiple redundant systems. This is a long story, which I will tell more about another time. Here, let me just show a few pictures...

The antiproton beam line with a water phantom for dosimetry.
The entire experiment is located in an experimental zone at the antiproton decelerator (AD) at CERN. We share our zone with the AEgIS people, who want to find out if antiprotons fly up or down in the gravitational field of the earth. :)

Franz-Joachim Kaiser messing with the water phantom. Behind him the AEgIS beam line.
Three ionization chambers (ICs) are visible here, from the left to the right: a custom made "Advanced Roos" chamber, a Markus chamber and the MicroLion liquid ionization chamber. All by PTW.
Gafchromic EBT film irradiated with antiprotons. Beam spot is about 1 cm FWHM. The narrow beam geometry makes us very vulnerable to positioning errors...

... and therefore we also monitor the beam from spill to spill with a Mimotera detector.
We are usually 2-4 people on a shift. Tonight I will do the night shift with Franz-Joachim (to the left). Stefan will leave soon. Usually, I would do night shifts with Roy Keyes, but he couldn't be here this year.
I build this little box for the experiment: it interfaces the antiproton decelerator with the printer port of our data acquisition computer. No need for expensive IO cards or fancy LabView. Basically it is just some TTL logic and optocouplers. On the server side, a daemon listens to the parallel port if a new spill of antiprotons is coming in.
Once triggered, the server takes care to read out all data systems, such as beam current transformers, ionization chamber and scintillators.
Client programs, here running on the laptop to the left, can connect to the server, and change various settings of the readout procedure.

Again, this is home-brew. Earlier, the data acquisition was some libncurses based stuff, this year is the first time we had a traditional GUI for the client and a clear client/server separation. I wrote the client in C++/QT4 and compiled it for linux and win32. Stefan did a package for mac. Server is pure C, linux only. Sometimes, I think the most valuable course I had when I was a student at our Physics department in Aarhus, was a C-programming course.  (And that course was only offered once! What a shame!)

Fiona from the Belfast QUB group is in charge of the film scans, and making sure there are enough sweets for all of us.
Entire experimental zone in the AD hall, seen from above, where our non-existant counting hut would be..
A bit off topic, but just over our heads, there is a positron beam line, which delivers the positrons to the anti-hydrogen "bottle" of ATRAP. Positrons go from the right side to the left.

Me, checking up on things... :-) I think this is the 8th or 9th time I am working at the ACE experiment.
Alanine is one of the most reliable solid state dosimeters for such exotic beams such as antiprotons. Here a stack of pellets is prepared for irradiation.

Once the pellets have been irradiated with antiprotons, they are shipped for read out to our collaborators at the National Physical Laboratory (NPL) in Teddington, UK.. (The NPL serves also as a primary standard lab for radiation quantities.)


Alright then.. :/

Recursively posting this blog entry.
More pictures here.