Tuesday, June 20, 2017

About grants: What are "indirect costs"?

Before blogging further about science, I wanted to explain something about the way research grants work in the US.  Consider this part of my series of posts intended to educate students (and perhaps the public) about careers in academic research.

When you write a proposal to a would-be source of research funding, you have to include a budget.  As anyone would expect, that budget will list direct costs - these are items that are clear research expenses.  Examples would include, say, $30K/yr for a graduate student's stipend, and $7K for a piece of laboratory electronics essential to the work, and $2K/yr to support travel of the student and the principal investigator (PI) to conferences.   However, budgets also include indirect costs, sometimes called overhead.  The idea is that research involves certain costs that aren't easy to account for directly, like the electricity to run the lights and air conditioning in the lab, or the costs to keep the laboratory building maintained so that the research can get done, or the (meta)costs for the university to administer the grant.  

So, how does the university to figure out how much to tack on for indirect costs?  For US federal grants, the magic (ahem) is all hidden away in OMB Circular A21 (wiki about it, pdf of the actual doc).  Universities periodically go through an elaborate negotiation process with the federal government (see here for a description of this regarding MIT), and determine an indirect cost rate for that university.  The idea is you take the a version of the direct costs ("modified total direct costs" - for example, a piece of equipment that costs more than $5K is considered a capital expense and not subject to indirect costs) and multiply by a negotiated factor (in the case of Rice right now, 56.5%) to arrive at the indirect costs.  The cost rates are lower for research done off campus (like at CERN), with the argument that this should be cheaper for the university.  (Effective indirect cost rates at US national labs tend to be much higher.)

Foundations and industry negotiate different rates with universities.  Foundations usually limit their indirect cost payments, arguing that they just can't afford to pay at the federal level.  The Bill and Melinda Gates Foundation, for example, only allows (pdf) 10% for indirect costs.   The effective indirect rate for a university, averaged over the whole research portfolio, is always quite a bit lower than the nominal A21 negotiated rate.  Vice provosts/presidents/chancellors for research at major US universities would be happy to explain at length that indirect cost recovery doesn't come close to covering the actual costs associated with doing university-based research.  

Indirect cost rates in the US are fraught with controversy, particularly now.  The current system is definitely complicated, and reasonable people can ask whether it makes sense (and adds administrative costs) to have every university negotiate its own rate with the feds.   It remains to be seen whether there are changes in the offing.

Saturday, June 17, 2017

Interesting reading material

Summer travel and other activities have slowed blogging, but I'll pick back up again soon.  In the meantime, here are a couple of interesting things to read:

  • Ignition!  An Informal History of Liquid Rocket Propellants (pdf) is fascinating, if rather chemistry-heavy.  Come for discussions of subtle side reactions involved in red fuming nitric acid slowly eating its storage containers and suggested (then rejected) propellants like dimethyl mercury (!!), and stay for writing like, "Miraculously, nobody was killed, but there was one casualty — the man who had been steadying the cylinder when it split. He was found some five hundred feet away, where he had reached Mach 2 and was still picking up speed when he was stopped by a heart attack."  This is basically the story from start to finish (in practical terms) of the development of liquid propellants for rockets.   That book also led me to stumbling onto this library of works, most of which are waaaaay too chemistry-oriented for me.  Update:  for a directly relevant short story, see here.
  • Optogenetics is the idea of using light to control and trigger the activation/inactivation of genes.  More recently, there is a big upswing in the idea of magnetogenetics, using magnetic fields to somehow do similar things.  One question at play is, what is the physical mechanism whereby magnetic fields can really do much at room temperature, since magnetic effects tend to be weak.  (Crudely speaking, the energy scale of visible photons is eV, much larger than the thermal energy scale of \(k_{\mathrm{B}}T \sim ~\)26 meV, and readily able to excite vibrations or drive electronic excitations.  However, one electron spin in a reasonably accessible magnetic field of 1 Tesla is \(g \mu_{\mathrm{B}}B \sim ~\) 0.1 meV.)  Here is a nice survey article about the constraints on how magnetogenetics could operate.
  • For a tutorial in how not to handle academic promotion cases, see here.  

Tuesday, June 06, 2017

Follow-up: More 5nm/7nm/10nm transistors

Two years ago I wrote this post about IBM's announcement of "7 nm transistors", and it still gets many pageviews every week.   So, what's been going on since then?

I encourage you to read this article from Semiconductor Engineering - it's a nice, in-depth look at what the major manufacturers are doing to get very small transistors actually to market.  It also explains what those numerical designations of size really mean, and changes in lithography technology.  (This is one part of my book that I really will have to revise at some point.)

Likewise, here is a nice article about IBM's latest announcement of "5 nm" transistors.  The writing at Ars Technica on this topic is usually of high quality.  The magic words that IBM is now using are "gate all around", which means designing the device geometry so that the gate electrode, the one that actually controls the conduction, affects the channel (where the current flows) from all sides.  In old-school planar transistors, the gate only couples to the channel from one direction.

Later I will write a long, hopefully accessible article about transistors, as this year is the 70th anniversary of the Bell Labs invention that literally reshaped global technology.

Thursday, June 01, 2017

What should everyone know about physics?

A couple of weeks ago, Sean Carroll made an offhand remark (the best kind) on twitter about what every physics major should know.  That prompted a more thoughtful look at our expectations for physics majors by Chad Orzel, with which I broadly agree, as does ZapperZ, who points out that most physics majors don't actually go on to be physics PhDs (and that's fine, btw.)  

Entertaining and thought-provoking as this was, it seems like it's worth having a discussion among practicing physicist popularizers about what we'd like everyone to know about physics.  (For the pedants in the audience, by "everyone" I'm not including very young children, and remember, this is aspirational.  It's what we'd like people to know, not what we actually expect people to know.)  I'm still thinking about this, but here are some basic ingredients that make the list, including some framing topics about science overall.
  • Science is a reason- and logic-based way to look at the natural world; part of science is figuring out "models" (ways of describing the natural world and how it works) that have explanatory (retrodictive) and predictive power.   
  • Basic science is about figuring out how the world works - figuring out the "rules of the game".  Engineering is about using that knowledge to achieve some practical goal.  The line is very blurry; lots of scientists are motivated by and think about eventual applications.
  • Different branches of science deal with different levels of complexity and have their own vocabularies.  When trying to answer a scientific question, it's important to use the appropriate level of complexity and the right vocabulary.  You wouldn't try to describe how a bicycle works by starting with the molecular composition of the grease on the chain....
  • Physics in particular uses mathematics as its language and a tool.  You can develop good intuition for how physics works, but to make quantitative predictions, you need math.
  • Ultimately, observation and experiment are the arbiters of whether a scientific model/theory is right and wrong, scientifically.  "Right" means "agrees with observation/experiment whenever checked", "wrong" means "predicts results at odds with reality".  Usually this means the model/theory needs an additional correction, or has only a limited range of applicability.  (Our commonplace understanding of how bicycles work doesn't do so well at speeds of thousands of miles an hour.  That doesn't mean we don't understand how bikes work at low speeds; it means that additional effects have to be considered at very high speeds.)
  • There are many branches of physics - it's not all particle physics and astrophysics, despite the impression you might get from TV or movies.
  • Physics explains light in all its forms (the microwaves that heat your food; the radio waves that carry your wifi and cell phone traffic; the light you see with your eye and which carries your internet data over fibers; the x-rays that can go through your skin; and the really high energy gamma rays that do not, in fact, turn you into an enormous green ragemonster).  
  • Physics includes not just looking at the tiniest building blocks of matter, but also understanding what happens when those building blocks come together in very large numbers - it can explain that diamond is hard and transparent, how/why water freezes and boils, and how little pieces of silicon in your computer can be used to switch electric current.  Physics provides a foundation for chemistry and biology, but in those fields often it makes much more sense to use chemistry and biology vocabulary and models.  
  • Quantum mechanics can be unintuitive and weird, but that doesn't mean it's magic, and it doesn't mean that everything we don't understand (e.g., consciousness) is deeply connected to quantum physics.  Quantum is often most important at small scales, like at the level of individual atoms and molecules.  That's one reason it can seem weird - your everyday world is much larger.
  • Relativity can also be unintuitive and weird - that's because it's most important at speeds near the speed of light, and again those are far from your everyday experience.   
  • We actually understand a heck of a lot of physics, and that's directly responsible for our enormous technological progress in the last hundred and fifty years.
  • Physicists enjoy being creative and speculative, but good and honest ones are careful to point out when they're hand waving or being fanciful. 
I'll add more to this list over time, but that's a start....