Saturday, December 16, 2017

Finding a quantum phase transition, part 2

See here for part 1.   Recall, we had been studying electrical conduction in V5S8, a funky material that is metallic, but on one type of vanadium site has local magnetic moments that order in a form of antiferromagnetism (AFM) below around 32 K.  We had found a surprising hysteresis in the electrical resistance as a function of applied magnetic field.  That is, at a given temperature, over some magnetic field range, the resistance takes different values depending on whether the magnitude of H is being swept up or back down. 

One possibility that springs to mind when seeing hysteresis in a magnetic material is domains - the idea that the magnetic order in the material has broken up into regions, and that the hysteresis is due to the domains rearranging themselves.  What speaks against that in this case is the fact that the hysteresis happens over the same field range when the field is in the plane of the layered material as when the field is perpendicular to the layers.   That'd be very weird for domain motion, but makes much more sense if the hysteresis is actually a signature of a first-order metamagnetic transition, a field-driven change from one kind of magnetic order to another.   First order phase transitions are the ones that have hysteresis, like when water can be supercooled below zero Celsius.

That's also consistent with the fact that the field scale for the hysteresis starts at low fields just below the onset of antiferromagnetism, and very rapidly goes to higher fields as the temperature falls and the antiferromagnetic state is increasingly stable.   Just at the ordering transition, when the AFM state is just barely favored over the paramagnetic state, it doesn't necessarily take much of a push to destabilize AFM order.... 

There was one more clue lingering in the literature.  In 2000, a paper reported a mysterious hysteresis in the magnetization as a function of H down at 4.2 K and way out near 17-18 T.  Could this be connected to our hysteresis?  Well, in the figure here at each temperature we plot a dot for the field that is at the middle of our hysteresis, and a horizontal bar to show the width of the hysteresis, including data for multiple samples.  The red data point is from the magnetization data of that 2000 paper.  

A couple of things are interesting here.   Notice that the magnetic field apparently required to kill the AFM state extrapolates to a finite value, around 18 T, as T goes to zero.  That means that this system has a quantum phase transition (as promised in the post title).  Moreover, in our experiments we found that the hysteresis seemed to get suppressed as the crystal thickness was reduced toward the few-layer limit.  That may suggest that the transition trends toward second order in thin crystals, though that would require further study.  That would be interesting, if true, since second order quantum phase transitions are the ones that can show quantum criticality.  It would be fun to do more work on this system, looking out there at high fields and thin samples for signatures of quantum fluctuations....

The bottom line:  There is almost certainly a lot of interesting physics to be done with magnetic materials approaching the 2d limit, and there are likely other phases and transitions lurking out there waiting to be found.

Saturday, December 09, 2017

Finding a quantum phase transition, part 1

I am going to try to get the post frequency back up now that some tasks are getting off the to-do list....

Last year, we found what seems to be a previously undiscovered quantum phase transition, and I think it's kind of a fun example of how this kind of science gets done, with a few take-away lessons for students.  The paper itself is here.

My colleague Jun Lou and I had been interested in low-dimensional materials with interesting magnetic properties for a while (back before it was cool, as the hipsters say).  The 2d materials craze continues, and a number of these are expected to have magnetic ordering of various kinds.  For example, even down to atomically thin single layers, Cr2Ge2Te6 is a ferromagnetic insulator (see here), as is CrI3 (see here).  The 2d material VS2 had been predicted to be a ferromagnet in the single-layer limit.  

In the pursuit of VS2, Prof. Lou's student Jiangtan Yuan found that the vanadium-sulphur phase diagram is rather finicky, and we ended up with a variety of crystals of V5S8 with thicknesses down to about 10 nm (a few unit cells).  

[Lesson 1:  Just because they're not the samples you want doesn't mean that they're uninteresting.]   

It turns out that V5S8  had been investigated in bulk form (that is, mm-cm sized crystals) rather heavily by several Japanese groups starting in the mid-1970s.  They discovered and figured out quite a bit.  Using typical x-ray methods they found the material's structure:  It's better to think of V5S8  as V0.25VS2.  There are VS2 layers with an ordered arrangement of vanadium atoms intercalated in the interlayer space.  By measuring electrical conduction, they found that the system as a whole is metallic.   Using neutron scattering, they showed that there are unpaired 3d electrons that are localized to those intercalated vanadium atoms, and that those local magnetic moments order antiferromagnetically below a Neel temperature of 32 K in the bulk.  The moments like to align (antialign) along a direction close to perpendicular to the VS2 layers, as shown in the top panel of the figure.   (Antiferromagnetism can be tough to detect, as it does not produce the big stray magnetic fields that we all associate with ferromagnetism. )

If a large magnetic field is applied perpendicular to the layers, the spins that are anti-aligned become very energetically unfavored.  It becomes energetically favorable for the spins to find some way to avoid antialignment but still keep the antiferromagnetism.  The result is a spin-flop transition, when the moments keep their antiferromagnetism but flop down toward the plane, as in the lower panel of the figure.  What's particularly nice in this system is that this ends up producing a kink in the electrical resistance vs. magnetic field that is a clear, unambiguous signature of the spin flop, and therefore a way of spotting antiferromagnetism electrically

My student Will Hardy figured out how to make reliable electrical contact to the little, thin V5S8 crystals (not a trivial task), and we found the physics described above.  However, we also stumbled on a mystery that I'll leave you as a cliff-hanger until the next post:  Just below the Neel temperature, we didn't just find the spin-flop kink.  Instead, we found hysteresis in the magnetoresistance, over an extremely narrow temperature range, as shown here.

[Lesson 2:  New kinds of samples can make "old" materials young again.]

[Lesson 3:  Don't explore too coarsely.  We could easily have missed that entire ~ 2.5 K temperature window when you can see the hysteresis with our magnetic field range.] 

Tune in next time for the rest of the story....

Tuesday, November 28, 2017

Very busy time....

Sorry for the light blogging - between departmental duties and deadline-motivated writing, it's been very difficult to squeeze in much blogging.  Hopefully things will lighten up again in the next week or two.   In the meantime, I suggest watching old episodes of the excellent show Scrapheap Challenge (episode 1 here).  Please feel free to put in suggestions of future blogging topics in the comments below.  I'm thinking hard about doing a series on phases and phase transitions.

Friday, November 17, 2017

Max the Demon and the Entropy of Doom

My readers know I've complained/bemoaned repeatedly how challenging it can be to explain condensed matter physics on a popular level in an engaging way, even though that's the branch of physics that arguably has the greatest impact on our everyday lives.  Trying to take such concepts and reach an audience of children is an even greater, more ambitious task, and teenagers might be the toughest crowd of all.  A graphic novel or comic format is one visually appealing approach that is a lot less dry and perhaps more nonthreatening than straight prose.   Look at the success of xkcd and Randall Munroe!   The APS has had some reasonable success with their comics about their superhero Spectra.  Prior to that, Larry Gonick had done a very nice job on the survey side with the Cartoon Guide to Physics.  (On the parody side, I highly recommend Science Made Stupid (pdf) by Tom Weller, a key text from my teen years.  I especially liked Weller's description of the scientific method, and his fictional periodic table.)

Max the Demon and the Entropy of Doom is a new entry in the field, by Assa Auerbach and Richard Codor.  Prof. Auerbach is a well-known condensed matter theorist who usually writes more weighty tomes, and Mr. Codor is a professional cartoonist and illustrator.  The book is an entertaining explanation of the laws of thermodynamics, with a particular emphasis on the Second Law, using a humanoid alien, Max (the Demon), as an effective superhero.  

The comic does a good job, with nicely illustrated examples, of getting the point across about entropy as counting how many (microscopic) ways there are to do things.  One of Max's powers is the ability to see and track microstates (like the detailed arrangement and trajectory of every air molecule in this room), when mere mortals can only see macrostates (like the average density and temperature).    It also illustrates what we mean by temperature and heat with nice examples (and a not very subtle at all environmental message).   There's history (through the plot device of time travel), action, adventure, and a Bad Guy who is appropriately not nice (and has a connection to history that I was irrationally pleased about guessing before it was revealed).   My kids thought it was good, though my sense is that some aspects were too conceptually detailed for 12 years old and others were a bit too cute for world-weary 15.  Still, a definite good review from a tough crowd, and efforts like this should be applauded - overall I was very impressed.

Tuesday, November 07, 2017

Taxes and grad student tuition

As has happened periodically over the last couple of decades (I remember a scare about this when Newt Gingrich's folks ran Congress in the mid-1990s), a tax bill has been put forward in the US House that would treat graduate student tuition waivers like taxable income (roughly speaking).   This is discussed a little bit here, and here.

Here's an example of why this is an ill-informed idea.  Suppose a first-year STEM grad student comes to a US university, and they are supported by, say, departmental fellowship funds or a TA position during that first year.  Their stipend is something like $30K.  These days the university waives their graduate tuition - that is, they do not expect the student to pony up tuition funds.  At Rice, that tuition is around $45K.  Under the proposed legislation, the student would end up getting taxed as if their income was $75K, when their actual gross pay is $30K.   

That would be extremely bad for both graduate students and research universities.  Right off the bat this would create unintended (I presume) economic incentives, for grad students to drop out of their programs, and/or for universities to play funny games with what they say is graduate tuition.   

This has been pitched multiple times before, and my hypothesis is that it's put forward by congressional staffers who do not understand graduate school (and/or think that this is the same kind of tuition waiver as when a faculty member's child gets a vastly reduced tuition for attending the parent's employing university).  Because it is glaringly dumb, it has been fixed whenever it's come up before.  In the present environment, the prudent thing to do would be to exercise caution and let legislators know that this is a problem that needs to be fixed.

Tuesday, October 31, 2017

Links + coming soon

Real life is a bit busy right now, but I wanted to point out a couple of links and talk about what's coming up.
  • I've been looking for ways to think about and discuss topological materials that might be more broadly accessible to non-experts, and I found this paper and videos like this one and this one.  Very cool, and I'm sorry I'd missed it back in '15 when it came out.
  • In the experimental literature talking about realizations of Majorana fermions in the solid state, a key signature is a peak in the conductance at zero voltage - that's an indicator that there is a "zero-energy mode" in the system.  There are other ways to get zero-bias peaks, though, and nailing down whether this has the expected properties (magnitude, response to magnetic fields) has been a lingering issue.  This seems to nail down the situation more firmly.
  • Discussions about "quantum supremacy" strictly in terms of how many qubits can be simulated on a classical computer right now seem a bit silly to me.  Ok, so IBM managed to simulate a handful of additional qubits (56 rather than 49).  It wouldn't shock me if they could get up to 58 - supercomputers are powerful and programmers can be very clever.  Are we going to get a flurry of news stories every time about how this somehow moves the goalposts for quantum computers?    
  • I'm hoping to put out a review of Max the Demon and the Entropy of Doom, since I received my beautifully printed copies this past weekend.

Wednesday, October 25, 2017

Thoughts after a NSF panel

I just returned from a NSF proposal review panel.  I had written about NSF panels back in the early days of this blog here, back when I may have been snarkier.

  • Some things have gotten better.  We can work from our own laptops, and I think we're finally to the point where everyone at these things is computer literate and can use the online review system.  The program officers do a good job making sure that the reviews get in on time (ahead of the meeting).
  • Some things remain the same.  I'm still mystified at how few people from top-ranked programs (e.g., Harvard, Stanford, MIT, Cornell, Cal Tech, Berkeley) I see at these.  Maybe I just don't move in the right circles.  
  • Best quote of the panel:  "When a review of one of my papers or proposals starts with 'Author says' rather than 'The author says', I know that the referee is Russian and I'm in trouble."
  • Why does the new NSF headquarters have tighter security screenings that Reagan National Airport?  
  • The growth of funding costs and eight years of numerically flat budgets has made this process more painful.  Sure looks like morale is not great at the agency.  Really not clear where this is all going to go over the next few years.  There was a lot of gallows humor about having "tax payer advocates" on panels.  (Everyone on the panel is a US taxpayer already, though apparently that doesn't count for anything because we are scientists.)
  • NSF is still the most community-driven of the research agencies. 
  • I cannot overstate the importance of younger scientists going to one of these and seeing how the system works, so you learn how proposals are evaluated.




Monday, October 23, 2017

Whither science blogging?

I read yesterday of the impending demise of scienceblogs, a site that has been around since late 2005 in one form or other.  I guess I shouldn't be surprised, since some of its bloggers have shifted to other sites in recent years, such as Ethan Siegel and Chad Orzel, who largely migrated to Forbes, and Rhett Allain, who went to Wired.  Steinn Sigurðsson is going back to his own hosted blog in the wake of this.

I hope this is just indicative of a poor business model at Seed Media, and not a further overall decline in blogging by scientists.  It's wonderful that online magazines like Quanta and Aeon and Nautilus are providing high quality, long-form science writing.  Still, I think everyone benefits when scientists themselves (in addition to professional science journalists) carve out some time to write about their fields.



Friday, October 20, 2017

Neutron stars and condensed matter physics

In the wake of the remarkable results reported earlier this week regarding colliding neutron stars, I wanted to write just a little bit about how a condensed matter physics concept is relevant to these seemingly exotic systems.

When you learn high school chemistry, you learn about atomic orbitals, and you learn that electrons "fill up" those orbitals starting with the lowest energy (most deeply bound) states, two electrons of opposite spin per orbital.  (This is a shorthand way of talking about a more detailed picture, involving words like "linear combination of Slater determinants", but that's a detail in this discussion.)  The Pauli principle, the idea that (because electrons are fermions) all the electrons can't just fall down into the lowest energy level, leads to this.  In solid state systems we can apply the same ideas.  In a metal like gold or copper, the density of electrons is high enough that the highest kinetic energy electrons are moving around at ~ 0.5% of the speed of light (!).  

If you heat up the electrons in a metal, they get more spread out in energy, with some occupying higher energy levels and some lower energy levels being empty.   To decide whether the metal is really "hot" or "cold", you need a point of comparison, and the energy scale gives you that.  If most of the low energy levels are still filled, the metal is cold.  If the ratio of the thermal energy scale, \(k_{\mathrm{B}}T\) to the depth of the lowest energy levels (essentially the Fermi energy, \(E_{\mathrm{F}}\) is much less than one, then the electrons are said to be "degenerate".  In common metals, \(E_{\mathrm{F}}\) is several eV, corresponding to a temperature of tens of thousands of Kelvin.  That means that even near the melting point of copper, the electrons are effectively very cold.

Believe it or not, a neutron star is a similar system.  If you squeeze a bit more than one solar mass into a sphere 10 km across, the gravitational attraction is so strong that the electrons and protons in the matter are crushed together to form a degenerate ball of neutrons.  Amazingly, by our reasoning above, the neutrons are actually very very cold.  The Fermi energy for those neutrons corresponds to a temperature of nearly \(10^{12}\) K.  So, right up until they smashed into each other, those two neutron stars spotted by the LIGO observations were actually incredibly cold, condensed objects.   It's also worth noting that the properties of neutron stars are likely affected by another condensed matter phenomenon, superfluidity.   Just as electrons can pair up and condense into a superconducting state under some circumstances, it is thought that cold, degenerate neutrons can do the same thing, even when "cold" here might mean \(5 \times 10^{8}\) K.

Sunday, October 15, 2017

Gravitational waves again - should be exciting

There is going to be a big press conference tomorrow, apparently to announce that LIGO/VIRGO has seen an event (binary neutron star collision) directly associated with a gamma ray burst in NGC 4993.  Fun stuff, and apparently the worst-kept secret in science right now.  This may seem off-topic for a condensed matter blog, but there's physics in there which isn't broadly appreciated, and I'll write a bit about it after the announcement.

Tuesday, October 10, 2017

Piezo controller question - followup.

A couple of weeks ago I posted:

Anyone out there using a Newport NPC3SG controller to drive a piezo positioning stage, with computer communication successfully talking to the NPC3SG?  If so, please leave a comment so that we can get in touch, as I have questions.

No responses so far.  This is actually the same unit as this thing:
https://www.piezosystem.com/products/piezo_controller/piezo_controller_3_channel_version/nv_403_cle/

In our unit from Newport, communications simply don't work properly.  Timeout problems.  The labview code supplied by Newport (the same code paired with the link above) has these problems, as do many other ways of trying to talk with the instrument.  Has anyone out there had success in using a computer to control and read this thing?   At issue is whether this is a hardware problem with our unit, or whether there is a general problem with these.  The vendor has been verrrrrrrrry slow to figure this out.

Sunday, October 08, 2017

The Abnormal Force

How does the chair actually hold you up when you sit down?  What is keeping your car tires from sinking through the road surface?  What is keeping my coffee mug from falling through my desk?  In high school and first-year undergrad physics, we teach people about the normal force - that is a force that acts normal (perpendicular) to a surface, and it takes on whatever value is needed so that solid objects don't pass through each other.

The microscopic explanation of the normal force is that the electrons in the atoms of my coffee mug (etc.) interact with the electrons in the atoms of the desk surface, through a combination of electrostatics (electrons repel each other) and quantum statistics (the Pauli principle means that you can't just shuffle electrons around willy-nilly).  The normal force is "phenomenological" shorthand.  We take the observation that solid objects don't pass through each other, deduce that whatever is happening microscopically, the effect is that there is some force normal to surfaces that touch each other, and go from there, rather than trying to teach high school students how to calculate it from first principles.  The normal force is an emergent effect that makes sense on macroscopic scales without knowing the details.  This is just like how we teach high school students about pressure as a useful macroscopic concept, without actually doing a statistical calculation of the average perpendicular force per area on a surface due to collisions with molecules of a gas or a liquid.  

You can actually estimate the maximum reasonable normal force per unit area.  If you tried to squeeze the electrons of two adjacent atoms into the volume occupied by one atom, even without the repulsion of like charges adding to the cost, the Pauli principle means you'd have to kick some of those electrons into higher energy levels.  If a typical energy scale for doing that for each electron was something like 1 eV, and you had a few electrons per atom, and the areal density of atoms is around 1014 per cm2, then we can find the average force \(F_{\mathrm{av}}\) required to make a 1 cm2 area of two surfaces overlap with each other.   We'd have \(F_{\mathrm{av}} d \sim 10^{15}\)eV, where \(d\) is the thickness of an atom, around 0.3 nm.   That's around 534000 Newtons/cm2, or around 5.3 GPa.   That's above almost all of the yield stresses for materials (usually worrying about tension rather than compression) - that just means that the atoms themselves will move around before you really push electrons around.

Very occasionally, when two surfaces are brought together, there is a force that arises at the interface that is not along the normal direction.  A great example of that is in this video, which shows two graphite surfaces that spontaneously slide in the plane so that they are crystallographically aligned.  That work comes from this paper.

As far as I can tell, there is no official terminology for such a spontaneous in-plane force.  In the spirit of one of my professional heroes David Mermin, who coined the scientific term boojum, I would like to suggest that such a transverse force be known as the abnormal force.  (Since I don't actually work in this area and I'm not trying to name the effect after myself, hopefully the barrier to adoption will be lower than the one faced by Mermin, who actually worked on boojums :-)  ).

Tuesday, October 03, 2017

Gravitational radiation for the win + communicating science

As expected, LIGO was recognized by the Nobel Prize in physics this year.  The LIGO experiment is an enormous undertaking that combines elegant, simple theoretical ideas; incredible engineering and experimental capabilities; and technically virtuosic numerical theoretical calculations and data analysis techniques.  It's truly a triumph.

I did think it was interesting when Natalie Wolchover, one of the top science writers out there today, tweeted:   Thrilled they won, thrilled not to spend this morning speed-reading about some bizarre condensed matter phenomenon.

This sentiment was seconded by Peter Woit, who said he thought she spoke for all science journalists.

Friendly kidding aside, I do want to help.  Somehow it's viewed as comparatively easy and simple to write about this, or this, or this, but condensed matter is considered "bizarre".  

Sunday, October 01, 2017

Gravitational radiation redux + Nobel speculation

This past week, there was exciting news that the two LIGO detectors and the VIRGO interferometer had simultaneously detected the same event, a merger of black holes estimated to have taken place 1.6 billion lightyears away.  From modeling the data, the black hole masses are estimated at around 25 and 30 solar masses, and around 2.7 solar masses worth of energy (!) was converted in the merger into gravitational radiation.  The preprint of the paper is here.  Check out figure 1.  With just the VIRGO data, the event looks really marginal - by eye you would be hard pressed to pick it out of the fluctuating detector output.  However, when that data is thrown into the mix with that from the (completely independent from VIRGO) detectors, the case is quite strong.

This is noteworthy for (at least) two reasons.  First, there has been some discussion about the solidity of the previously reported LIGO results - this paper (see here for a critique of relevant science journalism) argues that there are some surprising correlations in the noise background of the two detectors that could make you wonder about the analysis.  After all, the whole point of having two detectors is that a real event should be seen by both, while one might reasonably expect background jitter to be independent since the detectors are thousands of miles apart.  Having a completely independent additional detector in the mix should be useful in quantifying any issues.  Second, having the additional detector helps nail down the spot in the sky where the gravitational waves appear to originate.  This image shows how previous detections could only be localized by two detectors to a band spanning lots of the sky, while this event can be localized down to a spot spanning a tenth as much solid angle.    This is key to turning gravitational wave detectors into serious astronomy tools, by trying to link gravitational event detection to observations across the electromagnetic spectrum.  There were rumors, for example, that LIGO had detected what was probably a neutron star collision (smaller masses, but far closer to earth), the kind of event thought to produce dramatic electromagnetic signatures like gamma ray bursts.

On that note, I realized Friday that this coming Tuesday is the announcement of the 2017 Nobel in physics.  Snuck up on me this time.  Speculate away in the comments.  Since topology in condensed matter was last year's award, it seems likely that this year will not be condensed matter-related (hurting the chances of people like Steglich and Hosono for heavy fermion and iron superconductors, respectively).  Negative index phenomena might be too condensed matter related.   The passing last year of Vera Rubin and Debra Jin is keenly felt, and makes it seem less likely that galactic rotation curves (as evidence for dark matter) or ultracold fermions would get it this year.  Bell's inequality tests (Aspect, Zeilinger, Clauser) could be there.   The LIGO/VIRGO combined detection happened too late in the year to affect the chances of this being the year for gravitational radiation (which seems a shoe-in soon).

Tuesday, September 26, 2017

The terahertz gap

https://commons.wikimedia.org/wiki/File:Thz_freq_in_EM_spectrum.png?uselang=en-gb
At a thesis proposal talk yesterday, I realized that I hadn't ever written anything specifically about terahertz radiation (THz, or if you're trying to market something, t-rays).   Terahertz (1012 Hz) is the frequency of electromagnetic radiation higher than microwaves, but lower than what is traditionally labeled the far infrared.  Sometimes called "mm wave" radiation (1 THz would be a free-space wavelength of about 0.3 mm or 300 microns), THz is potentially very useful for communications (pdf, from here), imaging (here, here, here), and range detection (see here for an impressive google project; or here for an article about THz for self-driving cars), among other things.  It's also right around the frequency range of a lot of vibrations in molecules and solids, so it can be used for spectroscopy, though it's also around the energy range where water vapor in the atmosphere can be an efficient absorber.

This frequency region is an awkward middle ground, however.  That's sometimes why it's referred to as the "terahertz gap".

We tend to produce electromagnetic radiation by one of two approaches.  Classically, accelerating charges radiate electromagnetic waves.  In the low frequency limit, there are various ways to generate voltages that oscillate - we can in turn use those to drive oscillating currents and thus generate radio waves, for example.  See here for a very old school discussion.  It is not trivial to shake charges back and forth at THz frequencies, however.  It can be done, but it's very challenging.  One approach to generating a pulse of THz radiation is to use a photoconductive antenna.  Take two electrodes close together on a semiconductor substrate, with a voltage applied between them.  Smack the semiconductor with an ultrafast optical pulse that has a frequency high enough to photoexcite a bunch of charge carriers - those then accelerate from the electric field between the electrodes and emit a pulse of radiation, including THz frequencies.

The other limit we often take in generating light is to work with some quantum system that has a difference in energy levels that is the same energy as the photons we want to generate.  This is the limit of atomic emission (say, having an electron drop from the 2p orbital to the 1s orbital of a hydrogen atom, and emitting an ultraviolet photon of energy around 10 eV) and also the way many solid state devices work (say, having an electron drop from the bottom of the conduction band to the top of the valence band in InGaAsP to produce a red photon of energy around 1.6 eV in a red LED).  The problem with this approach for THz is that the energy scale in question is very small - 1 THz is about 4 milli-electron volts (!).  As far as I know, there aren't naturally occurring solids with energy level splittings that small, so the approach from this direction has been to create artificial systems with such electronic energy gaps - see here.   (Ironically, there are some molecular systems with transitions considerably lower in energy than the THz that can be used to generate microwaves, as in this famous example.)

It looks like THz is starting to take off for technologies, particularly as more devices are being developed for its generation and detection.  SiGe-based transistors, for example, can operate at very high intrinsic speeds, and like in the thesis proposal I heard yesterday, these devices are readily made now and can be integrated into custom chips for exactly the generation and detection of radiation approaching a terahertz.  Exciting times.


Friday, September 22, 2017

Lab question - Newport NPC3SG

Anyone out there using a Newport NPC3SG controller to drive a piezo positioning stage, with computer communication successfully talking to the NPC3SG?  If so, please leave a comment so that we can get in touch, as I have questions.

Monday, September 18, 2017

Faculty position at Rice - theoretical astro-particle/cosmology

Assistant Professor Position at Rice University in

Theoretical Astro-Particle Physics/Cosmology


The Department of Physics and Astronomy at Rice University in Houston, Texas, invites applications for a tenure-track faculty position (Assistant Professor level) in Theoretical Astro-Particle physics and/or Cosmology. The department seeks an outstanding individual whose research will complement and connect existing activities in Nuclear/Particle physics and Astrophysics groups at Rice University (see http://physics.rice.edu). This is the second position in a Cosmic Frontier effort that may eventually grow to three members. The successful applicant will be expected to develop an independent and vigorous research program, and teach graduate and undergraduate courses. A PhD in Physics, Astrophysics or related field is required.

Applicants should send the following: (i) cover letter; (ii) curriculum vitae (including electronic links to 2 relevant publications); (iii) research statement (4 pages or less); (iv) teaching statement (2 pages or less); and (v) the names, professional affiliations, and email addresses of three references.  To apply, please visit: http://jobs.rice.edu/postings/11772.  Applications will be accepted until the position is filled, but only those received by Dec 15, 2017 will be assured full consideration. The appointment is expected to start in July 2018.  Further inquiries should be directed to the chair of the search committee, Prof. Paul Padley (padley@rice.edu).

Rice University is an Equal Opportunity Employer with commitment to diversity at all levels, and considers for employment qualified applicants without regard to race, color, religion, age, sex, sexual orientation, gender identity, national or ethnic origin, genetic information, disability or protected veteran status.


-->
         

Faculty position at Rice - experimental condensed matter

Faculty Position in Experimental Condensed Matter Physics Rice University


The Department of Physics and Astronomy at Rice University in Houston, TX invites applications for a tenure-track faculty position in experimental condensed matter physics.  The department expects to make an appointment at the assistant professor level. This search seeks an outstanding individual whose research interest is in hard condensed matter systems, who will complement and extend existing experimental and theoretical activities in condensed matter physics on semiconductor and nanoscale structures, strongly correlated systems, topological matter, and related quantum materials (see http://physics.rice.edu/). A PhD in physics or related field is required. 

Applicants to this search should submit the following: (1) cover letter; (2) curriculum vitae; (3) research statement; (4) teaching statement; and (5) the names, professional affiliations, and email addresses of three references. For full details and to apply, please visit: http://jobs.rice.edu/postings/11782. Applications will be accepted until the position is filled. The review of applications will begin October 15 2017, but all those received by December 1 2017 will be assured full consideration. The appointment is expected to start in July 2018.  Further inquiries should be directed to the chair of the search committee, Prof. Emilia Morosan (emorosan@rice.edu).  

Rice University is an Equal Opportunity Employer with commitment to diversity at all levels, and considers for employment qualified applicants without regard to race, color, religion, age, sex, sexual orientation, gender identity, national or ethnic origin, genetic information, disability or protected veteran status.

Friday, September 15, 2017

DOE experimental condensed matter physics PI meeting, day 3

And from the last half-day of the meeting:

  • Because the mobile electrons in graphene have an energy-momentum relationship similar to that of relativistic particles, the physics of electrons bound to atomic-scale defects in graphene has much in common with the physics that sets the limits on the stability of heavy atoms - when the kinetic energy of the electrons in the innermost orbitals is high enough that relativistic effects become very important.  It is possible to examine single defect sites with a scanning tunneling microscope and look at the energies of bound states, and see this kind of physics in 2d.  
  • There is a ton of activity concentrating on realizing Majorana fermions, expected to show up in the solid state when topologically interesting "edge states" are coupled to superconducting leads.  One way to do this would be to use the edge states of the quantum Hall effect, but usually the magnetic fields required to get in the quantum Hall regime don't play well with superconductivity.  Graphene can provide a way around this, with amorphous MoRe acting as very efficient superconducting contact material.  The results are some rather spectacular and complex superconducting devices (here and here).
  • With an excellent transmission electron microscope, it's possible to carve out atomically well defined holes in boron nitride monolayers, and then use those to create confined potential wells for carriers in graphene.  Words don't do justice to the fabrication process - it's amazing.  See here and here.
  • It's possible to induce and see big collective motions of a whole array of molecules on a surface that each act like little rotors.
  • In part due to the peculiar band structure of some topologically interesting materials, they can have truly remarkable nonlinear optical properties.
My apologies for not including everything - side discussions made it tough to take notes on everything, and the selection in these postings is set by that and not any judgment of excitement.  Likewise, the posters at the meeting were very informative, but I did not take notes on those.

Wednesday, September 13, 2017

DOE experimental condensed matter PI meeting, day 2

More things I learned:

  • I've talked about skyrmions before.  It turns out that by coupling a ferromagnet to a strong spin-orbit coupling metal, one can stabilize skyrmions at room temperature.  They can be visualized using magnetic transmission x-ray microscopy - focused, circularly polarized x-ray studies.   The skyrmion motion can show its own form of the Hall effect.  Moreover, it is possible to create structures where skyrmions can be created one at a time on demand, and moved back and forth in a strip of that material - analogous to a racetrack memory.
  • Patterned arrays of little magnetic islands continue to be a playground for looking at analogs of complicated magnetic systems.  They're a kind of magnetic metamaterial.  See here.  It's possible to build in frustration, and to look at how topologically protected magnetic excitations (rather like skyrmions) stick around and can't relax.
  • Topological insulator materials, with their large spin-orbit effects and surface spin-momentum locking, can be used to pump spin and flip magnets.  However, the electronic structure of both the magnet and the TI are changed when one is deposited on the other, due in part to interfacial charge transfer.
  • There continues to be remarkable progress on the growth and understanding of complex oxide heterostructures and interfaces - too many examples and things to describe.
  • The use of nonlinear optics to reveal complicated internal symmetries (talked about here) continues to be very cool.
  • Antiferromagnetic layers can be surprisingly good at passing spin currents.  Also, I want to start working on yttrium iron garnet, so that I can use this at some point in a talk.
  • It's possible to do some impressive manipulation of the valley degree of freedom in 2d transition metal dichalcogenides, creating blobs of complete valley polarization, for example.  It's possible to use an electric field to break inversion symmetry in bilayers and turn some of these effects on and off electrically.
  • The halide perovskites actually can make fantastic nanocrystals in terms of optical properties and homogeneity.

Tuesday, September 12, 2017

DOE experimental condensed matter PI meeting, day 1

I'm pressed for time, so this is brief, but here are some things I learned yesterday:
  • An electric field perpendicular to the plane can split and shift the Landau levels of bilayer graphene.  See here.
  • The quantum Hall effect in graphene and other 2d systems still has a lot of richness and life in it.
  • I have one word for you...."polaritons".
  • It's possible to set up a tunneling experiment, from one "probe" 2d electron gas that has a small, tight Fermi surface, into a "sample" 2d electron gas of interest.  By playing with the in-plane magnetic field, the tunneling electrons can pick up momentum in the plane as they tunnel.  The result is, the tunneling current as a function of voltage and transverse fields lets you map out exactly the "sample" electronic states as a function of energy and momentum, like ARPES without the PES part.  See here.
  • Squeezing mechanically to apply pressure can actually produce dramatic changes (quantum phase transitions) in unusual fractional quantum Hall states.
  • How superconductivity dies in the presence of disorder, magnetic field, and temperature remains very rich and interesting.  The "Bose metal", when magnetic field kills global phase coherence without completely ripping apart Cooper pairs, can be an important part of that transition.  For related work, see here.
  • One should be very careful in interpreting ARPES data.  It's entirely possible that not everything identified as some exotic topological material really fits the bill - see here.  On the other hand, sometimes you do see real topologically interesting band structure.
  • The DOE still has laptops running Windows XP.

Sunday, September 10, 2017

DOE Experimental Condensed Matter PI meeting, 2017

The Basic Energy Sciences program is part of the US Department of Energy's Office of Science, and they are responsible for a lot of excellent science research funding.  The various research areas within BES have investigator meetings every two years, and at the beginning of this coming week is the 2017 PI meeting for the experimental condensed matter physics program.  As I've done in past years,  I will try to write up a bulleted list of things I learn.   (See here, here, and here for the 2013 meeting; see here, here, here, and here for the 2015 meeting).

Good luck and stay safe to those in Florida about to get hit by Hurricane Irma.  It's very different than Harvey (much more of a concern about wind damage and storm surge, much less about total rainfall), but still very dangerous.

Lastly, Amazon seems to have my book available for a surprisingly low price right now ($62, though the list is $85).  I (and my publisher) still have no idea how they can do this without losing money.  

Sunday, September 03, 2017

Capillary action - the hidden foe in the physics of floods

There is an enormous amount of physics involved in storms and floods.   The underlying, emergent properties of water are key to much of this.

An individual water molecule can move around, and it can vibrate and rotate in various ways, but it's not inherently wet.  Only when zillions of water molecules get together does something like "wetness" of water even take on meaning.  The zillions of molecules are very egalitarian:  They explore all possible microscopic arrangements (including how they're distributed in space and how they're moving) that are compatible with their circumstances (e.g., sitting at a particular temperature and pressure).  Sometimes the most arrangements correspond to the water molecules being close together as a liquid - the water molecules are weakly attracted to each other if they get close together; at other temperatures and pressures, the most arrangements correspond to the water molecules being spread out as a gas.    Big tropical systems are basically heat engines, powered by the temperature difference between the surface layers of seawater and the upper atmosphere.  That difference in temperatures leads to net evaporation, driving water into the gas phase (by the gigaton, in the case of Hurricane Harvey).  Up in the cold atmosphere, the water condenses again into droplets, and heating the air.  If those droplets are small enough, the forces from adjacent air molecules bouncing off the droplets slow the droplets to the point where they are borne aloft by large-scale breezes - that's why clouds don't fall down even though they're made of water droplets.

There is another feature that comes from the attraction between water molecules and each other, and the attraction between water molecules and their surroundings.   Because of the intramolecular attraction, water molecules would have less energy if they were close together, and therefore having a water-air interface costs energy.  One result is surface tension - the tendency for liquid droplets to pull into small blobs that minimize their (liquid/vapor interface) surface area.

However, sometimes the attractive interaction between a water molecule and some surface can be even stronger than the interaction between the water molecule and other water molecules.  When that happens, a water droplet on such a surface will spread out instead of "beading up".  The surface is said to be hydrophilic.  See here.  This is why some surfaces "like" to get wet, like your dirty car windshield.

Sneaking in here is actually the hidden foe that is known all too well to those who have ever dealt with flooding.  You've seen it daily, even if you've never consciously thought about it.  It's capillary action.  A network of skinny pores or very high surface area hydrophilic material can wick up water like crazy.  Again, the water is just exploring all possible microscopic arrangements, and it so happens that in a high surface area, hydrophilic environment, many many arrangements involve the water being spread out as much as possible on that surface.  This can be to our advantage sometimes - it helps get water to the top of trees, and it makes paper towels work well for drying hands.  However, it can also cause even a couple of cm of floodwater indoors to ruin the bottom meter of sheetrock, or bring water up through several cm of insulation into wood floors, or transport water meters up carpeted stairs.   Perhaps it will one day be economically and environmentally feasible to make superhydrophobic wall and flooring material, but we're not there yet.

(To all my Houston readers, I hope you came through the storm ok!  My garage had 0.8m of water, which killed my cars, but the house is otherwise fine, and the university + lab did very well.)

Friday, August 25, 2017

Hurricanes, heat engines, etc.

Looks like it's going to be a wet few days, with the arrival of Harvey.   I've mentioned previously that hurricanes and tropical storm systems are heat engines - they basically use the temperature difference between the heated water in the ocean and the cooler air in the upper atmosphere to drive enormous flows of matter (air currents, water in vapor and liquid form).  A great explanation of how this works is here.  Even with very crude calculations one can see that the power involved in a relatively small tropical rain event is thousands GW, hundreds of times greater than the power demands of a major city.   Scaling up to a hurricane, you arrive at truly astonishing numbers.  It's likely that Harvey is churning along at an average power some 200 times greater than the electrical generating capacity of the planet (!).  Conservative predictions right now are for total rainfall of maybe 40 cm across an area the size of the state of Louisiana, which would be a total amount of 5.2e10 metric tons of water.   Amazing.  I'm planning to write more in the future about some of this, time permitting.

Update:  For what it's worth, Vox has an article about Harvey, and they say it deposited 14-15 trillion gallons of water.  Each gallon is 3.78 kg, meaning that the total mass of water deposited for 14 trillion gallons is 5.3e10 metric tons.  How's that for estimating accuracy in the above?


Friday, August 18, 2017

Invited symposia/speaker deadlines, APS 2018

For those readers who are APS members, a reminder.  The deadline for nominations for invited symposia for the Division of Condensed Matter Physics for the 2018 March Meeting is August 31.   See here.

Likewise, the Division of Materials Physics has their invited speaker (w/in focus topic) deadline of August 29.  See here.

Please nominate!

Power output of a lightsaber

It's been a long week and lots of seriously bad things have been in the news.   I intend to briefly distract myself by looking at that long-standing question, what is the power output of a lightsaber?  I'm talking about the power output when the lightsaber is actually slicing through something, not just looking cool.  We can get a very rough, conservative estimate from the documentary video evidence before us.  I choose not to use the prequels, on general principle, though the scene in The Phantom Menace when Gui-Gon Jinn cuts through the blast door would be a good place to start.  Instead, let's look at The Force Awakens, where Kylo Ren throws a tantrum and slices up an instrument panel, leaving behind dripping molten metal.  

With each low-effort swing of his arm, Ren's lightsaber, with a diameter of around 2 cm moves at something more than 2 m/s, slicing metal to a depth of, say, 3 cm (actually probably deeper than that).  That is, the cutting part of the blade is sweeping out a volume of around 1200 cc/sec.  It is heating that volume of console material up to well above its melting point, so we need to worry about the energy it takes to heat the solid from room temperature (300 K) up to its melting point, and then the heat of fusion required to melt the material.  At a rough guess, suppose imperial construction is aluminum.  Aluminum has a specific heat of 0.9 J/g-K, and a density of 2.7 g/cc when solid, a melting point of 933 K, and a heat of fusion of 10.7 kJ/mol.  In terms of volume, that's (10.7 kJ/mol)(1 mol/27 g)(2.7 g/cc) = 1070 J/cc.  So, the total power is around (933K-300K)*(0.9J/g-K)*(2.7g/cc)*(1200 cc/sec) + (1070 J/cc)(1200 cc/sec) = 3.1e6 J/s = 3.1 MW.  

Hot stuff.

Thursday, August 10, 2017

That's the way the ball bounces.

How does a ball bounce?  Why does a ball, dropped from some height onto a flat surface, not bounce all the way back up to its starting height?  The answers to these questions may seem obvious, but earlier this week, this paper appeared on the arxiv, and it does a great job of showing what we still don't understand about this everyday physics that is directly relevant for a huge number of sports.

The paper talks specifically about hollow or inflated balls.  When a ball is instantaneously at rest, mid-bounce, it's shape has been deformed by its interaction with the flat surface.  The kinetic energy of its motion has been converted into potential energy, tied up in a combination of the elastic deformation of the skin or shell of the ball and the compression of the gas inside the ball.  (One surprising thing I learned from that paper is that high speed photography shows that the non-impacting parts of such inflated balls tend to remain spherical, even as part of the ball deforms flat against the surface.)  That gas compression is quick enough that heat transfer between the gas and the ball is probably negligible.  A real ball does not bounce back to its full height; equivalently, the ratio \(v_{f}/v_{i}\) of the ball's speed immediately after the bounce, \(v_{f}\), to that immediately before the bounce, \(v_{i}\), is less than one.  That ratio is called the coefficient of restitution.

Somehow in the bounce process some energy must've been lost from the macroscopic motion of the ball, and since we know energy is conserved, that energy must eventually show up as disorganized, microscopic energy of jiggling atoms that we colloquially call heat.   How can this happen?

  • The skin of the ball might not be perfectly elastic - there could be some "viscous losses" or "internal friction" as the skin deforms.
  • As the ball impacts the surface, it can launch sound waves into the surface that eventually dissipate.
  • Similarly, the skin of the ball itself can start vibrating in a complicated way, eventually damping out to disorganized jiggling of the atoms.
  • As the ball's skin hits the ground and deforms, it squeezes air out from beneath the ball; the speed of that air can actually exceed the speed of sound in the surrounding medium (!), creating a shock wave that dissipates by heating the air, as well as ordinary sound vibrations.  (It turns out that clapping your hands can also create shock waves!  See here and here.)
  • There can also be irreversible acoustic process in the gas inside the ball that heat the gas in there.
This paper goes through all of these, estimates how big those effects are, and concludes that, for many common balls (e.g., basketballs), we actually don't understand the relative importance of these different contributions.  The authors propose some experiments to figure out what's going on.  The whole thing is a nice exercise in mechanics and elasticity, and it's always fun to realize that there may still be some surprises lurking in the physics of the everyday.

Saturday, August 05, 2017

Highlights from Telluride

Here are a few highlights from the workshop I mentioned.  I'll amend this over the next couple of days as I have time.  There is no question that smaller meetings (this one was about 28 people) can be very good for discussions.
  • I learned that there is a new edition of Cuevas and Scheer that I should pick up.  (The authors are Juan Carlos Cuevas and Elke Scheer, a great theorist/experimentalist team-up.)
  • Apparently it's possible to make a guitar amplifier using tunnel junctions made from self-assembled monolayers.  For more detail, see here.
  • Some folks at Aachen have gotten serious about physics lab experiments you can do with your mobile phone.
  • Richard Berndt gave a very nice talk about light emission from atomic-scale junctions made with a scanning tunneling microscope.  Some of that work has been written about here and here.  A key question is, when a bias of \(eV\) is applied to such a junction, what is the mechanism that leads to the emission of photons of energies \(\hbar \omega > eV\)?  Clearly the processes involve multiple electrons, but exactly how things work is quite complicated, involving both the plasmonic/optical resonances of the junction and the scattering of electrons at the atomic-scale region.  Two relevant theory papers are here and here.
  • Latha Venkataraman showed some intriguing new results indicating room temperature Coulomb blockade-like transport in nanoclusters.  (It's not strictly Coulomb blockade, since the dominant energy scale seems to be set by single-particle level spacing rather than by the electrostatic charging energy of changing the electronic population by one electron).
  • Katharina Franke showed some very pretty data on single porphyrins measured via scanning tunneling microscope, as in here.  Interactions between the tip and the top of the molecule result in mechanical deformation of the molecule, which in turn tunes the electronic coupling between the transition metal in the middle of the porphyrin and the substrate.  This ends up being a nice system for tunable studies of Kondo physics.
  • Uri Peskin explained some interesting recent results that were just the beginning of some discussions about what kind of photoelectric responses one can see in very small junctions.  One recurring challenge:  multiple mechanisms that seem to be rather different physics can lead to similar experimentally measurable outcomes (currents, voltages).
  • Jascha Repp discussed some really interesting experiments combining STM and THz optics, to do true time-resolved measurements in the STM, such as watching a molecule bounce up and down on a metal surface (!).  This result is timely (no pun intended), as this remarkable paper just appeared on the arxiv, looking at on-chip ways of doing THz and faster electronics.
  • Jeff Neaton spoke about the ongoing challenge of using techniques like density functional theory to calculate and predict the energy level alignment between molecules and surfaces to which they're adsorbed or bonded.  This is important for transport, but also for catalysis and surface chemistry broadly.  A relevant recent result is here.
  • Jan van Ruitenbeek talked about their latest approach to measuring shot noise spectra in atomically small structures up to a few MHz, and some interesting things that this technique has revealed to them at high bias.  
  • There were multiple theory talks looking at trying to understand transport, inelastic processes, and dissipation in open, driven quantum systems.  Examples include situations where higher driving biases can actually make cooling processes more efficient; whether it's possible to have experiments in condensed matter systems that "see" many-body localization, an effect most explored in cold atom systems; using ballistic effects in graphene to do unusual imaging experiments or make electronic "beam splitters"; open systems from a quantum information point of view; what we mean by local effective temperature on very small scales; and new techniques for transport calculations. 
  • Pramod Reddy gave a really nice presentation about his group's extremely impressive work measuring thermal conduction at the atomic scale.  Directly related, he also talked about the challenges of measuring radiative heat transfer down to nm separations, where the Stefan-Boltzmann approach should be supplanted by near-field physics.  This was a very convincing lesson in how difficult it is to ensure that surfaces are truly clean, even in ultrahigh vacuum.
  • Joe Subotnik's talk about electronic friction was particularly striking to me, as I'd been previously unaware of some of the critical experiments (1, 2).  When and how do electron-hole excitations in metals lead to big changes in vibrational energy content of molecules, and how to think about this.  These issues are related to these experiments as well.  
  • Ron Naaman spoke about chiral molecules and how electron transfer to and from these objects can have surprising, big effects (see here and here).
  • Gemma Solomon closed out the proceedings with a very interesting talk about whether molecules could be used to make effective insulating layers better at resisting tunneling current than actual vacuum, and a great summary of the whole research area, where it's been, and where it's going.