22nd March 2013 18:54JST  , , , , , ,  1 Response »

One of the good things of working at RIKEN is that the Brain Science Institute (BSI) is just across the road, and they have an appealing lecture series. Today there was a presentation by Giulio Tononi about why animals sleep. In other words: sleep must have a biological function, and the question what that function is, still remains unsolved. He quickly discards older ideas that for instance it is to save energy that cannot be spent usefully during the night, and tells that it must have to do with handling the experiences obtained during the day.

During sleep your brain activity differs from the waking state, and with current technologies such as  functional MRI this can be pinpointed in ever greater detail. One observation is that, during the non-REM part of sleep, neurons in parts of the brain will cease firing completely for a short period of time (milliseconds), and this happens about 1000 times per night.

The nervous system and especially the brain consists of neurons that are connected to each other, the connections are called synapses. If the synaptic link is strong, the chance that the next neuron will fire if the previous one does, is greater. The strength of the synaptic link can vary. It is quite uncontested that by using a particular neuron (for instance by hearing a sound or performing a manual action), the synapses involved are strengthened (potentiated in the jargon).

There is cost associated with having strong synaptic links, for instance they consume more energy and space. Therefore, if some synapses are strengthened by new experiences, that must be balanced by weakening others, which is called renormalization. Tononi’s hypothesis is the following (if I understood correctly): during being awake, the synapses that you have used just now (on that day) are the most excited ones. If you were to renormalize there and then, the experiences of that day would be represented too strongly, so that you will forget other memories too soon. Instead, during sleep you are disconnected from the environment, and you can ‘sample’ all your experiences, past and present, to renormalize your synapses more effectively (whether this sampling is dreaming cannot be determined). Here the being disconnected, hence sleeping, is essential.

Obviously this is not my area of expertise, and I cannot judge how controversial this hypothesis is. It makes sense. In any case it’s very interesting, and the progress on this topic is impressive.

 10th August 2012 18:50JST  , , , , ,  3 Responses »

Presently I am at the Abdus Salam International Centre for Theoretical Physics in Trieste, Italy, attending the meeting Innovations in Strongly Correlated Electronic Systems. Amusingly, this first week is a summer school, with extended lectures aimed at providing an introduction or overview of current research topics, whereas next week will be a regular workshop, where scientists present their newest results. The ICTP is a famous place for physics meetings, and they also have researchers and students working here. It is supported by UNESCO, with an explicit mission to support science in developing countries.

The summer school has been very nice so far. It has probably the most international audience I have ever been part of, and from the beginning people have been asking questions abundantly during the presentations. The scope is rather broad, which I like, yet everything is centred around phenomena in materials where the (electron) interactions are strong, as opposed to for instance regular metals and insulators, where the interactions are mostly screened and therefore weak in nature. The most popular compounds are high-temperature superconductors and Mott insulators.

We have live blogging, and video recordings of all the lectures are available online.

Trieste is a great place, at the north-easternmost extremity of Italy, surrounded on three sides by the Slovenian border, on the Adriatic coast. It was the only port of the Habsburg empire, and the previous wealth shows in the many fine buildings in the rather small city. Now it’s mostly tourism, and the centre is filled with restaurants, bars, caffetterie, gelaterie, delicacy shops and so forth. The weather has been excellent so far, between 25 and 30°, and not as humid as Japan. I’m staying in a hotel in the city, which allows for buying some of the exquisite cheeses, meats and bread to take for lunch (superseding the disappointing food served in the canteen here).

update: The workshop was very interesting and successful. For future reference: you can rent a bicycle in Trieste at Surf. My favourite places are Osteria de Scarpon (food) and Osteria da Marino (wine).

Jul 042012

Today it is/will be announced that the Large Hadron Collider (LHC) accelerator experiment near Geneva has observed the Higgs particle. Although in not my field of expertise, this is a huge discovery in physics, and I will write about it here. Probably some more updates will follow today and later, at the end of this post.

What is the Higgs particle?

Most importantly it is the single missing piece in the puzzle of fundamental particles that is called the Standard Model. This model was developed in the 1960s and on, and makes very precise predictions on which fundamental particles exist, and how they interact. Roughly there are two kinds of partices: matter particles and force particles.

The matter particles comprise quarks (building blocks of protons an neutrons that make up atomic nuclei), electrons and the elusive neutrinos. They all have mass. The force particles cause the matter particles to interact with each other. For instance the electromagnetic force is mediated by light particles, called photons. The force particles are of themselves massless, which is also the reason why photons travel at the maximum velocity, therefore known as the speed of light.

Now it comes: there is yet one other particle, which is unlike all the others. This is the Higgs particle. It does not mediate a force by itself, but instead interacts with the other force particles. When this happens, that force turns from a long-range force to a short-range force. Light does not interact with Higgs, and therefore we can feel a tiny force from stars billions of light years away. The particles that make up the weak nuclear force, responsible for radioactive decay, and unimaginatively called W- and Z-bosons, do interact with Higgs, and therefore only operate within say the atomic nucleus. In physics, this turning into a short-range force is completely equivalent with the force particle “obtaining a mass”, so turning from a massless into a massive particle. This is reason why you can read statement like “Higgs gives mass to all other particles”. The Higgs particle also interacts with some matter particles, but since they are already massive, their mass just gets a little bit bigger.

Bottom line: the fact that the last piece of the puzzle is found is of prime importance, all hyped claims like “God particle” and what not should not be taken seriously.

Interestingly, in precisely the same way as I described above, a magnetic field dies off over a short length in a superconductor, so turns from long-range to short-range. In this context it is called the Anderson-Higgs mechanism, and it is of central importance in my thesis work.

What does an accelerator do?

The big accelerator ring on the Swiss/French border was upgraded some years ago, and was, after an unfortunate accident in 2008, fully operational in 2010. In this ring protons (the nucleus of a hydrogen atom) are accelerated to near light speed in two opposite directions (there are two pipes). At four places these two beams can be made to intersect, such that protons from one beam collide with those from the other.

What happens basically is that two protons combine into a lump of energy, which immediately after splits up into other particles. More precisely, the quarks within the proton do this. They can turn into other quarks, neutrinos, other stuff, and also into the Higgs particle, according to the very precise rules laid out by the Standard model. Those resultant particles then fly away from the collision center and into the detectors. Those are huge instruments to be thought of as the CCD element in your digital camera. They notice when a particle hits them, and can also measure their energy and momentum, like your camera can measure the colour.

However, most of the produced particles including the Higgs are terribly unstable and quickly decay into other particles. So what the detectors measure is not the Higgs particle itself, but remnants of it. Because there is so much going on, so many collisions at the same time, the detectors collect a huge amount of data. This is then processed, first by discarding about 99% of it. The relevant data is analyzed by many people and big computers, and finally interpreted by statistics.

People who’ve ever done something with statistics, know there is always noise. Apart from errors, there are just a lot of random events, for instance particles from space, that your detectors pick up. The real particle collisions add just a few events on top of all of those random ones. So what you have to do is keep measuring and accumulate data, so that the real events start to outnumber the random ones. If there really is a particle that you are interested in, then you look for the predicted collision products with the predicted energy in the predicted directions, and see whether you eventually see a “signal” arising out of the noise.

This signal is compared to computer simulations/calculations based on the Standard Model. These simulations are done beforehand, so people know that if they see a signal, they know to what particular collision process it can be attributed.

In short: you can never see the Higgs particle itself, you can only accumulate evidence that it must have been there.

What has been discovered now?

As mentioned, all particles predicted by the Standard Model have been found except for the Higgs. Therefore the upgrade to the LHC was geared specifically to finding  the Higgs (but also other things are researched). Two experiments, ATLAS and CMS, each and independently look for different collision processes involving the Higgs.

One major problem is that the Standard Model does not predict the mass of the Higgs particle itself (but once you know the mass, the interactions it can undergo are predicted with high precision). Therefore the experiment had to look for all the possibilities that Higgs particles of different mass may incur. This is done by looking for the collision products that should occur. If  with statistical certainty they are not there, then you have “excluded the existence of a Higgs particle with mass x“. Then you do the same for other mass values. In the end you exclude whole mass regions, and this has been done over the past few decades. Of course what you hope for is that the data contradicts the hypothesis that the particle is not there. If that happens with enough certainty, you claim discovery of the particle.

There is a terminology in particle physics, where the “observation” of a particle denotes less certainty than the “discovery” of it. Today they will probably announce “observation of a Higgs particle with mass around 125 GeV”.  Most people are however now convinced that this will not be overturned. This is the greatest physics achievement of this decade, probably of all of science. It is also amazing that a particle predicted in 1964 is now finally confirmed.

What’s next?

First of all, they have to very carefully study the Higgs and check against the Standard Model. There are many properties and interactions, for which there are precise predicitons, to which the Higgs particle should conform. It is not at all certain that this will the case, and there may even be more than one variety of the Higgs particle. Also the Higgs may not be fundamental but consist of other particle itself.

If after all this the conclusion is that there is precisely one variety of Higgs particle which conforms exactly to the Standard Model, and no other particles are found either, then we face a conundrum. Namely, the Standard Model has known problems, it does not explain everything in the world around us. For instance, gravity is not part of it! Search for solutions to these problems is called Beyond the Standard Model physics. The LHC will continue to look for anything that can help answer those questions.


If you have time to spare, you can watch excited nerds talking in an alien language at the CERN live webcast. The best source for physicists is probably Tomasso Dorigo’s live blogging. And don’t believe everything you read and hear in regular media.

Press releases: CERNNIKHEF


I will try to answer questions you may have, in English or Dutch.

Update 5 Jul 11:00JST To be clear about what has been found: both the ATLAS and the CMS experiment see clear evidence with just about enough statistical signficance to claim discovery, for a new, hitherto unobserved particle with a mass of about 126 GeV. The observed behaviour of this particle agrees with what a Standard Model Higgs particle would do, but there is not enough data to unambiguously claim that it is the Higgs. However, we can invoke the duck theorem here: if walks, talks and quacks like a duck, it’s probably a duck.

There is a slight deviation in the so-called two-photon-to-Higgs decay channel, they observe more of those events than one would expect from the number of collisions that have occurred. This may be a statistical fluctuation (the number of events is quite small, just several hundreds), or it may be a real effect that the Standard Model cannot explain. People are hoping for the latter, as it might give a clue in answering the unsolved problems.



From Wednesday up to today I attended the International conference on Topological Quantum Phenomena 2012 in Nagoya. This is one of the meetings held as part of the ministry of education (MEXT) funded five-year program headed by Yoshi Maeno, combining efforts in condensed matter all related by exploring topologically non-trivial states of matter, comprising several research groups in multiple locations in Japan. The plenary talks were presented by Grigori Volovik, Yoichi Ando, Tony Leggett and Shoucheng Zhang, and the scope is indeed rather broad.

Now topological things have always been important ever since Dirac proposed his monopole, but remained usually outside the real mainstream physics. Also, the focus has been mostly on topological defects, which are localized excitations that nevertheless influence the whole system. However, since the discovery of the (fractional) quantum Hall effect, we know that there can also be topological ground states, such that the systems themselves can only be understood by regarding them as a whole, as opposed to having a local order parameter. After so-called topological insulators were predicted and then found a couple of years ago, it seems that topological has become a real buzzword. For me this is interesting, as from my Master’s thesis on I’ve worked on topological stuff.

As pointed out by Volovik already in the first talk, it seems that topology is a necessary ingredient in the general classification scheme of states of matter, which is the principal task of condensed matter physics. So on top of the broken-symmetry paradigm, topological non-trivial aspects need to be taken into account for a full understanding. Moreover, some old knowledge may be better formulated in topology language. According to Volovik, the Fermi surface itself is topologically non-trivial. By the way, his famous book The Universe in a Helium Droplet is very instructive and the draft is freely available.

Overall it was a nice conference with a broad scope that nevertheless seemed to belong to the same endeavour. The scale was quite right with just over 100 people attending on average per day. The interactions during the poster sessions—sometimes a dull affair—were particularly lively. Unfortunately, the deadline for submission was before my arrival in Japan, but I had some interesting discussions anyway.

On Friday there was an excursion to the very large Higashiyama zoo and to Nagoya castle. The latter was destroyed by incendary bombing in 1945, but has been rebuilt in ferroconcrete. Of the cuisinse in particular kochin, a local breed of chicken, was especially good.


The past two days I attended the Tonomura FIRST International Symposium on “Electron Microscopy and Gauge Fields”, in honor of Akira Tonomura and his efforts in realizing a 1.2 MV electron microscope. Tonomura was the person to first conclusively confirm the Aharonov-Bohm effect, which is one of the wonders of quantum mechanics and one reason why students fall in love with physics (more below).

Incredibly sadly, Tonomura was diagnosed with pancreatic cancer one year ago, seemed to recover well from a major operation, but passed away exactly one week before the symposium. It was supposed to be a grand meeting of his scientific friends, but rather turned into a memorial conference. It was impressive nonetheless, located at the top floor of the Keio hotel with talks by amongst others Yakir Aharonov and Nobel Prize laureats C.N. Yang, Tony Leggett and Makoto Kobayashi. I realized only Wednesday morning that I actually read one of his books, I even cited it in my Master’s thesis.

I particularly liked Aharonov’s talk, in which he explained how he came up with the prediction for the AB-effect. This phenomenon is a variation on the quantum version of Young’s double-slit experiment, which is typically used to verify the wave nature of fields and particles. Here is a cute video portraying the single-electron double-slit experiment. Aharonov and Bohm suggested putting an infinitely thin solenoid right between the slits, so that there is a magnetic field between the trajectories but not on any trajectory the electron can take. In electrodynamics a magnetic field curves the motion of a charged particle (such as an electron), this is called the Lorentz force, but only when the particle actually travels through regions of non-zero magnetic field. AB predict that the interference pattern is shifted from its central position, due to the presence of  a magnetic field in between, even though the electron never  ‘feels’ the magnetic field directly.

In solid state physics, the periodic nature of the atoms or ions in a crystal lattice causes so-called bandgaps in the energy spectrum, which is why some materials are electric insulators and some are metals. The reason behind this is called Bloch’s theorem; so, potentials periodic in space cause gaps in energy. Aharonov told how, as a graduate student, he though about generalizing this to potentials periodic in time, which should analogously lead to gaps in momentum. This was the initial step to the AB effect, which at the time caused a lot of fuss, with some eminent physicists not willing to accept it.

The mechanism behind the AB effect is by now well established and well understood, but it has deep implications for the inherent non-local nature of quantum mechanics. Aharonov proceeded by converting the standard way of looking into the so-called Heisenberg picture, which he claimed identifies much more clearly the non-locality issue. This was very interesting and I’d have to look into this more carefully, perhaps by reading this article or his book.

The second day was devoted to the electron microscopy itself. What people basically do is shoot a beam of electrons at a target sample, and detect the beam a little bit further on. So you’re basically looking at the ‘shadow of the electron beam’, but like with lasers you can also probe phase and coherence properties being altered by the interaction with the sample. The higher the energy you can put in the electron beam, the higher your resolution will be, and by now one can look at individual atoms. There’s a nice explanation on the project’s website itself. On this day I especially enjoyed the talk by David Smith step-by-step outlining the current limits of the machines, and indicating where improvements can be made.

The current project which was initiated and supervised by Tonomura, aims to build a 1.2 MV electron microscope, making it the most powerful in the world. It is expected to be operational next year. It was funded by of the FIRST program, which is basically a part of the money the government assigned to stimulate the economy after the final crisis of 2008. My own group is also funded by this program.

© 2012 Aron Beekman Suffusion theme by Sayontan Sinha